Here’s how Microsoft plans to integrate ChatGPT into all its apps and bots via Azure • The Signup
Microsoft presents ChatGPT, with all its promises and shortcomings, as the world-plus-dog cloud service in Azure.
Redmond is excited to announce this week that ChatGPT will be selectively available as a preview within Azure OpenAI. This service is largely aimed at companies that want to use large language models in their applications and workflows, such as using Dall-E2 for image generation, GPT-3.5 for text, and Codex for something resembling code.
By making ChatGPT available from Azure, participating organizations can now access the software, embed it into their applications and pipelines, and generate walls of text for any purpose that they can successfully justify and interact with a chatty, imaginative bot. . users.
“Developers can integrate custom AI-powered experiences directly into their own applications, including enhancing existing bots to handle unexpected questions, summarizing call center conversations for faster customer service solutions, creating new ad copy with personalized offers, automating claims processing, and more.” Eric Boyd , Microsoft’s corporate vice president of artificial intelligence platforms, was outraged.
“Cognitive services can be combined with Azure OpenAI to create compelling use cases for enterprises.”
Then again he would say no. Microsoft is billions deep in OpenAI. Part of the investment agreement is a pact in which Microsoft gets rights to commercialize OpenAI’s technology. And that includes up-and-coming text generator ChatGPT, a non-intelligent bot that predicts what people might type from given input prompts. For example, you can ask him to come up with a movie synopsis and he will do the momentum.
Redmond has aggressively built machine learning capabilities into its portfolio and cloud services, launching Azure OpenAI Services in 2021. Boyd says more than 1,000 companies use the system, with Microsoft citing examples such as Moveworks, KPMG and Al Jazeera.
Now they and other organizations can hack ChatGPT in Azure and everything that comes with it. The service currently costs $0.002 per 1000 tokens in preview mode. The pricing of OpenAI’s various AI models is based on tokens, which it describes as chunks of emerging words, with 1,000 tokens consisting of about 750 words.
Billing for ChatGPT usage will begin on March 13. Developers can request access to software in Azure here.
Trained on mountains of text scraped from web pages and other sources, the bot quickly captured the world’s imagination after it became available to the public via OpenAI’s website in November. According to a study based in part on statistics from Likeweb, the bot became the fastest app to reach 100 million users, reaching that mark by early February.
There are problems.
In a column earlier this month The registrationAlexander Hanff, an IT and privacy technologist, said the chatbot told him he was dead and doubled up with fake obits. We have also pointed out other problems.
OpenAI’s CEO proclaims that no one in their right mind wants AGI
None of this is likely to slow Microsoft down from continuing to push the code into its products, which already include Bing, Edge, and Skype. The Windows 11 giant will talk more about its plans for ChatGPT and other AI technologies at “The Future of Work with AI” on March 16, hosted by CEO Satya Nadella.
Microsoft’s Boyd nodded to the challenges with AI tools like ChatGPT.
“We understand that any innovation in AI must be done responsibly,” he said. “This becomes even more important with powerful new technologies like generative models. We’ve taken an iterative approach to large models, working closely with our partner OpenAI and our customers to carefully assess use cases, learn and manage potential risks.”
When developers apply to use ChatGPT from Azure, they must outline how they intend to use the technology before being granted access, Microsoft said. It also plans to filter offensive and offensive content.
“In the event of policy violations, we may ask the developer to take immediate action to prevent further abuse,” he added. ®