Article
8 min

Here’s Why Microsoft is Betting Big on ChatGPT and AI

The company has added ChatGPT to Azure OpenAI Service for developers seeking to integrate custom AI-powered experiences directly into their own applications.

What's Inside
  • ChatGPT on Azure OpenAI Service is in beta

    Microsoft is currently conducting previews of ChatGPT on Azure OpenAI Service by application developers seeking to integrate custom AI-powered experiences directly into their applications.

  • Why Azure is the right fit for ChatGPT

    Microsoft is supportive of ChatGPT, both in terms of its commercial investments in the AI technology and through a longstanding partnership with ChatGPT maker OpenAI.

  • The need to educate AI

    AI algorithms themselves need to be educated by being fed with training data. Pre-trained AI algorithms allow third parties to either directly consume these or apply their unique company aspects to fine-tune or tweak them to get customized results.

  • Microsoft’s AI future

    Microsoft will continue to invest in AI and its AI strategy is just in the infancy of getting the technology into the hands of application developers.

Artificial Intelligence AI On Circuit Board. Future Technology Concept Visualization. Big Data Transmission Connection. AI Analyzes Technology Digital Data Network Abstract Background.

Microsoft’s move to bring ChatGPT into the realm of its Azure cloud service is an important step in the company’s big bet on the power of artificial intelligence (AI) in its future.

It was announced in March that ChatGPT – a natural language AI-based tool designed to answer questions and assist with several computing tasks – would be available on Microsoft Azure OpenAI Service, allowing application developers to integrate ChatGPT into the cloud apps they build.

Jamie Wakeam, the National Director of Intelligent Cloud for Microsoft Canada, says ChatGPT, among other things, provides the ability to generate content for writing scripts or documents.

“There are so many different uses cases for applying generative AI, examples are popping up online all over the place. One great example is how generative AI will change how you interact with product support centres. We’re all used to interfacing with basic chatbot services, but generative AI takes that to an entirely new level. Not only will the bot responses be more comprehensive, but summarized transcripts can now be provided when your chat gets transferred to a live agent. Call scripts can be automatically generated, based on the chat log. All of this creates a more productive call agent and a happier customer who gets what they need solved faster.”

ChatGPT might also be used to fine-tune content. Again, citing the call centre application, Wakeam says an organization’s product information and lexicon might be fed into an AI model so that, as content is being generated by a contact centre application, the AI would recognize and use product names, slogans or relevant key words related and relevant to products. ChatGPT might also be used to create summarizations by paraphrasing or editing down long and detailed documents.

“Generative AI (such as ChatGPT) has the ability to reason over documents and give you a sort of executive summary of salient points,” Wakeam says. So, in the context of a call centre application, it’s possible to summarize phone conversations or a string of email threads with a particular customer that may have included several touchpoints during a support incident.

ChatGPT on Azure OpenAI Service is in beta

Microsoft is currently conducting previews (typically described as beta testing) of ChatGPT on Azure OpenAI Service by application developers seeking to integrate custom AI-powered experiences directly into their applications. However, in the spirit of ethical and correct usage, Microsoft is limiting access only to those who agree to a screening process to preview it and asking them to fill out a questionnaire that outlines their intended projects and uses.

“We want people to use it and to try it,” Wakeam says. “The one guardrail we do have is – we want this done in appropriate, ethical ways. We want people to be creative and innovate, using this type of technology. But we also want them to do so responsibly.”

The preview is currently only available in certain geographic regions where Microsoft has data centres that have Azure OpenAI Service deployed.

“As you can imagine, it does take quite a bit of engineering time and effort to deploy these types of services globally,” Wakeam adds. “Part of the preview service process that we have for any Azure is a gating process to limit access through the preview period as we build out the infrastructure, do global deployments and localization for languages and regions. All those things take a little bit of time so that’s part of the preview process that we go through.”

Why Azure is the right fit for ChatGPT

Speculation by some market watchers is that by bringing ChatGPT into the Azure OpenAI Service, Microsoft is helping to commoditize and further legitimize that technology. Wakeam says the company is supportive of ChatGPT, both in terms of its commercial investments in the AI technology and through a longstanding partnership with ChatGPT maker OpenAI.

Azure is all about creating a cloud computing environment for organizations to run their solutions on – both those they create and those they consume from independent software vendors. The integration of open APIs into the cognitive services layer of Azure is a natural fit, Wakeam says, and another important step in the evolution of AI technologies that Microsoft has been working to bring to the commercial market for many decades.

“Microsoft is providing the right type of governance and controls around those APIs,” he says. “By having this as part of the Azure platform, it allows us to keep customer data with their subscription. We’re not taking customized data sets and mixing them with every other customer’s GPT conversations and sessions. Data is confined to a particular customer’s Azure subscriptions and to the queries they make against the ChatGPT service.”

Azure’s pre-existing infrastructure provides authentication, security, governance and separation of data.

“We’ve done this for a long time on the Azure platform. It’s a natural way for us to extend that with ChatGPT,” Wakeam says. 

“So, it’s a really good way for organizations to apply those customizations, integrate into their applications and have the right access controls to govern their applications and data appropriately.”

The need to educate AI

All AI algorithms themselves need to be educated by being fed with training data. Pre-trained AI algorithms allow third parties to either directly consume these or apply their unique company aspects to fine-tune or tweak them to get customized results.

“A lot of organizations are looking for pre-trained models and may want to put customizations on top of those,” Wakeam says. “They’re looking for a service provider like Microsoft to provide them with technologies that already have a base of (AI training) information in there – whether it be speech, handwriting recognition, image or even more advanced analytics going across different types of data sources.”

Consider AI that performs language conversion or reasoning through a decision tree. AI training would have been done for vision, speech and decision types of models. When you look to integrate AI technologies into applications or solutions, you’re starting from a foundation of pre-trained models.

In some cases, you’re looking to fine-tune these models or add custom data sets and “tweak” the results to what is important for you.  Wakeam cites the example of an organization that may have a product catalogue it wants to feed into an AI algorithm as customized data, so that if a request is made through a speech application for a specific product or term, the AI algorithm knows and recognizes that term from a product’s lexicon.

Organizations that are building their own applications and looking to integrate with other applications deployed on Azure would use Open AI Azure API services to do that, Wakeam says.

“You need to have an app that you want to plug in to one of these API services,” he says. “That application could be something very simple, like a web page or a mobile application. Or it could be a completely customized  line of business solution that your organization has built. There’s a very broad spectrum in terms of who would use these API services.”

Microsoft also supports the creation of low-code or no-code solutions through its Power Platform service for creating web pages, mobile and desktop apps without the need for rich development and coding skills.

“They, too, can now plug into Azure OpenAI Service APIs. So, if you easily want to add a ‘chat’ prompt into your application, you can do so with very little or no coding and publish that through the Power Apps platform.”

Microsoft’s AI future

Wakeam says Microsoft will continue to invest in AI and that its AI strategy is “just at the infancy of getting the technology into the hands of application developers.”

“There’s a lot more to come in a short period of time,” he says. “The next wave is going to be taking the output of ChatGPT responses and having some of those actions automated with real tasks. As a simple example, if I’m working on a new product tweet, the AI systems can now ask if you’d like your message posted directly on your Twitter account, with your permission, of course. As Microsoft releases our copilot capabilities in the Office suite, you will really start to see how generative AI output can be automated. Not only will it recommend text but can actually create the PowerPoint slides, amazing layouts, email and share the file around to your colleagues, book meetings, summarize meetings and so much more. This will be true virtual assistance that will help you drastically improve your productivity.

“I think that’s really where it gets super exciting.”