Exploring the AI innovation: Insights from Microsoft Ignite Conference 2023

We just got back from some really interesting days in Seattle attending the Microsoft Ignite Conference. The big theme was of course AI.

Looking across the sessions we attended, it seems that this theme can be divided into two subthemes:

  • How is Microsoft going to improve all their applications with AI? (Office, Azure Portal, Visual Studio etc.)
  • How can you use Microsoft services and infrastructure to build your own AI solutions?
Ignite 1
Ignite 2
Ignite 3


Microsoft is introducing the concept of copilots across most of their applications. A copilot is an AI assistant that is context aware and can help you solve the tasks that you are working on in whatever Microsoft application you are using. In Office applications or in Visual Studio. In the Azure portal when managing resources or in a Fabric notebook using AI to enrich your data or using AI to generate the code that can enrich and analyse your data.

Some of you might already be familiar with copilot in Visual Studio. But get used to seeing similar help in whatever MS application you work in. It is really mind blowing how much help you can get from a copilot in different tasks. Especially when the copilot knows the entire context that you are working in and have access to execute complex tasks and query for additional information on your behalf.

Here's some examples that I am looking forward to playing with:

  • Azure portal

    Ask the copilot questions that can be answered with queries to the azure graph. For instance, how many VMs are currently running in this tenant? Or show me the functions where the cost has increased during the last week.

  • Visual studio

    Explain code. Analyse stack traces. Write commit messages. Generate code.

  • Microsoft Fabric

    Generate code to train custom ML models.

  • Excel

    Let copilot generate python code that can be executed in excel to analyse data in my excel workbook.

Azure AI Studio

Microsoft is adding copilots to all their UIs, but they are also making the frameworks and infrastructure for building copilots available to developers. With some nice tooling. From the demos I saw, the developer experience seems promising.

Here are some highlights:

  • Prompt flow

    When building upon LLMs to create customized AI solutions, we need a bit more than the traditional question and answer that you have probably already played around with in ChatGPT. A prompt flow enables us to use vector search, Retrieval Augmented Generation, API calls and other techniques to build a much more efficient AI assistant, that is aware of context, can ask additional questions to give you better answers, and can execute queries and actions on your behalf.

    I am looking forward to experimenting with the VS code extension for working with prompt flow. It looks promising.

  • Vector search and RAG

    By adding an LLM generated vector representation (embedding) of your data to your database, you will be able to implement Retrieval Augmented Generation. This technique will make it possible for you to provide the LLM with all the contextual data when submitting a query or question.

    For instance, you can create embeddings based on all your product titles, descriptions, and specifications. And magically the LLM will be able to answer detailed questions about your entire products catalogue. Combine this with a flow that retrieves your order history and submits this to the LLM as well, you will be able to ask a simple question as “Are the shoes I bought waterproof” and the LLM will give you a precise answer.

    The changes announced in this area was that embeddings and vector search will be supported in PostgreSQL, Cosmos DB, the new MongoDB on Cosmos vCore offering, and of course in Azure AI search (formerly known as Cognitive search). So, it will be really easy to add and maintain an embedding representation of your data in a way that matches your current infrastructure and needs.

  • Using AI to benchmark your AI solution

    In Azure AI studio you get some nice features that will enable you to create benchmarks of your solution. I saw a pretty cool demo of a setup where GPT4 was used to evaluate performance of a list of test cases of a fairly complex prompt flow. The flow was based on the faster and cheaper GPT3.5-turbo. Whenever the logic of the flow was adjusted, the test suite was executed, and it was easy to compare the performance of the previous implementation with the new one.

  • Monitoring performance

    The same techniques can be used for monitoring the performance of your AI solution in production. You can continuously keep track of the same performance metric used in your automated test, but now based on a sample of real questions and answers from your solution. The Azure AI studio provides a nice UI for this data as well.

All in all, it is obvious that the possibilities created by OpenAI will have a huge impact on all Microsoft products and services. As users we will get used to having copilots that can help us in many aspects of our daily work. These copilots already have mind blowing capabilities and quality in many scenarios, and they will most likely improve dramatically during the coming time.

The cost in terms of licensing or consumption is going to be interesting. It is obvious that OpenAI and Microsoft has a bit of a gold mine here. The potential is huge. And the price tag will probably match this.
When it comes to custom built AI solutions, the price will be consumption based. And the prices of OpenAI products are going down and the capabilities and token limits are going up. So, I think we can build great value here for a fair price.

When it comes to copilots across Microsoft applications, I think the licensing model will be a bit of a challenge for many smaller companies. You need a significant license commitment to open for the good stuff. This price can be hard to justify for many, even though there are performance gains to get. But I assume that soon the copilot concept will be standard in every piece of software we work with, so it is hard to imagine a future where copilot is not standard in the MS suite.

Further exploration

Many of the sessions can be streamed from the ignite home page.

I would especially recommend the fairly technical demo of prompt flow in VS code:


But I also recommend you take a look at:


Written by Nikolaj Kaplan, Senior Architect