Blog

UTILISING ARTIFICIAL INTELLIGENCE ON AZURE

Azure AI Workshop

01 Dec 2023 - Estimated reading time: 5 min

Following on from Fraser Skea’s post Azure build AI day, a small group from across the digital community here at Hymans underwent a three-day workshop with a Microsoft AI expert. There are many AI proof of concepts (POC) in progress at Hymans and the main purpose of this workshop was to get a better understanding of the services that are available within Azure so we can get the most out of them.

Day 1 – AI Solution Example and AI History

We started off with a brief journey of AI, from its humble beginnings in 1956 to the mind-bending Generative AI we have today. We then discussed the mathematical wizardry used within Azure Open AI and Azure Cognitive search. This really helped create a grounding of what these tools do and gave a better understanding of how to use them.

Next, we delved into a step-by-step guide to build a full AI solution and it gave us an idea of what can be achieved within a Retrieval Augmented Generation (RAG) architecture. The main advantage of a RAG architecture is that you can orchestrate how a Large Language Model (LLM), like ChatGPT, intakes and processes data, rather than just using the LLM directly. This results in the ability to tailor the user’s response to whatever use case is needed.

Day 2 – Responsible AI, Security and Cost

We finished working through the step-by-step guide, leaving us with a fully-fledged AI chat bot which can query multiple data sources and can integrate into different platforms, MS Teams or MS Outlook for example, using the Azure AI bot service.

We then had a thought-provoking presentation about Responsible AI and Microsoft’s Responsible AI standard, designed to aid the community in creating responsible and ethical AI solutions. This is a very important topic in the development of AI solutions.

Following that we looked at security in Azure Open AI and specifically how we can filter responses based on the signed in user.

This led us into cost saving and Azure service pricing, steps we could take to cut costs and understanding how Azure AI services are priced.

We finished the day with prompt flow engineering in the Azure Machine Learning service. This is a low-code tool which allows everyone, not just developers, to develop, test and deploy, LLM solutions in the cloud.

Day 3 – Enhancing our POC

Finally, we finished the workshop looking at our own POC that we had previously created. This was our opportunity to dig into our own implementation and get feedback for a MS AI expert. Looking at our specific use case we were able to get some insight into how we can tweak our prompts and use stored data to enrich responses from the LLM.

Summary 

The workshop was a fantastic opportunity to really dig into the tools that Microsoft support for developing AI solutions. Mutlu, our helpful AI expert, wasn’t afraid to go into detail when we asked for it, making the workshop a great interactive experience tailored to suit our needs. We came away with a better understanding of how to use these tools, improve our current POCs and it opened our minds to other ways in which we can implement these AI services.

If you want to know more about this topic leave a comment or you can message me directly.

Subscribe to our news and insights

0 comments on this post