
How to block the creation of Client Secrets on Entra applications
April 27, 2025Using Graph API to assign roles to logic app managed identity
April 28, 2025One of the early criticisms of Large Language Models (LLMs) was an inability to do good math when answering questions. Newer LLMs can do great reasoning and mathematical tasks, but at a high compute cost and the query times can be slow. Microsoft Fabric offers easy-to-use SaaS data tools that can scale to very large volumes of data and run traditional relational database queries. While custom RAG models (Retrieval-Augmented Generation) have been able to map natural language queries to structured databases, most implementations have been high-code custom designs.
Microsoft Fabric is a vast suite of tools that are easy to use, but using the right tools for the right purpose can require some training and knowledge. Along with my teammate Inder Rana, we created a GitHub repo that uses 250 million rows of real CMS open source Medicare Part D healthcare data, which can be found at this link: fabric-samples-healthcare/analytics-bi-directlake-starschema at main · isinghrana/fabric-samples-healthcare . The entire GitHub repo can be installed in a few hours without requiring a coding background to deploy.
In the video below, I show you how to connect an Azure AI Foundry Agent to a Microsoft Fabric Data Agent which can be created with the GitHub repo. At the conclusion of the build, you can start asking questions of the AI Foundry Agent and getting mathematically correct answers from 250M rows of data!
Here is a link to the video: https://youtu.be/yQkbd1f6JFk .
More information about querying Microsoft Fabric Data Agents from Azure AI Foundry Agents can be found here: https://learn.microsoft.com/en-us/fabric/data-science/how-to-consume-data-agent