Automate Financial & Economic Data Analysis with Azure Linux VM
May 4, 2025🎤 What It Means to Be an Event Producer for The Best Community in Tech
May 5, 2025Harnessing the power of Azure’s cutting-edge AI and cloud services has revolutionized the way businesses approach customer support. In this blog, we explore how you can implement a scalable and secure AI chatbot solution infrastructure using Large Language Models (LLMs) and Azure Machine Learning Prompt Flow. By employing Azure resources such as Azure OpenAI, AI Search, and Cognitive Services, along with robust architecture principles like private endpoints and shared links, organizations can elevate their customer support experience while maintaining enterprise-grade security and seamless integration.
Overview of an AI chatbot solution:
Problem:
You want to automate customer support using a chatbot powered by a Large Language Model (LLM), but need to ensure the bot responds accurately, politely, and stays within company guidelines.
Solution:
Use Azure Machine Learning Prompt Flow to design and test an LLM-powered workflow that answers customer queries using optimized prompts and integrated knowledge sources.
Evaluate and refine responses at scale, then deploy the flow as a real-time API for seamless chatbot integration.
Azure Resources to be used in the solution:
💬 Azure OpenAI
Provides access to OpenAI’s powerful language models like GPT via Azure, enabling natural language processing and generative AI in applications. It ensures enterprise-grade security, scalability, and compliance for AI workloads.
🔎 Azure AI Search
A cloud search service with built-in AI capabilities to extract and index information from various data sources. It enables intelligent, scalable search experiences across apps and websites.
🌐 Azure App Service
A fully managed platform for building, hosting, and scaling web apps and APIs quickly.
Supports multiple languages and integrates with CI/CD tools for streamlined deployments.
🧠 Azure AI Cognitive Services
A collection of pre-built AI models that handle vision, speech, language, and decision-making tasks. Enables developers to add intelligent features like facial recognition or sentiment analysis to apps without ML expertise.
⚙️ Azure Machine Learning (Azure ML)
An end-to-end platform for building, training, and deploying machine learning models at scale. Supports automated ML and model management with enterprise-grade tools.
💾 Azure Storage Account
Provides scalable, durable cloud storage for data objects like blobs, files, queues, tables, and disks. It supports multiple access tiers and redundancy options, making it ideal for diverse workloads from backups to web apps.
Architecture of the solution:
Now if the architecture is set up in this manner with Azure Private Endpoints only, then the connections to AI Search do not work.
By default, Azure AI Search connects to other Azure services over the public internet.
When you create a search service, it does not reside in your virtual network (VNet). It connects to data sources (like Blob Storage, SQL DB, etc.) via public endpoints.
While this default setup is easy to use, it’s not ideal for:
- Highly secure or regulated environments.
- Scenarios where sensitive data shouldn’t be exposed over the public network.
- Customers who need private, encrypted traffic within Azure only.
Hence, shared private links are offered—to let you route traffic through your own VNet and the Azure backbone network instead.
Architecture of the solution with shared private links:
💬 Azure OpenAI:
Provides access to OpenAI’s powerful language models like GPT model which the AML promptflow would leverage in order to use natural language processing and generative AI for providing answers.
🔎 Azure AI Search:
The FAQ documents are extracted and indexed which can be easily searched and found for constructing the answers for the chatbot.
💾 Azure Storage Account
It is used to store the FAQ documents which can be leveraged by AI Search in order to do the indexing and search.
🌐 Azure App Service:
It is used to build a web app which is used as front-end for the chatbot.
🧠 Azure AI Cognitive Services:
It is used to add intelligent features like sentiment analysis to apps without ML expertise.
⚙️ Azure Machine Learning:
AML Prompt Flow is developed and tested using a Compute Instance, which serves as an interactive, user-friendly environment to author, debug, and visualize flows.
Once ready, the prompt flow is executed at scale on an AML Compute Cluster, which automatically provisions and scales VMs to run batch jobs, evaluations, or large-scale prompt testing efficiently.
Private architecture:
Each of these resources has a private endpoint created and in approved state. Additionally, as per the nature of AI Search resource, private shared links are created to connect to storage account (blob), OpenAI and AI Services.
- Private Endpoints provide a secure, private connection to an Azure resource within a Virtual Network (VNet).
- Shared Private Links in Azure allow one Azure resource (like Azure AI Search) to securely access another resource (like Blob Storage or OpenAI) over Azure’s private backbone network instead of the public internet.
Key Benefits of Private Shared Links:
- Simplified Access for Consumers
Customers can access private resources securely without setting up their own VNet. - Lower Infrastructure Overhead
Reduces the need for additional networking configuration, especially useful when multiple users/services need access to the same resource privately. - Controlled Sharing
Resource owners can grant fine-grained access to specific users or services while still keeping the resource within the private network boundary.
Shared private links in AI Search:
Creating a shared private link is similar to creating a private endpoint, however, instead of connecting to a subnet, a shared private link would connect to another resource that supports shared private links. Once created, you have to approve the shared private link just like a private endpoint in the connected resource.
Eg. Once you have created the shared private links which connect from AI Search to Storage Blob, OpenAI and AI Cognitive Services, you have to approve the private link which would be under the Private endpoint connections tab under each of these resources.
Services that support Private Shared Links (as of now):
💬 Azure AI Search
Commonly uses Private Shared Links to index content stored in private resources like Blob Storage or Cosmos DB.
🔒 Azure OpenAI
For secure prompt flow execution and interactions with other private resources.
⚙️ Azure Machine Learning
To access private data sources (e.g., private storage accounts or databases) during training and inferencing — especially within Prompt Flow.
🧠 Azure Cognitive Services
For accessing data securely in private networks.
Use-Cases for the architecture:
🛒 E-Commerce
• Customer Support: Automates responses to FAQs about orders, shipping, returns, and product inquiries, improving customer satisfaction while reducing human workload.
• Product Recommendations: Integrates with product catalogs and uses AI to suggest products based on user preferences and behavior.
🏥 Healthcare
• Patient Support: Answers frequently asked questions about appointments, billing, or insurance in a HIPAA-compliant manner.
🏛️ Public Sector
• Citizen Services: Provides instant answers to queries about government services like taxes, licenses, or benefits.
🏡 Real Estate
• Property Listings: Answers inquiries about available properties, pricing, and location details using integrated data.
• Market Insights: Educates users about real estate trends and mortgage options using AI Search and language models.
The integration of Azure’s comprehensive suite of AI and cloud technologies enables businesses to achieve unparalleled efficiency in customer support automation. Through features like Azure Machine Learning Prompt Flow, optimized workflows, and intelligent data search, this solution offers more than just responsiveness—it delivers reliability and innovation. With secure architecture utilizing private shared links and endpoints, the chatbot stays well within organizational boundaries, ensuring data privacy while simplifying access for users. By embracing Azure’s ecosystem, organizations can enhance customer satisfaction, streamline operations, and establish a future-ready foundation for intelligent automation.