Securing web applications with Azure Front Door WAF CAPTCHA (Preview)
May 23, 2025Slurm custom image for a locked down environment and faster start-up time, Azure Cyclecloud
May 23, 2025This post is co-authored with Shlomi Elkayam and Henry Hernandez from Amdocs CCoE.
In this blogpost you will learn how Amdocs CCoE team improved their SLA by providing technical support for IT and cloud infrastructure questions and queries. They used Azure AI Agent Service to build an intelligent email agent that helps Amdocs employees with their technical issues. This post will describe the development phases, solution details and the roadmap ahead.
About Amdocs CCoE
Amdocs is a multinational telecommunications technology company. The company specializes in software and services for communications, media and financial services providers and digital enterprises.
CCoE team is responsible for automation, infrastructure and design of all our Azure solutions, either for internal use cases or for our customers projects.
Technical Support via Email before AI
One of the key challenges of the team is to keep track of support inquiries received via generic mailbox. Amdocs has a dedicated software to create and manage internal support cases. Still many employees use email to ask questions and notify on technical issues. Managing this additional email-based support channel is time-consuming, inefficient and repetitive.
Amdocs wanted to find an intelligent way that will allow them to improve the end-user experience, leverage existing knowledge base and also integrate email into the dedicated support flow.
Solution
Amdocs decided to leverage AI agents to reduce the support delays and provide an intelligent support on time. The agent will respond to emails based on the information from the existing knowledge base guiding users on how to troubleshoot and fix their issues.
About Azure AI Agent Service
Azure AI Agent Service is a central component of the planned solution. It’s a fully managed, enterprise-grade service allows developers to securely build AI agents without need to manage the underlying compute and storage resources. Azure AI Agent Service supports multiple Large Language Models (LLMs) including GPT4o and GPT4o-mini of Azure OpenAI and other models such as Meta-Llama-405B-Instruct and Cohere-command-r. Service seamlessly integrates with Azure AI Search to create Retrieval-Augmented-Generation (RAG) applications which allow to build AI agents grounded to the private corporate data.
Experimenting with Azure AI Foundry Playground
The first step of the project was to build a simple agent grounded to Amdocs CCoE knowledge base (KB) to see if agent is capable to answer support questions.
Azure AI Foundry playground is the best tool for this type of experimentation. It allows you to create new AI Agent with just few clicks, upload the relevant files or connect it to existing Azure AI Search and start asking questions and evaluate agent responses.
After several iterations for system prompt changes and LLM parameters tuning the results were satisfying. It was time to move to the next phase of detailed architecture design.
Architecture design
The system design needed to address multiple requirements related to core functionality of the flow, user experience and ensure Amdocs corporate security standards in-place. At the time of this blogpost the solution comprised of multiple components besides the Azure AI Agent Service. For example, knowledge base synchronization and integration with Amdocs CCoE emailbox.
Knowledge Base Synchronization
Amdocs uses Microsoft SharePoint as their corporate knowledge base for documentation and how-to guides. To allow Azure AI Agent Service to be grounded to that information Amdocs needed solution to ingest those documents into vector database and connect that database to Support Email Agent.
For this Amdocs used a combination of Azure services and features:
- Azure Logic Apps to copy SharePoint files to Azure Blob Storage. The Logic App detects changes in the SharePoint files and triggers copy of the file into an Azure Blob Storage.
- Azure AI Search Service as a vector database.
- Integrated vectorization – a managed feature allows to easily encode text into vector and store it in the Azure AI Search index.
Once this flow was built, the knowledge was fully in-sync with vector db and all the updates and new documents are automatically vectorized and ingested into Azure AI Search for Email agent grounding.
Integrating Azure AI Agent Service with Amdocs CCoE mailbox
This is the central part of the solution. All the business logic happens in this component. Logic App and Function Apps were the perfect solution to integrate AI Agents using GPT-4o model. Logic App initiate the workflow by getting new emails along with email components security validations to avoid misuse. At that point email body is ready to pass into Azure Function App that acts as a serverless compute layer that runs the Azure AI Agent Service runtime.
Function Apps breaks the process into 3 functions:
- Parse Body Message – removes all redundant text from the body of the email and making sure to capture the question as clean as possible. For this task Amdocs initialized a dedicated AI agent with separate instructions. The response of the Email Cleaner Agent is sent back to the logic app to continue the flow.
- Ask AI – Is responsible to call the KB AI Agent with the cleaned email body. KB AI Agent is configured with an extensive set of instructions on how to respond including instructing agent to respond in valid HTML.
- Log Responses – It is critical for the CCoE team to log the Question & Answer pairs into the Storage Account. Having this data will help the team to improve internal documentation, add new topics and make sure that data is available for the agent to answer user questions more accurately.
Finally, Logic App will prepare the body of the response email and send the response back to the user, final step of the Logic App is to mark the email as read so it will not trigger workflow again.
Examples of questions and answers
Logic improvement and advanced features
As often happens additional improvements recognized and implemented during the development and testing of the solution.
Improving accuracy and creating closed feedback loop
Amdocs used explicit instructions in KB AI Agent system prompt to ensure that agent solely relies on the information received from knowledge base to avoid LLM hallucinations. Every time agent decided that it has not enough information to answer, Amdocs saved the question/answer pair in the dedicated Azure Blob Store for an additional review by the team to update existing knowledge base or create new knowledge documents that will answer user question next time.
This approach ensures that agent gets more relevant and “smarter” over time.
Prompt engineering for pre- and post-formatting
Additional formatting is required due to fact that email content is in HTML format. In order to improve agent response accuracy Amdocs needed to reduce all the redundant text from the email that is not relevant to the actual question, for example omitting the user signature. This is also done with specialized AI agent that instructed to receive an HTML and produce back cleaned text that application will use to submit the question to the knowledge base agent.
AI KB Agent response often consisted of steps, so the agent instructions included formatting guidelines and examples of valid HTML responses. This is in accordance with prompt engineering best practices if providing clear formatting instructions along with few examples to achieve better agent responses accuracy.
Security
Amdocs has strict corporate policy and application security review process. They configured Azure AI Agent Service and other solution components with a private networking to comply with the security standards. Private networking provides multiple security benefits such as: no public egress, private access to resources and data stores in this context KB Agent accesses Azure AI Search privately during the retrieval phase.
Conclusion
Amdocs successfully built RAG-enabled AI agent with Azure AI Agent Service. Using managed service allowed them to move faster and not worry about integration with Azure AI Search or managing conversation history or deploying and managing dedicated compute to run agent and LLM. They used GPT4o model to power the AI agent.
Yet they required to build an advanced design and made multiple development iterations to move from the concept to a working agent capable to answer user support questions on time. They’ve architected the closed feedback loop mechanism that allowed to receive critical quality information and improve agent accuracy over time. Finally, they needed to use the right services and configurations to keep the solution secure and compliant with Amdocs corporate security and networking policies.
Roadmap
Amdocs already planning to advance their agent with additional features such as screenshot attachment processing, email conversation support where user is able to ask additional questions via email and integration with third-party ticketing service to allow agent to open support ticket on user behalf if escalation is required to fix the issue.
Additional Resources
What is Azure AI Agent Service
Enabling SharePoint RAG with LogicApps Workflows