
Azure Monitor Alert Types Dev.bg Presentation and Examples
July 9, 2025
298 – Understanding Azure Virtual Network Manager from an ALZ perspective with Jack Tracey
July 10, 2025What is Microsoft Learn Docs MCP Server ?
The Microsoft Docs MCP Server is a cloud-hosted service that enables MCP hosts like GitHub Copilot and Cursor to search and retrieve accurate information directly from Microsoft’s official documentation. By implementing the standardized Model Context Protocol (MCP), this service allows any compatible AI system to ground its responses in authoritative Microsoft content.
The “Why”: The Initial Challenge of the AI Assistant
Our initial goal was simple: build a chat interface where a user could ask a question, and we would query this MCP server to get an answer. However, we quickly discovered a challenge. The MCP server is designed for AI agents; it doesn’t just return a single answer. Instead, it returns a rich payload of up to 10 high-quality content chunks from the documentation.
While this is fantastic for providing comprehensive context, it’s overwhelming for a direct chat response. Our app was successfully retrieving information, but it was just a firehose of data. The user was left with the difficult task of sifting through thousands of words to find their answer. This wasn’t a chatbot; it was just a complicated search bar.
We realized we needed to transform this wealth of data into a single, helpful response.
The “How”: A Two-Step Solution (Retrieve and Synthesize)
The solution was to architect our application around a two-step process, a pattern often referred to as Retrieval-Augmented Generation (RAG).
- Retrieve: First, connect to the specialized data source (the MCP Server) to fetch relevant, factual context.
- Synthesize: Then, provide that context to a general-purpose large language model (LLM) to generate a concise, human-readable answer.
Step 1: The Retrieval Saga – Taming the MCP Server
This was the most challenging part of our journey. Connecting to the MCP server wasn’t straightforward, and it took several iterations of trial and error to get it right. We knew the endpoint was https://learn.microsoft.com/api/mcp, but the exact request format was a mystery we had to solve by carefully analyzing the server’s error messages.
Our attempts ranged from simple, direct MCP messages to various JSON-RPC structures. After a lot of debugging, we landed on the precise payload the server expected. The key was to format the request as a JSON-RPC call with a specific method (tools/call) and parameters that named the tool (microsoft_docs_search) and its arguments (question). Our AI Assistant is ready to work alongside Microsoft Learn MCP!
Working Examples
Here is the final, working code snippet from our Next.js API route (pages/api/mcp.js) that successfully retrieves the context:
// — STEP 1: RETRIEVE CONTEXT FROM MCP SERVER (Using the final working logic) — const MCP_SERVER_URL = ‘https://learn.microsoft.com/api/mcp’; // This is the successful payload structure you discovered. const mcpPayload = { “jsonrpc”: “2.0”, “id”: `chat-${Date.now()}`, “method”: “tools/call”, “params”: { “name”: “microsoft_docs_search”, “arguments”: { “question”: userQuery // Using ‘question’ as the parameter name. } } }; const mcpResponse = await fetch(MCP_SERVER_URL, { method: ‘POST’, headers: { ‘Content-Type’: ‘application/json’, ‘Accept’: ‘application/json, text/event-stream’, ‘User-Agent’: ‘mcp-remote-client’, // Adding the User-Agent. }, body: JSON.stringify(mcpPayload), }); // … code to parse the streaming response …With this, we had successfully completed the “Retrieve” step. Our app was now a robust data-fetcher.
Step 2: The Synthesis Engine – Making Sense of the Data with our AI Assistant
Now that we had our 10 chunks of documentation, we needed to add the “brain” to our operation. We chose Azure OpenAI model for this task due to its powerful synthesis capabilities.
The process was as follows:
- Extract all the text from the search results returned by the MCP server.
- Combine this text into a single, large block of context.
- Create a carefully designed prompt that instructs the AI Assistant model to act as a Microsoft expert.
- Send the user’s original question along with the retrieved context to the Azure OpenAI API.
The prompt is the most critical piece of this step, as it guides the AI Assistant to produce the desired output. Here’s what our prompt looked like:
const azureUrl = `${AZURE_OPENAI_ENDPOINT}/openai/deployments/${AZURE_OPENAI_DEPLOYMENT_NAME}/chat/completions?api-version=2025-01-01-preview`; const aiResponse = await fetch(azureUrl, { method: ‘POST’, headers: { ‘Content-Type’: ‘application/json’, ‘api-key’: AZURE_OPENAI_KEY }, body: JSON.stringify({ messages: [ { role: ‘system’, content: ‘You are an expert assistant. Generate answers based on the provided context.’ }, { role: ‘user’, content: `Context:n${retrievedText}nnQuestion:n${message}` } ], max_tokens: 500, temperature: 0.7 }), }); if (!aiResponse.ok) { const errorText = await aiResponse.text(); console.error(‘Azure OpenAI Error:’, errorText); throw new Error(‘Failed to get a response from Azure OpenAI.’); }This prompt constrains the model to use only the official documentation we provided, ensuring the answers are factual and grounded in a reliable source.
The “What”: The Final Result of our AI Assistant
After implementing both steps, our application was complete. The user interacts with a clean, simple chat interface built with Next.js and React. When they ask a question:
- The Next.js backend silently queries the MS Learn MCP Server.
- It receives up to 10 articles of context.
- The backend passes that context and the original question to the Azure OpenAI API.
- Now we receive a concise, summarized answer.
- This final answer is displayed to the user in the chat window.
What started as a data firehose was now an intelligent, conversational, and genuinely helpful AI assistant.
Conclusion
This project was a fantastic lesson in modern AI application development. It highlights a powerful pattern: using specialized, data-retrieval tools in tandem with large, general-purpose language models. The journey underscored the importance of persistence in debugging APIs and the art of crafting the perfect prompt. By combining the strengths of different services, we were able to build an application that is far more capable than the sum of its parts.
Git Repo
https://github.com/passadis/mslearn-mcp-chat
Microsoft Learn Docs MCP
https://github.com/MicrosoftDocs/mcp/