- Copilot Health launches, and it analyzes medical records, lab results, and wearable data.
- Designed to support doctor visits, not replace them. Insights clinically evaluated and labeled when AI-generated.
- Access to sensitive health data raises potential misuse and privacy risks.
Microsoft has introduced Copilot Health, a new feature designed to help people understand their health information by connecting the AI assistant to medical records, lab results, and wearable devices. While the company positions the tool as a way to simplify complex medical data, the idea of giving an AI system access to personal health records can also raise privacy concerns.
The feature is rolling out gradually, and users can sign up for a waitlist to try it early.
AI designed to explain your health data
Using Copilot Health, you can create a dedicated space inside the chatbot to ask questions about your health data. The system can analyze medical records, interpret lab results, and identify patterns from wearable devices.
According to Microsoft, Copilot Health can connect to health records from more than 50,000 hospitals in the United States and healthcare organizations through HealthEx, lab test results from Function, and data from over 50 wearable devices, including Apple Health, Oura, and Fitbit.
Once connected, the assistant can explain confusing lab results, summarize medical information, or highlight trends such as sleep patterns or activity levels.
The platform can also help you find healthcare providers by connecting to real-time provider directories. You can search for doctors based on specialty, location, languages spoken, and insurance coverage.
Microsoft says the tool is designed to help patients better prepare for medical appointments. Copilot Health doesn’t replace your doctor
.
Microsoft’s broader AI healthcare vision
The company says Copilot Health is part of a broader push to apply artificial intelligence to healthcare. Microsoft researchers are also working on projects such as the Microsoft AI Diagnostic Orchestrator, which aims to analyze complex medical cases and support clinical decision-making.
Microsoft describes its long-term goal as building systems that combine primary care physicians’ general knowledge with medical specialists’ deeper expertise.
However, the company says health insights will only be introduced after clinical evaluations and will clearly indicate when responses are generated by AI.
Privacy concerns around health data and AI
Despite Microsoft’s safeguards, connecting an AI chatbot to personal health information raises significant privacy questions.
Health data is among the most sensitive personal information users can share. Giving an AI assistant access to medical history, wearable device data, and lab results could create new risks if the information were misused or exposed.
The software giant says Copilot Health includes additional protections, including encryption for data stored and transmitted through the service, separate storage from standard Copilot chats, controls to disconnect health data sources at any time, and the ability to delete personal information. The company also states that Copilot Health is not used for model training
.
However, as more tech companies build AI tools that work with medical information, concerns about how these systems handle personal health data will likely grow.

Microsoft is rolling out the new feature gradually, and it’ll appear as “Health” in the left pane of the Copilot home.
Would you trust an AI like Copilot Health with your medical records?
Voting closes: March 17, 2026 1:00 pm
