Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Microsoft announced its launch on Thursday Co-pilot healtha “separate, secure space” in Copilot to ask questions about lab results and medical records, search for providers, analyze data from wearable devices, and other health-related chats. The feature will be rolled out in phases, so it won’t be available to everyone right away, but users can Join the waiting list To gain access.
Microsoft says Copilot Health “does not replace your doctor” and is not intended to provide medical diagnosis or treatment, but it does help users understand their health data. Users can import medical records from more than 50,000 US hospitals and healthcare organizations through HealthEx, and import laboratory test results through the function. Copilot Health is also compatible with “more than 50 wearable devices,” including devices from Apple, Ora, and Fitbit. The Copilot Health home page can display data from wearable devices, such as current step counts, as well as reminders for upcoming appointments, depending on the data users choose to share.
1/3
Users can also find medical professionals through Copilot Health. It’s connected to “real-time US provider directories” that can help users search for providers based on specialty, location, languages spoken, and insurance plans accepted.
In its press release about Copilot Health, Microsoft stated that it “improved the quality and reliability of answers by leveraging information from credible health organizations across 50 countries.” It also says responses in Copilot Health will include quotes with links to sources and “answer cards written by experts from Harvard Health.”
User conversations in Copilot Health are “isolated from public Copilot and kept under additional access, privacy, and safety controls,” according to Microsoft. It also claims that data from Copilot Health Chats is not used to train its AI models. Users can also delete their health data or disconnect data sources at any time, such as turning off access to wearable data.
OpenAI launched a very similar feature in January called ChatGPT Health, which also provides an isolated environment for medical chats, encourages users to link their medical records, and does not use health chats for model training. However, Microsoft does not currently have a HIPAA-compliant version of Copilot Health, unlike… ChatGPT Healthcare And Amazon’s health AI, which was Open to more users Tuesday. Anthropic Claude Healthcare Likewise “HIPAA Ready”.
When asked about HIPAA compliance in a press conference prior to Thursday’s announcement, Dr. Dominic King, vice president of health at Microsoft AI, said: “HIPAA is not required for a direct consumer experience like this when you’re using your own data.” the Health Insurance Portability and Accountability Act It includes security requirements to protect patients’ electronic health data and prohibits certain types of use and disclosure. HIPAA Violators He could face fines Maybe even a prison sentence. Because companies like Microsoft are not legally required to comply with HIPAA, they are not subject to the consequences that a hospital or doctor might face for violating a patient’s HIPAA rights. “However, at Copilot, we believe it is very important that we meet the best standards out there,” King added. “So, we will be announcing some updates here on our position regarding so-called ‘HIPAA controls.’” King did not explain exactly what that entails.
King also noted that his Copilot Health program ISO 42001 certificate. ISO 42001 is an independent international standard for artificial intelligence systems that aims to promote “the responsible use of artificial intelligence” as well as “traceability, transparency and reliability.” Microsoft 365 Copilot and Microsoft 365 Copilot Chat We also have this certificate.
However, even with this certification and any future intentions of voluntary HIPAA compliance, users may still want to be cautious about sharing their medical data with AI. As experts pointed outAI companies can change their data privacy policies at any time. AI also has a history of giving away users Inaccurate or unsafe medical adviceIt has a particularly strong track record When it comes to mental health.