OpenAI Launches ChatGPT Health in the US, Raising Privacy Concerns

ChatGPT Health in the US

OpenAI has introduced a new feature called ChatGPT Health in the United States, allowing users to share their medical records and data from fitness apps like MyFitnessPal, Apple Health, and Peloton to receive more personalized health advice.

According to OpenAI, conversations in ChatGPT Health are stored separately from other chats and will not be used to train its AI models. The company emphasized that the tool is designed to support, not replace, medical care, and is not intended for diagnosis or treatment.

The move has drawn both attention and criticism from privacy advocates. Andrew Crawford of the US non-profit Center for Democracy and Technology said it is “crucial” to maintain “airtight” safeguards around users’ health information. He warned that as OpenAI explores advertising models, the separation between sensitive health data and other ChatGPT conversations must remain absolute.

OpenAI reports that more than 230 million people ask its chatbot questions about health and wellbeing each week. By integrating personal health data, ChatGPT Health aims to provide more relevant and personalized responses, according to a company blog post.

Max Sinclair, CEO of AI marketing platform Azoma, called the launch a “watershed moment,” describing ChatGPT Health as a potential trusted medical adviser that could reshape patient care and consumer health decisions. Sinclair also noted that the tool could give OpenAI a competitive edge against rival AI platforms such as Google’s Gemini.

Initially, ChatGPT Health will be available to a small group of early users in the US, with a waitlist open for those seeking access. The feature is not yet available in the UK, Switzerland, or the European Economic Area, where stricter rules govern the processing and protection of personal data.

Crawford emphasized that in regions without strong privacy protections, companies collecting health data must exercise extreme caution, as inadequate policies could put sensitive health information at serious risk.

Leave a Reply

Your email address will not be published. Required fields are marked *