According to News Medical Life Sciences, ChatGPT, which was launched in November 2022, is an artificial intelligence (AI) chatbot developed by Open AI. ChatGPT can do many things that humans can do which is exciting and frightening. Some of these tasks include writing articles, completing job applications, and writing jokes and poetry. Already, Chat GPT has shown that ChatGPT can pass law school and business exams given at prestigious universities.
“ChatGPT is poised to disrupt numerous industries, and search engines, education, graphic design, research, healthcare, retail, banking, manufacturing, logistics, and travel are already identified as those that stand to benefit from the technology.”
News Medical Life Sciences states that ChatGPT will help healthcare professionals in many ways including the remote management of healthcare, clinical decision-making in real-time, drug management, medical record-keeping, real-time translation, clinical trials, and many other functions.
How could ChatGPT be used to help healthcare providers and patients?
Some of the possibilities for ChatGPT usage in the medical profession, according to Forbes, include personalized treatment plans, monitoring patients remotely, and the following:
- Telemedicine – virtual assistants. ChatGPT will be used to help patients schedule their appointments, manage their health information, and even receive treatments. Many patients are already using telemedicine to review their appointments, prescriptions, test results, and other aspects of their medical care.
- Clinical decision support. ChatGPT will be used to help doctors provide “real-time, evidence-based patient recommendations.” ChatGPT could review dangerous drug interactions, suggest possible treatments, and provide clinical guidelines.
- Recordkeeping. ChatGPT will help doctors and nurses dictate their notes so these healthcare providers can focus more on their patients. The AI software will also help summarize and analyze patient interactions and the patient’s medical histories. Chat GPT should also make test results more readily available.
- Medical translation. ChatGPT can be used to “translate medical jargon, technical terms, and common expressions, allowing patients to understand their diagnosis, treatment options, and medication instructions.”
- Managing medications. ChatGPT will help doctors keep track of the various medications their patients are taking and the dosages for each medication. The AI software should also help explain the possible adverse reactions, side effects, drug interactions, contraindicators, and other medication management issues.
- Disease surveillance. Doctors, researchers, and even regular citizens will be able to use ChatGPT to review global health data which can provide “real-time insights into potential outbreaks and facilitate early response efforts.” The AI software can look for patterns and anomalies that indicate a new disease is spreading or how quickly and where a known disease is spreading. In turn, public health officials, doctors, and others should then be able to take a more measured response to this new information.
- Medical writing and documentation. ChatGPT could be used to help doctors and other healthcare professionals write and document their medical reports including clinical notes and discharge summaries – along with “real-time suggestions and corrections.”
- Clinical trial research. ChatGPT could be used to identify possible participants in clinical studies by analyzing large sections of patient data to find eligible participants. “By leveraging ChatGPT’s capabilities, clinical trial recruitment efforts can become more efficient, targeted, and effective in reaching diverse populations.”
- Checking for symptoms. ChatGPT can be used to help patients understand their symptoms and the seriousness of their symptoms so patients can decide when medical care is necessary, what “self-care measures a patient can take before seeking medical attention, such as home remedies or over-the-counter medications,” and what other immediate steps are necessary.
- Triage of patients. ChatGPT could be used to determine which patients require priority care (including the need for surgery) by asking the patients questions about their medical history and their symptoms.
- Medication information. As discussed in the medication management section, ChatGPT can be used to see what drug side effects, consequences, and possible contraindications are possible. Patients will be able to ask questions in their layman’s language and ChatGPT can respond “with accurate and timely information, helping patients make informed decisions about their medications.” ChatGPT should also be able to quickly provide details about the “proper dosage, administration, and storage of medications, as well as potential alternatives for patients who are allergic or intolerant to specific prescriptions.” ChatGPT will help doctors stay current about new drugs, drug recalls, and other critical pharmaceutical information.
- Medical education. ChatGPT will help students, doctors, and other healthcare professionals obtain medical information and resources to help with their studies and with ongoing medical learning.
- Mental health support. ChatGPT “can be used to provide behavioral health support to patients, including screening for mental health conditions, offering coping strategies, and connecting patients with resources for further support.”
- Remote monitoring of patients. ChatGPT will work with patients who are wearing medical devices and sensors to help doctors and healthcare providers have access to real-time patient information – to alert the doctors/providers that a patient is in danger or has health issues that should be addressed. Early intervention should lead to better health outcomes.
Medical professionals and hospitals will need to keep current with the latest AI tools to better serve their patients.
What are some of the concerns about the use of ChatGPT in the healthcare sector?
There are many different legal and ethical issues involved with the use of ChatGPT. Some current and federal laws already do apply to the use of any type of artificial intelligence in the healthcare sector. Many laws and regulations will likely be enacted as the use of ChatGPT and AI expands and the consequences of using this computer software become better understood.
- Confidentiality. Much of the data involved with ChatGPT comes directly from patients. There are already laws that protect patient information such as the Health Insurance Portability and Accountability Act of 1966 (HIPAA) and various federal and state laws. In addition to these laws, state medical agencies also regulate the use and storage of patient information. The manufacturers of ChatGPT software and the doctors who recommend and use the AI software will need to ensure that their patient information is private, secure, and complies with the applicable laws.
- Patients will need to be consulted about how their medical information is used. Generally, patients will likely have to “opt-in” for the AI software to be able to use a patient’s data.
- Healthcare providers will need to understand when their patient’s information is being used by ChatGPT and what information the providers need to explain to their patients about how ChatGPT may use their patient data. Many medical practices may need to update their HIPAA forms and their relationships with their various healthcare vendors.
- Accuracy. The software does need to be tested for accuracy. Patients’ lives are at risk. The information that is being analyzed needs to be accurate.
- The unauthorized practice of medicine. On the surface, the ability of ChatGPT to be used for clinical trials and medical research sounds like a worthy goal – but ultimately doctors and licensed healthcare professionals need to make medical decisions for their patients. The line between gathering and analyzing data and making medical decisions is likely to become a very fine one – and the subject of further laws and regulations about the medical and legal limits of ChatGPT.
ARTIFICIAL INTELLIGENCE AND THE HEALTH INSURANCE PORTABILITY AND ACCOUNTABILITY ACT OF 1996 (HIPAA)
AI in the healthcare industry almost by definition, requires constant access to patient information. Developers and covered health providers need to understand when and how HIPAA applies to their […]
Other legal issues regarding ChatGPT may involve:
- Intellectual property issues
- Bias issues
- FDA compliance – there are already FDA regulations regarding the use of artificial intelligence as medical devices and for medical treatment. FDA oversight of ChatGPT is just beginning and is likely to expand.
- Many other developing issues
ChatGPT offers many possibilities and promises for providing quality healthcare for patients and allowing doctors to provide better and more advanced healthcare to their patients. The AI services aim to both help the doctor better communicate with the patient and provide better patient outcomes. Both the developers of new software and the medical practices that use ChatGPT for their patients will need to understand the medical and legal limitations of ChatGPT. The legal limitations include issues of patient privacy, data security, the unauthorized practice of medicine, and many other issues.
FDA POLICY ON WHEN SOFTWARE THAT USES ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING QUALIFIES AS A MEDICAL DEVICE
As more artificial intelligence products become available, the FDA is reevaluating its approval criteria. For now, De novo approval and 501(k) are being used along with premarket approval.
Physicians and developers should contact Cohen Healthcare Law Group, PC to discuss the legal and ethical concerns about using ChatGPT. Our experienced healthcare attorneys advise physicians and developers about healthcare compliance laws and regulations.

Contact our healthcare law and FDA attorneys for legal advice relevant to your healthcare venture.
Contact Us
