Lemmy, I really would like to hear your opinions on this. I am bipolar. after almost a decade of being misdiagnosed and on medication that made my manic symptoms worse, I found stable employment with good insurance and have been able to find a good psychiatrist. I’ve been consistently medicated for the past 3 years, and this is the most stable I have been in my entire life.
The office has rolled out the use of an app called MYIO app. My knee jerk reaction was to not be happy about the app, but I managed my emotions, took a breath and vowed to give it a chance. After being sent the link to validate my account, the app would force restart my phone at the last step of activation. (I have my phone locked down pretty tight, and lots of google shit, and data sharing is disabled, so I’m thinking that might be the cause. My phone is also like 4-5 years old, so that could also be the cause.)
Luckily I was able to complete the steps on PC and activate that way. Once I was in the account there were standard forms to sign, like the HIPAA release. There was also a form there requesting I consent to the use of AI. Hell to the NO. That’s a no for me dawg.jpg.
I’m really emotional and not thinking rationally. I am hoping for the opinions of cooler heads.
If my doctor refuses to let me be a patient if I don’t consent to AI, what should I do? What would you do? Agree even though this is a major line in the sand for me, or consent to keep a provider I have a rapport with, who knows me well enough to know when my meds need adjusting?
EDIT: This is the text of the AI agreement. As part of their ongoing commitment to provide the best possible service, your provider has opted to use an artificial intelligence note-taking tool that assists in generating clinical documentation based on your sessions. This allows for more time and focus to be spent on our interactions instead of taking time to jot down notes or trying to remember all the important details. A temporary recording and transcript or summary of the conversation may be created and used to generate the clinical note for that session. Your provider then reviews the content of that note to ensure its accuracy and completeness. After the note has been created, the recording and transcript are automatically deleted.
This artificial intelligence tool prioritizes the privacy and confidentiality of your personal health information. Your session information is strictly used for the purpose of your ongoing medical care. Your information is subject to strict data privacy regulations and is always secured and encrypted. Stringent business associate agreements ensure data privacy and HIPAA compliance.
Edit 2: I just wanted to say that I appreciate everyone here that commented. For the most part everyone brought up valid points, and helped me see things I had not considered. I emailed my doctor and let them know I did not want to agree to the use of AI. I let them know that I was cool with transcription software being used as long as it was installed locally on their machines, but I did not want a third party online app having access to recorded sessions for the purposes of transcription. They didn’t take issue with it.
Thank you everyone!


No, I would not keep seeing a doctor who demanded I consent to AI to continue being a patient.
Just as I’d stop seeing a doctor who said they now want to ban all vaccines. Or who demands that patients go to their specific church. Or requires that we give blood.
A doctor who asks to use it, however, is entirely tolerable. I don’t mind if my doctor has an LLM look at my MRI scan to help find precancerous growths, and neither should you. But “talk to this chatbot instead of a triage nurse” is an unacceptable level of care.
I edited my post with the text of the agreement they wanted me to sign. It sounds like the AI is for “note taking”. I wouldn’t be having sessions with AI. The AI would record my sessions, make a transcript, then the DR would have to review/edit the transcript, then delete the recording. I don’t like the idea of my sessions being recorded, and I don’t trust that the app will delete the recording. Companies lie all the time and get away with it because the punishment is rolled into the cost of doing business.
I think the first step here is talk to your doctor about it. Say you’re uncomfortable and you don’t want to use it. If they are a decent person as you say, they should accommodate that. Have you actually talked to them about it? Or are you just assuming they will try to force you to agree to it?