As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

  • tonyn@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Stop using the digital voice recorder and type everything yourself. This is the best way to protect your voice print in this situation. It doesn’t work well as a protest or to educate your colleagues, but I suppose that’s one thing you can use your voice for. Since AI transcription is a cost saving measure, there will be nothing you can do to stop its use. No decision maker will choose the more expensive option with a higher error rate on morals alone.

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Unfortunately the interface of the medical records system will be changed when this is implemented. The keyboard input method will be entirely removed.

      • ubergeek77@lemmy.ubergeek77.chat
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        Even if this gets implemented, I can’t imagine it will last very long with something as completely ridiculous as removing the keyboard. One AI API outage and the entire office completely shuts down. Someone’s head will roll when that inevitably happens.

        • FlappyBubble@lemmy.mlOP
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Ah sorry, I mean removing the option of using the keyboard as an input method in the medical records system. The keyboard itself isn’t physically removed from the computer clients.

          But I agree that in the event of a system failure the hospital will halt.

          • ubergeek77@lemmy.ubergeek77.chat
            link
            fedilink
            arrow-up
            0
            ·
            10 months ago

            Also, if you get the permission of someone in leadership to clone their voice, one angle could be to voice clone someone on ElevenLabs and make the voice say something particularly problematic, just to stress how easily voice data can be misused.

            If this AI vendor is ever breached, all they have to do is robocall patients pretending to be a real doctor they know. I don’t think I need to spell out how poorly that would go.

  • Boozilla@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    Will they allow you to use your own non-cloud solution? As long as you turn in text documents and they don’t have to pay a person to transcribe, they should be happy. There are a number of speech to text apps you can run locally on a laptop, phone, or tablet.

    But of course, it’s sometimes about control and exercising their corporate authority over you. Bosses get off on that shit.

    Not sure which type of doctor you are, but there’s a general shortage of NPI people. I hope you can fight back with some leverage. Best of luck.

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      It will not be possible to use my own software. The computer environment is tightly controlled. If this is implemented my only input device to the medical records will be the AI transcriber (stupidity).

      I’m a psychiatrist in the field of substance abuse and withdrawal. Sure there’s a shortage of us too but I wan’t the hospital to understand the problem, not just me getting to use a old school secretary by threatening gping to another hospital.

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        my only input device to the medical records will be the AI transcriber

        I understand that you keep steering away from legal arguments, but that can’t be legal either. How could a doctor not have direct, manual access to patient records?

        Anyway, practical issues:

        You need some way to manually interact with patient records in the inevitable event the AI transcription gets it wrong. It only takes one time messing up transcription on something critical and you have a fucking body on your hands. Is your hospital prepared to give patients the wrong dosages because background noise or someone else speaking makes the AI mishear? Who would be held responsible in the case of mistreatment due to mistranscription? Is your hospital willing to be one of the first to try and tackle that legal rats nest?

        A secretary is able to do a sanity check that what they heard make sense. AI transcription will have no such logic behind it. It will turn what it thinks it heard into text and chuck it wherever it logs to. It thinks you’ve called for leeches when you said something about lesions? Have fun.

        Whenever there’s an issue with the transcription service you’d be screwed too. That could mean network outage, power outage, microphone breaks, any part of this equipment breaks, and this whole system falls apart.

        • FlappyBubble@lemmy.mlOP
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          The problem with incorrect transceiption exists with my secretary too. In the system I work in the secretary write my recordibg, sends it to me, I read it. I can edit the text at this point and then digitally sign it with a personal private key. This usually happens at least a day after being recorded. All perscriptions or orders to my nurses are given inannother system besides the raw text in the medical records. I can’t easily explain the practical workings but I really don’t see that the AI system will introduce more errors.

          But I agree that in the event of a system failure, there will be a catastrophic situation.

      • Boozilla@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        I was afraid that might be the case. Was hoping they would let you upload the files as if you had typed them yourself.

        Maybe find some studies / articles on transcription bots getting medical terminology and drug names wrong. I’m sure that happens. AI is getting scary-good, but it’s far from perfect, and this is potentially a low-possibility-but-dangerous-consequences kind of scenario. Unfortunately the marketers of their software probably have canned responses to these types of concerns. Management is going to hear what they want to hear.

        • FlappyBubble@lemmy.mlOP
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Thaks fot he advice but I’m not against using AI-models transcribing me, just not a cloud model specifically trained on my voice without any control by me. A local model or more preferrably a general local model woulf be fine. What makes me sad is that the persons behind this are totally ignorant to the problem.

          • Boozilla@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            I understand, and we’re basically on the same page. I’m not fully anti-AI, either. Like any tool, it can be used for good or evil. And you are right to have concerns about data stored in the cloud. The tech bros will mock you for it and then… oh look, another data breach has it been five minutes already. :)

            • FlappyBubble@lemmy.mlOP
              link
              fedilink
              arrow-up
              0
              ·
              10 months ago

              Yes I agree. Broadening the scope a little, I frankly just wait for a big leak of medical records. The system we use is a birds nest of different softwares, countless API:s, all sorts of database backends. Many systems syem from MS-DOS, just embedded in a bit more modern integrated environment. There are just so many flaws and I’m amazed a leak hasn’t happened (or at least surfaced) yet.

  • Boozilla@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I had another idea. You might be able to use something that distorts your voice so that it doesn’t sound anything like you, but the AI can still transcribe it to text. There are some cheap novelty devices on amazon that do this, and also some more expensive pro audio gear that does the same thing. Just a thought.

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Sure but what about my peers? I want to get the point across and the understanding of privacy implications. I’m certain that this is just the first of many reforms without proper analysis of privacy implications.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Honestly I would be way more concerned about your patients privacy. You shouldn’t just ship medical data to some third party. That leads to massive data breaches.

        • Boozilla@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          I agree with you but that ship has sailed. I work with big medical data and it’s shocking the stuff that gets stored and passed around. The really big players like PBMs and major insurance providers are supposed to abide by HIPAA but they do not fear enforcement at all. Only the small fish like doctors, etc, need fear HIPAA.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Voice cloning is the least of your concerns honestly as you are sending people private information to the cloud.

  • stevedidwhat_infosec@infosec.pub
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Simple jobs are going to continue to go away in favor of more efficient spending.

    You’re not going to get around the removal of simple jobs from the market in favor of newer concepts and more complex operations.

    All these people that said going to college to further your education was stupid and a waste of money are going to be the first to bitch and moan because the rest of us who spent the time and money to better ourselves would like to reciprocate that same logic into the world so you don’t have to worry about things like underpaid fast food workers spitting in your food, delivery drivers stealing your food, etc.

    Some people who can only do “simple” tasks are the ones who stand the most to be hurt by the world moving forward and becoming more advanced and complex, but I’m not sure what we can do to help them outside of seriously considering UBI. The wealth we are generating and saving through automation deserves to be equally spread amongst the people it replaced. That’s fair.

    • off_brand_@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I think it was pretty clear the issue was one of privacy requirements and not any qualms with losing jobs, which isn’t even happening here.

      • stevedidwhat_infosec@infosec.pub
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        They do pretty specifically mention the using their own voice thing, good point.

        However I’d like to remind everyone that recording you while in public is done and done so very frequently (look at all the whistle blower docs) so it’s really moot imo whether or not there exists recordings of your voice.

        And everything else I said still stands. Idgaf about the doctor who still goes home with some of the highest salaries in the public. Personally, I think medical practitioners should be a part of working for the state or the govt, and you basically become a servant to the public. Imo doctors should be held to the same public scrutiny but that’s a diff topic.

        • off_brand_@beehaw.org
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Not in public. This is a conversation with the healthcare provider, not with your partner while you’re at the grocery store. You have a legally recognized right to privacy (at least in the US) when it comes to your health details.

          Which is an unequivocally good thing.

  • small_crow@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    I assume you’ll be using Dragon Medical One. Nuance is a well established organization, with users in a broad range of professions, and their medical product is extensively used by many specialists. The health system where I live has been in the process of phasing out transcriptionists in favor of it for a decade or so.

    The only potential privacy concerns a hospital would care about would be if they are storing your transcripts on their servers, because that will contain sensitive information about patients. It will be impossible to get any administrator to care about your voice data.

    This tide is unlikely one you will be able to stem, but you could stop dictating and type it yourself.

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I’m not sure what exact service will be used. I won’t be able to type as the IT environment is tightly controlled and they will even remove the keyboard as an input device for the medical records.

  • Norgur@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Okay, so two questions:

    1. are you in a country that falls under the GDPR?
    2. this training data is to be per person?
    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I work in Sweden and it falls under GDPR. There probably are GDPR implications but as I wrote the question is not legal. I want my bosses to be aeare of the general issue ad this is but the first of many similar problems.

      The training data is to be per person, resulting in a tailored model to every single doctor.

      • Norgur@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        I think you can use the gdpr for your advantage here. Someone has to have tried this, right? So they could put on a gdpr request, demanding all data stored from them.

  • DontMakeMoreBabies@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    You’re going to lose this fight. Admin types don’t understand technology and, at this point, I imagine neither do most doctors. You’ll be loud minority because your concerns aren’t concrete enough and ‘AI is so cool. I mean it’s in the news!’

    Maybe I’m wrong, but my organization just went full ‘we don’t understand AI so don’t use it ever,’ which is the other side of the same coin.

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I understand the fight will be hard and I’m not getting into it if I cant present something they will understand. I’m definetly in a minority both among the admin staff and my peers, the doctors. Most are totally ignorsnt to the privacy issue.

    • 520@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Not OP but if I were him/her: Leakage of patient data. Even if OP isn’t responsible, simply being tied to an incident like this can look very bad in fields that rely heavily on reputation.

      AI models are known to leak this kind of information, there are news articles all over

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      10 months ago

      My biometric data, in this case my voice. Training an AI, tailored to my voice, out of my control, hosted as a cloud solution.

      Of course there is an aspect of patient confidenciality too, but this battle is already lost. The data in the medical records is already hosted outside of my hospital.

      • SheeEttin@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Sounds like a weak argument. They’re not going to be inclined to operate a local ML system just for one or two people.

        I would see if you can get a quote for locally-hosted transcription software you can run on your own, like Dragon Medical. Maybe reach out to your IT department to see if they already have a working relationship with Nuance for that software. If they’re willing to get you started, you can probably just use that for dictation and nobody will notice or care.

  • privsecfoss@feddit.dk
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    10 months ago

    I don’t where you live. But almost all of bigtec US cloud is problematic (Read: Illegal to use) for storing or processing of Personal information according to the GDPR if you’re based in the EU. Don’t know about HIPPA and other non-EU legislation. But almost all cloudservices use US bigtech as a subprocessor under the hood. Which means that the use of AI and cloud is most likely not GDPR-complaint. Which you could mention to the right people and hope they listen.

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I agree and I suspect this planned system might get scuttled before release due to legal problems. That’s why I framed it in a non legal way. I want my bosses to understand the privacy issue, both in this particular case but also in future cases.

    • pearsaltchocolatebar@discuss.online
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      10 months ago

      You don’t have to use a cloud service to do AI transcription. You don’t even need to use AI. Speech to text has been a thing for like 30+ years.

      Also, AWS has a FedRAMP authorized Gov Cloud that’s almost certainly HIPAA (and it’s non-us counterparts) compliant.

      Also also, there are plenty of cloud based services that are HIPAA compliant.

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    You tell them they either have a local person transcribe or you will have no choice but to step down. Tell them that the cloud is no place for medical data. It would also be a bonus if you could a bunch of your coworkers on board.

  • Ironically, GPT can kinda get you started here…

    To present your case effectively to your bosses and colleagues, focus on simplifying the technical aspects and emphasizing the potential risks associated with using a cloud-based AI transcription service:

    1. Privacy Concerns: Explain that using a cloud-based solution means entrusting sensitive biometric data (your voice) to a third-party provider. Emphasize that this data could potentially be accessed or misused without your consent.

    2. Security Risks: Highlight the risks of data breaches and unauthorized access to your voice recordings stored in the cloud. Mention recent high-profile cases of data breaches to illustrate the potential consequences.

    3. Voice Cloning: Explain the concept of voice cloning and how AI algorithms can be trained to mimic your voice using the data stored in the cloud. Use simple examples or analogies to illustrate how this could be used for malicious purposes, such as impersonation or fraud.

    4. Lack of Control: Stress that you have no control over how your voice data is used or stored once it’s uploaded to the cloud. Unlike a local solution where you have more oversight and control, a cloud-based service leaves you vulnerable to the policies and practices of the provider.

    5. Legal and Ethical Implications: While you acknowledge that there may be existing recordings of your voice online, emphasize that knowingly contributing to the creation of a database that could potentially be used for unethical or illegal purposes raises serious concerns about professional ethics and personal privacy.

    6. Alternative Solutions: Suggest alternative solutions that prioritize privacy and security, such as using local AI transcription software that does not upload data to the cloud or implementing stricter data protection policies within your organization.

    By framing your concerns in terms of privacy, security, and ethical considerations, you can help your bosses and colleagues understand the potential risks associated with using a cloud-based AI transcription service without coming across as paranoid. Highlighting the importance of protecting sensitive data and maintaining control over personal information should resonate with individuals regardless of their level of technical expertise.

  • lorty@lemmygrad.ml
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    This is really weird. Is it common in other countries for doctors to not input the data in the system themselves?

  • umami_wasabi@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    So what’s your concern? I’m a bit confused.

    1. Using cloud to process patient data? Or,
    2. Collecting your voice to train a model?
    • DessertStorms@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Yeah, I’d be sooooo confident and reassured if I knew my doctor was prioritising the security of their voice of the security of my information… /s

      (yes, it can be both, but this post doesn’t seem at all concerned with one, and entirely with the other)

  • ZILtoid1991@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago
    1. Go to the Minecraft servers of OpenAI and similar corporations.
    2. Find a room called “AI server room”, all while avoiding of defeating the mobs protecting the area.
    3. Destroy everything there.
    4. Go to the offices.
    5. Destroy everything there.
  • datavoid@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Personally I’d be more worried about leaking patient information to an uncontrolled system than having a voice model made

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Thats another issue and doesn’t lessen the importance of this issue. Both are important but separate. One is about patiwnt data, the other about my voice model. Also in thsi case I have no control over the mesical records and it’s already stored outside the hospital in my case.