• 👍Maximum Derek👍@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 year ago

    If you train something off the internet it’s bound to come out a bit racist. And I like to think that, thanks to me, it’s also slightly biased against people who put ranch dressing on pizza.

    • Karlos_Cantana@sopuli.xyz
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      I hope you get banned for your hateful and biggoted comments. I also hate ranch on pizza, but I care about the underrepresented class of people who do. They are humans and deserve all the rights as you or I. I am appalled that this kind of blatant hatred still exists in 2023. You, sir (or ma’am, or whatever pronoun you prefer), are a loathsome person and I’m ashamed to be in the same species as you.

      • java@beehaw.org
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        I hope you get banned for your hateful and bigoted comments. I also hate people who hate ranch on pizza, but I care about the underrepresented class of people who do. They are humans and deserve all the rights as you or I. I am appalled that this kind of blatant hatred still exists in 2023. You, sir (or ma’am, or whatever pronoun you prefer), are a loathsome person and I’m ashamed to be in the same species as you.

    • Luke_Fartnocker@lemm.ee
      link
      fedilink
      arrow-up
      14
      ·
      1 year ago

      I completely disagree with you. Maybe it’s because I’m old, but I don’t want any damned racist robot doctor telling me what to do. I just want my good old human, racist doctor treating me; like God intended.

  • Fizz@lemmy.nz
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    1 year ago

    Ok but you could find studies that show doctors or any staff perpetuates racism. Seems like it would be less offensive coming from a computer.

  • ailiphilia@feddit.it
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    It doesn’t appear to be limited to racism.

    Humans inherit artificial intelligence biases

    Artificial intelligence recommendations are sometimes erroneous and biased. In our research, we hypothesized that people who perform a (simulated) medical diagnostic task assisted by a biased AI system will reproduce the model’s bias in their own decisions, even when they move to a context without AI support. In three experiments, participants completed a medical-themed classification task with or without the help of a biased AI system. The biased recommendations by the AI influenced participants’ decisions. Moreover, when those participants, assisted by the AI, moved on to perform the task without assistance, they made the same errors as the AI had made during the previous phase. Thus, participants’ responses mimicked AI bias even when the AI was no longer making suggestions. These results provide evidence of human inheritance of AI bias.

  • intensely_human@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    The photo here is hilarious. It’s like the computer just said something horrible and they’re both trying to wrap their heads around it