with the way AI is getting by the week,it just might be a reality

  • peto@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    I think I the difference is that I find ‘human’ to be too narrow a term, I want to extend basic rights to all things that can experience suffering. I worry that such an experience is part and parcel with general intelligence and that we will end up hurting something that can feel because we consider it a tool rather than a being. Furthermore I think the onus must be on the creators to show that their AGI is actually a p-zombie. I appreciate that this might be an impossible standard, after all, you can only really take it on faith that I am not one myself, but I think I’d rather see a p-zombie go free than accidently cause undue suffering to something that can feel it.

    • lol3droflxp@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      I guess that we’ll benefit from the fact that AI systems despite their reputation of being black boxes are still far more transparent than living things. We probably will be able to check if they meet definitions of suffering and if they do it’s a bad design. If it comes down to it though, an AI will always be worth less than a human to me.