Art by smbc-comics

Consciousness is often said to disappear in deep, dreamless sleep. We argue that this assumption is oversimplified. Unless dreamless sleep is defined as unconscious from the outset there are good empirical and theoretical reasons for saying that a range of different types of sleep experience, some of which are distinct from dreaming, can occur in all stages of sleep.

Pubmed Articles

Does Consciousness Disappear in Dreamless Sleep?

Sciencealert Article We Were Wrong About Consciousness Disappearing in Dreamless Sleep, Say Scientists

  • ArbitraryValue@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    3
    ·
    edit-2
    1 year ago

    A human being is a process of computation. Ending the computation is death. Pausing the computation is, well, simply pausing the computation. It has no profound significance.

    (This is also my answer to the “teleporter problem.” As long as the computation continues, a change in the substrate on which it takes place also has no profound significance.)

    • query@lemmy.world
      link
      fedilink
      arrow-up
      17
      arrow-down
      2
      ·
      1 year ago

      And if the teleportation process doesn’t terminate the original, but creates a copy on the other end, are they both the same person?

      • ArbitraryValue@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        edit-2
        1 year ago

        Creating and destroying perfectly identical copies of the information that corresponds to a person neither creates nor destroys people unless the very last copy of that information is destroyed, in which case the person is killed.

        Small divergences aren’t a big deal. For example, if a person spends an hour under the effect of an anesthetic (or alcohol) which prevents the formation of new long-term memories, this person isn’t dying when he goes to sleep and wakes up without any memories of that last hour.

        Larger divergences are a big deal - losing a year of memories is pretty bad, losing a decade is even worse, and having one’s mind returned to the blank slate of an infant is very close to the same thing as dying.

        So what I’m saying is that the two copies start out as the same person and then gradually become different people.

        • Anduin1357@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          I would argue that two disconnected copies of the information that corresponds to a person does make 2 disjoint persons.

          Like running a different seed on procedural generation, entropy will ensure that these two identical persons won’t be identical after whatever ticks in the biological clock.

          • ArbitraryValue@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            I agree that the copies will diverge almost instantly; I’m just saying that small amounts of divergence aren’t a big deal. That’s what I’m trying to illustrate with my example of the person who loses an hour of memories. I think this is exactly equivalent to making a copy, having that copy exist for an hour, and then destroying it. An hour of memories does make the copy different from the original, but the loss of the copy is just the loss of that hour, not of a complete human being (and we naturally quickly forget much more than that - I already can’t remember what I did every hour yesterday).

            I admit I don’t feel like it’s exactly equivalent, but I think that’s an illusion caused by my moral intuitions developing in a wold where destroying a copy always means destroying the only copy.

            • Anduin1357@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Though the simpler solution is that perhaps memory formation is paused over the period then the person ‘lost’ their memory to sleep.

              Losing memories when you’re wide awake is like a file system deleting pointers to a file. The file is still there, just inaccessible.

              Anyways I feel that the assertion that “Creating and destroying perfectly identical copies of the information that corresponds to a person neither creates nor destroys people” is extremely dangerous thinking that could lead to the premature end of consciousness for some very unfortunate individuals. After all, they’re perfectly identical and we have no documented instance of anyone sharing consciousness, so it may be that consciousness are unique and not commutative.

      • jarfil@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        If it creates a copy, then it isn’t teleportation, it’s copying. Two copies will diverge from the moment they’re no longer a single copy.

    • Ech@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      a change in the substrate on which it takes place also has no profound significance.

      It does to the person being “deleted”.

      • massive_bereavement@kbin.social
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        SOMA is one of my favorite gaming experiences and probably one of the best sci-fi stories in this medium.

        Saddly some monster bits were a bit weaker and I think Amnesia fans felt it didn’t match their expectations…

    • NegativeInf@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      It feels like dreaming is the “training from a batch of sample memories” tactic from deep learning.