2024 might be the breakout year for efficient ARM chips in desktop and laptop PCs.

  • R0cket_M00se@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    10
    ·
    9 months ago

    “The most exciting tech isn’t the thing that currently exists and is being improved and integrated daily, it’s this other thing we don’t even know for sure will maybe happen.”

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      5
      arrow-down
      4
      ·
      9 months ago

      “Forget about the possibility that we may finally have developed machines that think, that comprehend the world in a way similar to how humans do and can communicate with us on our level. This new chip design might end up with comparable capabilities to the existing chip design!”

      Yeah, there was no need to try to hype this up as the biggest thing ever.

      • richieadler@lemmy.myserv.one
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        5
        ·
        9 months ago

        Forget about the possibility that we may finally have developed machines that think, that comprehend the world in a way similar to how humans do and can communicate with us on our level.

        That isn’t what’s happening with “IA” right now.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          5
          arrow-down
          4
          ·
          9 months ago

          Which is why I said possibility, I knew picky people would jump on the comment like this.

        • R0cket_M00se@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          7
          ·
          9 months ago

          You clearly don’t work in a field where it’s gutting swaths through workflows and taking up serious slack.

          You can describe your problem to it in native English, so it does communicate on our level. It comprehends training data in the same way a human comprehends our lived experience and assimilates the data in the same manner. It’s not truly “reasoning”, but it’s leagues ahead of anything we had even four years ago and it’s only going to grow from here.

          Commercial ventures are finding new uses cases everyday and to people in IT it’s hilarious in the same way that people who thought the Internet was a fad were hilarious.

    • corbin@infosec.pubOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      9 months ago

      Right, it’s less exciting now because it’s already here. I’m not expecting radically improved GPT models or whatever in 2024, probably just more iteration. The most exciting stuff there might be local AI tech becoming more usable, like we’ve seen with stable diffusion.

      • sir_reginald@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        9 months ago

        I’m just expecting performance optimisations, especially for local LLMs. Right now there are models as good as GPT-4 (Goliath 120B), but they require 2 RTX 4090 to run.

        The models that require less powerful equipment are not as good, of course.

        But hopefully, given enough time, good enough models will be able to run with mid end hardware.