A New York Times copyright lawsuit could kill OpenAI::A list of authors and entertainers are also suing the tech company for damages that could total in the billions.

  • makyo@lemmy.world
    link
    fedilink
    English
    arrow-up
    75
    arrow-down
    9
    ·
    10 months ago

    I always say this when this comes up because I really believe it’s the right solution - any generative AI built with unlicensed and/or public works should then be free for the public to use.

    If they want to charge for access that’s fine but they should have to go about securing legal rights first. If that’s impossible, they should worry about profits some other way like maybe add-ons such as internet connected AI and so forth.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      10 months ago

      There’s plenty of money to be made providing infrastructure. Lots of companies make a ton of money providing infrastructure for open source projects.

      On another note, why is open AI even called “open”?

      • ItsMeSpez@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        On another note, why is open AI even called “open”?

        It’s because of the implication…

    • Pacmanlives@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      10 months ago

      Not really how it works these days. Look at Uber and Lime/Bird scooters. They basically would just show up to a city and say the hell with the law we are starting our business here. We just call it disruptive technology

      • makyo@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        Unfortunately true, and the long arm of the law, at least in the business world, isn’t really that long. Would love to see some monopoly busting to scare a few of these big companies into shape.

    • miridius@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      5
      ·
      10 months ago

      Nice idea but how do you propose they pay for the billions of dollars it costs to train and then run said model?

          • nickwitha_k (he/him)@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            10 months ago

            If we didn’t live under an economic system where creatives need to sell their works to make a living or even just survive, there wouldn’t be an issue. What OpenAI is doing is little different than any other worker exploitation, however. They are taking the fruits of the labor of others, without compensation of any kind, then using it to effectively destroy their livelihoods.

            Few, if any, of the benefits of technological innovation related to LLMs or related tech is improving things for anyone but the already ultra-wealthy. That is the actual reason that we can’t have nice things; the greedy being obsessed with taking and taking while giving less than nothing back in return.

            Just like noone is entitled to own a business that can’t afford to pay a living wage, OpenAI is not entitled to run a business aimed at building tools to destroy the livelihoods of countless thousands, if not millions, of creatives by building their tools out of stolen works.

            I say this as one who is in support of trying to create actual AGI and potentially “uplift” species, making humanity less lonely. I think OpenAI doesn’t have what it takes and is nothing more than another scam to rob workers of the value of their labor.

            • General_Effort@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              10 months ago

              This is the wrong way around. The NYT wants money for the use of its “intellectual property”. This is about money for property owners. When building rents go up, you wouldn’t expect construction workers to benefit, right?

              In fact, more money for property owners means that workers lose out, because where else is the money going to come from? (well, “money”)

              AI, like all previous forms of automation, allows us to produce more and better goods and services with the same amount of labor. On average, society becomes richer. Whether these gains should go to the rich, or be more evenly distributed, is a choice that we, as a society, make. It’s a matter of law, not technology.

              The NYT lawsuit is about sending these gains to the rich. The NYT has already made its money from its articles. The authors were paid, in full, and will not get any more money. Giving money to these property owners will not make society any richer. It just moves wealth to property owners for being property owners. It’s about more money for the rich.

              If OpenAI has to pay these property owners for no additional labor, then it will eventually have to increase subscription fees to balance the cash flow. People, who pay a subscription, probably feel that it benefits them, whether they use it for creative writing, programming, or entertainment. They must feel that the benefit is worth, at least, that much in terms of money.

              So, the subscription fees represent a part of the gains to society. If a part of these subscription fees is paid to property owners, who did not contribute anything, then that means that this part of the social gains is funneled to property owners, IE mainly the ultra-rich, simply for being owners/ultra-rich.

              • nickwitha_k (he/him)@lemmy.sdf.org
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 months ago

                This is the wrong way around. The NYT wants money for the use of its “intellectual property”. This is about money for property owners. When building rents go up, you wouldn’t expect construction workers to benefit, right?

                I do not find that to be an apt analogy. This is more like someone setting up shop in the NYT’s lobby, stealing issues, and cutting them up to make their own newspaper that they sell from said lobby, without permission or compensation. OpenAI just refined a technology to parasitize off of others’ labor and is using it to seev rent over intellectual property that they don’t own or have rights to use.

                So, the subscription fees represent a part of the gains to society. If a part of these subscription fees is paid to property owners, who did not contribute anything, then that means that this part of the social gains is funneled to property owners, IE mainly the ultra-rich, simply for being owners/ultra-rich.

                I’m going to have to strongly disagree with here. The subscription fees are only going to the ultra-wealthy who are using LLMs to parasitize off of labor. The NYT is not who I’m worried about having their livelihoods destroyed, it’s the individual artists, actors, and creatives, as well as those whose jobs are being replaced with terrible chatbots that cannot actually do the work but are implemented anyway to drive lay-offs and boost stock prices. The NYT and other suits are merely a proxy due to the wealth gap leading to it being nearly impossible for those most impacted to successfully utilize the courts to remedy their situation.

                • General_Effort@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  10 months ago

                  I do not find that to be an apt analogy.

                  The point is that the people who create some property don’t get a cut when the property rises in value. You keep calling the intellectual property of the NYT labor. I think there’s something there you seriously misunderstand.

                  This is more like someone setting up shop in the NYT’s lobby, stealing issues, and cutting them up to make their own newspaper that they sell from said lobby, without permission or compensation.

                  That’s an analogy for a normal practice in journalism. Like when other news media (other websites, TV, radio, …) reports that on what the NYT reports. I’m sure you have seen articles where it said something like “The NYT reported that…”.

                  That’s not what the case is mainly about. I’m not sure if anything like that is even mentioned.

                  • nickwitha_k (he/him)@lemmy.sdf.org
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    10 months ago

                    I think that you are overlooking the part about aggressively competing against the original creation with something that is impossible without the existence of the initial creation. Also the part where the NYT isn’t really the ones most impacted by the current pushes for adoption of LLMs and similar tech. It’s not a case of punchcard computer operators becoming obsolete. It’s a case of using technology to deny those involved in creating and evolving culture along with those in the few remaining jobs that allow one to get by the ability to make a living.

                    Humanity as a whole isn’t benefiting, only the ultra-wealthy who are using and refining these tools for no other purbose but to further bludgeon and dehumanize workers, grow the number on the precipice of total ruin, and increase the wealth gap further. So, the NYT merely is playing the role of “the enemy of my enemy”.

                    If the tools WEREN’T being used primarily to skim even more wealth and push more into poverty, there would be no problems (especially if the result were reform of the currently awful IP laws). But, we currently live in a world where billionaires are writing to profitable tech companies demanding mass lay-offs and deep salary cuts to increase stock prices, voice actors are thrown under the bus by their own union, and eating disorder helpline workers are fired en masse for unionizing to to be replaced with chatbots that cause measurable harm to vulnerable people. OpenAI deserves to be shutdown for the harm that they are enabling and profiting from.

      • Smoogs@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        5
        ·
        edit-2
        10 months ago

        Defending scamming as a business model is not a business model.

    • dasgoat@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      13
      ·
      10 months ago

      Running AI isn’t free, and AI calculations pollute like a motherfucker

      This isn’t me saying you’re wrong on an ethical or judicial standpoint, because on those I agree. It’s just that, on a practical level considerations have to be made.

      For me, those considerations alone (and a ton of other considerations such as digital slavery, child porn etc) make me just want to pull the plug already.

      AI was fun. It’s a dumb idea for dumb buzzword spewing silicon valley ghouls. Pull the plug and be done with it.

      • セリャスト@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 months ago

        The thing is that those models aren’t even open source, if it was then you could argue that openai’s business model is renting processing power. Except they’re not so their business model is effectively selling models trained on copyrighted data

        • dasgoat@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          edit-2
          10 months ago

          Plus, they built the whole thing on the basis of “research purposes” when in reality from the very start they intended to use this as a business above all else. But tax benefits, copyright leniency etcetera were used liberally because ‘it’s just research’.

          And then keeping it closed source. The whole thing is a typical silicon valley scam where they will use whatever they can get their grubby little hands on, and when the product is finally here, they make sure to throw it into the world with such a force that legislators can’t even respond adequately. That’s how they make sure that there will be no legislation on if the whole thing is even legal or ethical to begin with, but merely to keep it contained. From then on, they can just keep everything in courts indefinitely while the product festers like a cancer.

          It’s the same thing with blockchains basically.

          Also, again, digital slavery being used to ‘train’ models and child porn being used to train them because the web scrapers they used can’t and won’t discern whatever shit they rake up into the garbled pile of other people’s works.

    • asdfasdfasdf@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      That goes against the fundamental idea of something being unlicensed, meaning there are no repercussions from using the content.

      I think what you mean already exists: open source licenses. Some open source licenses stipulate that the material is free, can be modified, etc. and you can do whatever you want with it, but only on the condition that whatever you create is under the same open source license.

      • makyo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Ugh I see what you mean - no I mean unlicensed as in ‘they didn’t bother to license copyrighted works’ and public as in ‘stuff they scraped from Reddit, Twitter, and etc. without permission from anyone’.

    • poopkins@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      What is unlicensed work? Copyrighted content will not have a licence agreement but this doesn’t mean you can freely infringe on copyright law.

      • makyo@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        By unlicensed I mean works that haven’t been licensed IE anything being used without permission or some other right

        • poopkins@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Right: public works are content in the public domain where the copyright has expired and Creative Commons licenced content is, well, licenced.

    • canihasaccount@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      10 months ago

      Would you, after devoting full years of your adult life to the unpaid work of learning the requisite advanced math and computer science needed to develop such a model, like to spend years more of your life to develop a generative AI model without compensation? Within the US, it is legal to use public text for commercial purposes without any need to obtain a permit. Developers of such models deserve to be paid, just like any other workers, and that doesn’t happen unless either we make AI a utility (or something similar) and funnel tax dollars into it or the company charges for the product so it can pay its employees.

      I wholeheartedly agree that AI shouldn’t be trained on copyrighted, private, or any other works outside of the public domain. I think that OpenAI’s use of nonpublic material was illegal and unethical, and that they should be legally obligated to scrap their entire model and train another one from legal material. But developers deserve to be paid for their labor and time, and that requires the company that employs them to make money somehow.

      • thecrotch@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        10 months ago

        Would you, after devoting full years of your adult life to the unpaid work of learning the requisite advanced math and computer science needed to develop such a model, like to spend years more of your life to develop a generative AI model without compensation?

        No. I wouldn’t want to write a kernel from scratch for free either. But Linus Torvalds did. He even found a way to monetize it without breaking any laws.

      • Prok@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        Yes, good point, resource collection is nearly identical to content generation