• TheHarpyEagle@pawb.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 hours ago

    I mean, we’ve seen already that AI companies are forced to be reactive when people exploit loopholes in their models or some unexpected behavior occurs. Not that they aren’t smart people, but these things are very hard to predict, and hard to fix once they go wrong.

    Also, what do you mean by synthetic data? If it’s made by AI, that’s how collapse happens.

    The problem with curated data is that you have to, well, curate it, and that’s hard to do at scale. No longer do we have a few decades’ worth of unpoisoned data to work with; the only way to guarantee training data isn’t from its own model is to make it yourself