• 0 Posts
  • 33 Comments
Joined 5 months ago
cake
Cake day: January 18th, 2024

help-circle




  • For the Meta apologists, I have a reality check for you:

    Threads was immediately subject to mass amounts of radicalizing, extremist content, and there have also been instances of users having personal information doxxed on Threads due to Meta’s information-harvesting practices. [1]

    Threads was marketed to be open to ‘free speech’ (read: hate speech and misinformation) and encouraged the Far-Right movement to join, who have spread extremism, hate, and harassment on Threads already. [2] Threads has been a hotbed of Israel-Palestine misinformation/propaganda. [3] They also fired fact-checkers just prior to Threads’ launch. [1]

    As already established, Meta also assisted in genocide! [4]

    Meta/FB/Instagram also have a strong history of facilitating the spread of misinformation and extremism, which contributed to the January 6th insurrection attempt. [5], [6]

    This really should be obvious by now… but Meta mines and sells their user’s information.[7] Just look at the permissions you have to grant them for Threads…

    FB users have to agree to all sorts of unethical things in the TOS, including giving Meta permission to run unethical experiments on their users without informed consent. [8] Their first published research was where they manipulated users’ feeds with positive or negative information, in order to see if it affected their mood. It did, and they successfully induced depression in many of their users!

    I will now turn to an article that surmises well the core practices of Meta as a company:

    • Elevates disinformation campaigns and conspiracy theories from the extremist fringes into the mainstream, fostering, among other effects, the resurgent anti-vaccination movement, broad-based questioning of basic public health measures in response to COVID-19, and the proliferation of the Big Lie of 2020—that the presidential election was stolen through voter fraud [16];

    • Empowers bullies of every size, from cyber-bullying in schools, to dictators who use the platform to spread disinformation, censor their critics, perpetuate violence, and instigate genocide;

    • Defrauds both advertisers and newsrooms, systematically and globally, with falsified video engagement and user activity statistics;

    • Reflects an apparent political agenda espoused by a small core of corporate leaders, who actively impede or overrule the adoption of good governance;

    • Brandishes its monopolistic power to preserve a social media landscape absent meaningful regulatory oversight, privacy protections, safety measures, or corporate citizenship; and

    • Disrupts intellectual and civil discourse, at scale and by design. [9]
















  • GONADS125@feddit.detoFediverse@lemmy.world0.19.3 is now the most installed version
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    5 months ago

    It’s different on other platforms (like mastodon) but on lemmy, it only blocks posts from the blocked instance. Users from Threads would still be interacting in comments with the user who blocks their instance.

    Regardless, I believe they should be defederated by instance admins on ethical grounds. Meta/FB have run unethical, uninformed experiments on their users, including purposefully inducing depression in their users.

    The fact that Meta has assisted in genocide should be grounds for defederation by instances which claim to protect and care about their users.

    Meta’s platforms have also played a key role in radicalizing users, and they purposefully marketed Threads to far-right extremists.

    Here’s my argument with citations

    There’s also good arguments to defederate and block them from the fedivers based on EEE.

    If an instance’s admins claim they care about protecting their users and providing a safe, healthy community but are federated with Threads, then they are either uninformed or liars.