I’ve never been that far to the south but lately I’ve been reading and watching those novels and movies.

The prevalent idea is: in this world (Texas?) you are alone, nobody gives a cr*p about you, do not trust anyone because they’ll take advantage of you, ridicule and mock you. The world (or maybe only Texas?) is an inhospitable, inhuman, Darwinist place.

  • Alice@hilariouschaos.com
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    10
    ·
    2 months ago

    This is a very weird take.

    In places like Texas and Louisiana, there’s like it’s own ‘culture’ I guess for lack of better words.

    Not only that, you get back Woods living, country living OR inner city living.

    There’s a southern I guess ‘way’ don’t know how to call it exactly. But it’s not what you’re describing.

    People are nicer. Actually stop on the highway to help you.

    • FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      3
      ·
      edit-2
      2 months ago

      People are nicer. Actually stop on the highway to help you.

      Why do Texans assume that only happens in Texas?

      You hear this all the time, like it’s something special, but then you break into it, and nope, it’s not any different than here.