I’ve never been that far to the south but lately I’ve been reading and watching those novels and movies.
The prevalent idea is: in this world (Texas?) you are alone, nobody gives a cr*p about you, do not trust anyone because they’ll take advantage of you, ridicule and mock you. The world (or maybe only Texas?) is an inhospitable, inhuman, Darwinist place.
This is a very weird take.
In places like Texas and Louisiana, there’s like it’s own ‘culture’ I guess for lack of better words.
Not only that, you get back Woods living, country living OR inner city living.
There’s a southern I guess ‘way’ don’t know how to call it exactly. But it’s not what you’re describing.
People are nicer. Actually stop on the highway to help you.
Why do Texans assume that only happens in Texas?
You hear this all the time, like it’s something special, but then you break into it, and nope, it’s not any different than here.
Maybe you’re just unpleasant to be around lol
That doesn’t make sense. The other commenter is merely saying that they see people do the same things in other states.
That’s all. What does then being unpleasant have to do with it.