This is a praise question.
I think the desert is beautiful. Americans love and cherish their deserts and it attracts millions of tourists from all over the world too.
In Mexico the "desert" is not seen the same. It's something you want to avoid, something that has a bad connotation, and synonym for scarsness and ugliness and definitely no one vacations there.
We share the Chihuahuan desert and even it's amazing to see how different it's treated in both sides of the border.
29 comments
I did not know I did this, or that anyone does this outside of an appreciation for beauty where it is.
We do?
I’m sure there’s also a difference in safety and economic development in both regions. Northern Mexico has a lot of structural issues that the southwestern U.S. doesn’t have that prevents it from being an economic center and tourism capital.
We do?
A lot of the US is damp and often cold. It affects the breathing, the joints, a lot of things you might not expect if you haven’t experienced a humid, cool climate for an extended period.
I love the dry air, lack of humidity and mild temperatures mostly. I hate being cold and snow.
Rarely heard of desert fantasizing. Forest fantasizing, yes.
The mythology surrounding the “old west” probably has at least something to do with this mindset amongst those of us that have it.
The romanticization comes from the history with settling the Wild West, cowboys, and outlaws. The desert was the last frontier in the continental US and the lack of governmental oversight as an idea is still extremely popular.
“Americans” believe this?
I wasn’t aware I did that. Thank you for letting me know. I will research “why” at once.
Plenty of people in the US see the desert as a boring wasteland, or as a place to exploit.
Then again, there are a number of protected desert areas on both sides of the border, so clearly at least some people in Mexico see the desert as worthwhile.
Cause the winter in like 80% of the rest of the country sucks so bad
Coming from Southern California, I know a lot of people who appreciate the beauty of the desert, but I don’t know of anyone who romanticizes it. It’s a dangerous place and most people avoid it, unless going for specific purposes like recreation.
I think it’s a regional thing. Outside of the states with deserts, I don’t find Americans think about the desert much. those who live there really do seem to appreciate it
I just think the desert is cool
We like geographical diversity and the US has a fair amount of it, I think. And not everyone loves it, but there are ~350mil of us, the third largest population in the world, of course some people are into it.
People from hot dry climates fantasize about cold wet climates, and vice versa. America is a big country and there are many different climates. I’m from Texas and I vacation in Scotland for some nice cloudy weather. Scottish people usually find this hilarious and talk about how they’d love to move to Texas and enjoy all the sunshine.
I sure don’t.
I am from green part of the country with hills and trees. Visiting Phoenix or Vegas for work in the Summer made me ask “how can people live here for six months of the year?”
I find deserts fascinating. The endless expanses and huge sky are amazing to me. It also feels good to be in a place not overflowing with annoying people. Forests and mountains are similar in this regard, just different. I love uncrowded spaces.
A couple of reasons. First, I’d say there is very real natural beauty in it, but then also there’s intangible beauty, something more abstract. We tend to appreciate an idealized version or a pioneer spirit, going where nature wouldn’t want us to go. It’s a similar thing to the way we romanticize space travel, oceanic exploration, climbing mountains, and the like. There are things to see and to experience, but you can’t access them unless you’re willing to risk going where we aren’t mean to survive.
I only romanticize dessert. 🧁
Beautiful scenery, wide open spaces, sunsets. In the late 80’s early 90’s, a lot of movies romanticized the desert.
I live in the desert southwest, by choice. Moved here from the Midwest.
For me, the landscape is beautiful. The dry air lends a different tint to the colors, more like watercolors and less like pastels. The desert wildlife is unique and interesting. The quiet is peaceful. The desert has a unique smell, especially after a rare rain, like sage and pinion and juniper. It never smells so much like the desert as after one of those rains.
And the cultures of the people who live in the desert states are rich and more to my liking than where I came from.
Having said all of that, it’s easy to like the desert with modern conveniences. If I had to live off the local land, it would be a different story, but that hasn’t been necessary for a very long time.
I think most Americans do not love or romanticize it, though some definitely do. You have to keep in mind, one of the biggest pulls for the US is our robust national park system and the fact that we have so many diverse geographic features. I live on the East Coast and have never seen a desert in person. I would love to see them though, because I’ve seen photos of Badlands and Chihuahua and Mojave, and they all seem breathtaking.
However, there are different types of vacationers. Many vacation for luxury and relaxation and would never choose a desert over a resort. Some vacation for culture or history, and also wouldn’t care as much. But many vacation to just experience everything this world has to offer, and that crowd is drawn to places like this. Same thing with my local national park. I live near Congaree, which is like a swamp with old growth trees, hot and humid as hell, and infested with mosquitos half the year. It’s frequently listed as one of the less beautiful parks, but I love the fuck out of it because it is a unique natural landscape. I think the desert is the same.
My in-laws live in the Chihuahuan Desert (US part) and that is the ugliest desert I’ve ever seen. So there’s that.
I certainly do not. I live in the PNW and our weather is just about perfect. You couldn’t pay me enough to live in the desert. I don’t even want to visit.
I just think it’s neat, but people who vacation in the desert are either camping or going to specific scenic locations. They’re usually not just wandering in the middle of nowhere.
Fuck that. If I never see a desert again it’ll be too soon. 90% of the times I almost died it was in a desert (the rest involved too much alcohol and being a young and dumb 20-something).
No romanticism here. Fuck the desert. Just surround me with green and blue. Anything but tan.