Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
So, I've heard multiple times that western United States have a better "quality of life". And it is beautiful and I do like that climate.
However, something Always draws me to the south and I can't give up my home in South Carolina.
Basically, I just want opinions for those who prefer western untied states over the south or prefer southern living. What is it about the south that makes it so charming?
I live in the South also, but have never lived in the west. However I have spent time out there. The west is beautiful, its dry and its wide open. If you don't like our wet warm humid climate and would enjoy some desert sun then it may be for you. There are far less bugs, lots of sun etc. Its also important to remember what they don't have. There is not much water, not much green and no where near the history we have. You live in SC, history is everywhere there and Charleston is IMO one of the neatest towns in this nation. Nothing like that out west. I enjoy the green landscape, the history. Here in Tennessee we have lots of water and the beautiful green Smoky Mountains. I don't think I would give that up for the dryer west. However if that dry climate is your preference it may be worth it to you. The west is very pretty.
I have lived in both and I would have to say the people I prefer the south for the atlantic ocean and gulf beach culture and boating is number one reason for me at least, though its alot more crowded in the mid atlantic beaches and weather is not as good.
better cost of living in the south for the most part, california is wonderful but very expensive to live. my father bought a house with a huge backyard and a beach cottage in fort lauderdale for the price of our house in s. california. I also have family myrtle beach and hilton head, everyone stayed down south in the carolinas and florida. Maryland is southern in a few places
friendliest people in America in the south I would say, again I am talking certain areas of the south, a few big cities in florida are not always friendly or very southern, though there are some very friendly people in many parts of the west. I like the seafood and food shrimp and grits and crab cakes etc, great southern food in certain places. Very sexy region too, lots of sexy southern girls and maybe the climate makes it more sultry
people complain about heat and bugs in the south but there just arent these type of caribbean style white sand beaches out west until you get out to hawaii.
I live in the South also, but have never lived in the west. However I have spent time out there. The west is beautiful, its dry and its wide open. If you don't like our wet warm humid climate and would enjoy some desert sun then it may be for you. There are far less bugs, lots of sun etc. Its also important to remember what they don't have. There is not much water, not much green and no where near the history we have. You live in SC, history is everywhere there and Charleston is IMO one of the neatest towns in this nation. Nothing like that out west. I enjoy the green landscape, the history. Here in Tennessee we have lots of water and the beautiful green Smoky Mountains. I don't think I would give that up for the dryer west. However if that dry climate is your preference it may be worth it to you. The west is very pretty.
The West does have history, it's just different. There is a lot to learn about California, let alone the whole Western U.S.
So, I've heard multiple times that western United States have a better "quality of life". And it is beautiful and I do like that climate.
However, something Always draws me to the south and I can't give up my home in South Carolina.
Basically, I just want opinions for those who prefer western untied states over the south or prefer southern living. What is it about the south that makes it so charming?
I don't think the West has a better quality of life. It's more expensive, services aren't very good, and outside of tech and med the economy is very weak and undiversified. The people are also less outgoing and helpful than people in other parts of the country. Most of the cities are ugly and look stuck in the 1970s/1980s, with little historical architecture and not much in the way of modern architecture either.
Ha ha! This week we're visiting the spouse's family in NC. Back story is we lived for 3 years in the famous Destin, FL, currently living in Las Vegas but orinally from the vilified California. What I like about the southern states is the feeling of slowing down. Things seem less urgent and people seem down to earth and more salt of the earth. The parts that sucked are the intense humidity (When I got off the plane my knees buckled), All the potential wildlife you can stumble on (In Florida I saw more snakes and bears than I cared to), and the acres of total darkness due to lots of trees and lack of street lights. I LOVED eating fresh shrimp/seafood from the gulf. And regional southern foods are always a treat.
In the west, the variety of climate and environments is a big bonus. You have the ocean, the mountains, the desert and more of an acceptance of different cultures. There's is a 'laid back' culture of to each his own but I wouldn't classify it as overly friendly. People seem more tolerant of ideals that they don't conform with. You can't beat the different ethnic cuisines of California or the fusion of these cultural foods. There is no place like Las Vegas. I don't care how much you hate sin city. There's not many places where you can spend the day hiking or skiing and the night gambling, sightseeing or eating dinner in a 5 star restaurant. Have you ever star gazed on a summer night? The skies are so clear and you get the best views. California's offerings are so immense. From San Diego's marine culture, LA's entertainment industry, the Central Valley feeds the nation. SLO and Sonoma for wine and the north coast for weed. Yeah I said it. I haven't even touched on Oregon and Washington State. You want green? Oregon AND Wahington has got your back. All of those areas are expensive for a reason. Down fall, no water right now. Climate change is a mofo. Look at the floods of The Midwest and Texas. Wasn't Texas in a drought for a couple of years? I'm a big fan of Americana. No place is better than the other, just different and that's what makes us beautiful. AND if you don't like one place, you can just move. God bless America.
Last edited by MAXIALE02; 07-18-2015 at 09:41 PM..
The West does have history, it's just different. There is a lot to learn about California, let alone the whole Western U.S.
It does have history, everywhere has history. What I meant is that the west does not have large historic cities like Charleston or New Orleans. There is no Jamestown, no St Augustine etc. Then of course we have the history related to the war of northern aggression. The west was settled much later and there just is not as much really old historical places there. That does not mean the story of that settlement is not fascinating, or the story of the gold rush, the conflict with the Indians, the Mexican war etc. I did not mean to insult the west, hope you did not take it that way. The west does offer a lot, and obviously some 30 million people find your state to be a good place to live. (I assume your in CA)
well my husband and i were born and raised in the west and have lived in the south for about 10 years now. absolutely love it out here...the history, the landscaping, the architecture, the people, the way of life....so, the south has won these westerners over!
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.