Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
The United States derives most of it's own culture from the United Kingdom, so I would say the UK. Our language, protestant majority, our last names. Even the way we cut meat at our butchers is derived from what we learned from ancestors that came from England.
I think that the cultures of the two countries are so intertwined, particularly over the past century, that I just don't see how anyone can intelligently compare the two.
I was going to say 'western' culture, but really, both of these nations have influenced world culture. It's pretty hard to really say definitively, America has probably had the most influence in the past 50 years through the reach and power of the mass media, but Britain continues to be very influential, especially in the English speaking world and in Commonwealth countries. Take music. While America gave us blues, jazz, rock'n'roll etc, many of the greatest rock and pop grounds like the Beatles, Stones, Led Zep, originated from Britain.
In terms of literature, while in the past 200 years Britain has produced Jane Austen, Dickens, H.G.Wells, George Orwell, C.S.Lewis etc (not talking Shakespeare or anyone pre 1800), the US has produced Mark Twain, Ernest Hemingway, John Steinbeck and poets like T.S.Elliott and Emily Dickinson.
In terms of science/technology it's an even-keel: the US has Ben Franklin, Thomas Edison, while the UK has people like Alexander Graham Bell. It's not just about names, however. I think both nations have led the world since the mid 19th century. The Industrial revolution kicked off in Britain, France and Belgium, but many of the advancements occured in the US as well like the cotton gin and the standardization of the factory production line.
Nowadays I don't think culture is as totally American-dominated as some people fear. The Brits are still exporting alot of culture, celebs (anti-culture, perhaps?), to the world including the States.
Anyway, thought this might be an interesting, if not very objective debate. Feel free to be as passionate as you like, as long as you keep to the topic and don't start a country vs country flamewar.
I think both Countries have contributed exceedingly well when it comes to moral degradation in our cultures --- its certainly going to be a close race to see who claims victory for the top National STD Epidemic (per capita) ... as an example. Its going to be a close one for sure with increased secular humanism and a disdain toward God at both homefronts.
Ultimately, the US itself was basically created by the British. Both countries are intertwined, but the US branched out, gained independence and took a different direction. Now the boot is on the other foot, with the US exerting dominance over British culture in terms of music, media, corporations and movies. It's a very interesting subject. The Brtish Empire died, but now the US is the most powerful nation on earth. It's kind of like the torch has been passed from father to son, rather than any war being fought over dominance. The US has taken Britain's old role, while Britain seems content as is.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.