Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
From my experience, having a mix of friends growing up, about 30% of them white, the only people I knew whose parents fully supported them pursuing a career in the arts were white. Everyone else's parents discouraged it, considering it an impractical waste of time and a sure ticket to a career of waiting tables.
And Hollywood tends to like to get their stars young, so if an acting career isn't started until after the aspiring actor moves out from home (which again, it was mainly only white Americans I knew who moved out right at 18), it's probably too late.
Of course, most of my non-white friends were immigrants though, so that likely influences things. And I'm not saying that a definitely the reason, it's just a guess based on my observation.
Hollywood's a scuzzy, corrupt industry though, so there's that too.
Hollywood is very liberal and liberals want more diversity, anti-white and pro-Islam.
So logically I would assume that I would see less White actors and more non-White actors (Blacks, Asians) but that's not the case in Hollywood.
Also if they are pro-Islam, I would assume that we would see many Muslim characters and women wearing hijabs in movies but we barely ever see that.
Top actors in Hollywood are still heavily White.
Why is it like this? Sounds very illogical. Why are they doing the opposite of what they want in Hollywood?
Publicly claim to be anti-racist and subconsciously be racist... liberals... racists that have fooled people into thinking they aren't racists... if you dig deep enough, you'll find liberals are even more racists than the KKK... their subconscious begins to emerge and their true identity is revealed... they need to portray as anti-racists because they will lose out on a lot of money otherwise...
I'd argue that there are more white people aspiring to be actors, writers, directors and producers, and that is reflected in the product being put out.
Hollywood is very liberal and liberals want more diversity, anti-white and pro-Islam.
So logically I would assume that I would see less White actors and more non-White actors (Blacks, Asians) but that's not the case in Hollywood.
Also if they are pro-Islam, I would assume that we would see many Muslim characters and women wearing hijabs in movies but we barely ever see that.
Top actors in Hollywood are still heavily White.
Why is it like this? Sounds very illogical. Why are they doing the opposite of what they want in Hollywood?
Wealthy Spoiled Hypocrites
Progressive Liberals are long on hypocrisy "do as I say not as I do" ideology and authoritarian position.
As soon as their lifestyle is affected... not so much. It's all an act for show.
Actor - professional liar
Quote:
Originally Posted by LuckyGem
Hollywood is more diverse today than it was 25 years ago.
However, it is still a nepotistic cesspool of cronyism.
Nonsense! It is perceived, intentionally, as that; however nothing has changed. The WHITE MEN pulling the strings are still the same WHITE MEN pulling the strings.
Blacks have been conditioned to believe that liberal progressives care about black issues. Never mind that liberal progressives do nothing for blacks but instead do things to them.
Most but not all top white actors are liberals. So why not spend hard worked for money to help them get paid millions because after all, in the end there is great lip service, you know, they speak out, lend their name to black causes which is the same as sharing wealth isn't it? Its the same as respect isn't it?
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.