Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
People tend to agree with polling when the outcome seems to validate their perceptions. They tend to dismiss polls when outcomes conflict with their perceptions. ALL opinion POLLS generally rely on a sampling and interpolate results across the population........
That pretty much sums it up. Somewhat funny but true. Especially the first two sentences.
we learned our lesson last time.
it is not the numbers, it is the methodology.
Good post. I would add it's also a function of the sample which many times is bent towards a certain ideology and outcome. Check methodology (wording of questions is key) and sample first to determine validity of the poll. Otherwise it's propaganda and fake news.
Some of the lefty responses on here crack me up. They do this all the time. They are not wordsmiths, but they do try to twist words around depending on who said what.
When it fits their agenda, they will say, "they never said that" or "it doesn't say that" or "that's not what it means" or that people are "reading too much into it". When it goes against their agenda, they then act like every single word matters, and anything that is left out, or isn't said exactly the right way means something it doesn't.
Quote:
NOTE: Bankrate asked this question only to individuals who were adults at the time of the recession in 2007. Bankrate surveyed 2,740 total adults nationwide, 2,315 of which were 18 or older when the downturn began.
They said that they asked "Americans", but then go on to say they asked "adults in America" or "adults nationwide". That does not mean that they were "American adults". If they felt the need to point out that "Americans" feel a certain way, then their "note" should include that they asked adult American citizens. There is a difference.
It is possible that they are sloppy in their work, which makes their polling questionable. Or it could be that their "note" is there for a reason - because there is a big difference between "American adults" and "adults living in America" or "adults nationwide".
In order for your thesis to have any validity, every one of these surveys would have to be composed of more non-citizens than not.
Logically, what are the chances of that being the case?
It’s interesting that you presume that only immigrants, etc. are experiencing financial difficulties.
Is there some unwritten rule that states that every native born American is not allowed to be unemployed and not just not unemployed, but working a job that pays well enough to have a decent standard of living?
Didn’t Trump run on bringing jobs back to depressed areas of the country?
Why would he have done that if every American was gainfully employed?
Sorry OP, thesis fail.
The lengths that some will go to not have their bubbles burst is quite astonishing.
You got the whole point of the post wrong. There are 2 issues.
1) Sample sizes are supposed to be representative of the population. If you are talking about the population of all adults who live in America the sample should be reflective of that. If you are talking about American adults the sample should be reflective of that. The pollsters said they polled adults living in America. But their summary of the results that goes to the media in press releases talks like everyone polled was American. We don't know what percentage of their sample was American adults as opposed to Adult residents of this country who may or may not be able to vote.
2) The other problem when they do this is that the news media makes its pronouncements not based on what the poll methodology says but what they get in their press releases. If the press releases say Americans were polled, the media runs with it and what they do around this time is use the information to source their predictions about the election. If the pollster calls everyone who lives in America, Americans, then the news media types are going to get their election predictions wrong because the pollsters polled an unknown quantity of people who can't vote.
They could do the exact same thing on a poll about abortion and it could swing the other way. It could show more people now are pro life because they polled all adults and may have a lot of people from Catholic countries who are not US citizens mixed into their sample compared to if they just polled American citizens. Meanwhile, you'd have the news media framing it that Americans are now more pro-life because that's what their press release on the poll results say.
All I want is for the pollsters to say in their summaries for press releases is, "More Adults Residing in America Say..." if that's who their methodology says those people were the ones sampled instead of saying "Americans Say..." I'm assuming the methodology section of the polls are accurate.
Good post. I would add it's also a function of the sample which many times is bent towards a certain ideology and outcome. Check methodology (wording of questions is key) and sample first to determine validity of the poll. Otherwise it's propaganda and fake news.
Early in the 2016 campaign cycle (prior to the primaries), a few pollsters asked a couple of of questions about the many Republican candidates running and how the poll responders rated each candidate on those qualities. One of the questions was how people rated each candidate on Leadership. Donald Trump kept coming out ahead of the other candidates on that particular quality. I'm sure you can guess what happened. The pollsters dropped the "Leadership" quality from subsequent polls even though they kept polling on other qualities. I'm pretty sure I posted about it when it happened, probably in the Elections Forum.
You got the whole point of the post wrong. There are 2 issues.
1) Sample sizes are supposed to be representative of the population. If you are talking about the population of all adults who live in America the sample should be reflective of that. If you are talking about American adults the sample should be reflective of that. The pollsters said they polled adults living in America. But their summary of the results that goes to the media in press releases talks like everyone polled was American. We don't know what percentage of their sample was American adults as opposed to Adult residents of this country who may or may not be able to vote.
2) The other problem when they do this is that the news media makes its pronouncements not based on what the poll methodology says but what they get in their press releases. If the press releases say Americans were polled, the media runs with it and what they do around this time is use the information to source their predictions about the election. If the pollster calls everyone who lives in America, Americans, then the news media types are going to get their election predictions wrong because the pollsters polled an unknown quantity of people who can't vote.
They could do the exact same thing on a poll about abortion and it could swing the other way. It could show more people now are pro life because they polled all adults and may have a lot of people from Catholic countries who are not US citizens mixed into their sample compared to if they just polled American citizens. Meanwhile, you'd have the news media framing it that Americans are now more pro-life because that's what their press release on the poll results say.
All I want is for the pollsters to say in their summaries for press releases is, "More Adults Residing in America Say..." if that's who their methodology says those people were the ones sampled instead of saying "Americans Say..." I'm assuming the methodology section of the polls are accurate.
I didn’t get it wrong.
You are assuming, based on absolutely no evidence, that the people polled are not Americans therefore the results have to be wrong or at minimum, misleading.
If the report states “Americans” why would you assume otherwise?
Do you think that market research companies would risk legal action by misreporting their sample?
People tend to agree with polling when the outcome seems to validate their perceptions. They tend to dismiss polls when outcomes conflict with their perceptions.
ALL opinion POLLS generally rely on a sampling and interpolate results across the population.
ALL POLLS generally rely on participants’ SELF IDENTIFICATION. This may include citizenship, voter registration status, party, age, income, wealth, employment status, education, race, ethnicity, religion and so on. It is not practical for a polling participant to prove their identity.
Politicians tend to live and die by polls.
Credible polls disclose their methods in the fine print.
You know, making sure they are polling Americans is probably harder to do and way more expensive.
I often question the idea of sampling in that you survive a certain number of people then try to apply it to a much larger population. Now, "3 out of 4 Americans that were involved in this poll said [whatever]" seems more credible than saying "3 out of 4 Americans said [whatever]".
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.