Welcome to City-Data.com Forum!
U.S. CitiesCity-Data Forum Index
Go Back   City-Data Forum > General Forums > Education
 [Register]
Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
View detailed profile (Advanced) or search
site with Google Custom Search

Search Forums  (Advanced)
Reply Start New Thread
 
Old 02-20-2014, 09:28 AM
 
28 posts, read 61,229 times
Reputation: 42

Advertisements

Quote:
Originally Posted by lkb0714 View Post
Seriously, read your own source.

"This study does not identify the 33 institutions, as the authors guaranteed them anonymity in return for sharing a great deal of internal data."

They shared information that could potentially have been identifying for students, especially in very small schools. Of course they would require anonymity. Additionally, the schools have an obligation to protect the reputation of their recent graduates, if the study had not come out favorably, it could have been detrimental to their students.
Nice! I read that too and should have been more explicit in my OP... I think the PU1 and MS1 codes in the charts are fine. Anything that shows data from a single school can be suppressed (not sure how much more this would have revealed though?). But I don't see a reason not to provide a general list in an appendix. Would you and I have figured out some stuff? Sure... but still would have only been (good) guesses.

Social Science research does this all the time. To get data, they have to promise anonymity. But I think at some point the research message gets undermined. My personal guess is that many of these 33 are schools well off the radar and do not represent the aggregate of US universities. How could they? It looks like ~140 of the Top 150 schools won't let you apply without test scores.

I should have mentioned the promise of anonymity. My bad. Otherwise, after reading the 70 pages, what do you think of the study? Would love to know.
Reply With Quote Quick reply to this message

 
Old 02-20-2014, 09:40 AM
 
20,793 posts, read 61,282,830 times
Reputation: 10695
Quote:
Originally Posted by lkb0714 View Post
Seriously, read your own source.

"This study does not identify the 33 institutions, as the authors guaranteed them anonymity in return for sharing a great deal of internal data."

They shared information that could potentially have been identifying for students, especially in very small schools. Of course they would require anonymity. Additionally, the schools have an obligation to protect the reputation of their recent graduates, if the study had not come out favorably, it could have been detrimental to their students.
I don't have a problem with them not identifying the schools...the issue I have is they don't back up their conclusions with any data. What were the GPA's in high school, what are they now in college, what were their test scores submitted and not submitted??? Lets see some actual numbers backing up the conclusions.
Reply With Quote Quick reply to this message
 
Old 02-20-2014, 10:24 AM
 
78,339 posts, read 60,527,398 times
Reputation: 49627
Quote:
Originally Posted by golfgal View Post
I don't have a problem with them not identifying the schools...the issue I have is they don't back up their conclusions with any data. What were the GPA's in high school, what are they now in college, what were their test scores submitted and not submitted??? Lets see some actual numbers backing up the conclusions.
Most people pushing an agenda anymore just go to the media with their claims....the media checks nothing and just airs it as some sort of statement of fact without any rebuttal or equal time because that would require effort and actual journalism.

It pings around the internet with a few astute people noting it's got no supporting substance but a larger group will latch onto it as fact and then regurgitate the "findings" for years to come.

For example, a horrendously bad study paid for by the Missouri trial lawyers stated that there was no need for lawsuit caps in the state because medical insurance costs were fine and it was the insurance companies that were gouging doctors.

NPR ran it, no rebuttal.
KC star newspaper ran it, no rebuttal.

National Assoc. of Insurance Commissioners and pretty much every other actuary took one look at the study and pointed out the glaring flaw in it, laughed and walked away....but it was too late because the average rank and file person already heard it, saw it....and it was fact.

P.S. I don't mean to pick on NPR, just one example. A number of news organizations have done the same whether it's breaking news about a shooting or Obama's birth cert etc etc etc.
Reply With Quote Quick reply to this message
 
Old 02-20-2014, 10:48 AM
 
20,793 posts, read 61,282,830 times
Reputation: 10695
Quote:
Originally Posted by Mathguy View Post
Most people pushing an agenda anymore just go to the media with their claims....the media checks nothing and just airs it as some sort of statement of fact without any rebuttal or equal time because that would require effort and actual journalism.

It pings around the internet with a few astute people noting it's got no supporting substance but a larger group will latch onto it as fact and then regurgitate the "findings" for years to come.

For example, a horrendously bad study paid for by the Missouri trial lawyers stated that there was no need for lawsuit caps in the state because medical insurance costs were fine and it was the insurance companies that were gouging doctors.

NPR ran it, no rebuttal.
KC star newspaper ran it, no rebuttal.

National Assoc. of Insurance Commissioners and pretty much every other actuary took one look at the study and pointed out the glaring flaw in it, laughed and walked away....but it was too late because the average rank and file person already heard it, saw it....and it was fact.

P.S. I don't mean to pick on NPR, just one example. A number of news organizations have done the same whether it's breaking news about a shooting or Obama's birth cert etc etc etc.
Then they come here and scream about how unfair it all is...because they read it in the news you know....
Reply With Quote Quick reply to this message
 
Old 02-20-2014, 11:12 AM
 
28 posts, read 61,229 times
Reputation: 42
Not sure how intense I am allowed to go, but if you guys are cool going down the rabbit hole, would love to get your take. I have to do all this for an off-line project, so this discussion is really sweet!

Here is a piece of research from 2012 (link). I seriously have no skin in the game on whether SATs matter or not. At the end of the day, I have this (very) personal view that if (mainstream) colleges could get around these pesky tests (and up their US News ranks!), they would. So, tests are either in a sunset phase (dying a slow death) or bottom line, they may be useful?

I am a bit biased to meta-analyses (241 data sets in above example) in social science. Brings in that wisdom of crowds factor. Here is a piece from the abstract...

Univariate analyses revealed that demographic and psychosocial contextual factors generated, at best, small correlations with GPA. Medium-sized correlations were observed for high school GPA, SAT, ACT, and A level scores. Three non-intellective constructs also showed medium-sized correlations with GPA: academic self-efficacy, grade goal, and effort regulation. A large correlation was observed for performance self-efficacy, which was the strongest correlate (of 50 measures) followed by high school GPA, ACT, and grade goal.

Quick digression... I hear you on the conditional probability. I think GPA is the defacto measure of "college" success, but not "life" success. I can get an A in college and not know how to communicate in my job. I can drop out of school and be very successful. But GPA in primary, secondary and tertiary seems to be the best of the worst measures (only within that setting).

According to the study, "high school GPA is a stronger predictor of university GPA than is either the SAT or the ACT." Zero surprise... GPA is a continuum... I would have liked to have seen self-regulatory learning strategies have a more positive correlation to GPA success, since that is my (personal) hypothesis.

Few highlights (copied and pasted)...
  1. High school GPA and SAT/ACT collectively explained 22% of the variance in (college) GPA.
  2. ACT was a stronger predictor of (college) GPA than SAT.
  3. Among 12 motivational factors, medium positive correlations were observed for academic self-efficacy (r  .31) and grade goal (r  .35), whereas a large positive correlation was found for performance self-efficacy (r  .59 < Wow!). Performance self-efficacy and grade goal were the strongest of the 42 non-intellective associations tested.
  4. Discounting small correlations, performance self-efficacy, grade goal, effort regulation, and academic self-efficacy emerged as the strongest correlates of tertiary GPA, alongside traditional assessments of cognitive capacity and previous performance.
  5. Three motivational constructs (academic self efficacy, grade goal, and locus of control) explained 14% of variance in GPA, with grade goal being the strongest predictor, followed by academic self-efficacy.
  6. In the self-regulatory learning domain, a model including six behavioral and cognitive learning strategies accounted for 11% of the variance. Effort regulation was the strongest predictor, followed by meta-cognition.
  7. Analysis indicated that, combined, measures of effort regulation, test anxiety, academic self-efficacy, and grade goal accounted for 20% of the variance in GPA.

Bottom line? ACT throws a kink in the test optional argument (according to this one random research report ). And SAT (r .29) can't be ignored. Look on page 14 of the PDF at the R+ column... That self-efficacy number blew my mind! And they came at it with both performance and academic efficacy. Yes, the sample size for performance was too small, but academic was meaty. Grade goal sample size was also too small for my liking... I feel more digging around the internet is in order!

If I were a teacher, I might consider (additional?) goal setting at the individual level (may be unrealistic?). Then I would (further) focus efforts on student self-sufficiency. Teach a kid to fish... I guess? Pretty wild conclusions... will have to sleep on them! The collective classroom (approach) takes a real hit with this research...

Reply With Quote Quick reply to this message
 
Old 02-20-2014, 12:50 PM
 
16,825 posts, read 17,720,029 times
Reputation: 20852
Quote:
Originally Posted by golfgal View Post
I don't have a problem with them not identifying the schools...the issue I have is they don't back up their conclusions with any data. What were the GPA's in high school, what are they now in college, what were their test scores submitted and not submitted??? Lets see some actual numbers backing up the conclusions.
It is a violation of FERPA to release ANY of that. Following federal law is not a scientific flaw. And giving a statistical test, like r, and then giving the p-value, is more meaningful than having an appendix of hundreds of thousands of raw data points.

You are asking for a degree of data that is not part of the scientific standard. No peer reviewed article publishes the raw data, which is why they always assign a corresponding author. If you have questions about the science, you ask them.
Reply With Quote Quick reply to this message
 
Old 02-20-2014, 12:52 PM
 
16,825 posts, read 17,720,029 times
Reputation: 20852
Quote:
Originally Posted by Edu.Architect View Post
Not sure how intense I am allowed to go, but if you guys are cool going down the rabbit hole, would love to get your take. I have to do all this for an off-line project, so this discussion is really sweet!

Here is a piece of research from 2012 (link). I seriously have no skin in the game on whether SATs matter or not. At the end of the day, I have this (very) personal view that if (mainstream) colleges could get around these pesky tests (and up their US News ranks!), they would. So, tests are either in a sunset phase (dying a slow death) or bottom line, they may be useful?

I am a bit biased to meta-analyses (241 data sets in above example) in social science. Brings in that wisdom of crowds factor. Here is a piece from the abstract...

Univariate analyses revealed that demographic and psychosocial contextual factors generated, at best, small correlations with GPA. Medium-sized correlations were observed for high school GPA, SAT, ACT, and A level scores. Three non-intellective constructs also showed medium-sized correlations with GPA: academic self-efficacy, grade goal, and effort regulation. A large correlation was observed for performance self-efficacy, which was the strongest correlate (of 50 measures) followed by high school GPA, ACT, and grade goal.

Quick digression... I hear you on the conditional probability. I think GPA is the defacto measure of "college" success, but not "life" success. I can get an A in college and not know how to communicate in my job. I can drop out of school and be very successful. But GPA in primary, secondary and tertiary seems to be the best of the worst measures (only within that setting).

According to the study, "high school GPA is a stronger predictor of university GPA than is either the SAT or the ACT." Zero surprise... GPA is a continuum... I would have liked to have seen self-regulatory learning strategies have a more positive correlation to GPA success, since that is my (personal) hypothesis.

Few highlights (copied and pasted)...
  1. High school GPA and SAT/ACT collectively explained 22% of the variance in (college) GPA.
  2. ACT was a stronger predictor of (college) GPA than SAT.
  3. Among 12 motivational factors, medium positive correlations were observed for academic self-efficacy (r  .31) and grade goal (r  .35), whereas a large positive correlation was found for performance self-efficacy (r  .59 < Wow!). Performance self-efficacy and grade goal were the strongest of the 42 non-intellective associations tested.
  4. Discounting small correlations, performance self-efficacy, grade goal, effort regulation, and academic self-efficacy emerged as the strongest correlates of tertiary GPA, alongside traditional assessments of cognitive capacity and previous performance.
  5. Three motivational constructs (academic self efficacy, grade goal, and locus of control) explained 14% of variance in GPA, with grade goal being the strongest predictor, followed by academic self-efficacy.
  6. In the self-regulatory learning domain, a model including six behavioral and cognitive learning strategies accounted for 11% of the variance. Effort regulation was the strongest predictor, followed by meta-cognition.
  7. Analysis indicated that, combined, measures of effort regulation, test anxiety, academic self-efficacy, and grade goal accounted for 20% of the variance in GPA.

Bottom line? ACT throws a kink in the test optional argument (according to this one random research report ). And SAT (r .29) can't be ignored. Look on page 14 of the PDF at the R+ column... That self-efficacy number blew my mind! And they came at it with both performance and academic efficacy. Yes, the sample size for performance was too small, but academic was meaty. Grade goal sample size was also too small for my liking... I feel more digging around the internet is in order!

If I were a teacher, I might consider (additional?) goal setting at the individual level (may be unrealistic?). Then I would (further) focus efforts on student self-sufficiency. Teach a kid to fish... I guess? Pretty wild conclusions... will have to sleep on them! The collective classroom (approach) takes a real hit with this research...
R-value by itself is meaningless. Absolutely, meaningless. If you don't give significance than you might as well just use mean without SE.
Reply With Quote Quick reply to this message
 
Old 02-20-2014, 01:09 PM
 
Location: The New England part of Ohio
24,097 posts, read 32,437,200 times
Reputation: 68283
I think that there usefulness has diminished in recent years because so many students aggressively prepare for these tests.

They are taught to take a test and to do well at it. How is that predictive of college success?
Reply With Quote Quick reply to this message
 
Old 02-20-2014, 05:04 PM
 
2,401 posts, read 3,255,451 times
Reputation: 1837
Quote:
Originally Posted by sheena12 View Post
I think that there usefulness has diminished in recent years because so many students aggressively prepare for these tests.

They are taught to take a test and to do well at it. How is that predictive of college success?
Well if the students are able to study hard for the test while not sacrificing other aspects of their college application, doesn't that show something about their work ethic and intelligence?
Reply With Quote Quick reply to this message
 
Old 02-20-2014, 05:27 PM
 
20,793 posts, read 61,282,830 times
Reputation: 10695
Quote:
Originally Posted by lkb0714 View Post
It is a violation of FERPA to release ANY of that. Following federal law is not a scientific flaw. And giving a statistical test, like r, and then giving the p-value, is more meaningful than having an appendix of hundreds of thousands of raw data points.

You are asking for a degree of data that is not part of the scientific standard. No peer reviewed article publishes the raw data, which is why they always assign a corresponding author. If you have questions about the science, you ask them.
Not in the context of a study where no identifying information is sent and it is catagorical such as : students with a high school GPA of 3.5 had an average college GPA of 3.2 +/- .2 or whatever.....nevermind that college are already required to release that data to the Common Data Set--at least for test scores .

Quote:
Originally Posted by Edu.Architect View Post
Not sure how intense I am allowed to go, but if you guys are cool going down the rabbit hole, would love to get your take. I have to do all this for an off-line project, so this discussion is really sweet!

Here is a piece of research from 2012 (link). I seriously have no skin in the game on whether SATs matter or not. At the end of the day, I have this (very) personal view that if (mainstream) colleges could get around these pesky tests (and up their US News ranks!), they would. So, tests are either in a sunset phase (dying a slow death) or bottom line, they may be useful?

I am a bit biased to meta-analyses (241 data sets in above example) in social science. Brings in that wisdom of crowds factor. Here is a piece from the abstract...

Univariate analyses revealed that demographic and psychosocial contextual factors generated, at best, small correlations with GPA. Medium-sized correlations were observed for high school GPA, SAT, ACT, and A level scores. Three non-intellective constructs also showed medium-sized correlations with GPA: academic self-efficacy, grade goal, and effort regulation. A large correlation was observed for performance self-efficacy, which was the strongest correlate (of 50 measures) followed by high school GPA, ACT, and grade goal.

Quick digression... I hear you on the conditional probability. I think GPA is the defacto measure of "college" success, but not "life" success. I can get an A in college and not know how to communicate in my job. I can drop out of school and be very successful. But GPA in primary, secondary and tertiary seems to be the best of the worst measures (only within that setting).

According to the study, "high school GPA is a stronger predictor of university GPA than is either the SAT or the ACT." Zero surprise... GPA is a continuum... I would have liked to have seen self-regulatory learning strategies have a more positive correlation to GPA success, since that is my (personal) hypothesis.

Few highlights (copied and pasted)...
  1. High school GPA and SAT/ACT collectively explained 22% of the variance in (college) GPA.
  2. ACT was a stronger predictor of (college) GPA than SAT.
  3. Among 12 motivational factors, medium positive correlations were observed for academic self-efficacy (r  .31) and grade goal (r  .35), whereas a large positive correlation was found for performance self-efficacy (r  .59 < Wow!). Performance self-efficacy and grade goal were the strongest of the 42 non-intellective associations tested.
  4. Discounting small correlations, performance self-efficacy, grade goal, effort regulation, and academic self-efficacy emerged as the strongest correlates of tertiary GPA, alongside traditional assessments of cognitive capacity and previous performance.
  5. Three motivational constructs (academic self efficacy, grade goal, and locus of control) explained 14% of variance in GPA, with grade goal being the strongest predictor, followed by academic self-efficacy.
  6. In the self-regulatory learning domain, a model including six behavioral and cognitive learning strategies accounted for 11% of the variance. Effort regulation was the strongest predictor, followed by meta-cognition.
  7. Analysis indicated that, combined, measures of effort regulation, test anxiety, academic self-efficacy, and grade goal accounted for 20% of the variance in GPA.

Bottom line? ACT throws a kink in the test optional argument (according to this one random research report ). And SAT (r .29) can't be ignored. Look on page 14 of the PDF at the R+ column... That self-efficacy number blew my mind! And they came at it with both performance and academic efficacy. Yes, the sample size for performance was too small, but academic was meaty. Grade goal sample size was also too small for my liking... I feel more digging around the internet is in order!

If I were a teacher, I might consider (additional?) goal setting at the individual level (may be unrealistic?). Then I would (further) focus efforts on student self-sufficiency. Teach a kid to fish... I guess? Pretty wild conclusions... will have to sleep on them! The collective classroom (approach) takes a real hit with this research...
The SAT measures math and critical reading as well as writing which some schools care about but most do not. If you are not a strong math student, you aren't going to score well..but maybe you are an amazing writer and do well on that part and still get a 4.0 in college because you are going with your strengths. That is the downfall of the SAT. They have since added the subject scores but most colleges do not use those. The ACT measures math, science, social science, English and Writing..so it hits more of the subject areas and if you aren't as strong in one area, it doesn't hurt your score as much. More and more schools are preferring the ACT because of this--it shows a more clear picture of the overall student.
Reply With Quote Quick reply to this message
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.

Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.


Reply
Please update this thread with any new information or opinions. This open thread is still read by thousands of people, so we encourage all additional points of view.

Quick Reply
Message:


Over $104,000 in prizes was already given out to active posters on our forum and additional giveaways are planned!

Go Back   City-Data Forum > General Forums > Education

All times are GMT -6. The time now is 08:58 PM.

© 2005-2024, Advameg, Inc. · Please obey Forum Rules · Terms of Use and Privacy Policy · Bug Bounty

City-Data.com - Contact Us - Archive 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37 - Top