Quote:
Originally Posted by Judge_Smails
3. Results in Russia not looking good at all.
|
If I wanted to defend Russia, then this is very easy - the national ranking is calculated by simply combining all data. 400 from Asian or Caucasian republic + 550 from some Russian region /2 = 475. While it is a lot higher, if population is taken into account.
This test is also probably suffers from good schools giving it to the best students, while ordinary schools don't care. Otherwise, Moscow wouldn't score that much higher than all other regions - it should actually be much lower than most, thanks to immigrants.
Quote:
But OK. Can you describe which types of problem solving PISA fails to identify that Russian students excel at?
|
This test is alien for Russians.
Some criticism:
https://docs.google.com/viewer?a=v&q...q2OvKAEMKAQDQA
students sometimes just misunderstood what the item writer meant to ask.
it becomes apparent what the competence values actually measure: no more and no less than the number of right responses.
This suggests that little bias is needed to distort test results far beyond their nominal
disparities in student sampling
one-dimensional ?competence? scale is neither technically convincing nor culturally fair.
The ways students react to the lack of time vary considerably between countries:
Dutch students try to answer almost every item. Towards the end of the
test they become hasty and increasingly resort to guessing.
Austrian and German students skip many items, and they do so from the
?rst block on, which leaves them enough time to ?nish the test without
Greek students, in contrast, seem to be taken by surprise by the time
better than in Portugal and not far away from the USA and Italy. In the
last block, however, non-reached items and missing responses add up to
35%,
bringing Greece down to one of the last ranks.
Between-country variance may be due for instance to school curricula, cul-
tural background, test languange, or to a combination of several factors. This
factors are particularly in?uential in PISA because students have little time
(about 2'20? per item) and reading texts are too long.
If the languages di?er, correlations are at
best about 0.96, as for the Czech and Slovak Republics. If the languages do not
belong to the same stem, correlations are hardly larger than 0.94. While some
countries belong to large clusters, others like Japan and Korea are quite isolated
(no correlation larger than 0.90). These resultshave immediate implications for the validity of inter-country comparisons
Thirdly, it is clear from the outset that little can be learned when something
as complex as a school system is characterised by something as simple as the
average number of solved test items.
Quote:
And what results in what years are totally different. By my reckoning, the top of the table looks pretty much the same year after year.
|
At least regional results in Russia are BS. That's expected, since sample size in each region is basically zero.