Skip to main content
article icon

Assessment - China's Pisa prowess to come under pressure

news | Published in TES magazine on 6 December, 2013 | By: William Stewart

Dominance will be challenged when nation enters as a whole

China will enter the next round of the world’s most influential international education study as a whole country for the first time, TES can reveal.

The news comes in the week that the Programme for International Student Assessment (Pisa) results from 2012 were released, showing that the Chinese city of Shanghai had climbed even higher above the other 64 participants in all three test areas of maths, reading and science.

Much of East Asia is excelling in education according to Pisa, occupying the top seven places in the rankings for maths, the main focus of the latest assessment. Two other regions in China, Hong Kong and Macao, came in third and sixth place respectively. Singapore, a city state where three-quarters of the population are Chinese, came second, and Taiwan, another Chinese nation, finished fourth.

Many more Chinese students are set to take Pisa tests in 2015 because “China” rather than “Shanghai” is registered to take part. A spokesman for the Organisation for Economic Cooperation and Development (OECD), which runs Pisa, told TES: “I understand they expect enough Chinese regions to take part so they will be able to have a single (score) for the country as a whole.”

The inclusion of 15-year-olds from poor provinces far removed from China’s most dynamic city could tarnish the country’s stellar Pisa record when the next set of results are published in 2016.

This week’s results have prompted soul-searching in Western countries - such as former Pisa star Finland, where performance declined in all three areas, and the UK, where it flatlined - as well as speculation about the secret of Chinese success.

John Bangs, chair of the OECD Trade Union Advisory Committee’s working group on education, said: “The OECD insists that Shanghai’s success is genuine and it is not about rote learning and doing practice tests. I am not so sure.

“I don’t think any systematic study of whether or not there have been practice tests in individual countries that have done well has taken place. The OECD ought to do that study.”

But Andreas Schleicher, the OECD’s deputy education director, said Pisa evidence showed that East Asian educational achievement could not be explained by rote-learning stereotypes. “What is very important is that this is not a success (based on) drilling (students),” he said. “The emerging strength of East Asian systems is actually in the creative capacity of students to think mathematically and have advanced mathematical reasoning skills.”

Mr Schleicher also said that Shanghai authorities had done more than most governments to ensure that all children, and not just the elite, received a good education. But as Tom Loveless, a former teacher and a policy professor at Harvard University, has argued, Shanghai is not typical of the rest of China.

Writing for US thinktank the Brookings Institution, he notes that about 84 per cent of high-school graduates in the city - which traditionally attracts the country’s elite - go to university, compared with 24 per cent nationally. And its per capita GDP is more than twice that of China as a whole. This wealth allows Shanghai’s parents to spend much more on private tutors than typical Chinese workers.

Students in 12 Chinese provinces have in fact already taken Pisa tests, Mr Schleicher has revealed, although the results were never released. “Even in some of the very poor areas you get performance close to the OECD average,” he said in 2010.

This suggests that when the Chinese score is published in 2016 it will be nowhere near as high as that of Shanghai, where, according to Pisa, students are now “the equivalent of nearly three years of schooling” above the OECD average in maths.

Cambridge International Examinations chief executive Michael O’Sullivan, who worked for the British Council in China for seven years, said: “Anyone in China will tell you that the overall results won’t be as high as Shanghai, which is known for the quality of its state schools and the high aspirations of its families.”

Professor Loveless writes that even the scores from poorer provinces in China, referred to by Mr Schleicher, could be too high to be representative of 15-year-olds in those areas. This is because Pisa takes its participants from students who are actually at school, but secondary attendance rates in poor, rural areas of China are as low as 40 per cent and middle-school dropout rates could be as high as 25 per cent, he says.

The students who remain in school are likely to be from families that can afford the high school fees and are “strongly committed to formal education”, the academic writes.

This is a problem that the OECD recognised earlier this year in relation to poorer countries participating in Pisa. The organisation published a paper acknowledging that the most disadvantaged students in some low- and middle-income countries were missed by the study, which was confined to “an already relatively privileged student population”.

But Mr Schleicher stressed the equity that Pisa had uncovered in Shanghai. “The biggest performance gains in Shanghai have not been at the top,” he said. “They have been in the migrant groups and the poor schools.”

Asked to comment on the theory that “Tiger mothers” - ambitious, uncompromising parents - were the secret of China’s success, he accepted that the huge expectations placed on students were a factor. “It is not just the mothers,” he added. “It is also the grandmothers, it is the entire family, it is the schools.”

Mr Bangs said he believed that European countries could gain from examining the pedagogical approaches used in East Asia. “But there is a question about the kind of pressure those kids are under and whether or not those pressures stamp out confidence and innovation,” he added.

Read an investigation of Pisa on pages 28-32 and see the Pisa results table on pages 34-35

Case study: UK

The UK’s “stagnant” performance in Pisa 2012 has been seized upon by politicians of every hue, who are eager to use it to blame their opponents for the country’s apparent failings.

The UK was ranked 26th in maths, 23rd in reading and 21st in science, a largely similar performance to the Pisa 2009 results. According to England’s education secretary Michael Gove, this underlines the “urgent need” for his reform agenda.

But could the changes being introduced by the government lead to the improvements seen elsewhere, such as in the Far East?

Speaking at the Pisa press conference on Monday, Andreas Schleicher, the OECD official responsible for the rankings, said that no judgements could be made about the education secretary’s changes until the 2015 results were released in three years’ time.

A central pillar of Mr Gove’s reforms has been to increase school choice by introducing state-funded, autonomous free schools and boosting the number of academies. But Mr Schleicher appeared to play down the importance of choice. “You expect competition to raise performance of the high performers and with low performers put them out of the market,” he said. “But in fact you don’t see that. Competition alone is not a predictor for better outcomes.

“The UK is a good example: it has a highly competitive school system but it is still only an average performer.”

Likewise, he said that school autonomy was a good predictor of performance, but only if teachers were actively involved in the management of their schools. “UK schools already have great levels of autonomy so you would predict UK schools to do better than they actually do,” he said. “But the autonomy really needs to be embedded in a strong education system.”

He added that stronger, centralised exams worked well with greater autonomy, which chimed with Mr Gove’s overhaul of GCSEs.

Mr Schleicher’s statement was echoed by Professor Robert Coe of Durham University, who said that the UK’s performance seemed to “underline the view” that improvements in exam results had more to do with “grade inflation than real improvement over time”.

Richard Vaughan

Case study: Finland

Finland learned a week early that it had lost its position as world star of education when its Pisa results were leaked to a Helsinki newspaper.

“Since that day there has been an outcry in Finland: ‘everything is lost’,” said Annu Marganen, a reporter at Finnish news agency STT. “We have lost Nokia, our economy is not going well, and now this.”

Ms Marganen was among the guests at an event held by the Finnish embassy in London a few hours after the official release of the results, in which the country fell from sixth to 12th position in maths. Yet the mood in the ambassador’s residence was not one of doom and gloom. This was partly because the results were no surprise to the Finnish academics present: years of internal assessments had already shown a downward trend in performance.

Although the Finnish education system remains egalitarian in comparison with other countries, it has grown less equitable, with a growing “tail” of underachievers concentrated in less advantaged schools.

Venla Bernelius, a researcher at the University of Helsinki, said this was the result of parental choice over schools being introduced to different parts of Finland since the mid-1990s, which she said had increased social segregation.

Ms Bernelius predicted that the disappointing result would not lead Finland to emulate the focus on tests in the Asian countries that have overtaken it. Instead it would likely explore more novel and technology- based forms of learning, she said. “I think many people in Finland will actually be happy, not because of the decline in the results but the possibility to use this as an opportunity for reform,” she added.

Some Finns may also take solace from the fact that their country’s slip in the league tables was not as dramatic as that of its historic rival Sweden, which fell from 19th to 36th place in the reading rankings.

Pekka Isosomppi, a press officer at the Finnish embassy, felt there could be a positive side to being out of the Pisa limelight. “It’s a bit like not winning the Eurovision Song Contest,” he said.

Michael Shaw

Case study: US

As a whole, the performance of the US appears to have flatlined when this week’s Pisa results are compared to those from three years ago. It is below the OECD average in maths and science, coming in at 36th and 28th place respectively, and around average in reading, in 24th position.

But when Massachusetts, Connecticut and Florida - the three states that took part in Pisa - are examined individually, the data tells a different story.

In each of the three subject areas, Florida stubbornly lingered at or below the OECD average, beneath the UK and Russia. By contrast, Massachusetts was ranked among the top performers - including South Korea and Finland - in reading and science, and above average in maths, placing it alongside Germany. Similarly, Connecticut performed above the OECD average and the US average in all subjects.

Florida and Massachusetts have undergone significant programmes of reform but, judging by their Pisa scores, with very different degrees of success. Under Jeb Bush, the former governor of Florida and a potential Republican candidate in the 2016 presidential election, the Sunshine State established a significant number of charter schools and placed greater emphasis on standardised test scores. Massachusetts, meanwhile, has largely been suspicious of charter schools, instead opting to focus on raising standards through, for example, tougher assessments.

Responding to the overall performance of the US, education secretary Arne Duncan said that the Pisa rankings showed a “picture of educational stagnation” that was at “odds with our aspiration to have the best- educated, most competitive workforce in the world”.

It would be easy to blame the US variation in performance on the fact that it is a large, economically diverse country, but in every subject it was outperformed by Vietnam, which has a much higher proportion of disadvantaged students.

Across the US, 45 states have adopted the Common Core State Standards - the first attempt at a national curriculum - which they hope will drag the country’s Pisa test scores up in the future.

Richard Vaughan

Behind the headlines

“These days, it seems you have to apologise for producing league tables,” Andreas Schleicher said as he introduced the Pisa findings. But the OECD official refused to say sorry, stating that the tables were “cross- nationally valid”, “robust” and “just a tiny part of the Pisa story”.

Inevitably it was the rankings that made the headlines this week. Stories in the UK suggested that the country had “dropped down the league” and out of the top 20 for the first time. In fact, the UK rose up the rankings very slightly in reading and maths, with small increases in its scores. In science, the UK fell from 16th to 21st in the printed table. But its score was the same as that of 20th-placed Slovenia and was unchanged from Pisa 2009.

As the OECD admits, such small changes in the rankings are not statistically significant because “large variation in single ranking positions is likely” owing to “the uncertainty that results from sample data”. And that is before you consider the criticisms from academics, already published by TES, who have said that the rankings are “meaningless”.

A wealth of data is contained in Pisa’s findings. But concerns about its methodology, together with rival surveys offering a very different picture, mean it is wise to treat the rankings with caution and the study as a useful source of information, not a bible.

William Stewart.

Maths top 10

Regions with the highest mean scores in maths, Pisa 2012

1. Shanghai, China (613)

2. Singapore (573)

3. Hong Kong, China (561)

4. Taiwan (560)

5. South Korea (554)

6. Macao, China (538)

7. Japan (536)

8. Liechtenstein (535)

9. Switzerland (531)

10. Netherlands (523)


Subscribe to the magazine

as yet unrated

Comment (1)

  • It is astounding that the Tes continue to give the oxygen of publicity to OECD Pisa's Andreas Schleicher and their trade union advisory spokesman, John Bangs when the Cambridge University Winton professor for the public understanding of risk, David Spiegelhalter has made it plain that Pisa results are unreliable. This can only undermine the standing of the Tes and raise questions about the independence of reporting subsequent to a question raised by the Tes in July, 2013.
    In the United States @DianeRavitch, former Asst Secretary of Education has raised her concerns over the Pisa / Pearson link.

    Unsuitable or offensive? Report this comment

    23:48
    22 December, 2013

    Stevemayman

Add your comment

Subscribe to the magazine
Join TES for free now

Join TES for free now

Four great reasons to join today...

1. Be part of the largest network of teachers in the world – over 2m members
2. Download over 600,000 free teaching resources
3. Get a personalized email of the most relevant resources for you delivered to your inbox.
4. Find out first about the latest jobs in education