Drivers of student performance: Insights from Europe

Drivers of student performance: Insights from Europe

We analyzed OECD PISA results from Europe. What did we learn about student mindsets, teaching practices, and technology?

A well-educated citizenry is an economic and social necessity. But there is little consensus about what it takes to deliver a quality education. Our latest research on this critical subject offers global findings as well as deep regional analysis—the focus of this article is Europe.

In two previous reports, one on the world’s best-performing school systems (2007) and the other on the most improved ones (2010), we examined what great school systems look like and how they can sustain significant improvements from any starting point. In this report, we switch our focus from systems to student-level performance, applying advanced analytics and machine learning to the results of the Program for International Student Assessment (PISA), a project of the Organisation for Economic Co-operation and Development (OECD). Beginning in 2000 and every three years since, the OECD has tested 15-year-olds around the world on math, reading, and science. It also surveys students, principals, teachers, and parents on their social, economic, and attitudinal attributes (Exhibit 1).

PISA is a rich set of assessment and survey data.

Using this rich data set, we have created five regional reports that consider what factors drive student performance. In this one, we analyze the results of 27 European Union (EU) countries and 12 non-EU countries in the region that participated in the 2015 PISA. Europe is a large and diverse region, and the PISA results reflect this, with performance ranging from poor to great. As a whole, the EU’s performance has been flat since 2006; performance in non-EU countries has improved slightly. There is a clear imperative for the lower-performing countries to improve faster, and for the more developed European systems not only to maintain performance but also to innovate in order to prepare students for their future.

This research is not intended as a road map to system improvement; that was the theme of our 2010 report, which set out the interventions that school systems need to undertake to move from poor to fair to good to great to excellent performance. Instead, we examine three factors that we found to be particularly important to student outcomes: student mindsets, teaching practices, and information technology.

Student mindsets have more influence on outcomes than socioeconomic background.

It is hardly news that students’ attitudes and beliefs—what we term their “mindsets”—influence their academic performance. The magnitude of this effect, and which mindsets matter most, is still under debate, and we focused our research on this topic. We know from years of academic research that student socioeconomic status matters for student performance. We therefore measured the effect of mindsets that is not explained by socioeconomics alone.

By analyzing the PISA data, we found that in Europe mindset factors explain a greater proportion of a student’s PISA score (29 percent) than even the home environment (18 percent) (Exhibit 2). In all other regions we surveyed, mindsets have at least double, and up to triple, the impact of home environment on PISA results, a pattern that reinforces the importance of this finding. Mindsets matter everywhere.

Mindsets eclipse even home environment in predicting student achievement.

Some mindsets are more important than others. In the 2015 PISA assessment, the most predictive mindset is the ability to identify what motivation looks like in day-to-day life (including doing more than expected and working on tasks until everything is perfect). We call this “motivation calibration,” as it involves a student “calibrating” what types of behaviors motivated students exhibit. Motivation calibration’s impact on the PISA science score is more than twice the impact of self-identified motivation (wanting to be the best and wanting to get top grades). Students who had good motivation calibration scored 12 to 13 percent (or 50 to 60 PISA points) higher on the science test than poorly calibrated ones. In contrast, scores are just 5 percent higher for students with high self-identified motivation than for those without. These relationships hold after controlling for socioeconomic status, location, and type of school. The motivation calibration relationship is particularly strong for students in poorly performing schools, where having a well-calibrated motivation mindset is equivalent to vaulting into a higher socioeconomic status. In these schools, students from the lowest socioeconomic quartile who are well calibrated perform better than those from the highest socioeconomic quartile who are poorly calibrated (Exhibit 3).

Having a well-calibrated motivation mindset is equivalent to leapfrogging into a higher socioeconomic quartile.

Other mindsets that are predictive of student outcomes include believing that one’s school science work will be useful for one’s future career, having little test anxiety, and having a strong sense of belonging to one’s school. We also found that students with a strong growth mindset (those who believe they can succeed if they work hard) outperform students with a fixed mindset (those who believe that their capabilities are static) by 11 percent in EU countries and 15 percent in non-EU countries.

The prevalence of beneficial mindsets varies between boys and girls. While girls are more likely to have strong motivation calibration, they are also more likely to have high levels of test anxiety.

To be clear, mindsets alone cannot overcome economic and social barriers, and researchers are debating the extent to which parental or school-system-level interventions can shift student mindsets. Our research does, however, suggest that mindsets matter a great deal, particularly for those living in the most challenging circumstances.

Students who receive a blend of inquiry-based and teacher-directed instruction have the best outcomes.

High-performing and fast-improving school systems require high-quality instruction. It’s that simple—and that difficult. We evaluated two types of science instruction to understand how different teaching styles affect student outcomes. The first is “teacher-directed learning,” in which the teacher explains and demonstrates scientific ideas, discusses questions, and leads classroom discussions. The second is “inquiry-based teaching,” which includes a diverse range of practices from conducting practical experiments to understanding how science can be applied in real life, to encouraging students to create their own questions.

Our research found that student outcomes are highest with a combination of teacher-directed instruction in most to all classes and inquiry-based teaching in some classes (Exhibit 4). If all students experienced this blend of instruction, average PISA scores in Europe would be 3.7 to 4.2 percent (or 19 PISA points) higher, equivalent to more than half a school year of learning. Currently over half of European students are receiving too little teacher-directed instruction.

Finding the sweet spot: The best EU student outcomes combine both teaching styles.

It’s also important to note, moreover, that some kinds of inquiry-based teaching are better than others. In Europe, more structured inquiry-based activities yield higher PISA scores. Explaining how a science concept can be applied to a real-world situation improves scores in both EU and non-EU countries. Conducting and drawing conclusions from scientific experiments also improves scores significantly. Less structured methods of inquiry, however, such as allowing students to design their own experiments, results in lower scores across the board.

Given the strong support for inquiry-based pedagogy, this seems counterintuitive. We offer two hypotheses. First, students cannot progress to inquiry-based methods without a strong foundation of knowledge, gained through teacher-directed learning. Second, inquiry-based teaching is inherently more challenging to deliver, and teachers who attempt it without sufficient training and support will struggle. Better teacher training, high-quality lesson plans, and school-based instructional leadership can help. So can giving principals and teachers the confidence to focus on the best forms of inquiry-based teaching while not using these methods exclusively.

While teacher-directed instruction has the most positive impact on PISA scores, inquiry-based practices do better in promoting students’ joy in science and instilling the belief that doing well in school will help them have a brighter future. We believe that is why blending teacher-directed instruction with inquiry-based teaching produces the greatest overall benefit across Europe.

While technology can support student learning outside of school, its record inside school is mixed. The best results come when technology is placed in the hands of teachers.

Screens are not the problem when it comes to student outcomes—but neither are they the answer. Our research examined the impact of first exposure to information and communications technologies (ICT) and the impact of ICT for 15-year-olds, at home and during school. Students who reported their first digital exposure before the age of six score 9 to 16 percent higher than those exposed at age 13 or later (controlling for socioeconomic status, school type, and location). Higher-socioeconomic-status students are more likely to start using devices at an early age. They also get more benefit from early exposure, which has worrying implications for the equity gap.

At home, one to four hours of Internet use per day for 15-year-olds is associated with the highest science performance, 10 to 13 percent (or 45 to 61 PISA points) higher than for students with no after-school Internet use (again, after controlling for socioeconomic status, school type, and location). There appears to be declining impact—and possibly negative behavioral implications—when students spend four hours or more a day before a screen.

The impact of ICT use on students during the school day is much more mixed: from –16 percent to +12 percent, depending upon the type of hardware. Most important, we found that deploying ICT to teachers, rather than students, works best. For example, in non-EU countries, adding one teacher computer per classroom has over ten times the impact of adding a student computer to that same classroom. Across Europe, some student-based classroom technologies, such as laptops, tablets, and e-book readers, actually appear to hurt performance (Exhibit 5).

Technology directed to teachers boost PISA scores, while many kinds of technology directed to students appear to hurt them.

These results, however, describe the impact of education technology as currently implemented, not its eventual potential. They evaluate only hardware, not software, and do not account for rapid evolution. Even so, European leaders should not assume the impact of ICT will always be positive or even neutral. Systems should ensure that ICT programs are integrated with curriculum and instruction and are supported by teacher professional development and coaching.


As we share these three findings, we are mindful of their limits. One cannot find definitive answers from a single source, no matter how broad or well-designed it is. The direction of causality, sample sizes, missing variables, and nonlinear relationships are all relevant issues. Many questions still need to be resolved through a thoughtful research agenda and longitudinal experimentation. That said, we believe that these three findings provide important insights into how students succeed—and that European educators should incorporate them into their school improvement programs to deliver the progress that their students deserve.

Download the full report on which this article is based, Drivers of student performance: Insights from Europe (PDF–5.3 MB) and the executive summary in French (PDF–639 KB).

About the author(s)

Etienne Denoël is a senior partner in McKinsey’s Brussels office, Emma Dorn is a specialist in the Silicon Valley office, Andrew Goodman is a partner in the London office, Jussi Hiltunen is a partner in the Helsinki office, Marc Krawitz is an associate partner in the New Jersey office, and Mona Mourshed is a senior partner in the Washington, DC, office.

The authors deeply thank the many people who supported us in bringing this report to fruition. We especially thank David Thomas and Frédéric Panier for their insight on education in Europe. We are grateful for the invaluable guidance of our analytics leadership: Rafiq Ajani, Taras Gorishnyy, and Sacha Litman. We thank our dedicated data engineer and data scientist colleagues: April Cheng, Sujatha Duraikkannan, Roma Koulikov, Devyani Sharma, and Avan Vora. And we are grateful for the substantial contributions from our colleagues Anne-Marie Frassica, Joy Lim, Esteban Loria, Miriam Owens, Corinne Spears, Amy Tang, and Paul Yuan. We further acknowledge the external thought-leaders and experts who provided counsel and expertise. Finally, this report would not have been published without the support of our editor Cait Murphy, the design creativity of Nicholas Dehaney at Spicegrove Creative, and the committed support of many local translators and designers.

Related Articles