It is critical PISA results are considered alongside other measures of Australian student performance to ensure policy decisions are based on a broad understanding of quality education, writes The Evidence Institute.

What is PISA?

The Programme for International Student Assessment (PISA) is administered every three years by the Organisation for Economic Co-operation and Development (OECD). It was first implemented in 2000 and tests students from participating and non-participating OECD countries and provinces around the world.  

PISA aims to understand how effective schooling systems are at preparing students for future work and study, as defined by the OECD. Countries and provinces are ranked against the OECD average (mean) and each other resulting in international comparisons. This information is often used as a driver for educational improvement, policy review and reform internationally.

What does PISA assess?

PISA is designed by education experts from participating countries and provinces to test real-world application of the skills taught in schools. The focus is on assessing capabilities in the domains of reading, mathematical and scientific literacy, rather than subject-specific content or knowledge. The assessment operates in a three-year cycle, with a major focus and two minor areas each cycle. For example, the focus area in PISA 2018 was reading literacy with minor areas for mathematical and scientific literacy.  

PISA is not a static assessment, which has implications for trend data and the ability to compare results over time.


 PISA's focus is on assessing capabilities in the domains of reading, mathematical and scientific literacy, rather than subject-specific content or knowledge.


These domains are defined by the OECD and often differ from definitions used by countries and provinces to frame their education policies and curricula. As definitions form the basis of assessments, when these differ assessments may not necessarily target the same skills, capabilities and knowledge. For example, PISA’s definition of reading literacy is different from definitions used in Australia. PISA’s “innovative” definition “refers to students’ capacity to apply knowledge and skills, and to analyse, reason and communicate effectively as they identify, interpret and solve problems in a variety of situations.”1 In contrast, a comparable part of the definition of literacy in the NSW English K–10 Syllabus (2012) states: “Being ‘literate’ is more than the acquisition of technical skills: it includes the ability to identify, understand, interpret, create and communicate purposefully using written, visual and digital forms of expression and communication for a number of purposes in different contexts". 2 Consequently, comparing PISA results with other high stakes assessments can reveal contradictions – for example, declining PISA results contradict NSW’s improving HSC results across many subjects including English3.

When using disparate results to evaluate the quality of education, understanding the purpose of each assessment and what it reveals is essential for drawing valid conclusions.

PISA is not a static assessment, which has implications for trend data and the ability to compare results over time. The OECD periodically reviews domain definitions and testing procedures to ensure that these are contemporary, and reflective of changing theory and best practice. This is desirable, but it is unclear how this impacts trend data4.  


Each round of PISA results reflects differences in educational performance for entirely different groups of 15-year old students ... and this should be kept in mind when comparing Australia’s performance against other countries and provinces, and PISA results across years.


Who participates?

OECD countries and provinces participate in PISA voluntarily. Non-OECD countries can choose when and how often they participate. This means that in every PISA cycle, different countries and provinces may be represented. The OECD average is based on data collected from OECD countries and does not include partner countries and provinces.

In Australia, students from all states and territories, and all schooling systems (government, Catholic and independent) participate, providing a representative sample of 15-year olds. Other countries can be represented by selected cities and/or provinces. Also, each round of PISA results reflects differences in educational performance for entirely different groups of 15-year old students. The same students are not being tracked across time by longitudinal testing. These points should be kept in mind when comparing Australia’s performance against other countries and provinces, and PISA results across years.


It is critical that PISA results be considered alongside other robust measures of Australian student performance to ensure decision-making is based on a multi-dimensional understanding of quality education.


What does this mean for the future?

PISA measures students’ abilities to use science, reading and mathematics knowledge and skills to meet real-life challenges through analysing, reasoning and communicating ideas effectively. Australia’s performance in PISA suggests that these remain as priorities for education jurisdictions to ensure students are well-prepared for the future. However, it is critical that PISA results be considered alongside other robust measures of Australian student performance. This will ensure that any decision making is based upon a multi-dimensional understanding of quality education. The NSW Curriculum Review presents a timely opportunity to explore evidence-based options for reforming the content and structure of the current curriculum, ensuring it best meets the aims of Australia’s educational aspirations.

  1. OECD. (2019). PISA 2018 assessment and analytical framework. Paris: OECD Publishing, p. 25. Retrieved from https://dio.org/10.1787/b25efab8-en
  2. NSW Education Standards Authority (NESA). (2012). NSW English K–10 Syllabus. Retrieved from https://educationstandards.nsw.edu.au/wps/portal/nesa/k-10/learning-areas/english-year-10/english-k-10/learning-across-the-curriculum
  3. See NSW Education Standards Authority (NESA) for data
  4. Goldstein, H. (2017). Measurement and evaluation issues with PISA. In The PISA effect on global educational governance (49-58), Routledge: New York and London.