Report

Description

How students experience university plays a major role in their academic, personal and professional success. Over the last decade Australian universities and governments have placed considerable emphasis on key facets of the student experience such as skills development, student engagement, quality teaching, student support, and learning resources. Reflecting this, a project was conducted in 2012 to furnish a new national architecture for collecting feedback on understanding the improving the student experience.

The University Experience Survey (UES) has been developed by the Australian Government to provide a new national platform for measuring the quality of teaching and learning in Australian higher education. The UES focuses on aspects of the student experience that are measurable, linked with learning and development outcomes, and for which universities can reasonably be assumed to have responsibility. The survey yields results that are related to outcomes across differing institutional contexts, disciplinary contexts and modes of study.The UES provides new cross-institutional benchmarks that can aid quality assurance and improvement.

In 2012 the Department of Industry, Innovation, Science, Research and Tertiary Education (DIISRTE) engaged ACER to collaborate with CSHE and UWS to build on 2011 work and further develop the UES. The project was led by A/Professor Hamish Coates (ACER) and Professors Richard James (CSHE) and Kerri-Lee Krause (UWS), and was managed by Dr Rebecca Taylor and Ali Radloff (ACER). The work was informed by the UES Project Advisory Group (PAG).

The UES is based on an ethos of continuous improvement, and it is imperative that quality enhancement work be positioned at the front-end rather than lagging tail of data collection and reporting activity. Using survey data for improvement is the most important and perpetually most neglected aspect of initiatives such as the UES, yet without improvement the value of the work is questionable. Recommendations were made to affirm the importance of reporting:

  • Recommendation 1: Interactive online UES Institution Reports should be developed to enable enhancement of the efficiency and reliability of reporting processes. This infrastructure should provide real-time information about fieldwork administration and student response.
  • Recommendation 2: A ‘UES National Report’ should be prepared for each survey administration that provides a broad descriptive overview of results and findings, and which taps into salient trends and contexts.
  • Recommendation 14: Strategies should be explored for international benchmarking, including the cross-national comparison of items, marketing the UES for use by other systems, or broader comparisons of concepts and trends.

Further development of the UES included extensive research, consultation with universities and technical validation. The survey instrument and its scales and items were further refined to be relevant to policy and practice and to yield robust and practically useful data for informing student choice and continuous improvement. Links were to be made with benchmark international collections. The future of the Course Experience Questionnaire (CEQ) was reviewed. The UES survey instrument was developed as an online and telephone-based instrument. The following recommendations were made regarding the substantive focus of the data collection:

  • Recommendation 3: The core UES should measure five facets of student experience: Skills Development, Learner Engagement, Quality Teaching, Student Support and Learning Resources.
  • Recommendation 4: The UES items reproduced in Appendix E of this UES 2012 National Report should form the core UES questionnaire.
  • Recommendation 5: As an essential facet of its utility for continuous improvement protocols should be adopted to facilitate the incorporation of institution-specific items into the UES.
  • Recommendation 6: Selected CEQ items and scales should be incorporated within an integrated higher education national survey architecture.The GTS, GSS, OSI, CGS, GQS and LCS scales and their 28 items should be retained in the revised national survey architecture, and the AWS, AAS, IMS, SSS and LRS scales and their 21 items be phased out from national administration. The name ‘CEQ’ should be discontinued and the retained scales/items should be managed as a coherent whole. A review should be performed after a suitable period (nominally, three years) to consider whether the scales are incorporated or discontinued.

The 2012 UES was the first time in Australian higher education that an independent agency had implemented a single national collection of data on students’ university experience. The survey was also the largest of its kind. Planning for the 2012 collection was constrained by project timelines, requiring ACER to draw on prior research, proven strategies and existing resources used for other collections to design and implement 2012 UES fieldwork. Overall, 455,322 students across 40 universities were invited to participate between July and early October 2012 and 110,135 responses were received. The national student population was divided into around 1,954 subgroups with expected returns being received for 80 per cent of these. Much was learned from implementing a data collection of this scope and scale, and the following recommendation were made:

  • Recommendation 7: Non-university higher education providers should be included in future administrations of the UES.
  • Recommendation 8: As recommended by the AQHE Reference Group, the UES should be administered independent of institutions in any future administration to enhance validity, reliability, efficiency and outcomes.
  • Recommendation 9: All institutions should contribute to refining the specification and operationalisation of the UES population and in particular of ‘first-year student’ and ‘final-year student’. Protocols should be developed for reporting results that may pertain to more than one qualification. Institutions should be invited to include off-shore cohorts in future surveys.
  • Recommendation 10: Given its significance a professional marketing capability should be deployed for the UES, working nationally and closely with institutions. To yield maximum returns, UES marketing and promotion should begin around nine months before the start of survey administration.
  • Recommendation 13: A UES engagement strategy should be implemented nationally as part of ongoing activities to enhance the quality and level of students’ participation in the process.

Given the scope, scale and significance of the UES it is imperative that appropriate and sophisticated technical procedures are used to affirm the validity and reliability of data and results. Quality-assured procedures should be used to process data, coupled with appropriate forms of weighting and sampling error estimation. As with any high-stakes data collection all reporting must be regulated by appropriate governance arrangements.

Authored by Ali Radloff, Hamish Coates, Rebecca Taylor, Richard James, Kerri-Lee Krause.

Publication Details
Published year only: 
2013
8
Share
Share
Subject Areas
Geographic Coverage
Advertisement