National Student Outcomes Survey FAQs

31 May 2019

What is the survey?

The National Student Outcomes Survey is an annual survey that collects information on vocational education and training (VET) students’ reasons for training, their employment outcomes, satisfaction with training, and further study outcomes. The survey covers students who have an Australian address as their residential address who are awarded a qualification (graduates), or who successfully complete part of a course and then leave the VET system (subject completers). Since 2017, international graduates who completed their training in Australia (referred to as international onshore graduates) have been surveyed as an additional component to the National Student Outcomes Survey.

The 2019 survey covers Australian graduates and subject completers as well as international onshore graduates. This year the survey opens from Friday 31 May to mid-August 2019. Students who completed a VET qualification or subject in 2018 may receive an invitation to participate in the survey.

Who is doing the survey?

The National Centre for Vocational Education Research (NCVER) - a not-for-profit company owned by the State, Territory and Federal ministers responsible for vocational education and training. NCVER collects information and provides research on vocational education and training in Australia to governments, the training sector, industry and the community.

Australian Government Department of Education and Training - the government department funding the survey. They will use the results to assist students to make informed training decisions and to develop government policy to help employers and industry.

Ipsos - a national market and social research company that conducts the survey on NCVER's behalf.

Why is it conducted?

The aim of the National Student Outcomes Survey is to improve the social and economic outcomes of students who undertake VET. This is achieved by providing the VET sector with information on students’ reasons for training, their employment outcomes, satisfaction with training, benefits of the training and further study outcomes.

The information is used by national and state/territory bodies, along with local training providers, to ensure vocational training is of a high quality and relevant to Australian workplaces. The survey highlights both the positive and negative outcomes from training and monitors the effectiveness of the VET system. The information collected assists in administering, planning, and evaluating the VET system.

Who is included and when?

Who is included in the survey?

Students included in the survey are those who completed their vocational training in Australia in the previous calendar year.

Since 1999, the survey has collected information on the outcomes of government-subsidised domestic VET students. In 2016, the scope of the survey was expanded to report on the outcomes of all domestic graduates - including fee-for-service graduates from private training and community education providers. Following this successful trial, the expanded scope has been applied to the 2017 surveys and onwards for both graduates and subject completers. Since 2017, international onshore graduates have been included in the survey as an additional component.

Graduates are defined as students who gained a qualification through their training.

Qualifications include:

  • Advanced Diploma
  • Associate degree
  • Diploma
  • Certificate IV
  • Certificate III
  • Certificate II
  • Certificate I

Subject completers are defined as students who successfully complete part of a course and then leave the VET system.

When is the survey conducted?

The survey cycle begins in March and has three main stages: project preparation, fieldwork, and data analysis and reporting.

Project preparation (March - last Friday in May)

  • NCVER selects the sample of students (graduates and potential subject completers) stratified by state/territory of student residence, provider type and funding source. Before sampling, units in each stratum are stratified by age, sex, field of education, remoteness and indigenous status. From 2017, the sampling has been designed to allow more records to be selected for training providers targeted for reporting at the training provider level. Contact details of selected students are then provided directly to the fieldwork contractor by State Training Authorities or the Unique Student Identifiers Office. At no time do NCVER staff have access to students' contact details.

Fieldwork (Last Friday in May - August)

  • A personalised survey invitation and privacy notice is sent at the end of May each year. In 2019, these invitations will be sent via email or SMS and a hard-copy letter will be sent to non-respondents with a mailing address approximately three weeks after the survey has commenced. The survey invitation includes students' training details and instructions on how to complete the survey online. Each invitation contains a unique login code and the hard-copy letter contains a personalised QR code for accessing the online survey.
  • Emails and SMS reminders are sent to those students with a valid email address and/or mobile number who haven’t responded to the survey at various times.
  • Telephone interviews are conducted with a selection of those who haven't responded from July.

Data analysis and reporting (September - December)

  • After completion of fieldwork, quality checks are conducted and data are analysed.

What survey results are available?

What information is available to training providers on their students' satisfaction and outcomes?

Where enough responses are received to provide accurate results, eligible registered training organisations (RTOs) can receive an individual report on the employment outcomes, training satisfaction, perceived benefits and relevance of training as reported by their domestic students (alongside state/territory and national results for comparison). The 2019 survey was designed, subject to response rates, to provide data for RTOs with:

  • 100 or more domestic graduates in 2018
  • 850 or more domestic subject completers in 2018.

What information on VET students' satisfaction and outcomes is available to the general public?

The aggregate results of the National Student Outcomes Survey are presented in the following publications.

  • Australian vocational education and training statistics: VET student outcomes
  • Australian vocational education and training statistics: International onshore VET graduate outcomes

Data on domestic students are published in December each year on NCVER's portal. The results from the survey for international onshore graduates are expected to be published early the year after the survey on NCVER's portal.

The following supporting/supplementary information is also provided on the publication page on NCVER's portal:

  • Technical notes
  • Terms and definitions
  • Reminder letter
  • Questionnaire
  • Data dictionary

For a graphical view of the data, see the data visualisation product VET graduate outcomes. This product allows data users to view graduate outcomes by field of education, qualification level, intended occupation of training and training package.

All publications and products are available free of charge on NCVER's portal, see Student outcomes.

Requests for more detailed statistical information or further information about the National Student Outcomes Survey can be made to:

(08) 8230 8400

A charge will generally be made by the NCVER for more complex requests for information. See the data access and charging policy.

Additional information is made available to various stakeholders including the Australian Department of Education and Training and the State Training Authorities.

How has the survey changed over the years?

During 1995, 1997 and 1998 the survey was known as the Graduate Destination Survey. From 1999 onwards the survey was known as the National Student Outcomes Survey.

1995 (conducted by the Australian Bureau of Statistics), 1997, and 1998

  • A census of TAFE graduates with a qualification involving at least 200 hours or one semester of training.


  • A census of graduates with a qualification involving at least 200 hours or one semester of training.
  • A small sample survey of subject completers was introduced. Separate questionnaires were used for graduates and subject completers.
  • The survey was expanded to include students from TAFE, community education providers, private training providers and other government providers.
  • Separate questionnaires were used for TAFE, private training providers and community education providers.
  • Only information on TAFE students was published.

2000, 2001, and 2002

  • A sample survey of graduates and subject completers.
  • For graduates, the minimum training length condition was removed.
  • Separate questionnaires continued to be used for graduates and subject completers.


  • The option to complete the survey via the internet was introduced.
  • For the first time participants who identified themselves as graduates in the subject completer component were included in the graduate segment for reporting. Previously responses of these people were collected but not used in reporting. At the aggregate level, this change makes no difference, but for sub-populations the effect may be greater; therefore caution is required in making comparisons with results published in previous years.


  • For the first time the same questionnaire was used for graduates and subject completers.


  • Information on government-funded students from community education and private training providers was published for the first time.
  • The same questionnaire was used for students from TAFE and private training providers. A separate questionnaire was used for students from community education providers.


  • No changes were made


  • A new recognition of prior learning (RPL) question was included in the survey replacing the one used previously.


  • No changes were made


  • For the first time those with email addresses were invited to complete the survey online via email and/or a primary approach letter rather than being sent a hard copy questionnaire.


  • For the first time ALL respondents received a primary approach letter inviting them to complete the survey online at the first mailing, rather than a hard copy questionnaire. As in previous years, those with an email address also received an email invite. Hard copy questionnaires were only sent to those who did not complete online by a certain date.
  • Improvements were made to the classification of graduates and subject completers see An analysis of self-reported graduates. Not all students who identified themselves as graduates in the subject completer sample were reported in the graduate segment (as per the change made in 2003). These students were modelled to determine eligibility for the qualification. This improvement was applied to data from previous years to maintain the time series.


  • Rather than proportional sampling an improved sampling method was used that aims to achieve balanced sampling errors across institute for the main survey variable "labour force status after training".


  • The graduate sample was increased in order to provide estimates at an individual course level, for courses with a population of 300 or more.


  • Survey responses from fee-for-service students from community education providers were excluded from reporting and the summary publication was renamed to Government-funded student outcomes. Data have been backdated to 2006.


  • The survey expanded to include graduates from private and community education providers who either paid for their training themselves or whose employer paid for the training. Previously, only students who received government funding were included in the survey.
  • The questionnaire was revised to a shorter, more user friendly version that can be used on mobile devices. This questionnaire was implemented in 2016 for all students (TAFE, universities/ other government providers, private training providers and community education providers).
  • As a result of shortening the questionnaire, the questions used to classify subject completers were streamlined, and the classification of subject completers was improved. Data for subject completers in previous years have been backdated to 2006 using this improved derivation.


  • The expanded survey scope, which includes fee-for-service students from private training providers and community education providers, was applied to subject completers.
  • The reminder postcard and the hard-copy questionnaire were removed from the contact methodology.
  • The 2017 National Student Outcomes Survey excluded students aged under 18 years of age. Data from previous years were backdated.
  • Government-funded training was broadly defined as all activity delivered by government providers and government-funded activity delivered by community education and private training providers. In 2017 the scope of government-funded training was revised and data for previous years have been backdated. Government-funded training now includes only Commonwealth and state/territory government-funded training (either Commonwealth or state recurrent funding, Commonwealth specific purpose funding or state specific funding) from all training providers. All fee-for-service activity from training providers has been excluded.
  • The derivation of graduates and subject completers changed to be more in line with administrative data. Previously, due to lags in reporting of a large number of qualifications completed to the National VET Provider Collection, government-funded students who were sampled as subject completers were reported as graduates if they self-reported completing a qualification and were deemed eligible for that qualification via a logistic regression model. From 2017, those who self-report completing a qualification, are reported as subject completers, as per the administrative data (or sample selection). Data have been backdated for 2016, the year the data quality improvement is first seen.


  • Non-respondents with a mailing address were sent a hard copy letter approximately four weeks after the initial email contact. This represents a change to the previous methodology where the hard-copy letter was sent to all students at the commencement of the survey.
  • In 2018, more RTOs received data about their students’ satisfaction and training outcomes than ever before. RTOs were eligible to receive an individual report on their students’ responses (alongside state/territory and national results for comparison) if they had:
    • 100 or more domestic graduates and/or 860 or more domestic potential subject completers during 2017, and
    • enough survey responses to provide accurate results.


  • In 2019, the scope of students eligible to be included in the survey was revised to be consistent with the National Vocational Education and Training Regulator (NVETRE) framework. As a result of this change, students must have completed accredited training delivered by RTOs.
  • In 2019, RTOs will again be eligible to receive an individual report on their students' responses where enough responses are received to provide accurate results. As a general guideline, this will include RTOs with 100 or more domestic graduates and/or 850 or more domestic subject completers during 2018.

How do I complete the survey?

The survey can be completed online from 31 May 2019 until August 2019. Students included in the 2019 survey will be sent an invitation to complete the survey online from Ipsos, the fieldwork contractor. Refer to your login code provided by Ipsos and go to:

Where can I obtain further information about the survey?

Who can I contact for general information?

For general queries about the survey, please contact the Student Outcomes Survey Helpline on 1800 071 219 (free call within Australia) or +61 3 9940 7745 (from outside Australia) or email

Where can I obtain information on the privacy notice and the prize draw terms and conditions?

For further information refer to:

2019 National Student Outcomes Survey: Privacy notice

Prize draw terms and conditions of entry