What is the survey?
The National Student Outcomes Survey is an annual survey that collects information on vocational education and training (VET) students’ reasons for training, their employment outcomes, satisfaction with training, and further study outcomes.
Since 1999, the survey has collected information on the outcomes of government-funded VET students. In 2016, a trial was undertaken to expand the scope of the survey to report on the outcomes of all graduates; that is those graduates whose training was Commonwealth or state funded as well as fee-for-service graduates (those who paid for the training or whose employer paid for the training).
Following the successful trial, the expanded survey scope was applied to the 2017 survey for graduates and, for the first time, subject completers.
Who is doing the survey?
The National Centre for Vocational Education Research (NCVER) - a not-for-profit company owned by the State, Territory and Federal ministers responsible for vocational education and training. NCVER collects information and provides research on vocational education and training in Australia to governments, the training sector, industry and the community.
Australian Government Department of Education and Training - the government department funding the survey. They will use the results to assist students to make informed training decisions and to develop government policy to help employers and industry.
The Social Research Centre - a national market and social research company that conducts the survey on NCVER's behalf.
Why is it conducted?
The aim of the National Student Outcomes Survey is to improve the social and economic outcomes of students who undertake VET. This is achieved by providing the VET sector with information on students’ reasons for training, their employment outcomes, satisfaction with training, benefits of the training and further study outcomes.
The information is used by national and state/territory bodies, along with local training providers, to ensure vocational training is of a high quality and relevant to Australian workplaces. The survey highlights both the positive and negative outcomes from training and monitors the effectiveness of the VET system. The information collected assists in administering, planning, and evaluating the VET system.
Who is included and when?
Who is included in the survey?
Students included in the survey are those who completed their vocational training in the previous calendar year and have an Australian address as their usual address. In 2017, international on-shore graduates (defined as international fee-for-service students who undertook VET in Australia at an Australian training provider) were included in the survey as a trial.
Since 1999, the survey has collected information on the outcomes of government-funded VET students. In 2016, the scope of the survey was expanded to report on the outcomes of all graduates - including fee-for-service graduates (those who paid for the training or whose employer paid for the training) from private training and community education providers. The expanded scope was applied to the 2017 survey for graduates and, for the first time, subject completers.
Graduates are defined as students who gained a qualification through their training.
- Bachelor's Degree or higher
- Advanced Diploma
- Associate degree
- Certificate IV
- Certificate III
- Certificate II
- Certificate I
Subject completers are defined as students who successfully complete part of a course and then leave the VET system.
When is the survey conducted?
The survey cycle begins in March and has three main stages: project preparation, fieldwork, and data analysis and reporting.
Project preparation (March - last Friday in May)
- NCVER randomly selects the sample of students (graduates and potential subject completers) stratified by age, sex, field of education and training provider. Contact details of selected students are then provided directly to the fieldwork contractor by state training authorities or the Unique Student Identifiers Office. At no time do NCVER staff have access to students' contact details.
Fieldwork (Last Friday in May - August)
- A personalised covering letter and an information page is mailed at the end of May each year. The letter includes students' training details and instructions on how to complete the survey online. Each letter contains a unique login code and a personalised QR code for accessing the online survey.
- Emails and SMS reminders are sent to those students with a valid email address and/or mobile number who haven’t responded to the survey at various times.
- Telephone interviews are conducted with a selection of those who haven't responded from July.
Data analysis and reporting (September - December)
- After completion of fieldwork, quality checks are conducted and data are analysed.
What survey results are available?
The results of the National Student Outcomes Survey are published in November each year.
Data from the 2017 survey are presented in: Australian vocational education and training statistics: VET student outcomes. This publication supersedes previous publications on government-funded student outcomes, for which information (including a ten year time-series) is available in the new publication and selected data products.
The following supporting/supplementary information is also provided:
- Technical notes
- Terms and definitions
- Primary approach letter
- Data dictionary
For a graphical view of the data, see the data visualisation product VET graduate outcomes. This product allows data users to view graduate outcomes by field of education, qualification level, intended occupation of training and training package.
All products are available free of charge on NCVER's portal, see Student outcomes.
Requests for more detailed statistical information or further information about the National Student Outcomes Survey can be made to:
(08) 8230 8400
A charge will generally be made by the NCVER for more complex requests for information. See the data access and charging policy.
Additional information is made available to various stakeholders including the Australian Department of Education and Training and the State Training Authorities.
How has the survey changed over the years?
During 1995, 1997 and 1998 the survey was known as the Graduate Destination Survey. From 1999 onwards the survey was known as the Student Outcomes Survey.
1995 (conducted by the Australian Bureau of Statistics), 1997, and 1998
- A census of TAFE graduates with a qualification involving at least 200 hours or one semester of training.
- A census of graduates with a qualification involving at least 200 hours or one semester of training.
- A small sample survey of subject completers was introduced. Separate questionnaires were used for graduates and subject completers.
- The survey was expanded to include students from TAFE, community education providers, private training providers and other government providers.
- Separate questionnaires were used for TAFE, private training providers and community education providers.
- Only information on TAFE students was published.
2000, 2001, and 2002
- A sample survey of graduates and subject completers.
- For graduates, the minimum training length condition was removed.
- Separate questionnaires continued to be used for graduates and subject completers.
- The option to complete the survey via the internet was introduced.
- For the first time participants who identified themselves as graduates in the subject completer component were included in the graduate segment for reporting. Previously responses of these people were collected but not used in reporting. At the aggregate level, this change makes no difference, but for sub-populations the effect may be greater; therefore caution is required in making comparisons with results published in previous years.
- For the first time the same questionnaire was used for graduates and subject completers.
- Information on government-funded students from community education and private training providers was published for the first time.
- The same questionnaire was used for students from TAFE and private training providers. A separate questionnaire was used for students from community education providers.
- No changes were made
- A new recognition of prior learning (RPL) question was included in the survey replacing the one used previously.
- No changes were made
- For the first time those with email addresses were invited to complete the survey online via email and/or a primary approach letter rather than being sent a hard copy questionnaire.
- For the first time ALL respondents received a primary approach letter inviting them to complete the survey online at the first mailing, rather than a hard copy questionnaire. As in previous years, those with an email address also received an email invite. Hard copy questionnaires were only sent to those who did not complete online by a certain date.
- Improvements were made to the classification of graduates and subject completers see An analysis of self-reported graduates. Not all students who identified themselves as graduates in the subject completer sample were reported in the graduate segment (as per the change made in 2003). These students were modelled to determine eligibility for the qualification. This improvement was applied to data from previous years to maintain the time series.
- Rather than proportional sampling an improved sampling method was used that aims to achieve balanced sampling errors across institute for the main survey variable "labour force status after training".
- The graduate sample was increased in order to provide estimates at an individual course level, for courses with a population of 300 or more.
- Survey responses from fee-for-service students from community education providers were excluded from reporting and the summary publication was renamed to Government-funded student outcomes. Data have been backdated to 2006.
- The survey expanded to include graduates from private and community education providers who either paid for their training themselves or whose employer paid for the training. Previously, only students who received government funding were included in the survey.
- The questionnaire was revised to a shorter, more user friendly version that can be used on mobile devices. This questionnaire was implemented in 2016 for all students (TAFE, universities/ other government providers, private training providers and community education providers).
- As a result of shortening the questionnaire, the questions used to classify subject completers were streamlined, and the classification of subject completers was improved. Data for subject completers in previous years have been backdated to 2006 using this improved derivation.
- The expanded survey scope, which includes fee-for-service students from private training providers and community education providers, was applied to subject completers.
- The reminder postcard and the hard-copy questionnaire were removed from the contact methodology.
- The 2017 National Student Outcomes Survey excluded students aged under 18 years of age. Data from previous years were backdated.
- Government-funded training was broadly defined as all activity delivered by government providers and government-funded activity delivered by community education and private training providers. In 2017 the scope of government-funded training was revised and data for previous years have been backcast. Government-funded training now includes only Commonwealth and state/territory government-funded training (either Commonwealth or state recurrent funding, Commonwealth specific purpose funding or state specific funding) from all training providers. All fee-for-service activity from training providers has been excluded.
- The derivation of graduates and subject completers changed to be more in line with administrative data. Previously, due to lags in reporting of a large number of qualifications completed to the National VET Provider Collection, government-funded students who were sampled as subject completers were reported as graduates if they self-reported completing a qualification and were deemed eligible for that qualification via a logistic regression model. From 2017, those who self-report completing a qualification, are reported as subject completers, as per the administrative data (or sample selection). Data have been backcast for 2016, the year the data quality improvement is first seen.