Comparability between survey years

To assess trends, users need to be able to compare results between years. Changes are made to improve the research tools and processes where possible, and to make sure that the results produced are useful and relevant to users. 

Under each heading below, we state if there was a change to the research in this area and what that change was. For more information about any of these changes, please refer to the 2014 NZCASS technical manual [PDF, 5.3 MB]


The overall sampling process in 2014 remained largely consistent with previous surveys. The main change made was in using the NZ Post Postal Address File (PAF) and Māori Electoral Roll. This was done to:

  • improve response rates
  • reduce survey running costs.

As part of the change review and approval process, the Māori booster sample was increased from 1200 to 1660 to account for the estimated design effects. This increased sample was applied along with a number of other risk management processes that are detailed in the 2014 NZCASS Technical Manual.

Back to top


To ensure consistency with the 2009 and 2006 surveys, no changes were made in 2014 to any questions that were used as part of the following processes:

  • incident screening
  • incident selection
  • offence coding
  • imputation
  • data/statistical checking.

There were also a number of questions ring-fenced to ensure comparability over time because they are used by different agencies/groups.

Minor changes were made to bring some questions in line with stakeholder needs and (where possible) align demographics with Statistics NZ standard classifications. Some questions were deleted where the sample was not big enough to be analysed and reported on. A small number of questions were added (for example, security questions).

Back to top


The fieldwork was done by a different provider in each year (2014, 2009 and 2006). Each provider used a different survey software tool (in 2014, the TSS software was used).

The fieldwork process in 2014 was mostly the same as in 2006 and 2009. The only changes made in 2014 related to the use of updated technology to manage and monitor fieldwork processes.

In 2006 and 2009 the fieldwork providers needed to physically undertake the household sampling process within each meshblock by visiting every Xth household according to the prescribed sampling process. In 2014, households could be pre-selected from the Postal Address File and Māori Electoral Roll (using the same sampling process) and loaded electronically onto interviewers’ tablets so they knew exactly which household they were visiting.

Back to top

Offence coding

No changes were made to the method or rules used to code offences. However, new technology enabled coding management and quality assurance processes to be improved. The NZCASS coding manual was also updated to give clearer instructions to coders and make it easier to use.

Back to top

Data processing

Due to the survey software used, new quality assurance steps were implemented to make sure data was formatted correctly, clean, complete and correct. The datasets sent by the fieldwork provider were converted into analysis datasets to simplify analysis and reduce the risk of analytical errors. These changes were also applied retrospectively to the 2006 and 2009 datasets.

Back to top


The NZCASS weighting process in 2014 stayed largely the same as in 2009 and 2006. Small changes were made to account for sampling changes and update population benchmarks.

Back to top

Variance estimation

No changes to variance estimation were made in 2014.

Back to top


A number of changes and updates were made to the imputation process in 2014:

  • The number of values imputed for each missing value was increased from 10 to 100 in 2014. This was done to increase the power of significance tests.
  • Around 40 new imputation items were added to enable analysis by the victim’s relationship to the offender for violent interpersonal offences. The methodology used to include these items in the imputation process was externally reviewed by an expert from the University of Auckland.
  • A number of minor corrections to the imputation code were made.
  • All changes were retrospectively applied to the 2006 and 2009 datasets so comparisons across time could be made.
  • All new imputation code was externally reviewed by an expert from the University of Auckland before analysis was finalised.

Back to top

Classifications and groupings

All classifications and groupings used for analysis and reporting were reviewed as part of NZCASS 2014. Where possible demographic and geographic classifications were brought in line with Statistics NZ standard classifications. The offence groupings were developed based on stakeholder analysis and reporting needs. All changes to offence groupings were retrospectively amended for the 2006 and 2009 datasets to ensure consistency across years.

Back to top

This page was last updated: