Archive

Posts Tagged ‘PAT Mathematics’

Understanding the Year Group Report

October 18th, 2011

Generated for PAT:Mathematics and PAT:Reading (Comprehension and Vocabulary), PAT:Listening Comprehension and STAR (revised 2011)

Introduction
The Year Group Report allows schools to compare their students’ achievement on PAT:Mathematics at a particular year level with the achievement of a national reference group. In most cases the reference group used will be the same year level as the students from the school. However, if the report is produced towards the end of the year, schools might choose to compare their students’ results with those of an older level reference group. This is because the national reference group data was collected for students at the beginning of the school year.

It is important to be aware that students reported on in the Year Group Report may not have taken the same test. To construct the report, raw marks have been converted to PAT scale scores, which can be compared regardless of which of the tests in a PAT series was originally taken.

Schools with small numbers of students at a particular year level (20 or less) should take care when comparing their students’ achievement with that of a national reference group. Statistics for small-sized cohorts are more likely to be affected by factors such as measurement error and extreme values, meaning the distribution of achievement could look different if the assessment were to be repeated.

The Year Group Report makes use of box plots. Follow the link to find an explanation of how to read a boxplot

The Year Group Report
Understanding the year group report

Click on the report to view full size image

Understanding the Item Report

October 18th, 2011

Generated for PAT:Mathematics, PAT:Reading (Comprehension and Vocabulary) and PAT:Listening Comprehension

Introduction
The Item Report indicates what percentage of the group of students answered each question (item) on a test successfully. It also compares the success of the group on the questions with the success of students from the national reference group at a given year level. An example of an Item Report is shown below. This Item Report analyses the results of a group of Year 9s who sat PAT:Reading Comprehension Test 6. Notice that the reference year used is Year 9. This means that the students’ achievement on each item is being compared with the achievements of a national reference group of the same year level. When reading the report it is important to identify which year level the group of students is being compared with. Each year group performs at a higher level than the one before.

The Item report for PAT:Reading Comprehension displayed below, shows that question 2 from Test 6 comes from the text ‘What’s the Problem?’ and is a local inference question. The question asked is ‘Why does the storteller think a bracelet is preferable to a piercing?’. Of the students in this group, 87.5 percent answered the question correctly. This is well above the percentage in the Year 9 reference group who answered the question correctly (71 percent). The percentage correct on each question is given both as a number and graphically. In the graphic, the green shaded part of the rectangle shows the proportion of students in the group who correctly answered the question, while the red arrow indicates the proportion of the national reference group who were successful with the question.

The different Item Reports
There are subtle differences between Item Reports for the different PAT subject areas. The Item Report for Mathematics for instance associates the questions according to the content category and describes the question asked.

Strengths and Weaknesses

The Item Report can be used to look for overall strengths and weaknesses a group may possess. Questions where the group’s performance is well under the national performance, or which deviate from the patterns shown on other questions, could be particular areas for concern and further investigation. It is important to remember that national performance on a question is only a reference point. A school might expect their students to consistently perform above the national proportions. In the Item Report shown, the students have performed above the national level on nearly every question. There are a few questions where their performance is slightly below the national reference group, for instance Question 31. This question,  could be an area to investigate further. It may be that the students have not covered these ideas or that something particular to the question made it difficult for them to answer.

The Item Report

Click on the report to view full size image

Understanding the List Report

October 18th, 2011

Generated for PAT:Mathematics, PAT:Reading (Comprehension and Vocabulary), PAT:Listening Comprehension and STAR

Introduction

The List Report provides a breakdown of results for a group of students who have taken the same test. Each individual’s results are provided as well as summary statistics for the whole group. The List Report shown below reports on 11 Year 8 students from Class 2 at NZCER School who sat PAT:Mathematics Test 5. Their mean scale score was 57.4 patm units, which is slightly higher than the national average for Year 8 (54.9 patm units). The first student on the list is Alex Choy. Alex answered 19 questions correctly on the test. This equated to a scale score of 55.2 ± 3.3 patc units. Alex’s achievement puts him in stanine 5 (an average achievement) when compared with the national reference group for Year 8.

It is possible to sort the list of student names by ‘first name’ or ‘lastname’ by clicking on the small triangle or the actual title “Student Name” just above the student name list.

The error component
For the PAT tests, each scale score on the List Report is provided with an estimate of measurement error. This indicates the precision of the score and gives a range within which we can be reasonably sure that the student’s true achievement level lies. This margin of error should be considered in all comparisons of scale scores, particularly when scores are close together. An example of how to read the error component is shown on the List Report on the facing page. For the STAR test the measurement error is estimated as plus or minus 3 marks.

Stanines
Stanines are used to compare an individual student’s achievement with the results obtained by a national reference sample chosen to represent a certain year level. Stanines divide the distribution of results for a year group into nine categories. Most students, when compared with their own year level, achieve around stanines four, five, and six. Stanines seven, eight, and nine represent comparatively high achievement for a year group, while stanines one, two, and three indicate comparatively low achievement. It is important to remember that the national reference sample data for the PAT tests was collected in March. This means that when a test is administered at the end of the year it will be more appropriate to make stanine comparisons with the next highest year level.

More information about stanines and how they were derived can be found in the teacher manuals or by following the link below.

More information about stanines

The List Report generated contains the following group and individual statistics:

Test group statistics

  • Number in class
  • The class mean on the patm scale
  • The standard deviation of class estimates
  • A mean stanine for the class

Individual statistics

  • The raw test score
  • The patm scale estimate with a confidence band
  • The number of items omitted
  • An individual stanine score

Understanding the list report

Click on the report to view full size image

Understanding Scale Scores

October 18th, 2011

What is a scale score

The NZCER marking service uses scale scores to report student achievement on PAT:Reading (comprehension and vocabulary), PAT:Listening Comprehension, PAT:Mathematics and STAR (revised 2011). A scale score represents the conversion of a raw test score to a location on a described equal-interval scale designed to measure progress over several year levels. There are three separate PAT scales: the PAT:Mathematics scale, the PAT:Reading Comprehension scale and the PAT:Reading Vocabulary scale. The PAT:Mathematics scale, along with a sample of descriptors from different content categories, is shown below.

The process used to convert raw scores to scale scores takes into account the difficulty of the questions in the tests. This means that scale scores can be compared directly, regardless of which test in a series was originally administered.

Each PAT scale uses its own measurement unit to measure progress. For instance, the PAT:Mathematics scale uses a unit called “patm” (short for PAT:Mathematics unit. Each unit on a PAT scale represents the same amount of progress no matter where the student is located on the scale. For example, a student who increases from 20 to 30 patm units is considered to have made the same amount of progress as a student who moves from 50 to 60 patm units. This equal-interval property makes the PAT scales ideal for tracking student achievement over time. Students’ scale scores can be recorded from test to test to show their growing levels of knowledge and skill.

Comparing achievement with national norms
Once a raw score has been converted to a scale score it becomes possible to compare a student’s achievement with the achievement of national reference samples at different year levels. A scale score of 60 patm units, for instance, represents very high achievement for a Year 4 student (stanine 9), but represents below average achievement for a student in Year 10 (stanine 4). The NZCER Marking service allows the user to choose which reference year will be used for these comparisons. It is important to remember that data from the reference samples were collected in March.

The scale description
When each of the new reading and mathematics PAT tests was developed the relative difficulty of every question in each test was located on the appropriate scale. This allowed the scales to be described in terms of the knowledge and skill associated with questions at different scale locations. A student’s achievement can therefore be reported in terms of the knowledge and skill required to correctly answer questions that are located at or below the student’s own scale score location. Students are more likely to have developed the knowledge and skills described at and below their own scale location. The scale is defined so that a student whose scale score is at the same position as a particular set of questions is expected to answer correctly 50 percent of these questions, and far more than 50 percent of the questions located further down the scale.

Using the PAT scales to indicate curriculum levels
By assigning each question to a specific curriculum level it has been possible to work out where the different curriculum levels fall on the PAT:Mathematics and PAT:Reading Comprehension scales. The graphic of the PAT:Mathematics scale shown below displays these curriculum locations using shaded bars. It is important to note that questions from each curriculum level cover a section of the scale, rather than a fixed point, and that these sections overlap.

The curriculum level a student is working at can be estimated by comparing the student’s scale score with the display of curriculum levels. For instance, using the PAT:Mathematics scale on the facing page we can see that the middle of the Level 4 range is at approximately 60 patm units on the scale. This means a student whose scale score is 60 patm units is expected to be able to answer about 50 percent of the questions in PAT:Mathematics considered to represent Level 4 of the national curriculum. He or she will be expected to answer correctly a much higher percentage of the Level 3 questions, but a much lower percentage of the Level 5 questions. This student therefore could be considered to be capable of working at Level 4. A similar graphic showing curriculum levels for PAT:Reading Comprehension is presented in the PAT:Reading teachers manual.

Scale Scores

Close
Scale Scores

Click on the report to view full size image

The Class Report

March 5th, 2009

Introduction

The Class Report shows how the students in a class are distributed along the scale.  It also provides a comparison with national reference group data by showing the stanine distribution for a chosen year group.  The Scale Score Report is available for PAT:Mathematics, PAT:Reading Comprehension and PAT:Reading Vocabulary.

Understanding the report

The Class Report positions each student in a test group on the PAT:Mathematics scale according to their achievement on the test. A shortened form of the student’s name is used to show where they achieved. The scale can be used to see what each student scored on the test and what their test score coverts to in terms of a PAT:Mathematics scale score.

It is important to remember that a student’s location on the scale is not a precise point.  Measurement error means that a student’s location on the scale should best be identified as a range.

Links to other reports

Clicking on a student’s name will take you directly to the Individual Report for that student.

Click on the image below to create a full size picture:


Understanding the Individual Report for PAT:Mathematics (new)

March 3rd, 2009

Introduction
An Individual Report maps how well a student has performed on the different questions within a test.  Although the Individual Reports for PAT:Mathematics, PAT:Reading Comprehension and PAT:Reading Vocabulary look different, they are all built around a set of common concepts.  There are two versions of the Individual Report for PAT:Mathematics: the standard view and the alternative view.  When viewing the individual report for PAT:Mathematics users may switch between the two views using a clickable button on the results bar at the top right of the report.

Understanding the standard view
The standard view of the Individual Report for PAT:Mathematics mirrors the student report that may be constructed by hand using black line masters contained in the teacher’s manual.  The report displays the questions positioned according to their location on the PAT:Mathematics scale and grouped according to their content category.  Questions that the student has answered correctly are shown using a black-filled circle, while those answered incorrectly are shown using unfilled circles.  Questions that were omitted by the student are shown as grey-filled circles.  The student’s overall level of achievement is indicated by the dotted line that crosses the page and intersects the scale and the stanine score distributions for three different year levels.  The dashed lines above and below the dotted line are used to indicate the measurement error associated with the student’s score.  If the test could be repeated we would expect the student to score in the range indicated by the dashed lines about two thirds of the time.  Students who achieve very highly or very poorly on a test will have a larger error associated with their score.

Comparing the location of questions with a student’s achievement
The student’s location on the scale may be compared with the locations of the questions on the scale.  Typically, a student is most likely to answer correctly the questions listed below their achievement level (as shown by the dotted line).  When a question is located well below the student’s own level of achievement there is a strong expectation that the question will be answered correctly.  In contrast, when a question is located well above the student’s level of achievement there is a low expectation that it will be answered correctly.

Question descriptions
A short description for each question is provided on the right hand side of the report. These descriptions can also be generated by hovering the mouse over a question in the report. A tick symbol next to the question description indicates that the question was answered correctly.

The actual questions themselves can be displayed by clicking on a question number.  This opens the Individual Item Report for that question, which displays the actual question as presented in the test and provides a range of information about the question, including showing which options were selected by students and providing links to other similar questions in the test.

Gaps and strengths
Evidence that a gap exists in a student’s knowledge could be indicated when a student has incorrectly answered a question that he or she was expected to answer correctly.  Although it could also have been just a “silly mistake”, it should be followed up and investigated further.  Similarly, when a student has correctly answered a question that was expected to be answered incorrectly it could be evidence that he or she has a particular strength in an area.  Again, there could also be other reasons; for instance, the student may have simply made a lucky guess.  In the report shown below, the student has answered Questions 7 incorrectly (it is presented as a non-filled circle).  Given the student’s overall level of achievement this question was expected to be answered correctly.  It is possible that he has a gap in this area, which could be confirmed with further questioning.

Comparing Performance in Different Content Categories
The Individual Report can be used to provide an indication of how a student has performed in the different content categories.  However, it is important to note that each content category has a different number of questions and that these questions vary in difficulty.  This makes it unwise to make direct comparisons between content categories on the basis of the number of questions answered correctly.  There is evidence that a student is performing less successfully in a content category when he or she is unable to answer the majority of the questions in that content category that are located below the student’s location on the scale.

Printing
The standard view of the PAT:Mathematics report is printed on two pages.  The first page provides the graphic showing the scale and the question locations, while the second page dispalys the question descriptors.

Click on the image below to create a full size picture:


Click here to download a high quality image

Understanding the alternative view

The Individual Report for PAT:Mathematics can be viewed in an alternative configuration.  In this view questions within a test are displayed according to their content categories, using the question number and a short, one-sentence description of what the question involves.  Bold print is used to indicate when the student has correctly answered a question.  Within each content category the questions are ordered according to their difficulty levels, with the more difficult questions placed higher on the page.

The dashed line that runs through the report is used to indicate how the student’s overall level of achievement compares with the difficulty of the questions.  Typically, a student is more likely to answer correctly the questions listed below the line than above it.  The report uses coloured shading to indicate when questions are well above or well below the student’s achievement level.  A question below the line with a dark green background is expected to be very easy for the student and a question with a dark red background above the line is expected to be very difficult for the student.  By scanning the different content areas, and looking at the student’s pattern of responses, especially with questions that were expected to have been easy or difficult, the reader can quickly get a feeling for the student’s performance on the different content categories.

Gaps and strengths
As above, evidence that a gap exists in a student’s knowledge could be indicated when a student has incorrectly answered a question that he or she was expected to answer correctly.  Although it could also have been just a “silly mistake”, it should be followed up and investigated further. Similarly, when a student has correctly answered a question when an incorrect response was expected it could be evidence that he or she has a particular strength in an area.  Again, there could also be other reasons.

Understanding Stanines

June 13th, 2008

Stanines
Stanines are used to compare an individual student’s achievement with the results obtained by a national reference sample chosen to represent a certain year level. Stanines divide the distribution of results for a year group into nine categories. Most students, when compared with their own year level, achieve around stanines four, five, and six. Stanines seven, eight, and nine represent comparatively high achievement for a year group, while stanines one, two, and three indicate comparatively low achievement. Read more…

Individual Test Item Report

June 13th, 2008

On the homepage a reference year is displayed. The matching year level is displayed by default, however you may enter a different year level (3–10) if you wish your test group to be compared to a different national level.

Click on the ‘Individual test item report’ icon individual test item in the row belonging to the test group you want the report for.

The Individual test item report provides a detailed overview of how a test group responded to each test item.

It is linked to the Individual reports via student name. Clicking on a student’s name in the table will generate an Individual report for that student on this test.

Question numbers are displayed at the top and bottom of the page, so you can jump between test questions as you like.

For Mathematics, a list of related questions within the test is displayed. Clicking on any of the related numbers will generate the Individual test item report for the related item.

Further links into the ARB website are given at the bottom of the page. These may be useful for follow up study in possible areas of weakness indicated by the Individual test item report.

When you have finished with the report click on the ‘close’ button close in the top right hand corner of the screen.

Individual Test Item Report

Close
Individual Test Item Report

Click on the report to view full size image

Understanding the Stanine Report

June 11th, 2008

Generated for PAT:Mathematics, PAT:Reading (Comprehension and Vocabulary) and STAR

Introduction
The stanine report displays the stanine score distribution for a group of students on a particular test. The results are displayed using a small student icon. A key is used to show how many students each icon represents. Using the mouse to point to an icon will display information about the actual student or students the icon represents. If the icon represents a single student, clicking on the icon will take the user to the individual report for that student.

The stanine report can be used to display subgroups of students (for instance boys, girls and ethnicities). Click on the appropriate droplist box to select which subgroups are to be displayed. It is possible to use both the gender and ethnicity filters at the same time.

The options for display are:

  • all students
  • boys
  • girls
  • boys and girls together, side -by-side.

The Stanine Report

Understanding the stanine report

Click on the report to view full size image

Teacher Manuals

June 1st, 2008

Teacher manuals are available for purchase through our online shopping. A sample of the PAT Mathematics Teacher Manual is available here.

Using the direct links below you can purchase full copies of the manuals by ordering online or offline.

PAT Mathematics Teacher’s Manual

PAT Reading Teacher’s Manual

PAT Listening Teacher’s Manual

STAR Teacher’s Manual – revised 2011

PAT Mathematics – Making Sense of the Reports

June 1st, 2008

This booklet contains detailed descriptions of each report generated for PAT Mathematics. Download a copy here. Read more…