Skip to content

2019 RISE Validity and Reliability

DATE:  
Thursday, May 16, 2019

TO:  
Elementary Principals
Middle School Principals

FROM:  
Dr. Anthony Godfrey, Associate Superintendent
Ben Jameson, Director of Evaluation, Research and Accountability

SUBJECT:   
2019 RISE Validity and Reliability


RISE Reports on Nextera:
Classroom reports for RISE continue to see delays of up to a week to access accurate information.  When students submit the test, the scale score they see is accurate, but that score will take up to a week to show up accurately on classroom, school and district-level reports.  This means that schools will need to wait for a week after they are completely finished with RISE testing to access accurate raw scores for their school.

RISE Validity and Reliability:
There have been a number of questions and concerns raised about the validity and reliability of the RISE test and the effect that the five outages have had on student scores.  To clarify, validity as used in the assessment world, means that the question items on the test are measuring what they intend to measure.  In the case of RISE where we are using the same question items that were in SAGE, we can be confident in the validity of the test because the items have been proven to be aligned with the Utah State Core.  There are five years of SAGE data to confirm this.

Educators often ask, “Were my students able to demonstrate their proficiency as intended?”  This is actually a question of reliability.  In other words, how consistent are students’ scores at representing their achievement when compared with their previous results?  In an email to superintendents across the state earlier this week, State Superintendent Sydnee Dickson said, “Unfortunately, the mounting issues with the operating platform created by Questar bring up many questions that will need to be answered.  The frequency of the problems that have occurred may impact the Utah State Board of Education’s ability to use the results for purposes of statewide accountability…. We believe the results can still be used to inform classroom instruction and individual student learning.  However, we are less confident about overall accountability.”

We recommend that principals be cautious in telling their teachers that their students’ RISE scores are not valid and reliable.  At this point, we do not know this for sure.  Here’s what we do know:

  • We know that the test itself is a valid test as it measures what it was intended to measure.
  • We know that students who were not affected by system outages will have had a similar opportunity to demonstrate their proficiency to that of the previous five SAGE-tested years.
  • We also know that if we filter out students whose scores were affected by outages (we are talking a smaller percentage of students in our district since most of our schools started testing later in the RISE window), we will have more accurate scores – accurate enough that principals and teacher teams may make instructional decisions based on RISE data – along with their own formal and informal assessment data – as has been done in years past.
  • We know that we don’t know how students’ scores were affected; therefore, it would not be appropriate to conclude that the data are invalid. In fact, four of the RISE system outages were issues with the submission of a session of the test after the student had already finished that segment.  Last Friday’s outage was an issue of students and educators not being able to log into the system to take the test.  The outages had less of an impact on students when they were actually testing.

School Accountability:
While we will likely be able to use RISE data for instructional decisions, its use for school accountability has not yet been determined.  The issue lies in the fact that the state accountability system requires that we count all students who participated – even if they were negatively impacted by the system outages.  Questar will be analyzing the RISE results of all students to see if and what the impact may be from the system outages.  The USBE will engage an objective third party to analyze and verify those results.  We will be monitoring this analysis closely.  Depending upon what they find, there are three options:

  1. If the analysis determines there is little impact, the state will move forward with the accountability system as planned.
  2. If the analysis determines there was an impact, there is an option to place an asterisk indicating where there have been testing irregularities that may have skewed the data.
  3. If the analysis determines there was a more significant impact, the USBE may decide to discontinue the accountability system for the 2018-19 school year. Superintendent Dickson indicated that this was a last resort.

Please contact Ben Jameson with any questions you may have.  Please look forward to more communication throughout the summer and fall of 2019 regarding the state’s findings.

Forward this Memo