4 Test Administration

Chapter 4 of the Dynamic Learning Maps® (DLM®) Alternate Assessment System 2015–2016 Technical Manual—Science (Dynamic Learning Maps Consortium, 2017) describes general test administration and monitoring procedures. This chapter describes updated procedures and data collected in 2019–2020.

For a complete description of test administration for DLM assessments, including information on available resources and materials and information on monitoring assessment administration, see the 2015–2016 Technical Manual—Science (Dynamic Learning Maps Consortium, 2017).

4.1 Overview of Key Administration Features

This section describes DLM test administration for 2019–2020. For a complete description of key administration features, including information on assessment delivery, Kite® Student Portal, and linkage level selection, see Chapter 4 of the 2015–2016 Technical Manual—Science (Dynamic Learning Maps Consortium, 2017). Additional information about administration can also be found in the Test Administration Manual 2019–2020 (Dynamic Learning Maps Consortium, 2019c) and the Educator Portal User Guide (Dynamic Learning Maps Consortium, 2019b).

4.1.1 Change to Administration Model

Instructionally embedded assessments were available for teachers to optionally administer between September 9 and December 20, 2019, and between January 1 and February 26, 2020. During the consortium-wide spring testing window, which occurred between March 9 and June 5, 2020, students were assessed on each Essential Element (EE) on the blueprint. Each state education agency sets its own testing window within the larger consortium spring window.

The COVID-19 pandemic significantly impacted the spring 2020 window, which resulted in all states ending assessment administration earlier than planned. State education agencies were given the option to continue assessments later in the year, extending to the end of June, however no state education agencies used this option.

4.1.2 The Instruction and Assessment Planner

In 2019–2020, the Instructional Tools Interface that was used for administering the optional fall instructionally embedded assessments was replaced with the Instruction and Assessment Planner (“planner” hereafter). The planner is designed to facilitate a cycle of instruction and assessment throughout the instructionally embedded windows.

The planner includes information to track the lifecycle of an instructionally embedded assessment from the creation of an instructional plan to the completion of the testlet. Additionally, student performance on instructionally embedded testlets is also reported within the planner to assist teachers in monitoring student progress and planning future instruction. For a complete descripion of planner, including the development process, see Chapter 4 of the 2019–2020 Technical Manual Update—Instructionally Embedded Model (Dynamic Learning Maps Consortium, 2020a).

4.2 Administration Evidence

This section describes evidence collected during the spring 2020 operational administration of the DLM alternate assessment. The categories of evidence include data relating to the length of the assessment and the adaptive delivery of testlets in the spring window.

4.2.1 Adaptive Delivery

During the spring 2020 test administration, the science assessments were adaptive between testlets, following the same routing rules applied in prior years. That is, the linkage level associated with the next testlet a student received was based on the student’s performance on the most recently administered testlet, with the specific goal of maximizing the match of student knowledge and skill to the appropriate linkage level content.

  • The system adapted up one linkage level if the student responded correctly to at least 80% of the items measuring the previously tested EE. If the previous testlet was at the highest linkage level (i.e., Target), the student remained at that level.
  • The system adapted down one linkage level if the student responded correctly to less than 35% of the items measuring the previously tested EE. If the previous testlet was at the lowest linkage level (i.e., Initial), the student remained at that level.
  • Testlets remained at the same linkage level if the student responded correctly to between 35% and 80% of the items on the previously tested EE.

The linkage level of the first testlet assigned to a student was based on First Contact survey responses. The correspondence between the First Contact complexity bands and first assigned linkage levels are shown in Table 4.1.

Table 4.1: Correspondence of Complexity Bands and Linkage Levels
First Contact Complexity Band Linkage Level
Foundational Initial
1 Initial
2 Precursor
3 Target

For a complete description of adaptive delivery procedures, see Chapter 4 of the 2015–2016 Technical Manual—Science (Dynamic Learning Maps Consortium, 2017). For a summary of student adaptive routing during the spring 2019 administration, see Chapter 4 of the 2018–2019 Technical Manual Update—Science (Dynamic Learning Maps Consortium, 2019a).

4.2.2 Administration Incidents

As in all previous years, testlet assignment during the spring 2020 assessment window was monitored to ensure students were correctly assigned to testlets. Administration incidents that have the potential to affect scoring are reported to state education agencies in a supplemental Incident File. No incidents were observed during the spring 2020 assessment window. Assignment of testlets will continue to be monitored in subsequent years to track any potential incidents and report them to state education agencies.

4.3 Implementation Evidence

This section describes additional resources that were made available during the spring 2020 operational implementation of the DLM alternate assessment. For evidence relating to user experience and accessibility, see the 2018–2019 Technical Manual Update—Science (Dynamic Learning Maps Consortium, 2019a).

4.3.1 Data Forensics Monitoring

Beginning with the spring 2020 administration, two data forensics monitoring reports were made available in Educator Portal. The first report includes information about testlets completed outside of normal business hours. The second report includes information about testlets that were completed within a short period of time.

The Testing Outside of Hours report allows state education agencies to specify days and hours within a day that testlets are expected to be completed. For example, Monday through Friday from 6:00 a.m. to 5:00 p.m. Each state selects their own days and hours for setting expectations. The Testing Outside of Hours report then identifies students who completed assessments outside of the defined expected hours. The report includes the student’s first and last name, district, school, name of the completed testlet, and time the testlet was started and completed. Information in the report is updated at approximately noon and midnight each day, and the report can be viewed by state education agencies in Educator Portal or downloaded as a CSV file.

The Testing Completed in a Short Period of Time report identifies students who completed a testlet within an unexpectedly short period of time. The threshold for inclusion in the report was testlet completion time of less than 30 seconds. The report includes the student’s first name, last name, grade, and state student identifier. Also included are the district, school, teacher, name of the completed testlet, number of items on the testlet, an indicator for whether or not all items were answered correctly, the number of seconds for completion of the testlet, and the starting and completion times. Information in the report is updated at approximately noon and midnight each day, and the report can be viewed by state assessment administrators in Educator Portal or downloaded as a CSV file.

4.3.2 Released Testlets

The DLM Alternate Assessment System provides educators and students with the opportunity to preview assessments by using released testlets. A released testlet is a publicly available, sample DLM assessment. Released testlets cover the same content and are in the same format as operational DLM testlets. Students and educators can use released testlets as examples or opportunities for practice. Released testlets are developed using the same standards and methods used to develop testlets for the DLM operational assessments. New released testlets are added on a yearly basis.

In response to state inquiries about supplemental assessment resources to address the increase in remote or disrupted instruction due to COVID-19, the DLM team published additional English language arts, mathematics, and science released testlets during the spring 2020 window. Across all subjects, nearly 50 new released testlets were selected and made available through Kite Student Portal. To help parents and educators better review the available options for released testlets, the DLM team also provided tables for each subject that display the Essential Elements and linkage levels for which released testlets are available.

The test development team selected new released testlets that would have the greatest impact for remote or disrupted instruction. The team prioritized testlets at the Initial Precursor, Distal Precursor, and Proximal Precursor linkage levels, as those linkage levels are used by the greatest number of students. The test development team selected testlets written to Essential Elements that covered common instructional ground, with a consideration for previously released testlets to minimize overlap between the testlet that were already available and new released testlets. The test development team also aimed to provide at least one new released testlet per grade level, where possible.

4.4 Conclusion

The Instruction and Assessment Planner was introduced to better support learning, instruction, and the process of administering DLM testlets. Additionally, new data forensics monitoring reports were made available to state education agencies in Educator Portal. Finally, DLM published additional English language arts, mathematics, and science released testlets during the spring 2020 window to support remote or disrupted instruction resulting from COVID-19. Updated results for administration time, linkage level selection, user experience with the DLM system, and accessibility supports were not provided in this chapter due to limited samples in the spring 2020 window which may not be representative of the full DLM student population. For a summary of these administration features in 2018–2019, see Chapter 4 of the 2018–2019 Technical Manual Update—Science (Dynamic Learning Maps Consortium, 2019a).