top of page
Backgrounds (21).png

Economics Cumulative Assesslet Blueprint

Biology Cumulative Science Assesslet.png

Purpose of the Assesslets

Assesslets are formative tools for teachers to assess student learning and guide instruction. They are aligned to the Georgia Standards of Excellence (GSE). Although not intended to predict performance on state summative assessments, Assesslets can provide information on how well students understand grade-level concepts and their ability to apply knowledge and skills to extended reasoning and critical thinking beyond basic recall. As formative tools, Assesslet items are designed to provide clear feedback on a student’s understanding of the state standards, including potential student misconceptions.

Like any formative assessment, Assesslets are best used as a means of guiding instructional
next steps. These may include revisiting items many students answered incorrectly, providing
students an opportunity to revise and improve their constructed and extended responses, and
forming small-group follow-up instruction based on patterns in the data.

Development Process

State resources, such as assessment guides, achievement level descriptors, and scoring
samples, were used when creating Assesslets to determine the format for items to best prepare
students for state assessments.

The development team consists of highly qualified subject matter experts who apply a deep
understanding of the curriculum and best practices in item and assessment development. Many
of our team members are Georgia educators with vast knowledge and experience in the
classroom. Subject matter experts participated in Item Development Training and qualified as
item writers by completing a series of performance tasks in assessment development.

All Assesslet items are written and reviewed during the development process by experienced
educators with expertise in the content area and grade level. Items then go through multiple
rounds of review. The review process is rigorous, checking for: alignment to standards and
appropriate level of cognitive complexity, accuracy of content and answer key, compliance with
item specifications, and adherence to Universal Design principles. This facilitates the greatest
access to the widest possible range of learners. Once items are finalized for content, they go
through a final editorial review.

Following each school year, items are reviewed based on statistical item analysis information for
Assesslet items that have a large enough sample size. Item analysis information may include
answer choice or score point distributions, percentage correct (e.g., p-value), and item
discrimination information (e.g., biserial correlations). Flagged items are reviewed, discussed,
and revised if it is determined changes are needed.

Types of Items

Selected-Response (SR) Items
A selected-response item, sometimes called a multiple-choice item, is a question, problem, or
statement that appears on an assessment, followed by several answer choices. The incorrect
choices, called distractors, are designed to reflect common misconceptions. The student’s task is to choose the best choice to answer the question. Items in grades 1-2 have three answer
choices with one correct key and two distractors. Beginning in grade 3, items have four answer
choices with one correct key and three distractors.

These items include rationales for each answer choice. Rationales provide clarity around
common misconceptions in the student’s thinking for incorrect choices.

Constructed-Response (CR) Items
A constructed-response item is an open-ended item that asks the student to provide a response
that he or she constructs, rather than selecting from a set of answer choices. On Assesslets, full
credit (two points) is given for a complete response. Partial credit may be awarded if part of the
response is correct.

These items include a rubric and an exemplar, which is a model response at the highest score
point. Each exemplar is written as if it were an actual student’s response.

Extended-Response (ER) Items
An extended-response item is an open-ended item that asks the student to provide a longer,
more detailed response than a two-point constructed-response item. On Assesslets, full credit
(four points) is given for a complete response. Partial credit may be awarded if part of the
response is correct.

These items include a rubric and an exemplar, which is a model response at the highest score
point. Each exemplar is written as if it were an actual student’s response.

Scoring Process

All Assesslet constructed-response (CR) and extended-response (ER) items are scored by a
trained evaluator while the selected-response items are scored automatically. Evaluators are
trained to score each grade-level Assesslet based on the scoring rubric. To ensure accuracy,
scoring supervisors review a sample of all responses assigned scores by evaluators. A scoring
platform is used that provides scoring supervisors a variety of tools to monitor scoring and
maintain scoring accuracy. The scoring rubrics are intended to guide teachers as they interpret
student responses and scores. All CR and ER items are scored within 6 business days after an
administration window has closed. Individual student results are released as soon as the scoring process has been completed.

If the Assesslet was administered in a small group or classroom setting with support from the
teacher to help students better understand how to approach the items in the Assesslet, the
individual results must be interpreted with caution. Since students received guidance and
support, the results may not be a true indication of an individual student’s performance.

If the Assesslet was administered to students to complete independently, the scores and
feedback from Assesslets may inform teachers instructional next steps in several ways:

- What students could focus on to improve their constructed- and extended-response answers, if given an opportunity to revise (highly recommended)
- How individual students performed on the Assesslet, and how students with similar levels of performance could receive common follow-up instruction
- Student understanding of grade level concepts
- Whether students were able to follow the format of the Assesslet
- Which item types and/or domains were most difficult for students
- What students need to review prior to the summative assessment

Assesslets are not intended to predict performance on a summative assessment.

Evaluator Feedback

Evaluator feedback is included for the constructed- and extended-response items and serves as additional feedback for a student’s assigned score on an Assesslet item that is beneficial to students, teachers, and administrators. Evaluator feedback moves beyond the numerical score by offering insight into why a response was assigned a score. This can give teachers the information needed to connect a student’s individual response to an overall score and provide additional guidance for students.

Evaluator feedback provides perspective on how students might revise and therefore improve their responses. Further, students who receive the same evaluator feedback might be grouped together, and the teacher might provide a brief mini lesson that elaborates on this common evaluator feedback. Teachers might also decide to provide focused, more tailored feedback on their individual students’ responses. If students follow through on the guidance provided, their responses can improve.

Timeline of Results

Selected-response items are scored automatically within the District and School Connect
platform and can be viewed immediately by clicking on the “Live Responses” button or by
viewing individual student response cards.

Constructed- and extended-response items are scored by trained evaluators who assign points
and provide feedback to student responses within 6 business days after the Assesslet
administration window has closed. Assesslets with a “Complete” status have been scored and
results with feedback are available.

Backgrounds (22).png

Have Questions?

Our Lennections team would appreciate the opportunity to discuss your district or schools academic goals.

bottom of page