Access Dynamic Item Analysis Reports
Access Dynamic Item Analysis Reports
Dynamic Item Analysis by Test report that focuses on items and related domains or strands for performance.
Teachers can base their academic goals and align their curriculums with these results to effectively influence student learning.
Depending on your project, you may see links to view the items, student responses, and rubrics.
The Pearson Access Report Builder dynamically generates reports based on selections you make in each Report Builder section — Who, What, and How. You can change your criteria and generate various reports for different organizations, student groups, or test(s).
Category | Description |
---|---|
Who | Includes organizations and student groupings. By default, all organizations and groupings are selected.
|
What | Includes subjects, tests, administrations, and available report types. |
How | Includes columns and rows that affect how data appears within the report. |
Prerequisites
- Confirm your role and project under your name, and your district or school in the Organization dropdown. If you have access to multiple projects or organizations, click each dropdown to select the one(s) you want to view.
The Pearson Access user interface and available options change based on the organization level to which you are pinned. If you select a district or school, you see different options than if you are pinned to a local education agency (LEA) or department of education (DOE).
- Tests must have been previously scored. Pearson provides these reports based on responses received from your class. This report only reflects those provided responses.
Step-by-Step
From the main menu, click (or tap) Report Builder.
Click an organization or grouping in the Who section, and select from the available options. Click Confirm. By default, all Who options are selected.
- Select at least one Subject and academic year in the What section. Click Confirm for each.
- Click Report Type and select a report type, and then click Confirm.
5. You can also scroll past Report Type to expand options in the How (Columns) and How (Rows) sections to indicate how data appears. The report can include these columns, if selected.
Columns also vary by project:
- Score Type indicates the score the student earned (for example, raw or scale).
- Score indicates the test's hierarchical level at which the score is being provided. For example, the Overall Test, the domain, the strand, or the subject.
- Non-Numeric Score displays any information that an administrator or teacher wants to detail on student performance.
- % Correct graph outline represents 100%, and the color within the outline represents the percent correct.
- Performance (Level) indicates the performance level.
Test Count indicates the number of students included in that row.
+View screenshot...
- Score indicates the test's hierarchical level at which the score is being provided. For example, the Overall Test, the domain, the strand, or the subject.
- Score Type indicates the score the student earned (for example, raw or scale).
- All Levels (Subject, Domain, Strand, Standard) includes lower-level data (for example, strands and standards). You can click to expand the various levels to find data specific to that level.
Rows vary, depending on the report type that you selected. See examples below:
6. Click GET REPORT.
You can click Download Report at the top-right of the report page to download a CSV version. Layout and formatting vary by report.
You can click Reset All at the top of the filter options to remove all previously selected filters.
The examples below are online versions of the Item Analysis reports based on your selection in the HOW section. These contain several elements. See callouts in the image below (layout and formatting slightly differ between online and PDF versions, and performance-level colors vary by project). The screenshots below contain sample data and mock text.
7a - Each blade displays a test indicator icon, the test name, number of students tested, and the test administration and dates below.
7b - Click the arrow to expand the test blade to see the
- Performance Level (PL) Key that measures student performance. Labels and colors vary, depending on your program.
- Test Description
- Points Possible for each score level
7c - Each student appears in the Student column. Click the student blade arrow to see details for an individual student.
7d - The test indicator icon also appears in each report row. Hover over it to see the same information as above the test blade.
7e - You can hover over the Performance bar to see performance level information for each overall test and item.
Each item has a unique ID.
7f - Click the item link to open the item previewer in another window (for low-stakes or released high-stakes test items. If items are not available, Item Not Available appears.
7g - The Score column displays the points earned out of total possible points.
7h - The Percent correct displays along with a bar graph that illustrates this percentage.
7i - Click a Score Level blade (for example, a standard or domain) to see each item's unique ID, with links for each low-stakes or released high-stakes test item, the score, and percentage correct for each.
7j - Click Download Report to download a PDF (or CSV, if available) of the report. If available, you can click the star icon to add the report to your favorites or click the share icon to share it.
7a - Each blade displays a test indicator icon, the test name, number of students tested, and the test administration and dates below.
7b - Click the arrow to expand the test blade to see the
- Performance Level (PL) Key that measures student performance. Labels and colors vary, depending on your program.
- Test Description
- Points Possible for each score level.
7c - The test indicator icon also appears in each report row. Hover above it to see the same information as above the test blade.
7d - You can hover over the Performance Level bar to see performance level information for each overall test and item.
7e - Click a Score Level blade (for example, a standard or domain) to see each item's unique ID, with links for each low-stakes or released high-stakes test item, the score, and percentage correct for each.
7f - Click the item link to open the item previewer in another window (for low-stakes or released high-stakes test items only). If items are not available, Item Not Available appears.
For non-multiple choice items, you can click the item link to open a Student Score Distribution graph window.
a - The upper left side of the window displays graph and item information.
b - A bar represents each unique score value (for example: 0, 1, 2, 3, 4) and how many students received each score.
c - The student list contains a Points Earned column with each student's name and points earned for all items.
- You can click within the graph to filter student list results.
NOTE: In the Student Score Distribution graph and student list, null scores are displayed as Other.
7g - The Score column displays the points earned out of total possible points.
7h - The Percent correct displays along with a bar graph that illustrates this percentage.
7i - Click Download Report to download a PDF (or CSV, if available) of the report.