Producing EdTech Evidence that Meets ESSA Standards

Producing EdTech Evidence that Meets ESSA Standards
21.png
Share

Working together with educators, administrators, technologists and researchers, we built LearnPlatform by Instructure to help education organizations (like schools, districts, states and education networks) manage, measure and advance their edtech efforts. Education organizations across the country run rapid-cycle evaluations (RCEs) with IMPACT™ technology to generate practical, relevant evidence administrators and educators use to make more data-informed decisions. The reports help education leaders understand the costs of their edtech initiatives, the extent to which edtech products are being used and the impact that the edtech products are having on student outcomes. 

RCEs reveal quality insights more quickly than traditional evaluation – this can be done in a little as a few weeks, depending on data, goals and needs of the organization. Districts, states and edtech providers conduct RCEs to produce evidence that aligns with the levels of evidence required by the Every Student Succeeds Act (ESSA).

Foundational Evidence 

Teacher Feedback: Administrators trust teacher feedback when making purchasing and adoption decisions. Teachers know best! It’s helpful to collect their subjective feedback when first exploring questions related to product implementation and effectiveness.

Usage Analysis: Examine edtech product usage overall and for different student subgroups. It’s important to understand the extent to which a product is or isn’t being used and make adjustments before moving to an Outcomes Analysis. A Usage Analysis can be run with or without a fidelity (or recommended usage) goal.

Cost Analysis: Analysis of usage data and cost/pricing data for an intervention. The analysis can also include direct costs (e.g., licensing) and indirect costs (e.g., PD, educator time, etc.), as well the recommended product usage and/or subgroup analysis.

Note: While these analyses are not currently ESSA-aligned, they can and should inform decision making and reporting.

ESSA Level I* or II

Outcomes Analysis with a Comparison Group: Compare achievement of students who use the edtech product to students who don’t. Studies with a comparison group are considered the most rigorous and offer the strongest evidence of effectiveness. Can confidently draw conclusions related to product impact on achievement. 

*If a comparison group is randomly assigned, it is ESSA Level I.

ESSA Level III

Outcomes Analysis without a Comparison Group: Examine relationships between edtech usage and student achievement outcomes. All students receive the product in this type of study. Because there is no comparison group, it cannot be concluded that the product caused higher or lower achievement. 


Ultimately, the reports and dashboards generated through LearnPlatform’s research-driven framework for rapid-cycle evaluation helps education organizations demonstrate and document evidence that directly aligns with ESSA guidelines. Evidence from RCEs help leader understand what is working best for which students and teachers in their own contexts, leading them to make more informed decisions that support the growth of all student groups.

Schedule a demo to see how you can generate edtech evidence to meet ESSA standards.

Discover More Topics:

Stay in the know

Don't miss a thing – subscribe to our monthly recap and receive the latest insights directly to your inbox.