The session highlighted that the pace of edtech adoption has outstripped the speed of traditional research. By the time a traditional study delivers results, school needs, products, and budgets have often already changed. RCE is designed as a rigorous, yet rapid, and iterative response to this challenge, designed to balance scientific rigor with practical relevance.
RCE: The Cycle for Smarter Decisions
RCE allows districts to gather evidence throughout the academic year, aligning with key assessment windows. This cyclical approach helps leaders make smarter decisions on an ongoing basis:
- Beginning-of-Year (BOY) / Fall: The focus is on Usage Data. Are students using the product, and is the implementation aligning with expectations for the right students?
- Mid-Year (MOY) / Winter: The team connects usage information along with mid-year assessment data to see if higher engagement is leading to stronger outcomes.
- End-of-Year (EOY) / Spring: This is the zoomed-out view, identifying which student groups, grade levels, or schools benefited most, informing budget and renewal decisions for the next year.
The research team emphasized that RCE provides evidence for every leader, answering critical questions for every stakeholder:
- CFOs & Cabinet: What is the Return on Investment (ROI)? Should we renew this license?
- Instructional Leaders: Is the tool working for all students, including specific subgroups? (e.g., SPED, ELL).
- School Board: Do we have a clear story backed by ESSA-aligned evidence?
Real-World Examples Shared by the Research Team
During the session, the team provided three case studies demonstrating the practical value of RCE:
- Usage vs. Outcomes: An analysis of a high school math program showed that 10th graders who used the program more had higher end-of-year math scores. Students who passed Math 1 and Math 2 had double the program usage compared to their classmates who did not pass. This insight allowed the district to focus on fidelity and implementation, rather than questioning the product itself.
- Predictive Validity: A district questioned whether a secondary benchmark assessment was actually predictive of their state test scores. The RCE confirmed a strong positive relationship, giving the district built-in confidence in the assessment and saving them the time and cost of exploring and implementing a new testing tool.
- Growth Over Time & Fidelity: Using a Latent Growth Model, the team showed a district that while growth varied by grade level, consistent usage was associated with higher growth across the board. This finding provided strong evidence for the school board to justify a large purchase, confirming that when students do meet the recommended usage, they see significant benefit.
How It Works: We Do the Heavy Lifting
The research team walked through the data and process requirements, emphasizing that the LearnPlatform Research Team does the vast majority of the work:
- Data Sources: We require Product Usage Data (from your vendor dashboard) and can layer in Outcomes Data (assessments/course grades), Demographic Data (from your SIS), and Cost Data.
- Simplified Process: Your team identifies the products and pulls the raw data. The research team then takes over to clean, analyze, run reports, and review the findings with you.
- The Deliverable: The team produces reports and a Research Brief, which Madra Cherubini highlighted, is designed to be digestible data—an immediate insight, no jargon summary that can be shared directly with instructional leaders, school boards, and administrators who may not have a background in research.
Resources:
RCE is an essential tool for providing your district with a clear, data-driven story about the impact of your edtech investments. Current subscribers are encouraged to connect with their dedicated research team member to delve deeper into RCE. If you have questions about RCE and are unsure who your research team member is, please contact your Customer Success Manager (CSM).