When Research Gets in the Way of Learning: Lessons from the Accelerating Adoption Network
Author Beth Holland, Ed.D.
Research and measurement 4 min read

The rapid adoption of educational technology has presented schools and systems with new tools, strategies, and programs that promise to accelerate learning at the very core of the student experience. However, to determine whether these innovations actually provide students with meaningful content, feedback, and experiences, the field needs evidence that speaks not only to whether they work, but also how, for whom, and under what conditions they improve student outcomes at scale.

During the 2024–2025 school year, FullScale launched the Accelerating Adoption Network as part of its Exponential Learning Initiative to fund and evaluate 10 networks (i.e., edtech providers, program leaders, independent school systems, etc.) scaling innovations that make durable improvements to the instructional core. The goal was to understand both the implementation and impact as these networks scaled across diverse contexts. Along the way, our evaluation–conducted with Mathematica–surfaced another critical lesson: if the field wants evidence that helps us learn how innovations scale under real-world conditions, research must evolve alongside the work itself.

Rigor Is Not the Problem—Rigidity Is

FullScale identified a heterogeneous set of networks operating across varied contexts. From the outset, differences in strategy, timing, leadership support, and local capacity shaped implementation. The process of scaling surfaced complexity that could not be ignored, yet the research design remained largely fixed and could not adapt as implementation unfolded.

Traditional methods and infrastructure, including Institutional Review Board and data-sharing requirements, treated implementation variability as noise rather than a source of learning. Fixed designs limited responsiveness, and outcome-focused approaches constrained insights. Consequently, the research itself obscured some of the most consequential learning. 

Proximate measures told a different—and more actionable—story. Teachers consistently reported that the innovations were beneficial, worth continuing, and influential in shaping instructional practice even though distal outcome measures often failed to detect statistically significant effects.

This produced a consequential tension: educators reported meaningful instructional shifts, while traditional research findings read as inconclusive. The absence of statistical significance did not signal an absence of learning; it reflected a misalignment between what was measured and the stage of innovation being studied. The lesson from the Accelerating Adoption Network is not that these approaches lack rigor, but that they lack the agility necessary for understanding how innovations take root, adapt, and mature across contexts. 

When Evidence Lags Behind Innovation

The tension between rigor and rigidity reflects an evidence ecosystem shaped by long-standing incentives: funding structures that privilege quantitative evidence, policy environments that demand comparability, and research norms that reward summative outcomes over formative learning. When statistical significance becomes the primary signal of value too early, implementation evidence is sidelined, proximate indicators discounted, and learning delayed until after key scaling decisions have been made.

Understanding how innovations scale requires methods and measures that move with the work rather than freeze it in place. This includes approaches that surface variation early and treat it as an opportunity for improvement. It also requires capturing leading indicators and valuing qualitative data that explains why outcomes occur. From this perspective, proximate measures, implementation data, and educator experience are not “soft” evidence. Instead, they help the field decide what to improve, what to support, and what to study more deeply. These approaches also surface the system conditions that enable successful implementation at scale.

Shifting toward more agile and adaptive research is not merely a technical adjustment but a normative one. It requires redefining what counts as credible evidence at different stages of innovation and valuing usable evidence that supports field learning alongside traditional standards of rigor. Done well, this shift does not lower the bar but raises the field’s capacity to learn, adapt, and improve as innovations scale.

An Invitation to the Field: Reimagining Research for Innovation at Scale

These lessons point to a broader opportunity for the field. To support meaningful innovation at scale, research must help educators, leaders, edtech developers, and policymakers understand what happens as innovations take hold, not just after implementation.

Meeting this moment demands collective action. Researchers must better align methods and measures with stages of innovation. Funders must value learning trajectories and implementation insights alongside summative outcomes. Policymakers must legitimize evidence that supports responsible scaling, even when it looks different from traditional markers of success.This need is increasingly urgent as schools and systems adopt emerging technologies and enact new instructional strategies or models. At last year’s FullScale Symposium, more than 30 researchers began laying the groundwork for a shared, field-driven charge: to build a more robust, agile evidence base capable of supporting the future of learning. We intend to continue that work in the coming months. Those interested in contributing to the next phase are invited to email us at comms@fullscalelearning.org to learn more.

Related Posts

The Lessons State Evaluations Teach Us A...

Middle school boys in science class

The Rural AI Strategy Lab

Smiling middle school girl

The Real Cost of Cutting Community Schoo...

An elementary student reads a book in Spanish to herself.

About the Author

Beth Holland, Ed.D. Beth Holland is the Managing Director, Research & Policy at FullScale, the national nonprofit formed by the merger of the […]

Read more from Beth Holland, Ed.D.
Scroll to Top