Forward Thinking from
Washington, DC
RESEARCH WINDOWS
T he U.S. Department of Education’s Office of Educational Technology has been scouting current practices in the
field and thinking about how new developments
in computer science might affect education. In
January, the department published a draft report,
Expanding Evidence Approaches for Learning in
a Digital World ( 1.usa.gov/101n0Bo), that has
some good thinking relevant to all education
stakeholders.
This report addresses the fact that emerging
technology not only changes how we learn and
teach, it shifts the terrain of how we measure,
research, assess, and evaluate learning and teaching tools, processes, and content. It draws from
many projects and developments in practice,
some of which were featured at ISTE 2012 in
San Diego.
methods from industry research and development cycles for continuous improvement of
promising products.
Further, we want to know more than the
impact on standardized test scores; we want to
know how to support learning and how to assess
the development of digital age skills. Advances
in technology tools and learning sciences point
to new ways to do this. The model of design-based implementation research is not new, but
this report endorses its use for putting new ideas
into practice quickly.
The main idea is that innovators and researchers should collaborate with practitioners to develop innovations in situ, where they will be implemented, rather than presenting practitioners
with a final package. This model of collaborative
research, combined with advances in collecting,
managing, and mining large data sets generated
by digital education resources, can change how
we work on educational challenges. Some of the
challenges identified in the report that we can
address with these new approaches include:
Clare Strawn
Clare Strawn, PhD,
is a senior research
associate with ISTE’s
Research and Evaluation Department
( iste.org/research).
You can contact her at
cstrawn@iste.org.
Research to Practice
What is the relationship between research and
practice? Dr. Eval addressed this question in
the June/July 2012 issue of L&L (page 10) in
response to a member who asked, “How will
educational innovation ever occur if everything
has to be ‘research based’?” This is an especially
critical issue when there are many new things to
try and very little evidence to guide us to what is
most effective.
School administrators usually don’t need
formal randomized, controlled trials to provide
enough evidence of effectiveness before trying
something out. The linear model of proving
something and then trying to replicate it with
fidelity is too slow and does not address the
need to adapt to different implementation
contexts. The Department of Education
report suggests that education should adapt
• Making sure learning resources promote
deeper learning
• Building adaptive learning systems that
support personalized learning
• Combining data to create support systems
that are more responsive to student needs
• Improving the content and process of
assessment with technology
Making Data Work for Us
On the other side of design-based implementation research is a new capacity for collecting and
managing large data sets. New curriculum and
tools can quickly and cheaply be disseminated