Skip to main content

The Impact of Our Research on Practice: Is It More than How Journals are Ranked?

Who actually reads our publications? The old adage is that only a handful of other researchers and maybe a very caring significant others read what we write. Of course, an even better question is "how does what we write affect practice?" The answer to this question scares me a bit and can create an existential crisis.

Thankfully there are those in the field who can answer this question and unequivocally point to at least one or multiple examples of how their research improved the outcomes of students with disabilities with a sustaining impact on practice. Numerous individuals come to mind of which many have been active Division for Research members and leaders. With much gratitude, respect, and appreciation I say "thank you" for your contributions. You give us all hope!

Increasingly, the answers to the questions above are being woven into the fabric of our profession in an effort to fill the research- to-practice gap. It seems that we are rapidly moving to not only identifying evidence-based practices but developing mechanisms to feasibly disseminate those practices to multiple stakeholders, across numerous settings using a systems change approach.

Without appearing to resort to hyperbole, continued special education research funding hinges on our ability to convincingly demonstrate to others how special education research makes a positive and lasting impact on the field and the public. It is not by accident that funding through the National Center for Special Education Research (NCSER) was reduced by 30% in 2011. Based on Council for Exceptional Children calculations, this legislative act resulted in a 2014 funding level of $ 49.9 million instead of the promised $70.1 million. The outcome was 75% fewer research projects being awarded in 2014. The value of special education research will be seen in the reauthorization of the Education Sciences Reform Act (ESRA). Projections suggest that the 2014 funding level is the new status quo and such restricted funding amounts are here for the near future. Unfortunately, budget cuts appear to be across the board including other federal agencies (e.g., NIH) that have a negative effect on students with disabilities.

Such a low funding level is discouraging but this is not to suggest that all is doom and gloom. Indeed our current circumstances should serve as a catalyst to strategically advance special education research by demonstrating our value to not only the field but more importantly to the tax paying citizens, public officials, and the students with disabilities that we serve. Continuing to chip away at the research-to - practice gap has the potential to put us in the position to influence those with decision making powers. One starting point might be to develop mechanisms by which we can retrospectively deliver evidence-based practices that are useable in genuine educational settings by using the principles of implementation science. Implementation science has gained access into the shared vocabulary of researchers and funding agencies. Broadly speaking, implementation science is the study of how to integrate research findings into
 
educational policies and practices. This has been the emphasis of numerous organizations including the Council for Exceptional Children during the annual convention. One goal is the adoption and sustainability of evidence -based practices. Implementation science focuses mostly on using research findings retrospectively. That is to say, studies are conducted on a particular educational practice, introduced into a setting, and analyzed over time regarding how well they fit in a particular setting and with a group of participants.

More recently, some researchers are exploring how to design research prospectively so that some of the issues of adoption and integration are addressed while the study is being conducted. This approach goes by several names with the term "design-based implementation research" serving as one example. There continues to be discussion about how to design research prospectively. Characteristics seem to include that the design is systematic, iterative, interactive, collaborative, context-sensitive, and flexible. While designing research prospectively is a work in progress, associated concepts are in grant applications. Two Institute of Education Sciences (IES) grant applications serve as initial examples. The Development and Innovation Goal Two application located in the Special Education Research Grants (CFDA 84.324) requires researchers to design research that incorporates iterations, feasibility, end user understanding, fidelity, usability, authentic educational settings, and dissemination. These appear to be elements of design-based implementation research.

Most recently, IES created the Partnerships and Collaborations Focused on Problems of Practice or Policy (CFDA 84.305H) grant application, a potential other example. A consistent theme in the application is the importance of developing a research collaboration between stakeholders and researchers from the beginning of the project. An overarching objective is to establish ongoing partnerships that conduct further joint research activities that increase capacity and research use. Both grant applications described are concerted efforts to address the research-to-practice gap.

Clearly, designing research prospectively involves genuine collaboration. The call for collaboration seems like a timeworn response, but restoring existing partnerships and establishing new alliances has never been more important. The Division for Research has the luxury of being comprised of those who are affiliated with other divisions. Diverse families, teachers of students with disabilities, school administrators, educational diagnosticians and university faculty alike, all have a vested interest in research. Looking within our educational walls, efforts can be made to involve psychologists, social workers, general educators, paraprofessionals, and school resource officers to name a few. From a broader perspective, those from national educational centers, businesses, and multiple funding agencies (i.e., DOJ, IES, OSEP, NIH, NSF) can serve as collaborators.

Establishing real partnerships is one important consideration when designing research that is both meaningful and long-standing. Yet, there are human characteristics and long entrenched structures and systems that mucky the waters a bit. A starting point toward sorting out these issues could be to determine the function of behavior for university personnel and school employees. One would be hard pressed to locate a school or university that does not have in their mission statement that they want to make an impact. How "impact" is operationally defined makes all the difference.

Taking a "what is in it for me" perspective on impact could be rather discouraging. The university administrator might define impact as the journal ranking since that is considered an important factor in how the institution is evaluated nationally. The faculty member could adhere to the university administrator's definition in an attempt to gain tenure/promotion, a raise, or status. The school administrator could define impact by state test scores where the higher the test scores, the more impact. The classroom teacher could reflect the school administrator's definition since keeping his or her job is contingent upon improved state assessment scores. Okay, current circumstances are not this bleak or straightforward. The behavior of people and systems are much more complex. Additionally, special education is filled with an abundance of individuals who sincerely and deeply want to make a positive impact on the lives of the children that they serve. This truly is one of the best characteristics of our profession. The point here is that we should challenge institutional structures and systems as a means of improving the quality and practicality of research.
 
Challenging institutional structures also involves questioning the assumptions of the scientific practices that guide us. Reflecting on the implications of our practices will allow us to suggest and share with others (school personnel, legislators, parents) what should or should not be taught to students with disabilities. Developing special education standards for evidence-based practices is a celebrated advancement. Creating the What Works Clearinghouse is an acknowledgement of the importance of determining and documenting evidence- based practices. The progression of using single- case design research as a means of establishing what is an evidence-based practice is a monumental achievement. Yet, we have so many more challenges and unanswered questions to address. Within the context of special education research and making the research-to-practice connection, the following are a list of random questions for consideration. This list is not complete. I hope the previous information, the questions below, the thoughts of others, and questions from you, will serve as means to continue the conversation about important considerations in special education research. May we collectively and strategically research our practices and practice the good of what is researched.

  1. How can we better support the efforts of the Council for Exceptional Children?
  2. How can we better contribute to the efforts of funding agencies who address the outcomes of people with disabilities?
  3. What to tell the teacher to teach in lieu of having an evidence-based practice?
  4. How can we increase the dissemination of low-cost or free access to resources on research and research-to-practice topics (social media, Google, Open Access)?
  5. What is the value of single-authorship in higher education?
  6. What created the tradition of not publishing non-significant study findings and how can we use this knowledge to advance the field?
  7. Why might the replication of research not be considered a valuable research endeavor?
  8. How can we increase the likelihood that a classroom teacher will value research more?
  9. How can we better demonstrate to legislators the importance and value of special education research?
  10. How can the Division for Research become more proactive and nibble?
  11. How can we encourage the next generation of special education researchers to seamlessly integrate theory, research, and practice?
  12. How can technology improve our research?
  13. How can we better educate the public about research and research findings?
  14. Are there methods by which knowledge generation can be expedited while still maintaining quality?
  15. How can mixed methods be used to answer important special education research questions?
  16. How can we better brand special education research?

"A pessimist sees the difficulty in every opportunity; an optimist sees the opportunity in every difficulty." Winston Churchill
 

Posted:  1 September, 2014

© 2025 Council for Exceptional Children (CEC). All rights reserved.