Dr. Diana Sisson and Dr. Betsy Sisson Connecticut Association for Reading Research
The purpose of this study was to investigate the current status of SRBI in Connecticut public schools in respect to four research questions. 1) What are educators’ perceptions of SRBI? 2) How familiar are they with SRBI principles and practices? 3) What are their beliefs regarding its implementation and sustainability in their school systems? 4) What professional development resources and training have they previously received, and what resources and training do they believe are integral to the success of SRBI in Connecticut? A mixed-methods design utilized a questionnaire survey to collect responses from a sample group of 200 educators representing 64 school systems, including classroom teachers, reading educators, instructional support personnel, building and district administrators, and independent consultants. The quantitative research employed descriptive statistics garnered from the Likert scale items included in the instrument, while the qualitative research focused on the embedded open-ended items. Findings suggested that participants supported the philosophy and rationale behind SRBI but harbored reservations about its implementation and sustainability. Key concerns centered on the significance of familiarizing all faculty and administrators to SRBI, ongoing professional development to facilitate the transition to this new model, the time demands associated with such intensive services, staffing needed to provide quality interventions (including the importance of ensuring that those working with the neediest of students are certified and trained to offer interventions), resources to meet the needs of diverse student populations, scheduling issues both for students and for educators, and a prevailing theme focusing on their perceived lack of an in-depth, comprehensive understanding regarding data analysis.
Policy Precedents to the SRBI Model
Response to Intervention (RtI) is the culmination of over three decades of federal involvement in special education services in this nation. Beginning with the Education for All Handicapped Children Act of 1975 (re-codified as the Individuals with Disabilities Education Act or IDEA), this legislation ensured appropriate public education for students with disabilities and access to nondiscriminatory evaluation procedures. From the onset, controversy fermented due to the use of the IQ discrepancy model as the primary diagnostic procedure. Reauthorized in 1977, 1983, 1986, and 1990, the discrepancy model remained untouched. By the time of the next reauthorization, IDEA 1997 emphasized regular education interventions and a problem-solving model to determine eligibility for special education services. It also identified thirteen classifications of student disability from which learning disabled (LD) emerged as the predominant category with 52% of students receiving special education services in the United States categorized as LD. Research findings, however, indicated that 52% to 70% of school-identified LD students failed to meet state or federal guidelines for this classification. These numbers prompted G. Reid Lyon of the National Institute of Child and Human Development to suggest that “learning disabilities have become the sociological sponge to wipe up the spills of general education” (Gresham, 2001, p. 1). Despite the implications of these statistics for special education law, the identification process remained ambiguous and inconsistent. This was addressed in IDEA 2004 which recommended simplifying the identification process and incorporating students’ responses to scientificallybased instruction as part of the qualification criteria for classification. This “response to intervention” was defined by the National Association of State Directors of Special Education (NASDSE) as
a practice of providing high-quality instruction and interventions matched to student need, monitoring progress frequently to make decisions about changes in instruction or goals and applying child response data to important educational decisions. (NASDSE, 2006, p. 3)
This paradigm shift in special education regulations has profound impact on American schools as the “prevalence of significant reading disability in children is 17-20% (1 in 5), while more than 33% (1 in 3) struggle to learn to read” (Greenwood, Kamps, Terry, & Linebarger, 2007, p. 73) with current statistics indicating that “12% -14% of students in U.S. schools receive special education services” (Hall, 2008, p. 23). A fundamental intent of RtI is to decrease the number of students in special education by perhaps 70% (Lyon, Fletcher, Shaywitz, Shaywitz, Torgeson, Wood, Schulte, & Olson, 2001). Such a significant decrease in students receiving special education services has considerable effect on the federal government as it is predicted that soon the national cost of special education services will total $80 billion annually (Burns & Gibbons, 2008) for the current 6.5 million children identified with disabilities (Collier, 2010).
Addressing these long-standing issues, IDEA 2004 contained three central elements: use of scientifically-based reading instruction, evaluation of how students respond to interventions, and employing data to inform decision making (Brown-Chidsey & Steege, 2005). In addition, most RtI models incorporated a multi-tier prevention system.
In this way, RtI has two interconnected goals: (1) to identify at-risk students early sothat they may receive more intensive prevention services prior to the onset of severe deficits and disability identification and (2) to identify students with LD who are repeatedly unresponsive to validated, standardized forms of instruction and instead require individualized, data-based instruction. (Fuchs, Fuchs, & Vaughn, 2008, p. 2)
In 2006, the state of Connecticut created an advisory panel charged with the task of reviewing RtI research and developing an implementation framework for the state’s schools. It was during this time that the nationally-recognized Response to Intervention (RtI) model was designated in Connecticut as Scientific Research-Based Interventions (SRBI) because that language was contained in both No Child Left Behind (NCLB) and IDEA regulations and was further intended to “emphasize the centrality of general education and the importance of using interventions that are scientific and research based” (Connecticut State Department of Education, 2008, p. 4). More specifically, “RtI models are dependent on interventions in which evidence is available to attest to their effectiveness” (Costello, 2008, p. 4), while the SRBI model is not held to such parameters. Key elements of Connecticut’s SRBI model included the following:
- Core general education curricula that are comprehensive in nature
- Wide-ranging academic and behavioral support systems
- Positive school climate
- Research-based instructional strategies
- Differentiated instruction for all students
- Universal assessments
- Early interventions
- Data-driven decision making
- Continuum of support throughout the three tiers
- Common formative assessments
As a cohesive model, SRBI was designed to provide a quality education for all students and close the achievement gap that has persisted in Connecticut for a number of years. The current study sought to investigate the status of its implementation within public school systems across Connecticut, delving into educators’ perceptions about its effectiveness in individual districts and schools.
A descriptive research study that employed a mixed-method design was used to further “collecting, analyzing, and mixing both quantitative and qualitative data in a single study” (Creswell & Plano-Clark, 2007, p. 5) as a means to investigate educators’ experiences and perceptions of SRBI implementation in the state of Connecticut. The design was carried out through a questionnaire survey comprising both numeric Likert scale items as well as open-ended queries created to provide the greatest depth of responses from a large sample group.
Developed from a review of the relevant literature pertaining to RtI as well as prior research studies investigating implementation in state-wide models, the questionnaire survey consisted of 30 items related to the sample group’s demographic information as well as to their experiences and perceptions of the current status of SRBI implementation. The items were formatted in a five-point Likert scale which could be numerically measured. In addition to this quantitative component in the study, open-ended queries were included in the survey in order to gain insights through qualitative means. The items for the instrument were developed and reviewed by multiple educational experts to ensure relevance, application to the field, content and construct validity, and bias analysis.
A total of 200 public school educators (grades K-12) were selected through nonprobability sampling. Educators – including classroom teachers, reading educators, instructional support personnel, building and district administrators, and independent consultants – were approached to participate during professional development events sponsored by the Connecticut Association for Reading Research (CARR) as well as the Connecticut Reading Association (CRA) and its local reading council affiliates.
Those who participated were asked to complete a demographic profile contained within the questionnaire survey. Table 1 presents a delineation of the sample group.
Table 1: Participant Characteristics
Grades Primarily Served
Number of Professional Development Trainings Pertaining to SRBI
Note. Those fields listed as “Missing” refer to the number of participants who failed to complete that
particular survey item.
The principal investigators of the study developed a questionnaire survey to examine SRBI implementation and sustainability in Connecticut. The sample group was representative of educators in attendance at a series of professional development events held during the academic year of 2010- 2011, including the following: 1) CARRsponsored events, 2) the CRA Conference, 3) the CRA Leadership Conference, 4) SRBI Lecture Series events co-sponsored by CRA, CARR, and local reading councils, and 5) local reading council events. During these events, attendees were invited to complete the survey questionnaire and share their thoughts of SRBI in their respective school systems.
Once research in the field was completed, the sample included 200 participants representing 64 public school districts. The survey instrument itself was divided into five sub-scales: Research Participant Demographic Profile, Perceptions of SRBI, Familiarity with SRBI Principles and Practices, Implementation and Sustainability, and Professional Development Resources. Descriptive statistics were used to summarize the demographic data obtained from the first sub-scale. Both quantitative and qualitative methods were utilized to analyze the remaining sub-scales.
The quantitative research consisted of 30 statements comprising 5 items relating to the construction of a demographic profile of the participants and 25 items using a 5-point Likert scale to determine the level of agreement or disagreement with key components of the SRBI model. Descriptive statistics, including calculations for item frequencies and chi-square tests, were conducted on each of the demographic variables as well as on those items pertaining to participant statements to determine relationships within the data. The qualitative data, consisting of four constructed-response items found in Sub-Scales II, III, IV, and V, requested participants to expand on their Likert-scale responses. To complete an organized analysis of the findings, the components of Miles and Huberman’s Interactive Model of Data Analysis (1994) was employed which follows a process-oriented approach to qualitative research – data reduction, data display, and conclusion drawing and verification. In effect, the data reduction of the participants’ constructed responses took place through coding comments and discerning themes that were representative of the sample group’s responses. Then, the data were systematically displayed which organized the respondents’ constructed responses graphically in matrices and charts so that conclusions could be drawn, verified, and validated to be accurate of the sample.
Positive Perceptions Regarding SRBI
A compelling theme that emerged from the questionnaire survey was the predominantly positive attitudes of the participants in regards to the SRBI model. When asked if they believed in the principles and practices of SRBI, 81.5% of respondents agreed (inclusive of agreed and strongly agreed) with 70.5% believing that providing systematic interventions for struggling students is more effective in determining achievement potential than IQ testing. Consensus was also reached on the need for differentiated instructional practices in Tier I classrooms (95.5%) with one classroom teacher from a suburban district offering, “I believe that Tier 1 instruction is most important. If you have effectively implement[ed] this instruction, you will have less in Tier 2 and 3.” This resolute support of educators for the SRBI model falters significantly when queried if the majority of the educators with whom they work are currently prepared to implement the SRBI model. Only 31.0% asserted that their colleagues were professionally ready. This percentage dropped significantly when analyzing administrators’ responses with only 9.1% purporting that their staff was ready. This unease was verbalized by a reading educator from an urban district who stated, “Teachers don’t know the principles of SRBI.” Another urban reading educator offered, “It helps the teacher to meet each student’s needs by working with that student on his or her level. However, this means that the teachers must have received appropriate PD about RtI/SRBI and work the system with fidelity.” Further analysis revealed that of those respondents who agreed general education teachers should implement differentiated instructional practices to meet the needs of diverse learners, only 23.8% could easily navigate through the three tiers, 12.7% could access appropriate resources, and 12.8% could select appropriate data for progress monitoring of student performance during intervention services.
Lack of Familiarity with Assessment Tools Needed to Drive SRBI Model
A second theme that surfaced was the inconsistency with which participants responded in relationship to their familiarity with SRBI principles and practices. When questioned on specific components of the model, 50.0% (inclusive of agreed and strongly agreed) indicated that they could easily navigate among the three tiers, 86.5% could make decisions regarding core instruction and interventions, 69.0% could access appropriate resources (urban, 70.8%; suburban, 72.4%; rural, 57.7%), 75.5% could ensure that intervention plans were supported by data, and 64.5% could select appropriate data for progress monitoring of student performance during intervention services. In contrast to this comparatively stalwart belief in their understanding and application of the components of the SRBI model, coding of their constructed responses indicated that respondents expressed a lack of familiarity in two principal areas: identifying evidence-based programs and interventions as well as an even more pronounced concern regarding their ability to use assessment tools. Of note, the construct of assessment tools encompassed actual assessments, progress monitoring techniques, and data analysis. The respondents, inclusive of all sub-groups of educators, referred at length to the ambiguity of how data should be effectively utilized in the model and this topic served as the basis for a recurring theme in their call for professional development training in this area.
Systemic Obstacles Impeding Effectiveness of the SRBI Model
Of further issue were the organizational barriers that the participants perceived as hindering successful implementation of SRBI in their schools. With participants from 64 school systems, only 47.0% (inclusive of agreed and strongly agreed) believed that district-level leadership provided active support for SRBI, and an almost equivalent 48.0% perceived that implementation of the model was jointly directed by general education and special education efforts. Of the respondents, 36.0% deemed a clearlydefined SRBI model to be in place in their school system with one reading educator noting, “It’s not clearly defined – therefore no one is sure of their role and process.” Although relatively few in number within the study’s sample group, those who identified themselves as servicing middle school and high school students expressed anxiety about implementing the model with older students as one classroom teacher simply stated, “It doesn’t exist at the high school level.”
In considering the individual components of the SRBI model, 48.5% asserted that a schoolbased multidisciplinary intervention team was in place that met on a regular basis, and 77.0% affirmed that universal screenings were being administered three times a year. Complexity did exist in this response, however, as the percentage shifted dramatically based on which grades the respondents were servicing, i.e., 93.0% of elementary educators affirmed that universal screenings were in place with that statistic dropping significantly to 72.7% of middle school educators and 27.8% of high school educators. Pertaining to certified staff providing intervention services, 68.5% suggested that Tier II and 78.0% that Tier III interventions were currently offered by certified personnel. The number of respondents, however, who concluded that these interventions were prescriptive to the individual needs of specific students dipped to 57.5%. This theme of unease with the role data played in the SRBI model and its impact on delivering interventions persisted as nearly one third of the respondents did not believe that appropriate progress monitoring within the three tiers was currently in place.
In addition to these direct inquiries regarding implementation and sustainability of the SRBI model, respondents also shared specific obstacles that they perceived as impeding effectiveness. Approximately 20% cited time as the primary barrier, followed by staffing, resources, scheduling, familiarizing educators with the model, ongoing training, the need for certified personnel, and the persistent issue of data – specifically progress monitoring.
Content-Specific Training Needed to Ensure Implementation and Sustainability
In the context of professional development training that the participants had previously received, 74.5% (inclusive of agreed and strongly agreed) had attended an overview of RtI principles and Connecticut’s SRBI model; 51.5% had received information regarding modifications of special education referral practices; 42.0% had obtained data pertaining to specific practices within each of the three tiers (0.0% of administrators professed to having had accessed such training); and, 52.5% had received training in evidence-based interventions with 53.5% attending training in progress-monitoring procedures. In essence, three out of four of the participants in the sample had attended training in an overview, but only one out of two had attended more advanced training in the specific components of the model. Beyond the items on the survey that questioned the importance of future professional development in specific components of the SRBI model, respondents referred to several particular areas of need: interventions, the recurring theme of utilizing data and progress monitoring, accessing resources, and the importance of training classroom teachers in Tier I core instruction with differentiated strategies. The relevance of providing content-specific training in the SRBI model can be illustrated in its relationship with the attitudes generated toward the model. Of those who never attended any training in SRBI, only 56.3% agreed with the principles and practices of SRBI. A trend formed of increasingly positive attitudes toward the model from additional attendance: 1-2 = 75.0%, 3-5 = 91.5%, 6-9 = 92.1%. That number dropped to 76.9% at attendance of 10 or more trainings.
This study was designed to provide a preliminary investigation of SRBI implementation in public schools in Connecticut. As the SRBI model constitutes a recent shift in educational policy, little has been known about its implementation phase, how the state’s educators view it, and its potential for sustainability as a state-wide model. By incorporating a mixed-methods approach, participants were able to express their experiences and perceptions of the model in succinct, quantitative terms while also sharing deeper insights through their constructed responses. The questionnaire survey employed by the survey served as an instrument by which to gauge the model in a broad array of school systems in a relatively short amount of time.
The item analysis coupled with the constructed responses suggested that participants viewed SRBI as a positive paradigm shift in educational policy and special education practices; nonetheless, they also deemed the current status of Connecticut schools unprepared to deliver the model with fidelity. Lack of a clear understanding in data analysis (from the selection of common assessments and probes to progress monitoring techniques to utilizing data to make effective intervention plans for struggling students) and a deficit of training topped their concerns – a theme that persisted throughout their responses. In addition, participants articulated a myriad of other issues that they felt impeded the effectiveness of the SRBI model, including time, staffing, resources, scheduling, and training. Specifically, they expressed a need for more comprehensive training in resources, data, and Tier I core instruction for classroom teachers.
The findings of this study support the necessity of additional training opportunities focusing on a specialized set of tools and competencies in order to ensure the success of SRBI in Connecticut’s public schools (Allington, 2009; Howard, 2010; Johnston, 2010; Wright, 2007). Successful implementation will also necessitate deeper training in the use of assessments and data to inform educational planning (Owocki, 2010) as well as a systemic response to the barriers currently impeding effective implementation of the SRBI model.
Limitations of Current Research
The results of this study were limited by several factors. First, the majority of participants in the sample were in attendance at professional development events which offered SRBI training sessions which suggests that the sample group may be more knowledgeable and more actively involved with the model than the overall population. Those who participated were also those willing to share their perceptions. Consequently, the extent to which their perceptions are representative of those who elected not to participate remains unknown. Neither was these responses verified, so self-reported data may have been biased. Second, administrators and instructional support personnel were a small proportion of the sample group which lessens the equity of their responses. Third, participants derived primarily from suburban school systems which limited the representative nature of the results, especially with rural school systems that only comprised 13% of the sample group.
Recommendations for Future Research
Additional research is required to furnish a more detailed understanding of SRBI in Connecticut. Future studies should be conducted with administrators as they hold a key role in the implementation and sustainability of the model (Hall, 2008; Shores & Chester, 2009). As this study was exploratory in nature and attempted to offer broad generalizations of current perceptions, research in the future should probe deeper into the model’s effectiveness in schools through correlation research to determine relationships, assess consistency, and form predictive statements (Ary, Jacobs, Razavieh, & Sorensen, 2006), focus groups to examine the issues more deeply, and document analysis to investigate special education rates as well as standardized achievement tests to ascertain the degree to which students are responding to interventions and if these interventions are affecting the percentage of students being identified for special education services.
A majority of the participants indicated that that they endorsed the philosophy of SRBI but registered apprehension that educators were fully cognizant of and prepared to apply the model’s principles and practices with struggling students. These findings should be viewed in a positive light for educators at the school, district, and state level. While there is strong support for the philosophy of SRBI, their concerns offer a context for discourse about the systemic reforms needed to facilitate the model. First, school systems should provide active commitment and support as evidenced through the development of a strategic SRBI plan for all of its schools with clear delineations of roles and responsibilities (Howell, Patton, & Deiotte, 2008; Sack-Min, 2009; Shores & Chester, 2009). Second, a focused professional development plan should be developed that provides training for administrators and faculty members across the continuum of SRBI principles and practices so that all staff members are fully prepared to assume responsibility for the SRBI model within their specific role in ameliorating student academic weaknesses (Applebaum, 2009; Bergstrom, 2008; Foorman, Carlson, & Santi, 2007; Howard, 2009; Mellard & Johnson, 2008; Restori, Gresham, & Cook, 2008). Third, school systems and individual schools should work collaboratively to create a resource kit for resources and specific interventions aligned to students’ academic needs (Burns & Gibbons, 2008; Wright, 2007). Fourth, school-based multidisciplinary teams should be developed at each school to collect and monitor data (including common assessments and probes, progress monitoring programs, and data analysis techniques to drive interventions) to assess the level of commitment and impact of the SRBI model at site-specific locations (Applebaum, 2009; Burns & Gibbons, 2008; Mellard & Johnson, 2008).
Although this study sought to provide a preliminary report on the implementation of SRBI in Connecticut’s public schools, it should also be viewed as an opening dialogue about the organizational frameworks needed to support this shift in educational policy. For example, results of the analysis of the study makes the need for more comprehensive training programs patently clear. Furthermore, attention needs to be given to the common issues of how to regulate schools’ time, scheduling, resources, and staffing to align to the SRBI model. Future research should strive to furnish status checks on Connecticut’s continued expansion of SRBI. It should also focus on the systemic reforms needed to ensure the academic well-being of Connecticut’s students. Reader’s Note: This article provides a brief summary of the research study, A Status Report on SRBI in Connecticut. CARR will publish a comprehensive report in the coming months.
Allington, R. L. (2009). What really matters in Response to Intervention. Boston, MA: Pearson.
Applebaum, M. (2009). The one-stop guide to implementing RtI. Thousand Oaks, CA: Corwin Press.
Ary, D., Jacobs, L. C., Razavieh, A., & Sorensen, C. (2006). Introduction to research in education. Belmont, CA: Thomson Higher Education.
Bergstrom, M. K. (2008). Professional development in Response to Intervention: Implementation of a model in a rural region. Rural Special Education Quarterly, 274, 27- 36.
Brown-Chidsey, R., & Steege, M. W. (2005). Response to Intervention: Principles and strategies for effective practice. New York: The Guilford Press.
Burns, M. K., & Gibbons, K. A. (2008). Implementing Response-to-Intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge.
Collier, C. (2010). RtI for diverse learners. Thousand Oaks, CA: Corwin. Connecticut State Department of Education. (2008). Using scientific research-based interventions: Improving education for all students. Connecticut’s framework for RtI – February 2008 executive summary. Hartford, CT: Author.
Costello, K. A. (2008). Connecticut’s Response to Intervention (RtI) model: A status report. Retrieved April 2. 2009, from National Center on Response to Intervention: RtI State Database Web site: http:state.rti4success.org/index.php? option=com_state&stateId=110
Creswell, J. W., & Plano-Clark, V. L. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage Publications, Inc.
Foorman, B. R., Carlson, C. D., & Santi, K. L. (2007). Classroom reading instruction and teacher knowledge in the primary grades. In D. Haagar, J. Klinger, & S. Vaughn (eds.), Evidence-based reading practices for response to intervention (pp. 45-72). Baltimore, MD: Paul H. Brookes Publishing Co.
Fuchs, D., Fuchs, L. S., & Vaughn, S. (Eds.). (2008). Response to Intervention: A framework for reading educators. Newark, DE: International Reading Association.
Greenwood, C. R., Kamps, D., Terry, B. J., & Linebarger, D. L. (2007). Primary intervention: A means of preventing special education? In D. Haager, J. Klinger, & S. Vaughn (Eds.), Evidence-based reading practices for Response to Intervention (pp. 73-106).Baltimore, MD: Paul H. Brookes Publishing Co.
Gresham, F. (2001). Responsiveness to Intervention: An alternative approach to the identification of learning disabilities. Paper presented at the Learning Disabilities Summit, Washington, D.C.
Hall, S. L. (2008). Implementing Response to Intervention. Thousand Oaks, CA: Corwin Press.
Howard, M. (2009). RtI from all sides: What every teacher needs to know. Portsmouth, NH: Heinemann.
Howard, M. (2010). Moving forward with RtI. Portsmouth, NH: Heinemann. Howell, R., Patton, S., & Deiotte, M. (2008). Understanding Response to Intervention: A practical guide to systemic implementation. Bloomington, IN: Solution Tree.
Johnston, P. H. (2010). (Ed.). RtI in literacy – Responsive and comprehensive. Newark: DE: International Reading Association.
Lyon, G. R., Fletcher, J. M., Shaywitz, S. E., Shaywitz, B. A., Torgeson, J. K. Wood, F. B., Schulte, A., & Olson, R. (2001). Rethinking learning disabilities. In C. E. Finn, R. A Rotherham, & C. R. Hokanson (Eds.), Rethinking special education for a new century (pp. 259-287). Washington, DC: Progressive Policy Institute and the Thomas B. Fordham Foundation.
Mellard, D. F., & Johnson, E. (2008). RtI: A practitioner’s guide to implementing Response to Intervention. Thousand Oaks, CA: Corwin Press.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis. Thousand Oaks, CA: Sage Publications, Inc.
National Association of State Directors of Special Education (NASDSE). (2006). Response to Intervention: Policy considerations and implementation. Alexandria, VA: Author.
Owocki, G. (2010). The RtI Daily Planning Book, K- 6. Portsmouth, NH: Heinemann.
Restori, A. F., Gresham, F. M., & Cook, C. R. (2008). “Old habits die hard:” Past and current issues pertaining to Response-to- Intervention. California School Psychologist, 13, 67-78.
Sack-Min, J. (2009). A rapid response. American School Board Journal, 196(9), 37-39. Shores, C., & Chester, K. (2009). Using RtI for school improvement: Raising every student’s achievement scores. Thousand Oaks, CA: Corwin Press.
Wright, J. (2007). RtI Toolkit: A practical guide for schools. Port Chester, NY: Dude Publishing.