Evidence for the Classroom Grants

The following are summaries of studies funded through the Evidence for the Classroom Initiative. 

View by grant year

2012 | 2013

2012 Grantee List

(Please click on a grantee and scroll down to view details of the grant)

Melissa Braaten | Evidence for the Middle School Science Classroom
Jennifer Greene and Thomas Schwandt | The Role of Student Characteristics in Teachers' Formative Interpretation and Use of Student Performance Data
Ilana Seidel Horn | Data in Conversation: Professional Learning Opportunities in Teachers' Data-Centered Collaborative Talk
Carolyn Riehl | Degrees of Freedom: Teachers' Use of Student Performance Data in Decision Making and Instructional Change
Brenda Turnbull and Erikson Arcaira | Now What? Instructional Responses to Interim-Test Data

Melissa Braaten
(Start: 7/1/2013, End: 7/30/2015)
Evidence for the Middle School Science Classroom
Department of Curriculum & Instruction, University of Wisconsin-Madison

Current emphasis on “data-driven instruction” rests on assumed connections between teachers’ analysis of data and subsequent practice. Systematic study will be necessary to understand how teachers’ attention to and use of student learning data is—or is not—related to instruction and decision-making. Because this practice takes place primarily in the privacy of a classroom and only periodically in the more public context of teacher groups or meetings, any study attempting to uncover data-driven decision-making from the perspective of the practitioner requires close, ongoing relationships with teachers. By working to understand the practices of “data use” from the perspectives of teachers, this project aims to understand how tools, colleagues, organizational structures, and other workplace resources provide both affordances and constraints for science teachers’ efforts at “data-driven” decision-making and instruction.

Over 2 years, this study will examine science teachers’ data use in daily practice in two middle schools that are racially, linguistically, and socioeconomically diverse. Using an ethnographic approach and multi-case study design, the project researchers will investigate teachers’ pedagogical decision-making when they obtain, interpret, use, and generate data in science classrooms. The researchers will examine personal, organizational, and contextual factors affecting science teachers’ data use and pedagogical practices, including purposes for using data; tools, resources, norms, and communities shaping teachers’ data use and pedagogical practice; and organizational, cultural, ideological, and political discourse shaping teachers’ data use and pedagogical practice. Based on interim findings from year 1, researchers will work with teachers and school leaders to develop tools and other supports for teachers’ professional learning teams. Iterative cycles of design studies during year 2 will examine how tools and resources shape these professional learning teams and support teachers’ attention to and use of student learning data during science teaching.

By becoming embedded observers, members of the research team will build case studies of: 1) individual science teacher practice and decision-making occurring daily in classroom settings, 2) collaborative teacher meetings and decision-making occurring at each school site, and 3) teacher participation and decision-making in periodic school level institutional meetings. Cross-case analyses will identify patterns and important variations between different instantiations of data use and instructional practice.

This study will: 1) provide rich evidence about the role that student learning data may or may not play in science teachers’ pedagogical reasoning and daily teaching practice, 2) generate findings useful for building meaningful supports for fostering productive data use practices for science teachers both in pre-service and in-service settings, 3) help provide connections between policy and practice.

Jennifer Greene and Thomas Schwandt
(Start: 1/1/2013, End: 12/31/2015)
The Role of Student Characteristics in Teachers' Formative Interpretation and Use of Student Performance Data
Department of Educational Psychology, University of Illinois at Urbana-Champaign

Teachers’ practical use of student performance data (e.g., unit tests; interim assessments known as benchmark, screening or diagnostic tests; project-based assessments) in decisions to change instructional practices is a complex and elusive phenomenon. This study is designed to achieve two principal aims: 1) study decision-making in elementary school teacher-teams, and 2) describe aspects of school culture that may influence data use in instructional decision-making. This study proposes that understanding practical use of these performance data requires the simultaneous investigation of three sets of circumstances: 1) How and why teacher-teams draw on knowledge of student characteristics (not simply race, ethnicity, and disability status but also students’ cultural and conceptual knowledge, language practices, and other distinguishing characteristics) in their interpretations of student performance data (and, as importantly, if they do not draw on this knowledge, why not), 2) How instructional decisions based on these two data sources—knowledge of student characteristics and data on student performance— arise through a process of collective decision-making (i.e., what transpires in teacher-team discussions), and 3) How a school’s culture (e.g., organizational and political contexts, accountability system, general climate) influences the decision-making processes of teacher-teams.

Our specific research questions are:

  1. In what ways and to what extent are interpretations of student performance data made by teacher-teams situated in their knowledge of student characteristics?
  2. How does variation in these characteristics affect the ways in which teachers interpret performance data and use those data to make instructional decisions?
  3. How are data on student characteristics brought to bear and negotiated as part of the process of interpreting and using performance data in a team setting?
  4. In what ways and to what extent does the culture of the school influence or mediate how teacher-teams interpret and use student performance data?

Ilana Seidel Horn
(Start: 1/1/2013, End: 12/31/2015)
Data in Conversation: Professional Learning Opportunities in Teachers' Data-Centered Collaborative Talk
Department of Teaching and Learning, Vanderbilt University

In the context of No Child Left Behind, student performance data have become commonplace, particularly in low-performing schools. In many places, improvement efforts attempt to harness the potential of teacher community along with the notion of evidence-based practice, making teacher workgroups important sites for interpreting student performance data. The proposed three-year project will investigate four middle school mathematics teacher workgroups in two urban districts as they discuss performance data. Focal groups will come from an existing longitudinal research project investigating systemic instructional improvement from district offices to schools and classrooms, providing analytic leverage from existing data and instruments. Through comparative case studies, we will uncover how teachers’ data conversations shape professional learning in different settings. Focusing on videotaped conversations, we will use methods from sociolinguistics to investigate learning in each workgroup. We will mine and augment existing data to situate the interactional analyses, tracing outward from teachers’ conversations to differing school, district and state contexts and inward to individuals’ contributions and expertise. This project will develop a model of teachers’ professional learning opportunities through collective engagement with student performance data. We will investigate teachers’ learning opportunities by examining workgroup cultures, specifically by comparing uses of representations of practice, epistemic stances, and interactional routines on teaching. These guiding concepts come from prior work on teachers’ learning through conversations. By extending these constructs to account for data use and specifying how they come together to shape professional learning in different contexts, we will sharpen our understanding of the learning potential in data-centered teacher conversations. Theoretically, this work will extend our understanding of teachers’ learning by examining relationships between workgroup practices and individual teachers’ development, as well as relationships of this development to broader contexts. Practically, it will illuminate learning consequences of data use, providing insight for school improvement and guidance for leaders.

Carolyn Riehl
(Start: 3/1/2013, End: 9/30/2015)
Degrees of Freedom: Teachers' Use of Student Performance Data in Decision Making and Instructional Change
Department of Education Policy and Social Analysis, Teachers College, Columbia University

Teachers’ data-driven decision-making (DDDM) can serve a technical function in guiding instruction. It also is a constitutive practice of the school as a social organization. And in many settings, it has taken on the character of an institutional myth that lends legitimacy in an era of intense pressure for evidence-based practice. The proposed research is intended to develop knowledge about how, if at all, DDDM leads to real instructional change. In particular, we will explore the logics of action that guide DDDM in schools and how those logics are enacted through structures, processes, speech, and action. We posit that at least four logics may be in play: the logic of feedback and diagnosis, the logic of instructional repertoires, the logic of accountability, and the logic of student voice.

For this naturalistic case study, we will select four public elementary schools in New York City. We will study at least two naturally occurring DDDM groups per school.

We will observe the structures, interactions, processes, and tools and resources that comprise the practice of DDDM and analyze how these components reflect different theories of action for DDDM. We will document the substantive content of DDDM discussions and the instructional decisions and plans that are made. We then will make repeated direct observations of classroom teaching, documenting the instructional changes that follow from DDDM. We will triangulate our observations with interviews with teachers and school leaders. A longitudinal design will enable us to observe how teachers’ participation in DDDM and their efforts to make change in their classrooms evolve over time.

In the second year, we will add a design study of student participation in DDDM in one or two schools that volunteer to participate, helping teachers to develop rubrics and other mechanisms for eliciting student self-reflections on their performance, and then observing how those are incorporated into DDDM practices and how they are interpreted by teachers and used for making instructional adjustments.

Overall, we aim to elaborate an evidence-derived theory of action for how the socially situated practices of teachers’ use of student performance data have impact in classrooms and schools. These findings will establish evidence proofs that can shed light on DDDM in other settings.

Brenda Turnbull and Erikson Arcaira
(Start: 1/1/2013, End: 6/30/2015)
Now What? Instructional Responses to Interim-Test Data
Policy Studies Associates

This is a study of elementary teachers’ instructional decisions made in response to students’ performance on interim or benchmark tests of reading and mathematics. Through case studies in three elementary schools and close analysis of supports available to the teachers, the team will probe ways in which instructional prescriptions are shaped by the schools’ tools, structures, and routines for data use and for instruction—and learn how these school conditions support rigorous, applied academic work as a response to student performance data.

The central research questions are:

  1. How do elementary-grade teachers make instructional decisions in response to students’ performance on formal interim assessments?
    1. How are these decisions influenced by characteristics of the data (content and format) and the organizational conditions for the process of interpreting data (formal and informal structures, and organizational routines)?
    2. How are these decisions influenced by the availability of specific instructional tools, structures, and routines?
  2. What supports teachers in choosing instructional responses to interim assessments that go beyond narrowly focused test preparation?


Through in-depth data collection and analysis in reading and mathematics, grades two to five, across schools that have strong student performance and that have made serious efforts to use data productively, the study will identify ways of supporting teachers in assigning academic work that includes applications of skills and knowledge. Because the school sites have quite different philosophies, the types of supports that are common across settings will have some generalizability.

The study addresses a gap in the literature on data use: relatively little is known about teachers’ instructional choices made in response to student data, other than that teachers may struggle to translate data into instructional prescriptions and may fall back on re-teaching the specific items missed on tests. Thus, the answers to the research questions will have significance for research and practice.

2013 Grantee List

(Please click on a grantee and scroll down to view details of the grant)

Catherine Brighton, Tonya Moon and Marcia Invernizzi | Kindergarten Teachers’ Use of Literacy to Make Instructional Decisions
Joan Buttram and Elizabeth Farley-Ripple | Understanding the Leverage Points: How Do Teachers Use Data to Inform Instruction
Amanda Datnow and Vicki Park | Teachers’ Use of Data for Differentiated Instruction: A Study of the Intersection of Ability Grouping and Data Use
Caroline Ebby, Jonathan Supovitz, and Philip Sirinides | Teachers' Use of Learning Trajectories in Analysis of Evidence and Instructional Response
Helenrose Fives and Nicole Barnes | Teachers with Expertise in Data Use: How Do They Engage in Data Driven Decision Making from Student Performance Data to Influence Instruction?
Matthew Kloser and Hilda Borko | Improving Teachers' Use of Data for Instructional Decisions: Using Assessment Portfolios for Professional Development
Joseph McDonald, James Kemple, and Susan Neuman | Data Use in Action
Katharine Shepherd and George Salembier | Collaborative Data Use by Teacher Decision-making Teams to Support Instructional Interventions for Struggling Students

Catherine Brighton, Tonya Moon and Marcia Invernizzi
(Start: 7/1/2014, End: 6/30/2016)
Kindergarten Teachers’ Use of Literacy to Make Instructional Decisions
Curry School of Education, University of Virginia

Phonological Awareness Literacy Screening-Kindergarten (PALS-K) is Virginia’s state-provided early literacy assessment that is a screening and diagnostic tool designed to guide instructional decision making and monitor students’ progress over time. It closely mirrors the knowledge and skills that students need to acquire in kindergarten, thus, is directly actionable and should guide teachers toward differentiated reading instruction. Despite extensive support, to date there have been no studies that investigate whether (and if so how, and under what conditions) teachers use these data, as well as other types of data, to differentiate reading instruction for their students’ needs. This study will fill that gap by answering the following research questions:

  1. How do kindergarten teachers use formative reading assessment data (i.e., PALS-K) to plan and implement appropriately challenging instruction for varying student needs?
  2. What are the organizational and instructional structures that support or inhibit teachers in using PALS-K data to guide instruction?
  3. Are there “micro-processes and stages” of PALS-K data use when teachers adjust their instruction in response to varying student performance data?

Through a two-year multiple case study-comparative design we aim to explore the distal and proximal factors in pairs of schools across six divisions (3 per year), one of which “over performs” and the other “underperforms” in terms of predicted literacy growth among Kindergartners when accounting for expected demographic variables. Data sources include central office administrator interviews; teacher and building administrator “think-aloud” interviews and narrative and time-sampled interviews; monthly classroom observations; and other relevant documents. Data will be analyzed using rigorous qualitative methods including safeguards for trustworthiness. Findings will be shared in cross-case analyses of teachers across pairs and across over- and under-performing school cases to present common factors that contribute to these differential practices across settings. Findings are expected to add to the scholarly literature on teachers’ decision-making patterns regarding data usage for instructional planning.

Joan Buttram and Elizabeth Farley-Ripple
Understanding the Leverage Points: How Do Teachers Use Data to Inform Instruction
Delaware Education Research and Development Center, University of Delaware

In spite of growing expectations for teachers’ use of data, little is known about what teachers actually do with student test data and, importantly, how their data use relates to student learning. Data use could potentially leverage improvements in instructional practice and student learning, but the mechanisms by which improvements are generated need to be unpacked. We propose to examine key leverage points between data use, instructional decisions, and student outcomes. Specifically, we seek to answer the following research questions:

  1. What types of/features of student performance data are used to develop knowledge of student learning?
  2. What types of instructional decisions are made based on these data?
  3. Which types of data-informed instructional decisions are associated with student learning gains?
  4. What contextual factors support or impede effective data use practices?

The research will be a joint endeavor between the University of Delaware (UD) and the Northwest Evaluation Association (NWEA)’s Kingsbury Center. As NWEA’s MAP assessment is used in 6,000 districts, this partnership uniquely permits us to explore data use at scale, focus on a form of data (interim assessments) increasingly common to schools nationwide but underrepresented in the data use literature, and to connect practice to student learning gains.

To accomplish our research goals, we propose an explanatory mixed methods approach based on large scale survey data and follow-up case studies. The former will be used to identify and document teachers’ data use practices as well as the conditions that facilitate or hinder their data use. Survey results will be connected to student MAP scores to identify effective practices and conditions (heretofore defined as associated with student learning gains). Case studies on a purposefully selected sample will follow quantitative analyses in order to develop a deeper understanding of effective data use practices and conditions.

Amanda Datnow and Vicki Park
(Start: 3/1/2014, End: 8/31/2016)
Teachers’ Use of Data for Differentiated Instruction: A Study of the Intersection of Ability Grouping and Data Use
Department of Education Studies, University of California, San Diego

For the past decade, data-driven decision-making has been a key pillar of educational reform agendas. However, we still know little about how teachers use data, what types of data they use, and how their instruction is impacted. We are also witnessing some instructional trends that highlight the importance of delving deeper into these issues. In particular, a recent study reporting increased ability grouping in reading and math in the upper elementary grades suggests the need to look further into how data are used for instructional differentiation. The purpose of the study is to uncover how teachers use student performance data to differentiate English language arts and math instruction. We will also examine how teachers’ beliefs about teaching, learning, and student ability inform instructional decision making, as well as how teachers’ use of data is mediated by teacher collaboration, school organization, and the broader policy context.

The project will involve in-depth qualitative case studies of four elementary schools, focusing on teachers and their teacher teams in the upper elementary grades. The project’s conceptual framework draws upon research and theory on sensemaking and co-construction perspectives on reform implementation; teachers’ use of data individually and in context; and teachers’ conceptions of student achievement and ability and how these conceptions shape their practice.

Caroline Ebby, Jonathan Supovitz, and Philip Sirinides
(Start: 1/1/2014, End: 6/30/2016)
Teachers' Use of Learning Trajectories in Analysis of Evidence and Instructional Response
Consortium for Policy Research in Education, University of Pennsylvania

This study focuses on how learning trajectory-oriented formative assessment can be used to provide teachers with greater understanding of student thinking and lead to more refined instructional responses to improve student performance in mathematics. The project includes both a learning experience for teachers and associated research study. The professional development component supports grades 3-5 teachers’ understanding of research-based learning trajectories in the examination of student work through both a standard professional development experience and a series of facilitated professional learning communities (PLCs).

The research component of the project is guided by the following questions:

  1. How do teachers make sense of learning trajectories and use them to interpret student work collaboratively with their colleagues in professional learning communities? How does the nature of their discussions change over time?
  2. How does the use of a learning trajectory-oriented formative assessment framework influence teachers’ interpretation of evidence of student thinking from student work and instructional decision making? What is the range and variation in teachers’ growth over time in their capacity to analyze student work and make instructional decisions and what factors moderate teachers’ growth trajectories?
  3. How does the learning trajectory-oriented framework influence teachers’ planned instructional responses and ongoing formative assessment practices in the classroom?

This is a mixed methods study. Qualitative data will be collected from nine PLCs and 28 teachers in three schools over two years. Quantitative methods will be used to model growth in teachers’ thinking over time, explore variation in growth across teachers, and identify factors that moderate growth. Comparison across different school contexts will further allow for exploration of how growth in teacher capacity and data-use practices varies and is shaped by school–level factors.

 

Helenrose Fives and Nicole Barnes
(Start: 2/1/2014, End: 12/31/2015)
Teachers with Expertise in Data Use: How Do They Engage in Data Driven Decision Making from Student Performance Data to Influence Instruction?
Department of Educational Foundations, Montclair State University

Teachers are expected to engage in data use to make instructional decisions that improve teaching and student learning (Data Quality Campaign, 2009; US DOE, 2009). Despite the demand to develop teachers’ data literacy and data use for instruction, there is little empirical evidence on how teachers understand and use data in authentic contexts that can be referenced to guide these policies and interventions (Mandinach, Honey, Light, & Brunner, 2008).

In this investigation we draw from the theoretical and empirical bases of data-driven decision making (DDDM, Mandinach et al., 2008; Marsh, 2012) and teacher expertise (e.g., Alexander et al., 2004) to uncover the craft knowledge of and contextual influences on teachers with expertise in data use. Craft knowledge is the experience-based “how to” knowledge developed and enacted by practitioners (Grimmett & MacKinnon, 1992). Extensive theoretical and empirical work has specified DDDM at the school or district level but little is known about how teachers engage in this practice within their classrooms. We seek to understand if the DDDM, designed for school and district level decision making, can be used to understand how teachers engage with data in their classrooms. Specifically, we want to uncover how and under what conditions fifth grade English Language Arts and Social Studies teachers with expertise in data use engage in a data based decision making process.

We hope to uncover teachers’ craft knowledge of data use to explore the subprocesses and microprocesses they evoke to convert classroom student performance data into actionable knowledge for distal (long-term) and proximal (short-term) instructional decisions. We focus on teachers’ use of recorded student performance data from formative assessments gathered at the classroom level that are deliberate and designed to discern and improve students’ learning. We rely on multiple rigorous qualitative methodologies (i.e., think-aloud protocols, interviews, observations, and document analysis) and a sophisticated analytic plan to answer the following research questions:

  1. How, and under what conditions, do fifth grade English Language Arts and Social Studies teachers with data use expertise use documented student performance data to inform distal and proximal instructional decisions?
  2. How, and under what conditions, do fifth grade English Language Arts and Social Studies teachers with data use expertise design instruction that is responsive to their instructional decisions derived from data use?

Building on existing conceptual models of data-based decision making and utilizing rigorous qualitative methodology, this investigation complements and extends the extant literature by examining how expert teachers in specific contexts engage in data use for their daily practice. Relevant to researchers and practitioners, our findings will inform existing theory of data-based decision making at the classroom level and will identify a repertoire of specific transferable strategies for data use that can be empirically confirmed in future research and used to guide policy and interventions.

Matthew Kloser and Hilda Borko
(Start: 6/1/2014, End: 12/31/2016)
Improving Teachers' Use of Data for Instructional Decisions: Using Assessment Portfolios for Professional Development
Institute for Educational Initiatives, University of Notre Dame

Despite many large school districts reportedly using data-driven practices, the field lacks a robust evidentiary base for identifying which teachers use formative student data, what data they use, when they use it, and how this data influences instructional choices. Furthermore, we have yet to deeply explore ways in which professional development (PD) can shape these practices. This exploratory study uses a portfolio instrument, the Quality Assessment in Science Notebook (Notebook) to explore the following research questions:

  1. How and when do middle school science teachers use student performance data, collected in the QAS Notebook, to make decisions about and to adapt classroom instruction? 1a. How do teachers’ use of interim/benchmark assessment data compare to their use of data from textbook or teacher created assessments?
  2. In what ways does professional development that includes the collaborative analysis of QAS Notebooks change teachers’ use of student performance data in making instructional decisions? 2a. What PD components are more or less effective in helping teachers use student performance data to make instructional decisions?
  3. How do different organizational structures of schools and districts support and constrain teachers’ use of student performance data for making instructional decisions?

In this study, teachers will collect four Notebooks over a twelve-month period. Data collected in the Notebook will include planning artifacts such as lesson plans; assessment artifacts such as interim and teacher-created tests, rubrics, and achievement data; and annotations of how teachers use the performance data to make instructional decisions. Teachers will also provide written reflections and participate in semi-structured interviews that probe how, if at all, they use data to inform their instruction.

Following the collection of the first baseline Notebook, teachers will participate in PD that focuses on rating their existing practice according to pre-defined rating dimensions and using the artifacts in the Notebook to develop an action plan for future practice. Professional development will continue throughout the school year as teachers collect additional Notebooks and meet frequently in collaborative teams to discuss student data and its implications for practice. These conversations will be recorded and transcribed for analysis. A final Notebook will be collected and rated by the research team to chart within- and between-teacher changes in practice. Results will be analyzed qualitatively according to our theoretical framework and developed into cases of teachers’ data use that pay particular attention to how context and organizational structure influence practice.

Joseph McDonald, James Kemple, and Susan Neuman
(Start: 1/1/2014, End: 12/31/2015)
Data Use in Action
Steinhardt School of Education, New York University

School districts in the U.S. have recently invested substantial financial and human capital in the pursuit of data-informed teaching, though the efficacy of this investment remains largely unknown. Moreover, the nature of data-informed teaching is under-theorized. This project of the Research Alliance for New York City aims to address these gaps with basic research on actual data use in elementary and middle schools. The schools will be selected first on the basis of their high levels of data use as measured by School Quality Review reports and ARIS clickstream data, then on the basis of the similarly high challenge the schools face in teaching students with lower than proficient scores on state ELA assessments in grades 3 and 6. Finally, they will be selected in order to pair schools that have generated above-average growth from grade 3 to 4 or 6 to 7, with those that have not.

The investigation of data use in these schools and in the classrooms of focal literacy teachers working within them will surface and examine school-level and team-level organizational routines, and also instructional routines that may illuminate data use and its efficacy. Research methods will include in-depth interviewing, observation of organizational routines and their social sub-strata, and novel observation techniques of instruction that involve low-inference transcription and reflection by teacher partners.

The research team is a mix of three senior researchers, two early-career researchers, and several doctoral students at various points in their training. The team aims to produce a conceptual framework that richly describes the ‘space’ that all data-focused teaching interventions implicitly target and must devolve to (if they ever do)—namely classroom practice. The team aims to produce writing and other products to help guide future research, policy, school development, and teacher education.

Katharine Shepherd and George Salembier
(Start: 1/1/2014, End: 12/31/2016)
Collaborative Data Use by Teacher Decision-making Teams to Support Instructional Interventions for Struggling Students
College of Education and Social Services, University of Vermont

Teacher decision-making teams play an important role in schools’ efforts to help teachers respond to diverse student learning needs. Decision-making teams enjoin general and special education teachers, and often other instructional and support staff, to help classroom teachers develop and implement interventions for struggling students. In doing so, teams rely heavily on a wide range of formative and summative student performance data and a collaborative problem solving model to develop instructional interventions. Decision-making teams have become nearly ubiquitous in schools nationwide and are a key component of most schools’ instructional programs. Despite the fact that these teams are a well-established feature of the educational landscape, little is known about how these teams work to collaboratively use data to develop instructional interventions for classroom teachers.

This study will unpack how decision-making teams understand, use, and collaborate around student performance data to develop instructional interventions, and how school conditions influence this process. Specifically:

  1. What knowledge and skills do teacher decision-making teams rely upon to interpret, analyze, and apply data to understanding student needs? What constitutes teacher teams’ data literacy? How are data translated into instructional interventions?
  2. What does collaborative data use look like in these teams? How do teacher teams meet, interact, communicate, and cooperate around student data? In what ways do teams leverage members’ expertise and experience to develop interventions?
  3. How do school-level organizational conditions support or inhibit teams’ abilities to effectively collaborate around student data?

Comparative case studies of four Vermont elementary and middle schools’ Educational Support Teams (ESTs) will inform the study’s research questions. ESTs are data-driven teacher decision-making teams that rely on a collaborative problem-solving approach to develop appropriate instructional responses for struggling students. The proposed study will contribute to our understanding of teacher decision-making teams’ data literacy and collaborative data use for instructional improvement.