Each year, staff in Round Rock ISD administer a climate survey to all staff, parents, and students. The climate survey provides campus and district staff an opportunity to collect important information from stakeholders in order to improve campus climate.
Responses to common questions about the climate survey development and climate survey results are listed below. If you have additional questions about the climate survey, please reach out to us directly.
Climate survey results
Aggregate climate survey results for student, parent, and staff are publicly available. You can find these results for each school and for the entire district on the Performance Indicators website.
What do the climate surveys measure?
The climate surveys are designed to measure campus climate as perceived by students, parents, and campus staff members.
Some student climate survey questions are designed to measure student-level program outcomes, such as students’ social and emotional learning, in addition to students’ perceptions of campus climate.
What is campus climate?
We used Thapa and colleagues’ (2013) conceptualization of campus climate to frame our understanding. Campus climate is based on “peoples’ experiences of school life” and is a reflection of individuals’ perceptions of campus norms, goals, and values, along with perceptions of interpersonal relationships, teaching and learning practices, and even organizational structures. Aggregated, individuals’ responses provide an estimate of campus climate as a whole.
How are individuals’ responses kept confidential?
Individual responses to the survey are secured on computers and servers only accessible to staff in the Department of Research & Evaluation. Reports are created with quantitative data aggregated across campuses in order to ensure confidentiality for staff with unique positions (e.g., campus with one principal). Individual responses to Likert-style questions (i.e., agree, disagree) are not shared.
Written responses are not kept confidential, as individual written responses are shared with campus leaders. Written comments are directly shared with campus and district administrators.
Responses to the family survey are designed to be anonymous, responses to the student and staff climate surveys are not anonymous. This is so Research & Evaluation staff can evaluate programs across the District for the purpose of continuous improvement.
How should campus leaders use their survey results?
As a performance management tool. The results of this survey provide insight into how campus are perceived by students, parents, and staff. Ideally, the results can inform campus improvement plans for the upcoming school year.
To initiate conversations with staff. Staff likely will be very interested to know how the campus is perceived by each of the respondent groups. We encourage campus leaders to share the aggregated survey reports broadly to facilitate discussions about continuous improvement.
Climate surveys provide imprecise estimates of campus climate and data should be considered in concert with additional data sources.
How were the climate surveys developed?
Climate surveys have been administered in Round Rock ISD for many years and have gone through a series of revisions. Most recently, staff revised the student climate survey (2017 – 2018 school year), the instructional staff climate survey (2016 – 2017 school year), and the family survey (2017 – 2018 school year).
To revise the climate surveys, committees of campus and district staff came together several times in the fall semester and selected questions from a series of researcher developed and validated surveys. These included questions from other school districts’ student climate surveys (e.g., Austin ISD, Chicago Public Schools), the Marzano High Reliability Schools Framework, the Teacher Data Use Survey (Wayman et al., 2016), and the School Survey of Practices Associated with High Performance (Weinstock et al., 2016).
The committee loosely aligned climate survey questions to Round Rock ISD goals, improvement plans, and the Round Rock ISD Learning Framework. When necessary, new questions were written. Research and Evaluation staff examined all questions for readability using Lexile’s MetaMetrics system and adjusted question wording for increased readability when necessary. Research and Evaluation staff field tested the questions with a sample of students, instructional staff, and parents for the student, instructional staff, and family climate surveys, respectively. Each year, campus principals had an opportunity to review the questions and provide feedback before a final set of questions were agreed upon.
Survey participants were presented questions from the climate survey in random orders (randomized by question and by domain), in an attempt to reduce the effect of survey fatigue on question analyses.
Psychometric analyses were conducted separately on each survey. Based on these analyses, we reduced the number of questions resulting in a final set of questions.
What psychometric analyses were conducted?
Staff in Research & Evaluation conducted a series of psychometric analyses after developing the staff and student surveys.
First, we conducted simple reliability analyses (e.g., Cronbach’s alpha, item-total correlations) in SPSS or R before conducting a series of exploratory factor analyses (EFA) with a subset of participant data in Mplus.
Based on these EFA, and a review of questions by district staff, we reduced the number of questions in the survey and conducted confirmatory factor analyses (CFA) on the reduced set of questions. Indicators of fit were generally strong for each survey, indicating that the climate surveys were reliable scales.
Why ask questions about “most teachers” or “most students”?
When campus staff respond to questions that refer to “most teachers” or “most students,” survey participants are reflecting on their perceptions of what occurs on the campus. On the aggregate, responses are an indication of perceptions of campus climate. Researchers of campus climate commonly measure campus climate using questions with similar “most teachers…” wording.
In addition, we hypothesize that survey respondents may be less susceptible to social desirability bias when reflecting on their peer group, rather than when self-reflecting on their own behaviors (especially when responding to a survey administered by district staff).
Finally, we believe that consistent phrasing throughout the survey will reduce participant load and survey fatigue; therefore, we re-worded some researcher developed questions to reflect the “most teachers…” wording, for consistency sake.