Yale Center for Teaching and Learning

15 Considerations for Building a Culture and Climate Assessment

December 1, 2021

By Meghan Bathgate

As universities across the nation focus on improving their culture and climate to become more inclusive and equitable, many of us have questions about how to gather useful and accurate data. As you collect information on community members’ experiences, we offer the following (non-exhaustive) list of ideas as you plan and construct your assessment.

Identify the purpose of your assessment. This will guide your method, selection of items, the population to be included, and the timing of your data collection. Revisiting these goals as you plan will make your questioning more purposeful, allow you to tailor your assessment to meet that purpose, and prevent data collection that does not serve your larger objectives.

Clearly define scope. While your assessment may meet multiple goals, it will not be able to evaluate everything at once. Prioritizing a set of goals within a given data collection time period can help hone your assessment to provide the deepest, most impactful results. Your subsequent assessments can build on the results, directed by these findings.

Consider which method is best for you, at this moment. Surveys, focus groups, 1:1 interviews, or the review of existing documentation are all possible avenues. The nature of the questions being asked, potential desires for anonymity, capacity for data analysis, and the current knowledge of your culture and climate all influence this choice.

Ask whether this data exists elsewhere. While there are data-sharing regulations, there are likely many reports and summaries, and in some cases direct data, that are available through existing departmental, divisional, or institutional sources. Additionally, if these data do not exist, you may be able to partner with another initiative or office to leverage data collection efforts and reduce requests on the university community.

Ask questions that will give you actionable results. A common error in assessment design is to ask only about a desired outcome but not the possible factors that explain it. For example, asking the degree to which people feel included in your department gives you the benefit of an average and variation in how people are feeling—but not why they are feeling that way. If 30% of your participants feel unsupported in your department, knowing what practices or areas lead them to feel this way is critical to take action to address their needs. Are students feeling least supported in courses? Interactions with peers? In their advising? In their career guidance? For those who feel unsupported, ask what resources or support may help them. For those who do feel supported, ask them what elements help them feel this way. These answers provide insight into what steps you can take to meet participants’ needs.

Treat the assessment itself as a form of communication with participants. What you choose to assess and how you frame, administer, and share data from the assessment communicates what is valued and salient to your work. Recognize that you are saying something to your community members by what you do, and do not, include. To help establish trust and engagement with your participants:

  • Communicate the purpose of the assessment.
  • State who has access to and will see the data.
  • State the level of anonymity of the data. Will the data be confidential (identifying information known by the researcher but not shared through results) or anonymous (no identifying information requested)?
  • Consider the flow of your assessment. For example, what do you ask first to engage participants and demonstrate the tone of the assessment; how do you provide outlets for participants to provide additional feedback?
  • Be thoughtful about including and using demographic items. Put demographic and background questions at the end of the survey to reduce stereotype threat and only ask demographic and background questions that are essential to your initiative. Carefully consider the options within demographic questions to best represent the identities of your sample. If you do ask demographic questions, consider that this can reduce anonymity for some participants. Additionally, consider whether you will review your data by demographic variables (e.g., separating data based on race or gender) or whether these variables will be used to describe the sample.

Test it out. Budget time to have a few people from your target population review your assessment and give you feedback on elements such as length, flow, and item phrasing. See cognitive interviewing procedures as a guide (great resource on this approach here).

Consider timing. Select a time when your target population is available to give attention to your survey (e.g., avoiding midterm week). Also consider and avoid (or partner with) simultaneous data collection efforts that may be requested of your sample.

If using a survey, select a platform. Then, understand the options within your platform. This knowledge can also help inform the design of your items and flow of your survey. Yale has a university account for Qualtrics.

Determine who will analyze the data. Before launching an assessment, develop an analysis plan. Who will analyze this data? How will it be shared? Are there areas of the assessment that should be prioritized for analysis? These decisions can also affect how anonymity is treated. Additionally, all good data analyses techniques and recommendations should be used throughout analyses (avoid cherry picking, be aware of your own potential biases, represent the data accurately).

Plan for how you will act if sensitive incidents are raised. Consider including resources to how participants can report these incidents (e.g., SHARE; Office of Institutional Equity and Access; Sexual Misconduct Response and Prevention).

Expect less than 100% response. When data collection is complete, consider how your data represent your sample. It is exceedingly rare to approach 100% response rate through optional assessments, although there are steps you can take to increase responses. If your measures are strong, your results will represent those who responded and are valid. Lower response rates can be mitigated by collecting information about culture and climate from several sources instead of relying on a single one.

Share aggregated results. Data collection is one step in the continuing process of improving diversity, equity, and inclusion. Develop a plan for sharing your findings back to the community you assessed. This can increase trust and transparency in your community and can reaffirm the value of their participation. Describe the actions that will be taken or informed by these data and ask for input on these plans, if appropriate.

Think big, act small. As you move towards your larger goals of inclusion, it may be tempting to jump into action by adding “more” to your department in response to the data (more trainings, more assessments, more meetings). While these all may be helpful where called for, remember small actions speak volumes. The spoken and unspoken social norms within your department help establish the narratives of the department. Small considerations, such as who is invited to and speaks at meetings, whose work is featured in departmental communications, how and to whom feedback is given, affect people’s day-to-day experience of belonging.

Follow University guidance regarding data use. If you intend on using your data for research purposes, please consult with Yale’s Institutional Review Board, who have a variety of helpful resources in their library. Medical information also has specific protections, for which the HIPAA Privacy Office offers resources and support.   

For questions or additional resources, contact Meghan Bathgate, Associate Director of Educational Program Assessment at the Yale Poorvu Center for Teaching and Learning (meghan.bathgate@yale.edu