Chandra Turpen, University of Maryland-College Park
David Craig, Oregon State University
Joel Corbo, University of Colorado at Boulder
Kathryn Svinarich, Kettering University
Robert Dalka, University of Maryland-College Park
There is abundant evidence of practices that support strong undergraduate programs, such as shared leadership and using data to guide programmatic changes. Yet in many physics and other STEM departments, motivating uptake of these strategies can be difficult due to factors such as time constraints and cultural assumptions of the disciplinary community. The Effective Practices for Physics Programs (EP3) initiative responds to these challenges.
EP3 is led by two professional societies that are trusted among physicists; the American Physical Society (APS) and the American Association of Physics Teachers (AAPT). The EP3 Guide was created by gathering knowledge from both research and practitioners within physics and education. It serves as a resource for stakeholders who wish to transform their programs by bringing together effective practices across a wide range of departmental facets; such as, student mentoring, course design, and departmental leadership.
The Departmental Action Leadership Institute (DALI) was created as the initial community engagement arm of the EP3 initiative. DALI is an intensive, year-long, professional learning community led by expert facilitators that supports physics faculty in apprenticing into effective change strategies, and adapting practices—including those outlined in the EP3 Guide—to match their local contexts. During this year, DALI participants engage with a local team of departmental stakeholders to work on a change effort that is specific to their program's concerns.
Research on DALI has found that faculty participants value learning skills around facilitating teams and designing purposeful approaches to change. This learning is supported by the consistent, high-touch, approach of DALI. This symposium will focus on three primary leadership practices that faculty members brought into their local teams: shared vision for success, purposeful use of data for decision making, and partnerships with students.
Join us to learn about EP3 and discuss ways to enable uptake of effective change practices.
Kimberly LeChasseur, Worcester Polytechnic Institute
Kris Wobbe, Worcester Polytechnic Institute
Diversity trainings have been critiqued as inadequate for shifting educational practice, despite their ubiquitous presence (Bezrukova, Spell, Perry, & Jehn, 2016). Equity audits can be a powerful alternative. As a change management tool, equity audits can spur reflection, clarify shared values, and connect values to action (Harris & Hopson, 2008). While there are many guides supporting their use in K-12 education (eg, Skrla, Scheurich, Garcia, & Nolly, 2004; Green, 2017), they remain underutilized in higher education.
In 2021, the Center for Project-Based Learning at Worcester Polytechnic Institute conducted an equity audit to advance our commitment to addressing structural inequities. As pedagogical leaders, we recognized the Center was upholding practices that reinforce privilege. An equity audit allowed us to engage in structured, evidence-based exploration of how to shift our operations to better enact our values. For example, we found that we partner with a disproportionately high percentage of Minority-Serving Instiutions, yet have relatively few workshops demonstrating culturally-responsive teaching through PBL. The experience of conducting an equity audit surfaced new ways of thinking about how anti-racist practices fit into our center.
In this presentation, we will provide guidance for how to conduct an equity audit. Strategies, resources, and lessons learned will be organized into four phases: 1) planning and committing to an equity audit, 2) identifying the right questions to ask, 3) making sense of data to surface new questions, and 4) addressing new questions with action. Within each phase, attendees will hear how we used the process of conducting an equity audit to initiate and manage change.
Alice Olmstead, Texas State University-San Marcos
Andrea Beach, Western Michigan University
Charles Henderson, Western Michigan University
Diana Sachmpazidi, University of Maryland-College Park
Cynthia Luxford, Texas State University-San Marcos
Instructional change teams are increasingly common in efforts to improve undergraduate STEM instruction. Such teams include three or more members who may have different perspectives and backgrounds. When they are functioning well, teams can be very effective. Unfortunately, effective collaboration in teams is often challenging. In prior work, we developed a context-specific model that shows how teams are set up (inputs), how they work together (processes), how they feel and think about working together (emergent states), and team outcomes (outcomes) (Olmstead et al., 2019; Sachmpazidi et al., 2020). Based on this model we have developed and pilot tested a survey tool that uses responses from individual team members to identify the strengths and weaknesses of an instructional change team. In this talk, we will present the survey tool. We will show how the variables in our model translated into survey items, share the statistical analysis, and discuss the potential utility of this survey for teams, including the types of reports generated for teams who complete the survey. We will also discuss how educators, researchers, evaluators, and other change agents can access this survey for their own work, such as setting up teams, assessing or providing feedback to existing teams, and demonstrating effectiveness of teams for administrators and evaluators.
Sam McKagan, American Association of Physics Teachers
We discuss a potential new tool that would bring together several existing projects to support physics departments in departmental change efforts and support departmental change researchers in identifying and understanding key features that support departmental change and equity. It has long been argued that the department is the most important unit of change in higher education. Careful qualitative analysis of individual departments engaged in change has led to promising results about the key factors that make a difference in departmental change. However, because change is a complex process with a long time scale and many interacting factors, it is often difficult to determine which factors make a difference in creating and sustaining change. There is a lack of systematic long-term quantitative data about institutional and departmental change, making it difficult to make many empirical claims. ASCN has brought together change researchers and change leaders to identify research questions that could be answered with such data. The Effective Practices for Physics Programs (EP3) Initiative has produced a large collection of recommendations for physics departments engaged in change. The PhysPort Data Explorer is a tool for physics departments that provides analysis and visualization of research-based assessment results. These projects could be brought together to create a new tool, the Departmental Data Explorer, that would (1) systematically collect data from many physics departments (and eventually STEM departments more broadly) on their change initiatives, program-level outcomes, assessments, and activities, and (2) share de-identified and anonymized data with change researchers through a national database of departmental practices, program-level student learning outcomes and equity goals, programmatic assessments, assessment results, and change strategies. This tool would support research on change in a broadly representative sample of departments with a particular focus on equity. We are actively seeking new collaborators and funding sources for this tool.