Advisor: Dr. June Ahn

Toolbox: R, Python, Dedoose, Excel, Figma, InVision

A partnership between UC Irvine, UC Riverside, University of Washington, Vanderbilt University, & Stanford University


Edsight is a visual analytics platform designed to spur new insight, learning, and decision-making for teachers, instructional coaches and researchers who are involved in instructional improvement efforts.

The challenges posed to the design team include:

  • Develop interface and interaction designs to communicate instructional insights to educators through visual analytics
  • Explore how design and conditions of data-use can improve data interpretations

  • To tackle these challenges, our team took a co-design approach, where we involved partner educators and researchers in ideating, field observations, and feedback sessions to iteratively refine the designs. This co-design effort has been ongoing since 2017.

    My Roles

    • Design the user experience for the data visualization platform using learning analytics.

    • Collaborate with stakeholders (teachers, instructional coaches, and school administrators) in school districts across the U.S. to define research questions, conduct studies, and develop designs.

    • Conduct interviews, usability testing, co-design sessions, user journey mapping, and field studies, to understand user experiences and ways to facilitate data interpretations and use in K-12 schools.

    Figure: The practical measures we developed to capture student engagement in Mathematics classroom.

    Insight: A Design Vignette

    Developing opportunities for deeper data sensemaking for educators

    Over two years of conducting cognitive interviews with teachers and instructional coaches, we observed that when presented with classroom data about student engagement, teachers tended to recall instructional moves and attribute causes, but rarely demonstrated actionable insight (Campos et al., under review).

    We saw an opportunity to enrich teachers' experience with Edsight, to develop more actionable conjectures about learning and teaching, when our partners came to us with a request for a new measure: to capture student experience at the Launch of a Math instructional task.

    We built from partners' insights that different from the other practical measures focusing on group or whole-class representation, this new task should also feature individual perspective.

    I proposed a new experience for teachers to view student data. The new features are shifts (1) from aggregate to individual view, and (2) from static to dynamic representations. This experience has three main benefits:

  • Teachers can quickly scan at the individual (by row) and the class levels (by color patterns).
  • Teachers can set a threshold for viewing student patterns (i.e., as answering "Yes" to being ready for the tasks >60% of the time) and develop hypotheses about how and why certain subgroups of students may differ from one another in learning experiences.
  • Giving teachers the choice to switch over aggregate/individual, single-day/multi-day, and thresholds of data patterns contributes to enhancing their agency in learning analytics.
  • I created a widget for teachers to select thresholds for the data patterns, building different hypotheses about student engagement with the launch of a task. Teachers can quickly view individual learning patterns (left side) and aggregate views (bar graphs and word clusters) over time (right).