Through Students Eyes: Eye Tracking the SVA Library Online Collections Experience

Role

UX Researcher

Duration

11 Weeks

Tools

Tobi Pro Lab, Tobi Eye Tracker, Figma, Figma Slides

Team Members

Lillian MacGuire

Meng Shi

Qiaochu Zhang

Over the course of 11 weeks, the team conducted an in-depth eye-tracking study to evaluate how users interact with the School of Visual Arts (SVA) library’s online collections database. Using Tobii eye-tracking technology, we observed real-time user behavior as participants attempted to complete key tasks relevant to searching, browsing, and accessing materials.

Our findings informed a set of actionable recommendations aimed at improving the clarity, accessibility, and overall user experience of the database interface. These were translated into design mockups using Figma and delivered alongside a slide deck, a highlight reel of the testing process, and a research summary.

Executive Summary

Laying the Groundwork: Establishing Project Goals

In an initial kickoff meeting via Zoom, we met with the Phoebe Stoneking, the Digital Services Librarian at SVA. Through this conversation we were able to pin-point a clear project goal to base our research plan and test tasks around:

“Analyze user behavior and visual attention on the SVA Online Collections website to understand navigation, search interactions, and resource discovery, identifying usability issues to improve intuitiveness and self-service.”

We were also able to understand the constraints of the project in this meeting, learning that the SVA Online Collections uses Springshare to host their databases.

Participants: Real Users, Real Tasks

During the kickoff meeting, the team learned that SVA’s Online Collections database is primarily used by students rather than staff. Based on this insight, we identified SVA students as our primary testing group.

To meet the project’s time constraints, we also included Pratt students who had experience using online collections databases, either at Pratt or a previous institution, to supplement our participant pool.

Participants were recruited through a Google Form that captured key behavioral data, including:

  • Prior experience with online collections resources

  • Primary device used (phone or desktop)

  • Eye-tracking eligibility through screener questions, ensuring compatibility with the Tobii eye tracker, our main tool for data collection in this study.

Figure 2: An image of the team conducting a pilot test.

During each testing session, a second team member acted as a dedicated note taker. Their role was to capture key usability issues and document direct quotes from participants during the retrospective think-aloud (RTA) interviews.

All notes were compiled into a shared spreadsheet that included details such as the platform used (desktop or mobile), the number of participants who encountered each issue, and a severity rating for prioritization.

This system allowed us to identify and rank the most critical problems, helping to focus our recommendations on the issues most in need of attention by SVA.

What Needs Fixing First: Issues by Frequency and Severity

Figure 5: Screenshot of the team’s shared problem sheet

Figure 1: An image of real SVA students. (Source)

Seeing What Users See: Eye Tracking as a Tool

We conducted eye-tracking research using Tobii, paired with retrospective think-aloud (RTA) interviews to capture user reasoning. To supplement qualitative insights, we used the System Usability Scale (SUS) to evaluate overall usability.

Tasks were designed to reflect natural user behavior while covering key areas of the interface, including:

  • How easily users could locate the Online Collections database from the SVA homepage

  • Whether users could identify and use the search bar effectively

  • If users could find and apply advanced filtering options

  • Whether users could locate curated subject guides to support their research

  • If users could identify where to go for help on the site

We tested 12 participants - six on desktop and six on mobile. While both platforms were assessed, early recruitment data confirmed desktop as the primary surface, so it became our focus.

Figure 3: Gaze plot of a participant completing task 1, which focused on them finding the online collections entry point.

SVA’s Online Collections Database is Useful, but Lacks Efficiency

Our eye-tracking tests uncovered several critical usability issues within the Online Collections database—many of which compounded over the course of testing.

To begin with, 93% of users misunderstood what “Online Collections” referred to. Most assumed it included all of SVA’s digital resources, when in reality, it only links to external databases.

Additionally, 75% of participants found the descriptive text under the Online Collections tab too lengthy, causing them to skip over important information entirely. This contributed to confusion and misdirected navigation early in their sessions.

In Figure 3, you will see in a participant’s gaze plot that they read only the headings in this section, skipped over the section descriptions, and skipped over the section images.

Figure 4: Gaze replay of a participant using the search bar incorrectly.

Issues Snowball Into Each Other Across the Interface Pages

Building on the earlier finding that users misunderstood what “Online Collections” refers to, we also observed related confusion around how to use the search bar effectively.

The Online Collections page is a directory of databases, not a catalog of individual resources like books or articles. However, many users attempted to search for specific topics or resources directly. For example, as shown in the gaze replay in Figure 4, one participant searched for “Korea Suicide Rate” and received no results.

This misunderstanding affected 60% of participants, highlighting a critical gap in how the purpose of the search function is communicated.

Our top five recommendations were selected because they consistently affected multiple users and created significant friction during key tasks. These issues not only disrupted navigation but also contributed to confusion around the database's core purpose and features.

In the next section, we’ll break down each of these five recommendations in more detail:

  1. Reduce text in the description, use bold text for key information, and select more relevant images to aid comprehension.

  2. Clarify the search function and provide better feedback when search results return no matches.

  3. Improve visibility and usability of filters, especially on mobile devices.

  4. Highlight the subject guide by changing its placement and turning the link into a clear call-to-action (CTA) button.

  5. Standardize the icon set and update icon labels with more descriptive, user-friendly language.

These changes aim to make the interface more intuitive, reduce user confusion, and support faster, more successful task completion.

Five High-Impact Improvements for SVA’s Online Collections

Research Findings & Proposed Recommendations

Finding 1:

  • 93% of users don't understand what "Online Collections" entails. They believe this incorporates all digital media that SVA provides, not just databases.

  • 75% of participants think text descriptions on the SVA homepage are too long and not scannable. They also believe the images do not aid scanning attempts.

Recommendation 1:

  • Reduce the amount of text used in the description, and add bolded text for important information

  • Choose images that feel more relevant.

Finding 2:

  • 60% of participants didn’t realize the search bar is meant to help them find databases, not search within them.

Recommendation 2:

  • Added contextual microcopy to clarify the search function.

Finding 3:

  • 83% of participants missed the filter option and relied only on keyword search on mobile.

  • When we asked participants to search for an art history database that includes articles, 5 out of 6 of them didn’t use the filter at all.

Recommendation 3:

  • Make filters more visible and usable on mobile.

Finding 4:

  • 92% of users find subject guides potentially useful but struggle to discover them

  • 83% of users think new & featured databases are irrelevant to their search results

Recommendation 4:

  • Swap the position of subject resources and New/Trial & Featured database

  • Swap the columns to right side

  • Make the guide link an CTA button

Finding 5:

  • 100% of users think the Icons and labels lack clarity

  • 44% of users thought icons were clickable and can be used as "Type" filter

Recommendation 5:

  • Use a more standardized icon set that matches the user's prior experienceSwap the columns to right side

  • Change the icon label to more descriptive language, e.g., 'contains index'

Want to See More?

For a more detailed view of the team’s findings, please see our findings highlight reel below, as well as our final presentation, which outlines our key findings and recommendations.

Client Reactions and Positive Feedback

We presented to our client, Phoebe Stoneking, as well as other stakeholders in the SVA library, in an onsite in-person presentation format. We received very positive feedback from Phoebe, who detailed that she was impressed with our ability to create design recommendations that fit within the system constraints:

“Thank you so much for all of your hard work this semester. I thought you had really excellent recommendations regarding the icon set, improving the databases search, and for reorganizing the page content. You paid close attention to our system constraints, which is really appreciated.”

- Our Client

I had some prior experience collecting eye-tracking data as an undergraduate research assistant, so I was excited to build on that knowledge in this project. This was my first time analyzing results from the Tobii eye tracker, and combining that with data analysis using the System Usability Scale (SUS) made for a valuable learning experience.

Here are some of the most important lessons I took away from the project, along with areas where I hope to continue improving.

  • Consistent Communication Enhances Team Alignment: Timely communication, especially around scheduling and task delegation, played a critical role in keeping the team aligned. Prioritizing clear communication led to smoother testing coordination and fewer misunderstandings. neutral, effective tasks, making me a stronger researcher.

  • How to be a Better Moderator: In this project, maintaining neutrality as a moderator was essential to reducing bias and collecting reliable data. I learned how to give more effective, non-leading feedback to participants by shifting from affirmations like “good” to open-ended prompts such as “okay, why do you think that?” This small but intentional change encouraged participants to elaborate on their thoughts and helped us gather richer, more insightful data as a result.

  • Need for Better Personal Documentation Habits: While collaboration was strong, keeping up with post-testing documentation was a personal challenge. Setting reminders for data entry is a clear opportunity for improvement in future projects.

Reflections and Key Takeaways