IT Portfolio
WHERE ouR JOURNEY BEGINS (project OVERVIEW)
The Office of Information Technology wanted to create a central resource to view the status of all products across the entire distributed university community.
This tool allows IT professionals at the university to access detailed information on all available, accessible, and discontinued technology in the portfolio.
Project Team - Lead UX Analyst (Me), Office of Central Information Technology senior leadership
WHAT WE WANTED TO DISCOVER (PROJECT GOALs)
The study was focused on evaluating the mid-fidelity wireframes that had been created.
We defined the following project goals:
Goal 1: What are users' understanding of the variety of content?
Goal 2: What are users’ taxonomy expectations, needs, and content requirements?
Goal 3: How are users utilizing the tool's design options to navigate and retrieve information?
our plan for success (RESEARCH STRATEGY)
The team had mid-fidelity wireframes they were hoping to learn more about. To understand how our users would interact with this tool and move through their user flows, a usability evaluation was the best approach.
A usability evaluation would validate the team’s designs by highlighting areas that fulfilled user needs, and any experiences, content, or user flows that may be difficult for users.
our main CHALLENGE
Gaps in user research. As we began the preparation for the evaluation, I realized the team was unaware of the gaps in their user research. The team had no prior experience with the UX process. Their expectation of a usability evaluation was a checklist of issues that would result in a working product.
Short timeline with high-level stakes. Users found the system lacking intuitiveness and raised a large number of usability issues. This made the team very concerned about incorporating any actionable insights. I was able to reframe the insights into intermediate milestones. This structure respected the team's resources and allowed me to encourage them to expand on their user research.
HOW WE EXECUTED OUR STRATEGY (RESEARCH PROCESS)
For this particular usability evaluation, I chose to structure the facilitated guide as a group of task-based scenarios. This method would provide users with some guidance on using the tool yet allow them some freedom for organic exploration.
Task-Based Scenarios
The scenarios and tasks were directly related to the defined goals. Users completed a certain number of tasks, outlined within realistic scenarios. All the tasks were in plain language and ensured the user would be able to state they had completed the task. The resulting experience allowed, the team to observe how users experienced the content and interacted with the tool.
Testing
9 expert users in the IT community within a university community, 40-minute, 1:1 sessions, over 2 days.
Each session consisted of a participant briefing and an eye-tracker calibration procedure, followed by the participant's completion of 4-6 task-based scenarios. The project team was able to view the participant’s facial expressions, hear their comments, and observe their computer screen/eye-tracking.
I created a note-taking template to provide a space for the project team to take collaborative notes. We recorded each session and provided viewing options through screen sharing for team members who were remote.
Analysis
I led the team in a short 20-minute analysis, after each session, to streamline the higher-level issues that began to arise. I created a rapid research template, where we scored and prioritised each issue and its impact. I guided the team effort to score each issue, thus demonstrating its impact per participant. These scores were based on the participant's frustration, confusion, or difficulty in completing the task.
Tools we used
Tools - Google Suite, White boarding, Tobii Eye Tracker, Usability Lab
WHAT WE LEARNED (Discoveries)
Insight 1: Participants were unable to understand the roadmap design choices and digest the information presented to them.
Insight 2: Inconsistent design choices within the nested navigation created apprehension, confusion, and mistrust with users'.
Insight 3: The complexity of the iconography within the use guidance key confused users'.
These discoveries exposed underlying design and content issues. Using a short whiteboarding session, I was able to guide the team to consider some immediate design and content changes as quick wins for their users. This change in approach would give users' the consistent design experience they were seeking.
recommendations
To ensure the team remained user-focused, I generated recommendations in the form of key questions and descriptions.
Key Question [Roadmap] - What is the purpose of displaying this information?
Ensure all design choices are intentional and consistent throughout the portfolio.
Key Question [Tabular Navigation] - Do our users understand our intention of the nested navigation?
Do an in-depth terminology review for each tab. Create an intentional navigation flow across the site to create trust with users.
Key Question [Use Guidance Key] - Why do we have so many terms in a key? Is this helping our users'?
Simplify the number of categories and rethink the terms with the intention of overall design.
the impact we had
$ 6 figure savings in university revenue
transparency in product builds & RFPs
enhanced student experience
The immediate feedback from users was that they felt heard. They also appreciated the consistency in content and design and found the experience intuitive to use.
The tool rolled out across 5 campuses and gave all IT departments a global system to track and view each other's roadmaps. This allowed for transparency in product builds and RFPs. The trickle-down effect was a better student experience, and a 6 figure savings in revenue.