A usability assessment.
Choose a website and conduct a usability assessment, including a heuristic analysis and both moderated and unmoderated usability tests. Write a report and prepare a presentation including recommendations for site improvements.
UserTesting.com, Zoom, Google Slides, Google Sheets, Adobe Illustrator
Evaluative research, study design, writing, interviewing, moderated usability testing, unmoderated usability testing, competitive analysis, heuristic analysis, secondary research
Design Research, Kacey Ballard
5 weeks, Spring 2020
This is my slide deck presenting my findings and recommendations.
PortlandOregon.gov is the official government website of the city of Portland. It serves as the primary resource for residents looking for a government office, a service, or information about the city. The website stores a lot of relevant information and is likely able to answer the user’s questions, but the organization of content can make things hard to find and complicate a simple process.
The purpose of this project was to understand how to conduct evaluative research on an existing design's usability, synthesize findings, gather insights, and recommend improvements for the overall design.
I chose PortlandOregon.gov for this assignment because I thought that usability and accessibility are of particular importance on a government website that needs to be able to connect a diverse group of people with a huge range of essential services and programs. These types of websites are so essential, so I wanted to see if it was really doing everything it needed to do to serve the citizens of Portland.
After looking over the website for myself, I listed out some concerns that I had regarding the features and design that I saw.
1. An area of risk is that the added “shortcuts” actually overcomplicate things and confuse the user rather than helping them.
2. Another potential issue is that the information isn’t effectively categorized to allow users to quickly and easily find what they’re looking for.
3. The only form of help that users can access is a number to call or an email to write to, so an additional concern is that users would rather give up on their task than use these methods to get assistance.
To guide me in my project, I asked some questions based on the area of concern that I noted to understand if they were actually as troublesome for users as I had inferred that they were.
1. What is a user’s first step when they aren’t sure where to go?
2. Do the items included in the navigation bar make sense to the user?
3. Which information on the homepage is necessary to the user?
4. How much does the user rely on the search bar versus the shortcuts on the homepage?
5. Does the user notice the shortcuts?
Using my research questions, I formulated hypotheses about the results that I expected to see.
1. Users prefer the navigation bar tabs over the search bar or shortcut links when
navigating from the homepage.
2. Users will only use the search bar to find information when every other option has failed.
3. Users will not discover or use the “Sort by Agency/Service” feature on the homepage.
4. After clicking on a tab, users prefer to use the links in sidebar and will not notice the same information in the center of the page.
5. Users prefer to use shortcuts over navigation bar tabs once they have left the
6. Users will attempt to use the home button to navigate back to the homepage.
Teaming up with a classmate who was researching the SF.gov website, we carried out a competitive analysis of the two websites. We sought to know:
1. What is the product or service that they offer?
2. What are they doing well?
3. What could they be doing better?
4. What are people saying about them?
5. What can we learn from them?
6. What are their business objectives?
7. Who are the customers or users they want to serve?
8. What are the most important goals of their customers?
9. What tasks do users need to complete to accomplish their goals?
I learned that there was a wide breadth of services that needed to be represented on the website. I found that the city tried to offer several navigation options to get users to the information that they needed; however, I wondered if this ultimately ended up contributing to the cluttered and busy appearance of the website. I realized that it really is imperative that users be able to successfully navigate this site since it contains so much vital information about living in Portland – information about utilities, owning a business, and more.
I also realized that there is a huge variety of people that this website needs to be accessible to. I learned that the city was in the process of updating documents to be understandable to people who may not speak English or do not have a high level of education. This prompted me to wonder if the website was navigable for people who are not familiar with technology or might not recognize all of the key words that would guide other users without issue.
Using Jakob Nielsen's 10 Usability Heuristics for User Interface Design, I assessed the current state of the PortlandOregon.gov site. I combed through the website, seeing how its design compared to the ideal standards.
1. Visibility of System Status: After navigating to the site of one of the city’s bureaus or offices, there is a line across the top of the screen showing the user a clickable chain of the pages they passed through to get to their current location.
2. Match Between System and Real World: When hovering over items in the navigation bar, there is no drop-down menu that appears to inform the user what can be found on these pages. The user doesn’t know what the site lists under “Services” or “Bureaus” until after they click on it and leave the home page.
3. User Control and Freedom: The line across the top of the screen is a clickable record of the pages the user has passed through, giving them an easy way to go back or undo an action.
4. Consistency and Standards: After navigating to the Government page from the homepage, there are multiple links on the page with the same name, making users wonder if they go to the same place or not.
On the official version of the city’s website (right), the home button takes the user to the Portland, Oregon homepage. On any of the versions used by one of the city’s bureaus (left), the same home button takes the user to the homepage of the bureau they’re viewing, not the Portland, Oregon homepage.
5. Error Prevention: When the user attempts to fill out a form to report a problem, the website first prompts them to sign-in to their account or create one before giving them access to the form rather than waiting for them to complete the report and then prompting them to make an account before submitting.
6. Recognition Rather Than Recall: Since the user can’t see the things that are located on each of the pages in the navigation menu, they just have to click on them and hope that they eventually find what they’re looking for. Since they don’t have a way to see their options before clicking, the next time they use the site, they just have to remember the path they took to complete their goal before.
7. Flexibility and Efficiency of Use: All pages on the website allow the user to scale the text size up or down according to their needs. Most of the pages of the city’s bureaus allow the user to change the language the site is displayed in, however, the homepage and some of the bureaus’ pages don’t allow this.
8. Aesthetic and Minimalist Design: This is the homepage of the website. It’s very modular and there’s very little space between the different blocks of content on the page, making it feel tight and crowded. There are so many options that it could be overwhelming to the user, making them not know where to look first.
9. Help Users Recognize, Diagnose, and Recover from Errors: There aren’t many error messages on the site, but one example is that there are forms and areas of the
site that can only be accessed by users who have accounts. Instead of just denying users access with no explanation, the site prompts users to make an account or sign-in to be granted access.
10. Help and Documentation: At the top right of each page, the contact information of that specific agency is listed. Other than a phone number to call or an email to write to if the user isn’t able to complete their desired task on their own, there isn’t really a way for users to get help while using the website.
The users of this site are residents of Portland, Oregon, and there are a number of services relating to their daily lives and needs that they may be attempting to access and request when they use the site. This set the standard for the participants who would be recruited to participate in this research. There were 3 main criteria:
1. The age of the participant should be 18 or over.
2. The participant should live in the United States.
3. The participant should speak English, since there are portions of the site are only offered in English.
I also wanted to understand each participant's habits and preferences relating to the various services that the website offers, so I asked:
1. If you are requesting a service, do you prefer to do this online, over the phone,
through email, or in-person?
(1) Online (Accept)
(2) Over the phone (Accept)
(3) Through Email (Accept)
(4) In-person (Accept)
(5) Other (Reject)
2. Have you ever used your city or county’s government website?
(1) Yes, many times (Accept)
(2) Yes, occasionally (Accept)
(3) Yes, once or twice (Accept)
(4) No, I’ve never used one (Accept)
(5) Other (Reject)
I created a list of tasks that I would ask participants to attempt. I tried to cover several different parts of the website, since it offers so many different things. There were some tasks that were specifically made to test my hypotheses, and others that were intending to reveal whether or not the site was still navigable past the surface level.
Now, I needed to define what successful completion of a task looked like. I defined what each possible outcome looked like:
Succeed: The user is able to complete the task by navigating the site intuitively without confusion or needing to go back to try something else.
Succeed with Difficulty: The user is able to complete the task, but the process was not clear to them. They had to make guesses as to which options to choose to get there, and/or they needed to go back after making mistakes and needing to pick a different option.
Fail: The user is not able to complete the task for any reason other than an “error” message from the website. This also includes when the user is able to find information, but did so by following a link that takes them to a website other than the official government website, such as the Portland tourism site.
Utilizing UserTesting.com for this portion of the project, I input my screener questions and created my test with the tasks I had listed earlier. The unmoderated tests were done by 3 participants. I viewed each video and took detailed notes. In addition to the tasks they completed, they were also asked to explain what stood out to them, what they liked, what they disliked, and what they would change.
My notes from the unmoderated tests.
Using the same criteria for selecting participants as for the unmoderated tests, I found another 3 participants who passed the screener. I had found that many young people – my peers at CCA – were not familiar with this type of government website, and they largely had no need for the type of services that these websites offer. As I looked for other participants with more experience who more closely fit the description of this site's typical user, I was able to arrange for 3 moderated testing sessions.
There was an additional task added to these tests compared to the others since the unmoderated tests had gone faster than I had anticipated, so I wanted to take the opportunity to get a better understanding of an additional aspect of the site.
My notes from the moderated tests.
With 6 participants completing each task, I was able to gather enough data to create visualizations that show how many people were able to succeed, succeed with difficulty, and unable to succeed. This information helped me to assess and validate my earlier hypotheses.
CAUSES OF PROBLEMS
To determine what improvements could be made to the website's design, I looked back at my notes from the testing sessions and tried to determine what caused the lack of success some participants had on certain tasks.
1. Function of Home Button: A problem that arose in the first test was that the user was unable to navigate back to the homepage from another page on the website. The issue is that once the user has left the homepage, the home button in the navigation bar does not return them to the homepage, but instead to the landing page of a specific bureau’s section of the website.
2. Irrelevant Search Results: On multiple occasions, testers used the search bar after being unable to complete their task any other way. However, instead of being a helpful tool, the search bar only created more difficulty by showing results that either took the user off of the government website or results that were completely irrelevant to their search.
3. Discoverability of Shortcuts: Most of the time, testers chose to use the navigation bar tabs over the shortcuts on the homepage. However, after navigating off of the homepage, on several occasions, testers noted that the shortcuts in the sidebar were convenient and opted to use those over the links in the center of the page.
Often times on the homepage, users were able to quickly identify the broad category that would take them to the information they needed and didn’t feel the need to spend time scrolling down and reading through shortcuts.
4. Navigation Tracker: Another feature of the site that caused more problems than it solved is the navigation tracker that appears in a bar at the top of the screen once the user navigates off of the homepage and onto the page of one of the city’s offices or bureaus. It is intended to show the user their location on the site and the different pages they traveled through to get to where they are.
However, one tester misread it, which ultimately led to her facing increased difficulty while trying to complete her task. She was on the landing page of the Bureau of Housing, but the tracker didn’t show where she currently was – only the previous page she had been on.
I consolidated my findings into key recommendations for improving the side and resolving the problems that users are currently facing.
1. Navigation Bar: Allowing users to see a dropdown menu when hovering over a tab would prevent the frustration of not knowing what is on a page before clicking and not seeing what they need.
2. "I want to..." Shortcuts: The shortcuts are too low in the visual hierarchy on the homepage, making them hard for users to discover, which in turn complicates and lengthens the process of finding information for the user. The shortcut buttons should be bigger to not overwhelm the user with lots of information in a small space, and they should occupy a larger and more prominent space on the homepage.
3. Sidebar Information: Eliminate the unnecessary information in the sidebar that
overwhelms and confuses the user.
4. Search Results: Users received too many irrelevant search results that complicated the process of finding the information they wanted and added to their frustration in using the site. The problem could be remedied by ensuring that for a link to be shown in the results of a search, it should have key words that are specific and in a particular order, not common words or connecting words that can appear in many different contexts.
5. Information Architecture: On multiple occasions, users expressed confusion at what they were being shown and that it was not what they expected to see when they click. Certain titles need to be rewritten in a way that is more representative of what they are for, and information needs to be recategorized to be more discoverable to users.
6. Color and Aesthetics: Most testers noted that the gray color scheme on the first page gave a negative first impression of the site, which also reflected on their feelings toward the city itself. Participants also favored the pages with more pictures and color, so improvement could be made to the design of the homepage to feature the qualities that testers liked the most.
7. UI Design: Another aspect of the site design users took issue with was the crowding and volume of information, specifically on the homepage. Add more white space between elements, shorten the length of the page, or remove elements that aren't commonly accessed by users.
WHAT I LEARNED
This project was the largest scale research that I had done for an IxD project. This was also the first time that I'd had to conduct design research with people I'd never met – before this, I'd only used surveys to collect information, so I never had to sit down and directly interact with a stranger. All of the "interviews" I'd done before were with friends and classmates with no clear moderator's guide to follow.
I was nervous to attempt this, but it gave me an appreciation for embracing the things that make me uncomfortable. If I had just interviewed classmates like normal, I wouldn't have learned the things that I ended up learning, since my classmates weren't an ideal match for the intended user of this site.
It was also a valuable experience to conduct this type of usability assessment on a product that I didn't design. I realized the way that the context of research can influence participants. The participants in the unmoderated test gave answers that showed they believed I was affiliated with the website – which I was not. This made them give favorable reviews of the site's usability, even when they performed poorly on some tasks. The moderated test participants, however, knew that I wasn't an actual designer of the site, and they were more willing to be critical.
This taught me that when I conduct research on my own projects, I need to be mindful of the fact that interviewees might try to say things just to please me, or they may hold back critique that would be useful to me. In these scenarios, I need to recognize this or anticipate it and ask questions that can get past it.
This is my report documenting my process and findings.