Abstract

From fall 2016 to fall 2018, small-scale UX studies were conducted regularly at Penn State University’s main library entrance. More than 200 participants spent 5 to 10 minutes participating in these UX studies in exchange for a cup of coffee and a snack. Unlike the traditional usability testing setup, this pop-up stand, called the UX Café, aimed to establish a cost-effective, agile, and sustainable UX study venue in the library. The article provides the idea and rationale behind the UX Café, details the processes, and discusses the results and impacts.

Introduction

Thirty years ago, Jakob Nielsen launched the “discount usability movement” when he presented “Usability Engineering at a Discount” at the 3rd International Conference on Human-Computer Interaction in Boston (Nielsen, 2009a). He challenged the early Human-Computer Interaction study norm—30 to 50 test subjects (Barnum, 2010)—by stating that a usability testing result would be good enough when we had five participants thinking-aloud while conducting simple tasks on prototypes. In his 1994 article, “Guerrilla HCI: Using Discount Usability Engineering to Penetrate the Intimidation Barrier,” Nielsen pointed out that the reason usability engineering methods were rarely used was mainly because of the perceived cost. By simplifying the early usability engineering methods, he aimed to diminish the barriers of conventional approach and further democratize the user testing practice.

However, Chow, Bridges and Commander (2014) did a survey in 2014 on 1,469 academic and public library websites and found more than 70 percent of the polled library respondents did not conduct any web usability testing. The result is surprising given the plethora of articles documenting the success of usability testing within libraries using Nielsen’s protocol. Why haven’t libraries embraced the practice of regular UX studies? Regardless of positive results, many of us might find the case studies in literature costly and time-consuming, deterring us from jumping into action. The pared down usability study process did lower the cost barriers somewhat, but in reality, the budgetary and human resources needed for “discount usability” may still be costly for many libraries. The investment lies not only with the actual UX study, but also the planning of the study. The study itself may not push the budget limit, but the process of planning and execution can still be labor-intensive, especially if a comprehensive scope of study is expected or group consensus is required in web governance. As a result, the perceived cost and lengthy process may discourage libraries making UX studies a practice or even conducting studies at all.

The UX Café, a lightweight UX study framework, was an alternative approach to higher-cost full-blown usability studies. A “pop-up” stand set up in the Penn State University (PSU) Libraries main lobby, the UX Café offered participants a cup of coffee and a granola bar in exchange for 5-10 minutes of their time participating in usability tasks, short interviews, design feedback, and other UX research. The goal of the UX Café was to develop a cost-effective, agile, and sustainable UX study framework that could be incorporated into the decision-making process. This 2-hour “UX event” held on the same day and time recurrently, at the same spot in the library, attracted passersby to participate. By making the event regular and available for students in an open space, the overhead of recruiting participants and scheduling sessions was minimized. Apart from developing the tasks for participants, once the facility was secured and the laptop was properly prepped, all we needed was fresh coffee and snacks! In addition to the benefit of conducting tests cost-effectively and providing recommendations to resolve issues quickly, the setup allowed me to communicate the importance of UX in our services and raise the awareness of UX throughout the organization. In this article, I will discuss the need for a sustainable UX framework and demonstrate how I set up the UX Café as the first step towards creating a user-centered environment in the library.

The Need for a Sustainable UX Study Framework

Principles of Creating User-Centered Design

“Test early, test often” is the rule of thumb for conducting web UX studies. Gould and Lewis (1985) proposed three principles of user-centered design: early focus on users and tasks, empirical measurement, and iterative design. Involving users early allows us to identify and resolve issues while structural changes are still relatively easy to make, preventing further investment in a potentially failed design. In Nielsen’s (1993) words, “because even the best usability experts cannot design perfect user interfaces in a single attempt, interface designers should build a usability engineering life cycle around the concept of iteration” (p. 32). From Carroll and Rosson’s (1984) proposal of using usability specifications as a tool in the iterative development process to Nielsen’s (1992a) usability engineering model to the more recent efforts on integrating UX studies in the agile development cycle (Kane, 2003; Sy, 2007; Fox, Sillito, & Maurer 2008; Budwig, Jeong, & Kelkar, 2009), incorporating iterative UX studies into a development process has been one of the major steps in achieving user-centered design. As UX practitioners, we need to recognize that in addition to conducting testing and analyzing results, we are also responsible for establishing a UX study workflow to allow us to test early and test often.

Iterative UX Studies in Libraries

While much research in libraries focuses on usability testing, detailing the tasks and results, there is little discussion around building a sustainable UX framework (Gallant & Wright, 2014; Friberg, 2017). Several articles describe an iterative approach of “test, design, test, and revise” during website redesigns (Benjes & Brown, 2000; McMullen, 2001; Cobus, Dent, & Ondrusek, 2005; George, 2005; Becker & Yannotta, 2013), but it is unclear if an iterative process continued after the new designs were launched. A survey by Chen, Germain, and Yang (2009) found that the majority of Association of Research Libraries member libraries (64 out of 71) did not conduct iterative usability testing at all design stages—predesign, during design, and post-design—and concluded that “iterative testing was minimally conducted” in the library. Gallant and Wright (2014) reasoned that because of the abundant information and content types libraries provide and the diverse group of users libraries serve, user experience testing should be repeated regularly to ensure the website addresses these challenges in a timely manner.

Abundant information and diverse users are not unique to libraries. Amazon has very similar challenges. But, unlike with e-commerce sites, we cannot translate the website traffic into financial rewards. However, a library is partly defined by the usage of library services and resources, which means creating and maintaining a user-centered environment on all levels at all times should be one of the core missions of a library.

Barriers of UX Studies

Nielsen is a strong advocate for discount usability, stating that the misperception of UX studies—costly, time-consuming, and entailing daunting techniques—prevents UX studies from being as prevalent as they should be (Nielsen, 1993, 1994). He also pointed out the underpinning to educate on a lack of perceived need for UX and a lack of awareness about appropriate techniques. Gould, Boies, and Lewis (1991) offered a more complex organizational and technical explanation for the barriers, arguing that usability was viewed as not measurable, not part of the goals, and a potential cause of delay to development work. Van Kuijk, Daalhuizen, and Christiaans (2019) distilled three drivers that influence usability during product development based on case studies: prioritization, capability for user-centered design, and design freedom.

The reasons libraries have not embraced UX studies is complex and often depends on the structure of the organization. Considering the barriers mentioned above, efforts are needed in two areas to advance user-centered design within organizations. First, the management needs to recognize and support the value of UX work. Second, the designated UX person or team needs to equip themselves with knowledge and skills and be able to conduct studies cost-effectively. There is definite merit in the comprehensive approach of some UX methods, but a low cost, efficient approach can better position UX for a long-term commitment from the management. In the following section, I will illustrate how the UX Café can help reduce barriers and bring a UX mindset into the culture of the organization.

Case Study: UX Café

Backstory

In fall 2015, I was hired as the UX Librarian at Penn State University. While I was eager to gather user feedback and work with the web team to create a user-friendly website, I had concerns about planning usability testing based on my previous experience, which is similar to many case studies in the literature—applying for Institutional Review Board approval, obtaining funds for incentives, securing hardware/software, reserving spaces, designing tasks, recruiting participants, and scheduling tests. It was a slow and time-consuming process.

I found the recruitment for and scheduling of the testing to be most burdensome. Tidal (2017) used multiple channels in his efforts to recruit 90 students for mobile site testing: social media, in-class announcements, mass emails, and $5 Amazon gift certificates. In the end, 20 students responded. Another problem Tidal noted was that students would either cancel at the last minute or not show up after the study time was arranged. My approach was simpler. I set up a table by the library entrance to encourage students to sign up for the study with a $10 gift card as incentive. Later I contacted each student individually to schedule a testing time. Several of the web committee members planned to participate in conducting the usability testing. With two librarians present at each test (one moderator and one notetaker), it was not easy to find a schedule that worked for everybody. I spent more time scheduling than conducting the tests.

Another time–consuming aspect of the process was the committee consensus culture. Dethloff and German (2013) described a prolonged process in the early phase of site redesign at the University of Houston:

Committee members had different interpretations of the initial questions and different ideas about the overall direction for the test. Many hours were spent discussing individual questions, the intent behind them, and how best to include them in a usability test. In the end, developing a script that all parties found acceptable took over four months.

Though my experience fared better, prolonged group discussions without any action can be disheartening and obstructive to future testing. We all know the value of the iterative design process in meeting our users’ needs, yet, we miss the point of an iterative process when every change is treated as monumental and consensus is needed to proceed. It is natural that we want our voices to be heard. Even the WebX team at Duke University Libraries, which has accomplished a lot with their “flash testing,” shares the same challenge every committee has: loss of connection between recommendations and implementation, ineffective discussion among ill-prepared members, and action/inaction points invested due to personal agendas (Wilkinson, 2015). But, unlike the difficulties in recruiting and scheduling, these organizational inefficiencies can be mitigated or even prevented. What better way to settle a disagreement in the committee than doing the actual testing? The more time we spent talking among ourselves, the less time we invest in getting to know our users. It was this discontent with the traditional process that urged me to come up with a plan for consistently conducting UX studies with students without too much organizational overhead.

Idea

The idea of UX Café popped into my head as I was drifting off to sleep— “why don’t I just set up a stand in the library to get students to do some usability testing right there for some coffee and snacks?” In contrast to a big project-based UX study, I envisioned a UX Café held weekly in an open space in the library to attract passersby to participate. Apart from designing the study, once I secured the space and prepped my laptop, all I needed was fresh coffee and snacks. Best of all, no recruitment or scheduling was necessary; anyone willing to spend 5-10 minutes on testing could participate.

Similar to guerilla testing, low cost and agility are the advantages of a UX Café. The method of guerilla testing often involves going out in the field, such as coffee shop or train station, asking people what they think of your design work (Allen & Chudley, 2012). The goal is the same as Nielsen’s discount usability—doing something is better than doing nothing, but at an even lower cost of time and money. It is not uncommon to gather large amounts of useful information for improvements within a very short period of time by just asking around. The downsides of guerilla testing are the limitation of how much you can test in each session, the chances of failing to find your target audience, and no recording to review and present to your colleagues afterwards (Allen & Chudley, 2012).

Libraries have used guerilla testing with good results in the cases of the University of Houston’s “spot checks” (Dethloff and German, 2013), Duke University’s “flash usability testing” (Wilkinson, 2015), and the weekly testing at Wayne State University (Nuccilli, Polak & Binno, 2018). However, the rationale behind the UX Café is less about guerilla style, but more about building a structured process that can be deployed easily for user studies when needed. My intention was to find a way to minimize the overhead without compromising the process of testing itself. Different from guerilla testing, my equipment was set up in the same spot to record the computer screen and participant’s voice for review after the session. The recording allowed me to focus on interacting with participants and not worry about taking notes. In addition, having the UX Café in a visible location on a regular basis garnered some attention from library employees, which I hoped would lead to more conversations about UX in the Libraries.

Planning

First, I needed funding to try out my idea. A yearly fund is set aside at the PSU Libraries for library-related research, as a way to encourage library faculty to pilot research that doesn’t already have a specific budget. In spring 2016, my UX Café proposal was granted $704 (based on my calculation for weekly UX Café for two semesters, see Table 1) from the Library Faculty Research Grant.

Table 1. Proposed budget for the first year UX Café.
Item Price Total (x36)
“Joe-to-go” from Starbucks in the library (serves 12) $14 per box $504
Granola bars from Amazon About $6 per 12 packages $200 (additional fund sought in case of fluctuation)
Total $704

With the permission from the administration to set up a pop-up coffee stand in the library, I came up with three possible locations and worked with the facilities office to pin down a high-traffic spot that was compliant with the building’s safety code. I also commissioned our in-house Public Relations and Marketing department to create a nice big poster with the UX Café logo. The poster highlighted the incentive, “free coffee and snacks,” and what we hoped for in return, “for 10 minutes with the user experience librarian.” I wanted students intrigued by the design of the poster and the free coffee. To maximize the chances of recruitment, I analyzed the gate count statistics and noticed the highest traffic flow happened on Wednesdays with two peaks around 10 a.m. and 11 p.m. I decided UX Café’s business hours would be Wednesday 9:30 a.m. to 11:30 a.m. My goal was to recruit at least five participants during the two hours (Nielsen, 2000).

The planning of an individual UX study usually started with my weekly meetings with the Manager of Discovery Access and Web Services (hereinafter “web manager”) on Fridays. We touched base on the web development team’s projects in need of UX studies. Through the discussion, we worked to bring a likely broad UX question into a more focused inquiry that allowed me to design the tasks to test with users at the UX Café. After the meeting, I would start with planning for the tasks and needed materials, for example, mockups or a paper-based questionnaire. I would also test the tasks with coworkers to fine-tune the process before the real testing on Wednesdays. The design and the practice run were usually completed by Tuesdays. The original plan was to finish analyzing and writing up reports soon after the UX Café and to present my findings on the Friday meeting with the web manager. However, in reality, I was not always able to wrap up a study in such short time frame, which I will discuss further in the Execution section.

Task Design

Is five to ten minutes enough?

Many usability tests in library literature were designed for participants to complete around ten tasks with one hour of time allotted. Augustine and Greene (2002) asked students to complete twenty tasks in one hour with a limit of 3 minutes per task. How much data can we gather from 5 to 10 minutes of time? In his article “Powers of 10: Time Scales in User Experience,” Nielsen (2009b) stated that users tend to abandon a site if it requires more than 1 minute to complete a simple task. Now, ten years later, it is reasonable to assume that users are accustomed to faster connections and have become less patient. My plan was to have students do one or two scenario-based tasks and follow up with a brief interview. I believed the five-to-ten-minute timeframe fit well with a user’s actual experience and expectations.

The setting aimed to study components of the library website one at a time. From the web design perspective, small and iterative changes of a website prove to be more effective and efficient than a complete overhaul. Rosenfeld (2012) demonstrates that tackling the “short head” (stuff that most people care about, in contrast to the “long tail”) can accomplish far more than a do-over. In the traditional hour-long usability testing described in the literature, the tasks were designed to cover a variety of library resources and services in order to catch as many pain points for users as possible at one time. The small but steady approach of UX Café shifted the goal from a single, major overhaul to regular small fixes.

Think-aloud or not?

Think-aloud protocol is one major component of Nielsen’s discount usability. The protocol, widely followed by the HCI community, asks usability testing participants to talk through the process when performing the tasks. Nielsen explains the reasons why he promotes this method: one, the method is not only used by “psychologists or user interface experts” of well-funded studies, it can also be used outside the laboratory by non-specialists; two, it is a “quick-and-dirty” technique for getting feedback from users (Nielsen, 1992b, 1993). Even so, I took a different approach at the UX Café. Participants were not required to think aloud.

The validity of think-aloud is debatable. Nielsen (Janni Nielsen not Jakob Nielsen), Clemmensen, and Yssing (2002) argue that human minds work faster and with more complexity than can be verbalized. In addition, think-aloud imposes more cognitive load to users and reduces their focus on the task because participants are in a constant attempt to transform thoughts into words. Studies show that the execution of think-aloud methods has an impact on testers and test results. For example, the participant’s mind or behavior can be influenced by the timing of verbalization (concurrent vs. retrospective) (Van Den Haak, De Jong, & Schellens, 2003) or by facilitators using different prompts (classic—“keep talking” vs. relaxed—“Mary, could you tell us what you are thinking now?”) (Hertzum, Hansen, & Andersen, 2009). Interestingly, results of studies on the same aspect have generated different conclusions. But one thing that is generally agreed upon is that think-aloud will increase task completion time compared to tasks completed in silence. I found a few minutes of conversation with participants after the fact was more effective than requesting them to think aloud.

The technology available for user research has become much more sophisticated. In the early days of adopting discount usability, think-aloud technique helped us record a participant’s thoughts during usability testing without the support of advanced technology. Now, thanks to an easily installed recording application on my laptop, I could review movement of cursor, browsing paths, and the amount of time participants spent on the pages. In other words, I could still see their struggles or confusions during the task without them talking it through. Van Den Haak, et al. point out that there is “very limited contribution” when participants verbalized their thought while performing tasks because the verbalization “served predominantly to emphasize or explain the problems that could also be observed in the participants’ actions” (p. 349).

Lastly, I wanted to create a situation that allowed participants to feel less like they were working on a test and more like they were showing me how they navigate the web. Some students tackled the tasks by rapid skimming and clicking. It would have been odd to ask them to explain their reasoning, which they may not have be aware of, behind each action. When discussing think-aloud with graduate students in the Informatics course, Nielsen and her colleagues (2002) learned that students felt judged during usability testing and had negative feeling toward think-aloud because it “interferes with their interaction with the interfaces and the task.” In addition, think-aloud will add strain on international student participants at tasks with the extra cognitive load of processing in a foreign language.

What kind of task?

In literature about library usability testing, we have seen many of this type of task: “Can you find [a name of book/journal/database/...]?” Vaugh and Callicott (2003) point out that these tasks reflect more of a librarians’ ideas of proper information seeking process than of the students’. It assumes that students would come to the library website with a specific name of [book/journal/database/...] in mind and know the different types of resources that the library provides. When explaining the principle of empirical measurement in product design, Gould and Lewis (1985) clarify that “building a prototype to study it experimentally is different from building a prototype to study how people will use and react to it.” The former is an “analytic question;” the latter is an “empirical question.” When designing task for usability testing, we need to be careful not to use the task to get users to “agree to” our design, but rather to “create a situation in which users can instill their knowledge and concern” into the design.

I typically started each test with a scenario that the participants could relate to and had them begin with Google. For example, “Think of a course assignment that is due soon. How would you find the resources you need in the library to finish the assignment?” Or, “Pretend you and your friend are planning to attend the Poetry Slam event at the library. You are not sure of the date and time and where in the library it will take place. How would you find the information online?” The former task prompted us to consider the bento box approach for the discovery layer. The latter task helped us put the importance of SEO (search engine optimization) in perspective. In these two examples, I wanted to learn not just the usability of our website or a feature, but also how users approach a realistic problem by providing the context for the task. Depending on the goal, I would have tasks designed to locate a designated item in order to test a specific feature. For example, I would need an item that I was certain was checked out to test if and how students would use the interlibrary loan service.

Approaching a problem from a student’s perspective is crucial in designing tasks. We need to keep in mind that, for most students, the library takes up a very small slice of their busy college lives. They stop by UX Café because they want coffee and snacks, not because they are familiar with library resources and love to give us feedback. Students want to pass the test and not sound stupid. Vaughn and Callicott (2003) comment that in their study results, participants “rarely expressed negative comments about the libraries’ web site” and they were “vague and evasive in their responses.” A colleague involved in space planning once suggested that I ask students if they like books. No student would tell me they do not like books even though they might never check a book out from the library. We need to learn to expect generic answers to a certain type of question and to reframe the question to get the meaningful results that we can act on. In addition, I would always pretest with coworkers or student workers, and finalize the tasks based on their feedback. I wanted to make sure I used the short time with the participants fully and wisely.

Execution

UX Café was launched on August 24, 2016. In the 28 UX Cafés held, a total of 204 participants participated. The recruitment ranged from 3 participants to 12 participants for one session, with the average of 7.3 participants per session, exceeding my goal of 5 users. The same tasks would be conducted again in the following week if more data was needed. The $700 research grant was able to fund all the UX studies from fall 2016 to spring 2018. Once the grant was expended, the funding for UX Café was allocated by the Assessment department in the Libraries.

The poster design proved to be eye-catching and attracted students. Once students understood the purpose of the UX Café, almost all who stopped by were willing to participate. Participants were invited to sit behind a portable trifold panel with a laptop in front of them. Because it was set up in an open space in the library, the trifold panel served as a small booth to block out the surroundings, which helped them concentrate on the tasks and have some privacy during the testing. Before testing, participants signed in (participants’ names were required by the Libraries’ Business Office for reimbursement), completed a brief survey (see Appendix), and gave their permission for voice and screen recording. These were all handled via one Google Form.

It was my hope that a natural, spontaneous interaction and interesting tasks would help put participants at ease and allow me to engage them in conversation. I tried not to read scripts during the testing, though I did have printouts on hand. Usually there was a follow-up exercise after the initial task. It could have been a short interview, a questionnaire, a request to complete the same task on a peer institution’s website, or a choice of a preferred mockup, depending on the goal of the testing. I also facilitated exercises like card sorting and Microsoft reaction cards (Moran, 2016) at the UX Café. Occasionally, students would stay for more discussion after they finished the test. They were curious about the goal of the study, what changes could be made, and if their responses were different from others’. Some mentioned that they wished other campus units would do similar studies to fix their web pages. Several participants even stopped by on more than one occasion to participate. One student patronized the UX Café five times. It is not that he was an expert user as more often than not he missed the links being tested. Neither was he in it for the food since he sometimes politely declined the coffee and snacks. In his words, “I like the studies. They are very interesting.”

As a UX team of one at the Libraries, I was responsible for the entire process, from ordering coffee to writing up reports. The procedure was efficient and straightforward in terms of setting up the pop-up café. Nonetheless, it would have been helpful to have a second person assisting who could attend to waiting participants while I was conducting a study. The time spent on analyzing results varied depending on the number of participants and the tasks. Originally, I conceived it as a weekly event with each report coming out weekly soon after the study. However, sorting through the voice/screen recordings to distill key findings in writing proved to be more time-consuming than expected. Moreover, testing at the UX Café was only one aspect of my responsibilities. The plan of a weekly routine turned out to be too ambitious. It was more practical for me to plan a study every other week, or once a month to allow for analysis and report building. Also, due to the significant backlog of the web development team, a coordinated approach to align testing effort with implementation would have been more constructive.

In their 2013 book Measuring the User Experience, Tullis and Albert identify ten types of usability commonly measured in business setting (see Table 2). To identify the range of UX studies that the UX Café can offer in comparison to industry practice, I classified the UX Café studies using these ten types. I was able to map the studies to all the categories, except for “evaluating the impact of subtle changes” which is best measured with A/B testing. Though not all the metrics that Tullis and Albert recommend in the book can be measured at the UX Café, for example, learnability, I can still gauge UX by measuring task success and task completion time and obtaining self-reported data from the questionnaires and brief interviews. The mapping indicates that the UX Café can accommodate a variety of UX studies that are commonly performed.

Other formats of testing can also make use of the UX Café style setting. Jenn Nolte, the UX Research Librarian at Yale University, took inspiration from the UX Café and set up a similar booth in the library’s lobby for remote usability testing with incentives of $10 Amazon gift cards (J. Nolte, personal communication, 2018). Though UX Café can be tailored to fit different projects, it is limited by the timeframe (5 to 10 minutes) and test subjects (mainly undergraduate students and only people who visit the library). Alternatively, UX Café can serve as a preliminary or an additional data point for more comprehensive studies.

Table 2. Mapping UX studies at the UX Café with UX study types.
Type of UX study UX Café study Methods (measurements)
Completing a transaction Locating a library resource or services, e.g., requesting a book
  • Usability Tasks (task success, completion time)
  • Brief interview
Comparing products Benchmarking with peer institutions for discovery interface
  • Usability tasks (task success, completion time)
  • Questionnaire (ranking)
  • Brief interview
Evaluating frequent use of the same product Hours page; room reservation page
  • Usability tasks (task success, completion time)
  • Brief interview
Evaluating navigation and/or information architecture Top navigation menu
  • Card sorting (hierarchical clustering)
  • Brief interview
Increasing awareness News and Events page
  • Usability tasks (task success, completion time)
  • Brief interview
Problem discovery Looking up course reserves
  • Usability tasks (task success, completion time)
  • Brief interview
Maximizing usability for a critical product Navigating the discovery layer
  • Usability tasks (task success, completion time)
  • Brief interview
Comparing alternative designs Comparing interface mockups
  • Usability tasks (task success, completion time)
  • Questionnaire (ranking)
  • Brief interview
Creating an overall positive user experience Bento box design for discovery layer
  • Usability tasks (task success, completion time)
  • Questionnaire (ranking)
  • Brief interview
Evaluating the impact of subtle changes Not yet tested at UX Café, can be done with A/B testing

Impact

Integrate UX into Workflow

At the time of my arrival to Penn State University, the library was undergoing a full website migration from proprietary Adobe CQ to Drupal, which aimed to be completed in fall 2016. Planning started in January 2015 and was overseen by the Web Implementation Management Team, which consisted of 10 members, and later 11, when I joined. The team met weekly to discuss content, information architecture, and usability testing. Eventually, one usability testing session for the homepage and one card sorting exercise for library services were conducted before the migration was completed. The limited UX work, though not ideal, did not pose a major concern since the plan was to continue UX studies for iterative design after the new site was up and running.

The aforementioned team disbanded after the charge was achieved, i.e., the site migration. Scrum, an agile method, was adopted by the library for development work. All the web services and applications had (or were to have) a designated “product owner” who worked with stakeholders and was responsible for decision-making and success of the “product” (Rubin, 2012). The product owner worked with the web manager and the web development team to frame the project timeline and worked with me to conduct UX studies. The UX Café fit in nicely with the new structure of web management.

One benefit of frequent, short UX studies is that it allowed me to tailor each study to align with the project timeline. As mentioned, I met with the web manager weekly to discuss the testing results and identify the scope for the upcoming study. The scope can be a new inquiry or modified from previous studies based on the findings. The product owner or developer could observe or administer the testing with my assistance should they be interested. Both the web manager and the product owner were informed of the results and received a report outlining the findings and proposing recommendations after the study. I kept the report succinct and in bullet points. The goal was to connect testing results to recommendations for implementation. Since the changes were meant to be iterative, there was no need for a comprehensive description of the process. In addition, the report was posted on the intranet, accessible to all library employees.

Daniel Pshock, the user experience and web content strategy coordinator at University of Houston Libraries, commented on how their User Café, based on UX Café, fit in the iterative process of their website (D. Pshock, personal communication, 2018):

Having regular small-scale usability studies has led to more frequent iterations on our website because we have a better understanding of how people respond to our design choices. It also raises our staff’s awareness of UX design—they see me out in the lobby with coffee regularly and they see the results of my studies on my blog. It’s helping my coworkers understand that our design choices are not arbitrary because we’re basing things off of what we’ve found in these small, frequent studies.

Raising Awareness of UX within the Organization

Similar to what happened at University of Houston Libraries, the visibility of the UX Café transformed how UX was perceived at PSU Libraries. The frequency and the tasks of the UX Café demonstrated that usability testing does not have to be a big production, it can be done at a smaller scale. In the few years of running UX Café, most of my colleagues had seen me talking to students in the library lobby. They sometimes stopped by, curious about what the day’s testing might be. After explaining the goal of study, I would always invite them to try out the test and give me their feedback. It provided me an opportunity to show my colleagues the different perspectives users may have regarding the library’s online presence, how the library can benefit from these studies, and how the role of UX librarian can fit within the services the library offers. UX is no longer just the trendy new field; it is a practice they can relate to because we all are users at some point and our patrons’ experiences should be the focus of our services.

Rebecca Blakiston, the user experience strategist at University of Arizona Libraries, wrote to me about the positive impact of their Tiny Café, also based on UX Café (R. Blakiston, personal communication, 2018):

While we have done many UX studies using intercept recruitment in the past, we hadn’t previously scheduled a regular, ongoing fixture for this work. By having a Keurig coffee maker, along with an official name, recruitment sign, and budget, it has strengthened our ongoing UX efforts across the library. Since we launched, we have used it for usability testing, user interviews, impression testing, and surveys. We publicize it externally through our website and internally through email. While we always have at least one member of our UX team staffing the café, we also have it available for other library employees to schedule and use. We have had other library employees use it for user interviews and usability testing, with our guidance and support. We have also taken Tiny Café on the road to our Health Sciences Library and faculty learning communities. This has helped us both gather iterative feedback on a variety of projects and foster a culture of user experience across the library.

Conclusion

In the beginning of this article, I mentioned that in order to achieve user-centered design, it takes a two-pronged approach to overcome the barriers to UX. One, the management has to value and support UX work. Two, the designated UX person (or team) has to be capable of applying appropriate techniques and proceed UX work cost-effectively. The two progressions can in turn propel each other, creating a positive cycle to move UX forward.

It does not take much to start the cycle. My proposal was accepted for the library research grant that I used to set up UX Café to promote UX within the library and gained support from management. The ongoing testing allowed me to refine my techniques and explore methods for different projects. Consequently, I became more attuned to designing appropriate tasks to get good results. It is worth emphasizing that it takes more than a few usability tests to gain the knowledge and techniques we need. From time to time, the result may not be as revealing as one might hope for it to be. But because it is low-cost and already structured, I have opportunities to improve my approaches and do it again the next week. The consistent, small-scale format provides a great opportunity to hone the skills of a UX team. The UX culture in turn is strengthened by meaningful and actionable study results.

Culture shift within an organization can be challenging, especially when the change is perceived as disruptive. By promoting UX one test at a time and communicating the outcome, I laid the groundwork for building a user-centered mindset within the organization. To this end, the UX Café served as a successful first step. Not only was I able to conduct UX studies regularly and cost-effectively, I also saw the idea of UX starting to gain traction when the administration included UX studies in project planning and committee charges, and when colleagues requested UX Café testing in meetings or emailed me with UX study questions. The results from the studies at the UX Café helped shape the directions of the library’s discovery layer, top menu navigation, and more. However, I am also aware that it takes time for the whole organization to fully embrace the ethos of UX and there will always be projects moving forward without proper UX studies, particularly with the limited capacity of a UX team of one. As I continued to conduct the UX Café, I also started a UX Interest Group in the library in the spring of 2018 in the hope of garnering interest, cultivating UX capacity within the organization, and building that UX mindset. But, that’s another story.

Author’s Note

An open source of related presentations and materials used for the UX Café is available at https://github.com/zoechao/ux_cafe under a Creative Commons Attribution (CC BY-NC 4.0) License (https://creativecommons.org/licenses/by-nc/4.0/).

Appendix: Sign-in form

First & Last Name ____________________

Email ____________________

Gender

  • ◯ Female
  • ◯ Male
  • ◯ Other

Major ____________________

Year

  • ◯ Freshman
  • ◯ Sophomore
  • ◯ Junior
  • ◯ Senior
  • ◯ Graduate student
  • ◯ Other: ____________________

Why are you in the library today?

  • ◯ To study, using library's computer
  • ◯ To study, using my own laptop
  • ◯ To print
  • ◯ To check out or return stuff
  • ◯ Other: ____________________

Where in the library do you plan to go today? ____________________

How often do you come to the library?

  • ◯ Every day, including weekends
  • ◯ 4-5 times a week
  • ◯ 1-3 times a week
  • ◯ a few times a month
  • ◯ a few times a semester
  • ◯ Never

How often do you use the library website?

  • ◯ Every day, including weekends
  • ◯ 4-5 times a week
  • ◯ 1-3 times a week
  • ◯ a few times a month
  • ◯ a few times a semester
  • ◯ Never

Will you be interested in participating in other studies in the future? (Some of the studies offer cash, up to $20.)

  • ◯ Yes, please contact me for other studies.
  • ◯ No, please don't contact me.

References

  • Allen, J., & Chudley, J. (2012). Smashing UX design: Foundations for designing online user experiences. Chichester: Wiley.
  • Augustine, S., & Greene, C. (2002). Discovering how students search a library Web site: A usability case study. College & Research Libraries, 63(4), 354–365. doi:10.5860/crl.63.4.354
  • Barnum, C. M. (2010). Usability testing essentials: Ready, set... test! Amsterdam: Elsevier.
  • Becker, D. A., & Yannotta, L. (2013). Modeling a library website redesign process: developing a user-centered website through usability testing. Information Technology and Libraries (Online), 32(1), 6–22. doi:10.6017/ital.v32i1.2311
  • Benjes, C., & Brown, J. F. (2000). Test, revise, retest: Usability testing and library Web sites. Internet Reference Services Quarterly, 5(4), 37–54. doi:10.1300/j136v05n04_08
  • Budwig, M., Jeong, S., & Kelkar, K. (2009). When user experience met agile: A case study. Proceedings of the 27th International Conference Extended Abstracts on Human Factors in Computing Systems – CHI EA 09, 3075-3084. ACM. doi:10.1145/1520340.1520434
  • Carroll, J. M., & Rosson, M. B. (1984). Usability specifications as a tool in iterative development (No. RC-10437). Yorktown Height, NY: IBM Thomas J Watson Research Center.
  • Chen, Y. H., Germain, C. A., & Yang, H. (2009). An exploration into the practices of library Web usability in ARL academic libraries. Journal of the Association for Information Science and Technology, 60(5), 953–968. doi:10.1002/asi.21032
  • Chow, A. S., Bridges, M., & Commander, P. (2014). The website design and usability of U.S. academic and public libraries. Reference & User Services Quarterly, 53(3), 253–265. doi:10.5860/rusq.53n3.253
  • Cobus, L., Dent, V. F., & Ondrusek, A. (2005). How twenty-eight users helped redesign an academic library web site: A usability study. Reference & User Services Quarterly, 232–246.
  • Dethloff, N., & German, E. M. (2013). Successes and struggles with building web teams: a usability committee case study. New Library World, 114(5/6), 242–250. doi:10.1108/03074801311326867
  • Fox, D., Sillito, J., & Maurer, F. (2008). Agile methods and user-centered design: How these two methodologies are being successfully integrated in industry. Agile 2008 Conference, 63–72.
  • Friberg, A. (2017). Why continuous usability testing can and should be part of regular library activity—from a UX librarian’s point of view. Revy, 40(1), 9–11.
  • Gallant, J. W., & Wright, L. B. (2014). Planning for iteration-focused user experience testing in an academic library. Internet Reference Services Quarterly, 19(1), 49–64. doi:10.1080/10875301.2014.894954
  • George, C. A. (2005). Usability testing and design of a library website: An iterative approach. OCLC Systems & Services: International Digital Library Perspectives, 21(3), 167–180. doi:10.1108/10650750510612371
  • Gould, J. D., Boies, S. J., & Lewis, C. (1991). Making usable, useful, productivity-enhancing computer applications. Communications of the ACM, 34(1), 74–85. doi:10.1145/99977.99993
  • Gould, J. D., & Lewis, C. (1985). Designing for usability: Key principles and what designers think. Communications of the ACM, 28(3), 300–311. doi:10.1145/3166.3170
  • Hertzum, M., Hansen, K. D., & Andersen, H. H. (2009). Scrutinising usability evaluation: Does thinking aloud affect behaviour and mental workload? Behaviour & Information Technology, 28(2), 165–181. doi:10.1080/01449290701773842
  • Kane, D. (2003). Finding a place for discount usability engineering in agile development: Throwing down the gauntlet. Proceedings of the Agile Development Conference, 2003. ADC 2003, 40-46. doi:10.1109/adc.2003.1231451
  • van Kuijk, J., Daalhuizen, J., & Christiaans, H. (2019). Drivers of usability in product design practice: Induction of a framework through a case study of three product development projects. Design Studies, 60, 139–179. doi:10.1016/j.destud.2018.06.002
  • McMullen, S. (2001). Usability testing in a library web site redesign project. Reference Services Review, 29(1), 7–22.
  • Moran, K. (2016, February 28). Using the Microsoft Desirability Toolkit to test visual appeal. Retrieved from https://www.nngroup.com/articles/microsoft-desirability-toolkit/
  • Nielsen, J. (1992a). The usability engineering life cycle. Computer, 25(3), 12–22. doi:10.1109/2.121503
  • Nielsen, J. (1992b). Evaluating the thinking-aloud technique for use by computer scientists. In Advances in human-computer interaction (Vol 3). Norwood, NJ: Ablex Publishing.
  • Nielsen, J. (1993). Iterative user-interface design. Computer, 26(11), 32–41. doi:10.1109/2.241424
  • Nielsen, J. (1994, January 1). Guerilla HCI: Using discount usability engineering to penetrate the intimidation barrier. Retrieved from https://www.nngroup.com/articles/guerrilla-hci/
  • Nielsen, J. (2000, March 19). Why you only need to test with 5 users. Retrieved from https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/
  • Nielsen, J. (2009a, September 14). Discount usability: 20 years. Retrieved from https://www.nngroup.com/articles/discount-usability-20-years/
  • Nielsen, J. (2009b, October 5). Powers of 10: Time scales in user experience. Retrieved from https://www.nngroup.com/articles/powers-of-10-time-scales-in-ux/
  • Nielsen, J., Clemmensen, T., & Yssing, C. (2002). Getting access to what goes on in people’s heads? Proceedings of the Second Nordic Conference on Human-Computer Interaction – NordiCHI 02.
  • Nuccilli, M., Polak, E., & Binno, A. (2018). Start with an hour a week: Enhancing usability at Wayne State University Libraries. Weave: Journal of Library User Experience, 1(8).
  • Rosenfeld, L. (2012, May 16). Stop redesigning and start tuning your site instead. Smashing Magazine. Retrieved from https://www.smashingmagazine.com/2012/05/stop-redesigning-start-tuning-your-site/
  • Rubin, K. S. (2012). Essential Scrum: A practical guide to the most popular Agile process. Upper Saddle River, NJ: Addison-Wesley.
  • Sy, D. (2007). Adapting usability investigations for agile user-centered design. Journal of Usability Studies, 2(3), 112–132.
  • Tidal, J. (2017). One site to rule them all, redux: The second round of usability testing of a responsively designed web site. Journal of Web Librarianship, 11(1), 16–34.
  • Tullis, T., & Albert, W. (2013). Measuring the user experience: collecting, analyzing, and presenting usability metrics. Amsterdam: Morgan Kaufmann
  • Van Den Haak, M., De Jong, M., & Schellens, P.J. (2003). Retrospective vs. concurrent think-aloud protocols: Testing the usability of an online library catalogue. Behaviour & Information Technology, 22(5), 339–351. doi:10.1080/0044929031000
  • Vaughn, D., & Callicott, B. (2003). Broccoli librarianship and Google-bred patrons, or what's wrong with usability testing? College & Undergraduate Libraries, 10(2), 1–18. doi:10.1300/j106v10n02_01
  • Wilkinson, J. (2015). Low-hanging fruit and pain points: An analysis of change implementation from flash usability testing at Duke University Libraries (master’s thesis). University of North Carolina at Chapel Hill. Retrieved from https://cdr.lib.unc.edu/record/uuid:4b088833-deb0-4d07-a382-3b8d2d971487