BeAM

Homepage of “Be a Maker” of BeAM, UNC at Chapel Hill’s Makerspaces website.

Background

From a project for a course I took called Usability Evaluation and Testing (INLS 719) at UNC Chapel Hill, myself and three fellow students completed an evaluation project for UNC’s Makerspaces website, BeAM (“Be a Maker”). BeAM supports an enthusiastic community of hobbyists and some professors incorporate the makerspaces into class projects. BeAM staff wanted to promote wider use of makerspaces, but some new users are frustrated by the website this is known from interviews with the BeAM full time staff and their understanding of their users. With these frustrations in mind, we proposed conducting a usability evaluation of the BeAM website to identify how it is and is not supporting community goals. The results of this evaluation were delivered to BeAM at the end of the course so they can take steps to implement our proposal.

Issues

We came up with a list of issues after a discussion with BeAM staff and a heuristic evaluation of core tasks the website supports:

  • Users have trouble finding resources for operating some of the equipment, such as formatting files correctly.
  • Makerspaces are at many different locations on campus and have different services and requirements for participating, which can be confusing.
  • Some makerspaces require people to attend an orientation first, but it isn’t very clear how attendance works.
  • Signing up for a makerspace class lacks a clear path or feedback on whether it was done successfully.

Goals

Our initial goals were developed after evaluating the key issues on the website:

  • Clarify requirements for attending makerspaces and all related information such as where to attend, what to bring, what training is required beforehand, etc.
  • Make it easy for users to search resources to learn about the equipment and how to use it themselves.
  • Users should be able to identify and get in contact with the BeAM staff member(s) if they want more information on a specific workshop and/or tool.

Usability Testing

This usability study was a somewhat formative assessment, while also incorporating exploratory methods to gain a better understanding of student experiences using the BeAM website. Participants were asked to answer pre-test questions gauging their current level of interest and familiarity with BeAM. They were also asked to think-aloud while completing five tasks, followed by a short post-task questionnaire for each task. Finally, they participated in a post-test semi-structured interview. The order of the tasks shown to participants varied to control for any learning effects that may have taken place. A balanced Latin square was used to develop five task orders. Since only four participants were interviewed, one order was randomly selected to drop. In addition to the qualitative think-aloud and interview data, quantitative metrics focusing on task difficulty and success rates were collected.

The 5 tasks users were asked to complete were:

  • Find out how to slice a 3D Print File: “You have a 3D print due for a class assignment. Using the website please find information on how to correctly prep a 3D file for a 3D printer.”
  • Find locations and hours of a Makerspace: Your teacher is having you do a scavenger hunt for class, where you need to pick up a sticker at each location to show you found it. Item 3 on the list is BeAM’s Murray location. Using the website, how would you go about finding this information?
  • Contact an expert about a BeAM service: You have decided you want to make a birdhouse for your mom’s upcoming birthday. You have never used a woodshop before but heard the makerspaces on campus have one available to UNC students. Using the BeAM website, find out how you can set up a time to meet with someone to tell you more about the woodshop and help you get started on your project.
  • Find an orientation time and signup waiver: You need to build a prototype solenoid engine for a class. Your professor mentioned BeAM as a potential resource, and the equipment you need is available at Murray Hall. Find out how to attend an orientation at the Murray Hall Makerspace, so that you can gain access to the materials you need to complete your project.
  • Register for a workshop: You want to use Fusion 360 for a collaborative engineering class project but need some help to get started. Find out if you can sign up for a Fusion 360 workshop at the KSL makerspace for the earliest available date.

The post-task questionnaires were each the following 4 short questions with a 7-point Likert Scale:

  1. The task was easy.
  2. I am confident that I completed the task.
  3. This task took a reasonable amount of time to complete.
  4. The system provided a clear path to complete the task.

Findings

We evaluated the results based on the self-reported post-task questionnaires, interview responses, observations, and how participants navigated to the desired information or service in terms. We did not time participants because we used a think-aloud procedure that would interfere with time taken, but we did track whether participants followed the “garden path” we determined for finding the information, how many steps they took and how many times they made “errors” (back-tracked to a previous page/step).

We first looked at average responses for each of the post-task questionnaires and computed confidence intervals for each. The chart below shows responses to the question on task difficult:

Post-task mean scores and confidence intervals for participants agreement with “The task was easy”

As for each task performance, we calculated lostness which is connected to the determined “garden path” (smallest number of screens needed to visit) and “errors” where users had returned to the same screen again. These results are summarized the the table below.

TaskGarden PathMean LostnessLostness
95% CI
Success RatesPerceived Difficulty Ranking
Murray20.480.04 – 0.925 complete1
3D printer file30.50.33 – 0.662 complete, 1 partial5
Consultation30.280.10 – 0.461 complete, 4 partials4
Orientation30.350.019 – 0.523 partials3
Workshop40.240.01 – 0.472 complete, 3 partials1
Display of key metrics from all tasks

Recommendations

Based on the results, we determined that most of the problems could be summarized in two main categories: label mismatch, and hidden or missing information. For example:

  • Label mismatches: When looking to set up a consultation with an expert, many participants referred to the “People” page. Several participants also mentioned confusion surrounding the “Events” label when it came to participants searching for orientations and workshops. Finally, “Technical Resources” was not always an obvious choice for participants searching for the 3D printing instructions.
  • Hidden or missing information: Several participants noted the lack of physical addresses on the location page. When looking to attend a workshop or orientation, it was not always clear how/if they needed to register for the event and what “prerequisites” may be required (e.g. sign a waiver form or have previously attended an orientation). Additionally, the e-book was not always an obvious choice for participants when searching for the 3D printing instructions, and only one out of five participants found the BeAM staff specialties information on the consultations page.

Based on the overall results we came up with a set of recommendations that we organized in a cost vs. impact matrix:

 Low CostMedium CostHigh Cost
Low ImpactRename “Events” labelAdd “Other locations” link in website footerCorrect links on homepage
Medium ImpactAdd address information to locations pageEmphasize staff specializationsConnect “People” and “Consultations” pages
High ImpactEmphasize workshop registration linkEmphasize orientation waiverReorganize “Resources” and/or “Technical Resources”
Cost vs. Impact Matrix for problems identified on the BeAM website

Reflection

Overall, our usability study was successful in meeting our evaluation goals. The post-task survey results showed that participants felt somewhat confident completing most tasks. Ease of use results were mixed. The difficulty levels of tasks reported by our participants did not always match the completion rates, lostness scores, or qualitative data. Error rates proved to be a difficult metric to collect, since participants discovered new ways to complete tasks which did not match the garden path. Finally, user satisfaction was reported to be somewhat positive as far as how much time it took to complete different tasks, but more negative as far as whether or not the system provided a clear path to do so. By the time we were presenting the project BeAM staff had already implemented some of the low and medium cost changes, which they felt made a meaningful improvement to the website, and had plans to work on our other more time-consuming recommendations.