Aible for Business
We worked with a company client called Aible for this project. Aible software enables users to use machine learning to create models for user provided data and constraints. At the time, Aible offered “Aible Advanced” a product geared specifically towards data scientists. When we began our project, Aible was in the process of expanding their product offerings to business users who were interested in producing models and gathering insights, regardless of their level of data science fluency. “Aible Business” provides straightforward answers in the form of actionable business insights and operational metrics. Our goal was to conduct research to assess the product's usability and streamline the user experience.
Duration 10 weeks
Responsibilities organize and facilitate usability tests and interviews, write usability and interview guides, conduct qualitative analysis, provide recommendations, write and present deliverables
Team Carissa, Hannah, and Pranavi
How do business users with data science experience understand the Aible Business product? And how might we improve usability of the model building process and the model output?
Usability has a broad definition, specifying our definition based on the target users helped us establish guidelines for our research and strategize a research plan.
Learnability: How quickly can users learn to navigate the interface?
Satisfaction: Do the users feel like their needs and expectations are being met?
Control: Are the users confident in the outcome of the model to inform their business decisions?
Based on our goals and usability definition, we decided to conduct usability studies and interviews.
We chose to implement usability studies and interviews for this project for a few reasons:
adaptable to virtual or in-person (due to COVID-19 they ended up all being virtual) which allowed us to reach a geographically diverse range of participants
provided a balance between concrete interactions with the interface and abstract impressions about the product
provided in-depth data while still being time efficient
However, each method had limitations which we acknowledged:
could have had a larger and more diverse sample if we had implemented surveys
usability testing and interviews are more time intensive and thus constrained by participant availability
the controlled setting could influence generalizability because people would most likely be more comfortable working with their own data rather than what we provided
Methods Usability Tests, Interviews
Participants 8 business professionals
Duration 1 hour
We developed a short and specific screening survey to ensure that our participants matched the target user.
We leveraged our social media networks for recruitment. Since we were not able to offer compensation for participants time, we expected that individuals within our personal networks would be more willing to participate. In an ideal world, we would have contacted potential uses through Aible's existing client base, however, due to our short timeframe, we decided to use this convenience sample. The screening survey helped mitigate some bias from using a convenience sample by ensuring that the participants fit specific criteria and accurately represented Aible's target user.
To actualize our research plan, we created test guides and then conducted 8 sets of usability tests and interviews.
This is an example from the interface that we watched participants navigate and interact with.
Under the guidance of our client mentor, we created specific tasks for the participant to complete during the session. All sessions were conducted over Zoom, and screen sharing allowed us to watch participants navigate the interface in real time. During the test, the moderator facilitated the general structure and outlined tasks for each participant to complete, but for the most part did not provide any further instructions. We utilized the “think-out-loud” method, participants were told to explore and address challenges on their own, giving voice to areas of confusion or uncertainty.
We utilized semi-structured interviews which allowed the participants to provide more open-ended feedback, we selected this method to complement the specific data gathered in the usability tests. We created an interview guide, which outlined the interview structure, beginning with specific questions about participants experience using Aible then broadening to generalized questions about the role of business analytics software in their work and experiences with data modeling products. This variety of questions allowed us to better understand how Aible could fit the needs of business users.
We analyzed our data by coding our transcripts to identify patterns and extract insights.
Using this method of analysis, we synthesized key findings. We presented the four most common issues and provided actionable recommendations.
Participants relied on trial and error to identify the correct format for their inputs which resulted in doubts about their accuracy.
Technical jargon was a barrier causing confusion, there were terms on the interface that did not match the functionality that users expected.
Users felt hesitant to trust model outputs for their business decisions and wanted greater transparency or explanation about the conclusions.
We compiled a set of features that received positive feedback. These areas could be further developed or incorporated into other aspects of the product.
Deliverables Usability Report, Presentation Deck
We combined our research into a report and presented it to our client who then provided us with feedback.
Following up with our client, overall the feedback we received was positive. After a few weeks our client was able to:
utilize the specific design and language issues we detected to redesign of the first interface page
validate assumptions about issues areas that the they had previously suspected
open up a new direction for the UX team to further explore. An area that previously had not been researched.
The lessons learned are always the most important aspect of any project.
We completed this project during a challenging time, by far the biggest was COVID-19, however, we had to find ways to creatively address these challenges, and deliver the project on time.
Build in extra time to address setbacks. Along the way we encountered setbacks such as participant no-shows and COVID-19 shelter-in-place. We were able to stay on track because we had built in extra time for unexpected challenges.
Lean on each other. Since we were a team of students, we all had to juggle this project with many other responsibilities. Supporting and empathizing with each other helped us work together.
Pilot and practice. We improved our testing tactics greatly in the first few tests, and each time after that, we still got better.
Thank you to Professor Fadden at UC Berkeley School of Information for a great class, Joe at Aible for sponsoring this project, and my awesome teammates!