Misc
 • 
5 Mins

Anomify x Goldsmiths University UX Engineering Partnership

Posted on
January 10, 2023
Check out our demos

See Anomaly detection, matching and learning in action.

Live Demos

Over the past few years Anomify has been engaged in a partnership with Goldsmiths University in London to support students on the User Experience Engineering Masters program. We provide students from the course with a brief, which includes some design questions that could be addressed in our product, and one or two students can choose to work with Anomify for their final project.

Last year two students Noor and Yuqing took up the challenge!

The Anomify dashboard has to address all manner of UX questions that aren’t easily solved by heuristics. For instance we need to present graphical data to monitoring professionals at the right time to help them understand their systems. Which data to provide depends on the context in which the anomaly occurred. At best Anomify is the mechanics apprentice, passing the right tools to the mechanic before they’ve been asked for.

This the list of UX questions that 2021/2022 students could choose from:

Research already carried out

How might users sign up and add time-series data without assistance from the Anomify team?
How can we make the training process a task that users want to engage in?
How can we help users explore events in Anomify?
How might we improve the onboarding process?
How can we help users understand our core value and experience a “moment of magic” early on in their interactions with Anomify? 
How can we make real-time graphs more useful & readable to users?
How can we structure the data in the dashboard so that the most important information is available and digestible to users?

Benefit to students

Not only do students get a case study for their portfolio that focuses on a novel UX problem, they also get to work as part of a development team which exposes them to real-world constraints.

Think about it, when designs are going to be used in a working product, it’s prudent to design extendable UI components where possible to make front-end development more efficient rather than reinvent the wheel each time. Also where designs are replacing existing systems designers need to think about how to transition legacy parts of the dashboard to new states. Neither of these need consideration for a conceptual product.

Benefit to Anomify

A fresh pair of eyes is always good. We get help with user research and advice on how best to address UX issues. This includes research, prototyping and testing with our target audience.


Yuqing’s UX Question: How might we improve the onboarding process?

Yuqing worked on simplifying the onboarding process. While users can sign up and send metrics to Anomify for free, there is a lot of upfront investment in time required before Anomify starts to provide value back to customers. Specifically users need to send metrics to kickstart the value chain. 

Youqing carried out her own research with users that corroborated with our own: users disengage when reading our help documentation. They aren’t clear on the answers to these questions:

  • How easy is it for me to send metrics to Anomify?

  • Once I have sent metrics, how long / how much input will be required until the system is generating value?



She set about addressing these questions by:

  1. Splitting up the metric sending process into chunks
  2. Providing more feedback after each step onboarding step has been completed 
  3. Providing example code that users can play with without committing their own data


Usability testing

Yuqing evaluated her designs with a group of participants matching our target persona. The=ey were asked to sign up and send metrics to the dashboard. She found that her design changed helped users better understand where they were in the onboarding process.

Noor’s UX Question: How can we help users explore events in Anomify?

Anomalies aren’t valuable in and of themselves. The value is added when the interactions between multiple metrics shed light on an event happening in the system being monitored.

For her project Noor took a look at how we might help users explore events in the dashboard so they could carry out more effective root cause analysis.

She first undertook an extensive literature review of data dashboard design, then used interviews, desk research, heuristic evaluation and card sorting to uncover particular pain points for users.

Two themes came out of the research phase. From the card sorting exercise it became clear that the structure of the dashboard could be improved. Secondly, it wasn’t easy to explore metrics and relationships to other events. 

Noor mapped out her findings on an effort impact matrix. She then addressed issues in the Low Effort but High Impact cell in the design phase.


Prototyping Tasks

Noor turned her research insights into design opportunities and iterated through prototypes with Figma. Below are some screenshots of her final high fidelity design.

Restructuring the information architecture

  1. Noor amended the navigation structure, putting the navigation on the left hand side with new top level categories. She Changed the menu from the narrow top navigation to the broad, shallow tree structure and left navigation. More top level navigation categories means users have to retain less information about the site's structure. 





  1. She improved the Overview page, adding in a heat map of recent anomalies and enabled the user to see recent false positive events alongside recent anomalies. 


Assist in events exploratory analysis 

  1. She aided to enable efficient browsing of metrics so that they could be compared and visualised in an intuitive way.



  1. Here she plotted existing correlation and related event data on the same graph so that interactions between metrics could easily be observed.

  1. Noor decluttered the metric details page to enable more straightforward investigations and to reduce cognitive load for users. This included removing any duplicate information, removing any non-functional components, and enhancing the tooltip displayed on the graphs.



Usability Testing

Noor tested her prototype by requireding users to complete three evaluation tasks 

  •  Explore metrics related to CPU metrics
  • investigate CPU-related events
  • see recent alerts for CPU metrics

Conclusion

It’s always good to unpack UX problems with new people. We learned a lot from our students and we hope they got something out of it too. As for their design effort. We’ll incorporate some of their ideas into the dashboard in the coming months.

Will Floutier
Product Manager

Want to check that we're a good fit?