Because good research needs good data

DMPonline Usability Testing

Theo Higham | 17 August 2020

We carried out our last usability assessment for DMPonline in 2013. We implemented many of the features suggested by our users and the tool was completely transformed. A new and improved interface was launched in 2015.

We have introduced new functionality to the tool over the past few years. We are in constant dialogue with customers and helpdesk enquiries give us a good overview of feature and workflow improvements, but rarely touch on usability. We felt this year it was definitely time for another round of systematic assessment. This time we focused mostly on testing the administrative interface since this is where most of the changes have occurred in the last five years.

This blog gives a brief overview of our findings. A full report will be published and distributed, most likely as a paper, which will be circulated to all administrative users.

The testing schedule consisted of the following methods:

  • Guided interviews with 8 current users with administrative rights
  • Usability tests with 4 people who had never previously used the tool
  • An online survey, which is still open for contribution

Guided Interviews

We wanted to hear from our existing administrative users and learn about the most prevalent issues with the interface and any new features they would like to see implemented. We assessed key functionality by taking a journey through the administrative menu.

Overall, we were pleased to hear that the interface was easy to use, but we encouraged users to report problems.

Some are due to interface design choices. For example, too much Guidance text is crammed into a narrow column on the right-hand side of the screen. Suggestions we received were to update the aesthetic layout and appearance of guidance text, Organizational configuration of default Guidance to prevent information overload from multiple sources.

Some problems are down to workflow choices. Admin users would like to customize funder templates by adding a standard set of questions. At the moment, this involves manually adding the questions to each funder template in turn, rather than a procedure that says ‘apply these questions to the following templates’ via user-friendly tick buttons.

A third class of problem was due to a lack of understanding or utilization of existing functionality. We were somewhat surprised to find that only 50% of interviewees made use of the Request Feedback feature to provide researchers with feedback on their plans. Some used direct email correspondence, others used enquiry management systems provided by their institution. Some users were unaware of certain facilities for example, template versioning or more sophisticated reporting on usage via the API. It is important to us that the tool's functionality is fully utilized so we will have to find ways of making these facilities more obvious, perhaps through the use of an admin dashboard or help on mouse-over.

We received numerous suggestions for improvement, too many to list here, some of them were:

  • Ability to drag and drop questions when editing templates, in addition to replicating questions across templates
  • Versioning for Plans, currently only available for Templates
  • Filtering Users and Plans by department
  • Better Usage reports, including reports on answers to specific questions
  • Ability to tidy up data by deleting/archiving old accounts, exporting statistics on users as a CSV file

Overall, template editing, and usage reports seemed to be a priority for admin users, and we will bear this in mind when planning development work.

Usability Tests

To put these admin features to the test, we recruited users with no prior experience of DMPonline and asked them to complete a series of tasks to simulate the experience of a new Admin using the site for the first time.

The main tasks were:

  • Create a user account
  • Create a data management plan
  • Edit organization details to add a website link
  • Enable feedback requests
  • Create a template and questions, including a conditional question
  • Add guidance to display it on the plan template
  • Interpret Usage statistics

Overall, we found that testers had no difficulties using the basic site functions such as signing up, creating plans and basic templates.

The tasks which caused moderate difficulty were those which required navigation through menus, such as downloading a plan, adding a link to the organization web site and enabling feedback requests.

Most testers didn’t spot the menu options because they didn’t interpret them as tabs. Hopefully, this can be easily fixed through simple changes to the design to bring the menu options closer to the users’ mental representation of a tab.

The most difficult task of all proved to be creating a conditional question. Every tester required a hint to complete this task. This was introduced as a new feature and we were aware it needed further development and additional help information.

Other issues uncovered were:

  • Inconsistent behavior of the Save button (will it close the question; will it close the whole page?). It is also unclear whether the Save action is complete (confirmation message is not obvious) and users can navigate away from pages without being warned about unsaved changes
  • Lengthy plan/template sections require too much scrolling and, ideally, should automatically collapse when they are completed or when the user navigates away
  • Some menu items can be confusing, e.g. some Guidance could refer to any sort of guidance, including how to use the tool, Request Feedback under Plans vs Request Feedback under Organisation details, have a similar name but perform different functions
  • The orange colour on some menu items can be quite faint and difficult to read

We didn’t like to see users struggling to complete some tasks. It goes without saying that we take the view that the DMPonline interface should be intuitive. New users should be able to complete all key tasks easily. Perhaps advanced features such as the API will always require some tuition. However, everybody should add a conditional question without having to think too much about it.

Online Survey Results

The survey proved very useful for gaging how the users feel about the tool and for gathering information on what works and what doesn’t.

We were happy that for most users the tool was a positive experience. We aim to get rid of the ticks under ‘frustrating’ and gain a few under ‘sleek’.

Common causes for frustration are:

  • Forgetting to save changes to the template
  • Typos - spell checker is needed
  • Needing to remember the workflow to create plans/guidance/themes
  • Doing things in the wrong order
  • Confusion about navigation - which menu to open for which action
  • Having to follow up on left-over DMPs. Not being sure of who did what, and where they are at. This is resolved by email communication.

We are leaving this exercise with a bagful of ideas for new features and suggestions for improvement. We will be in touch with a full report of our findings, which will also contain proposals for solutions.

The online survey is still open, please contribute!

A huge thank you to all participants!

About the Author

Theo Higham was employed as a summer intern to coordinate the usability testing exercise, in collaboration with Diana Sisu. He is a Computer Science student, currently going into his final year of study. He has studied disciplines such as Vision and Robotics, Computer Security, Introductory Applied Machine Learning, Professional Issues and Computer Communications and Networks.

Follow @HighamTheo on twitter and view Theo's LinkedIn profile.