Design Principles

How we make design decisions that reflect our values.

Principle criteria

  • Understand the context and spectrum of user needs.
  • Design with, not for, users by creating solutions with our community.
  • Consider how design decisions impact the end-to-end user experience.

Examples

  • Understand context: We know data collectors are often using older Android devices. With that context in mind, we've designed the interface to have small margins optimizing for narrow screens.
  • Design end-to-end user experiences: The app version is intentionally positioned at the bottom of the main menu, so data collectors can easily compare and make sure everyone is using the same version of Collect. Typically the app version would be hidden away in the settings, but we know from field research surfacing the app version can help avoid issues.

Questions to consider

  • Does this meet various different user needs:
    • Data collector
    • Supervisor or manager
    • Form designer
    • Trainer
    • Survey respondent
    • Trusted external partner (in Central, a Project Viewer)
  • Will this idea work for a range of abilities (e.g., motor skills, sight, cognition, hearing)?
  • Are we making decisions based on data (e.g., analytics, testing, co-design with users)?
  • Will this feature negatively impact other parts of the end-to-end experience?

Principle criteria

  • Do the hard work to make it functional and simple to use.
  • Make it easy for novices to start and for experts to accomplish complex tasks.
  • Accommodate a range of skills and technical capabilities.

Examples

  • Make it simple: we know from observing data collectors in the field that trying to focus on multiple questions on the screen can be distracting, making it hard to engage with the respondent, which is why one question per page is the default.
  • Support complex tasks: we can't think of every way that someone can use Central, so we provide powerful APIs that experts can use to automate every possible action.

Questions to consider

  • Have we prioritized making the experience functional over aesthetics?
  • Will people be able to easily learn this feature without needing to be retrained?
  • Will this work for various experience levels (e.g., first-time or power users)?
  • Will non-developers be able to use the tooling?

Principle criteria

  • Let users know what will happen before it happens.
  • Give users visibility into what's happening and what happened.
  • Make problems actionable so users can troubleshoot.

Examples

  • Let users know what will happen: In Central, many actions show a confirmation popup with further information about the effects of the action. For example, a popup is shown before a Form Draft is published. The popup lists general information and shows additional information or warnings in certain situations. If publishing the Draft would change a Dataset, then those changes are listed.
  • Visibility into what has happened: In Central, users can visit the Server Audit Log to see the list of actions that have been performed on the server, including who performed each action and when. Each Submission also has its own activity feed that lists actions performed on the Submission, including changes made over the course of its edit history.

Questions to consider

  • Can users figure out what is happening on their own?
  • Do we provide clear documentation so people can understand new changes?
  • Are we giving users all the right information to troubleshoot on their own?

Principle criteria

  • Speed, stability, and scalability are critical to user success.
  • Reconsider changes that reduce performance.
  • If it can't be faster, it should feel responsive.

Examples

  • Scalability: The Central server allows users to download more Submission data than the server can hold in memory by streaming the Submission data and responding to back-pressure.
  • Speed: The Central UI sends as few requests as possible to load each page more quickly. The UI loads parts of the page even before it has received all the information it has requested so that users can get started on their work.

Questions to consider

  • Have we considered how the performance of this feature will impact the experience?
  • Will this feature idea negatively impact the performance?

Principle criteria

  • Value users' current practices and avoid disrupting ongoing work.
  • Communicate early and clearly about upcoming changes.
  • Evaluate risk and anticipate potential harms for user safety.

Examples

  • Avoid disrupting the user experience: When redesigning the form finalization flow, the idea was to make auto-send the default. However, we heard that field workers don't use their phone credit to send data; they use Wi-Fi in the office or other places. To avoid people accidentally sending audio files or large images, which could be costly, we decided against this feature. We also included warnings so data collectors are aware if their manager decides to make auto-send the default.
  • Let users know before: With Collect, we introduced prompts/tooltips to let users know in advance about new changes, what to expect, and where to learn more. For example, we introduced inline messaging to let users know that manually editing the form name would be removed, and we also gave users advance notice in the forum.

Questions to consider

  • What are users doing currently, and how might this idea disrupt their workflow?
  • Have we communicated new changes so users won't be surprised?
  • Have we considered the negative implications or risks to user safety?

Principle criteria

  • Give users control over where their data is stored and how it is used.
  • Choose secure defaults that protect all users.
  • Help users make informed decisions about their data.

Examples

  • Control where your data is stored: Central is self-hostable on low-cost computers that users can control, and we provide detailed documentation to support users who wish to self-host.
  • Choose secure defaults: Central is served over HTTPS, securing communication between Central and clients like Collect and Enketo.

Questions to consider

  • By implementing this feature, will the survey respondent's data be kept private/secure?
  • Will this idea protect data collectors and managers (e.g., not losing the work, data won't become corrupted, and will only be available to trusted sources)?
  • Given that some users may not know about best practices in data privacy, have we clearly communicated the potential implications of their decisions?