Design With the User

Successful digital initiatives are rooted in an understanding of user characteristics, needs and challenges. User-centered design — also referred to as design thinking or human-centered design — starts with getting to know the people you are designing for through conversation, observation and co-creation. Information gathered through this engagement leads to building, testing and redesigning tools until they effectively meet user needs. By designing with the users, and not for them, you can build digital tools to better address the specific context, culture, behaviors and expectations of the people who will directly interact with the technology. Designing together means partnering with users throughout the project lifecycle, co-creating solutions, and continuously gathering and incorporating users’ feedback.

Explore this principle by: Core Tenets Project Lifecycle Featured Resources

Core Tenets

  • Incorporate multiple user types and stakeholders in each phase of the project lifecycle to direct feature needs and revise the design. Here, users are people who will interact directly with the tool or system, and stakeholders are people who will be affected by or have an interest in the tool or system, such as people whose data are being collected, government officials or researchers who may study the data collected.
  • Design tools that improve users’ current processes, saving time, using fewer resources and improving quality.
  • Develop context-appropriate tools informed by users’ priorities and needs, considering the ecosystem and accepting that digital tools will not always be the best fit.
  • Develop the tool in an incremental and iterative manner, with clear objectives and purpose in mind.
  • Ensure that the design is sensitive to and considers the needs of the traditionally underserved. 
  • Embrace an iterative process that allows for incorporating feedback and adapting your tool after the initial testing and launch.
  • Be open about setting expectations, and let people opt out of participating in the design process. 

Project Lifecycle

The following recommendations, tips and resources are drawn from the digital development community to give you options for applying this Principle during each phase of the project lifecycle. This guidance is not meant to be exhaustive, but rather to suggest actions you can take to apply this Principle in your work. If you have other tips, resources or comments to add, please share them with the community at the Digital Principles Forum.

Analyze & Plan

Designing a successful initiative means first identifying users’ needs. During the Analyze & Plan phase, you learn about users’ priorities, motivations and challenges. User adoption of technology is one of the largest barriers to a successful and sustainable initiative, so understanding comfort levels with different technologies is critical. Users can help you determine if a digital tool makes sense for their situation – it may not – and if it would address the underlying need; users can also identify existing technologies they already use that you can take advantage of.
  • Allow enough time to partner with users. Include time to analyze and plan with users, as well as several opportunities to gather and incorporate users’ feedback during the Deploy & Implement stage. Determine how many weeks or months will be required at each stage based on the complexity of your initiative or environment, and incorporate this timing into your work plan.
  • Understand the context. Be knowledgeable of the ecosystem where the tool will be deployed, including the people, networks, cultures, technological landscape, research evidence, politics and markets. This understanding will help to answer questions about the existing infrastructure, whether there is internet access or reliable power, or if there are market forces or government policies in place that support the use of technology.
  • Know your users and identify your stakeholders. Identify and observe different types of users, so you can begin to develop an understanding of their motivations and daily experiences.
  • Create user personas. Develop user personas that include names, pictures, demographic characteristics and motivations. Personas help make users more understandable and relatable, especially for team members who are not based in the project country.
  • Identify business processes to learn about the work that users are doing. A business process is a set of activities and tasks grouped together to accomplish a goal or produce something of value for an individual, stakeholder or organization. Identifying business processes will help you to better understand your target users and design a tool that better meets their needs.
  • Develop and validate user scenarios. Using insights from the user personas and business processes, create user stories to describe simply and clearly what users need the tool to do and why they need it. User personas describe who your users are, while the scenarios identify what they need. These stories describe features or requirements from a user’s perspective and are typically written in this format: As a <user type>, I want/need <a desired feature> so that <why it’s needed>. Observe current processes that users take to complete their tasks. Map out these processes and validate them with users. As part of this validation, ask them to show you the steps for each of their activities and identify who performs each step, and see how that aligns with what you drafted. To finalize, ask representative users to review the draft processes documented and provide feedback.
  • Develop methods for user feedback and input throughout the product lifecycle. When identifying users to engage, check that you are including users from a variety of environments and technology experience levels, and seek out participants from traditionally underserved populations. Check how these users fit with the user personas you developed, and determine whether the personas need updating or if your group may be missing users. Plan for how you will create an environment where users are comfortable providing feedback, taking into account cultural barriers that may affect users’willingness to provide feedback.
    • Form a representative user advisory group (UAG), where possible. The UAG gives target users a voice throughout the project lifecycle, particularly during testing and monitoring. Users are involved in the planning and decision-making processes at an earlier stage and at a more strategic level. In a broad sense, a UAG could be considered a technical working group (TWG); in particular, the common missteps in setting up a TWG would apply. Build the capacity of your UAG to review designs, articulate opinions and make choices.
    • Identify a representative group of users if a UAG is not realistic. For example, it may not be possible to convene a UAG, or it may not make sense with a small project team.
    • Set expectations with users and stakeholders you involve. Many communities are part of early initiatives but will not be involved later for a variety of reasons, such as budget cuts or regional issues. Let your users know if this initiative may not go further or if they may not be asked to participate in future phases. You should also provide users with a way to opt out of participating if they need to during later phases.

Tips & Resources

Design & Develop

During this phase, continue co-creating your tool  with users to ensure that it is functional and meets user needs. Validate your understanding of user needs and context, and identify any changes to the ecosystem. Before fully deploying your tool, make a plan to continue to engage users in testing its usefulness and usability and identifying any changes that may be necessary.
  • Collaboratively develop a vision document outlining goals for the tool that will set direction and guide the team.This document should capture the overall goal for your tool and will serve as your strategy. The vision should be validated with actual users. Answer these key questions:
    • What is the overall goal for the digital tool?
    • Who is it for (user personas)?
    • What user needs is it addressing?
    • What is the solution?
    • What value does it bring?
    • What is unique or new about it?
    • How will we know if it is successful?
  • Test both the design and the tool with users and the UAG, if one is in place. The size of your testing group will vary depending on the scope of your project, the budget and the number of user personas, but the group should still include underserved populations. Testing early and often will let you validate whether what you developed works to meets user needs and identify where there are still gaps. Ensure that your testers represent your targeted users, and test the tool in an environment representative of the reality of the average user. When testing with users, validate the user scenarios that you identified with them. The following are different stages of testing specific to users that should be completed before deployment:
    • Prototype and pilot test the technical design by using wireframes or simulations. Ask target users to use the simulation while you watch, listen and take notes.
    • Once you have an initial working prototype, sit down with a small group of target users and ask them to use the tool while you again watch, listen and take notes. As you expand testing to include more users, clearly communicate issues such as if some functionality does not work yet, to be sensitive to users’ time, or if real data are not yet available, so the users will only be able to use test data.
    • Beta test the tool or system with a larger group to confirm readiness for wider deployment.
    • You may include users in bug bash sessions — where a broad group of testers is brought together to try to use the system in different ways so that you can find more errors.

Deploy & Implement

Consider a phased rollout to facilitate getting rapid user feedback and to be able to make improvements continually. This will allow you to get the tool into the hands of users, incorporate feedback and make improvements before introducing the tool to the wider community. Even after full deployment, seek out opportunities to continue engaging users, and be prepared to make changes based on their feedback. The more that users interact with the tool, the better able they will be to apply it to their own situations and the more insights they will have on how it can be modified to be more useful.
  • Provide learning opportunities to actively support users in adapting to changes introduced with the new tool. Learning opportunities may include hosted activities, formal trainings, workshop sessions and community learning events. Literature printed in the local language can be placed where it will be readily accessible to target users.
  • Identify highly engaged users who show aptitude for the tool and encourage them to become champions of it. Prepare them to train others, and help to make them co-owners of the tool.
  • Create regular opportunities for users and stakeholders to provide feedback, such as through meetings, text messaging or a feedback phone line. When collecting feedback, acknowledge its receipt even when it will not be incorporated.
  • Update your tool iteratively based on user feedback. Once your tool is deployed, keep assessing what could be better and which changes could improve the user experience. As you refine or create new features, engage your users in providing feedback and testing.
  • Use members of the UAG to facilitate implementation and to act as advocates for the tool. Ask them to participate in different forums and learning exchanges.

Cross-cutting: M&E

Incorporating monitoring and evaluation across every phase of the project lifecycle provides you with useful information on how users and stakeholders are affected by the tool, if it is being used and if it has led to your desired outcomes. Based on this information, you can identify opportunities to improve the tool for increased impact. In previous stages, you were testing to determine if the tool worked; now you are assessing whether it helps to achieve programmatic outcomes.
  • Begin with a shared understanding of the initiative’s purpose. Define what success looks like with your users, including a high-level goal and desired outcomes.
  • Use a participatory process to identify performance indicators, such as usage and adoption. To measure progress toward your goal and achievement of outcomes, you need to define your indicators. Ask your users to help you define how to measure progress and against what targets, and be sure to assess these for different user types.
  • Share findings and data with users and the larger digital development community. Be open about what the tool achieved and where you fell short on reaching your desired outcomes, so the larger community can make use of what you have learned. Ask users to comment on the findings and help provide explanations for why outcomes were or were not achieved.
  • Modify the initiative based on evidence. While you will have been using monitoring data and user feedback throughout the project lifecycle, the final evaluation provides another opportunity to use data to inform improvements, which may necessary before an initiative can be scaled. In addition to reflecting on data and evidence collected by you and your team, pay attention to what others are learning in your context or sector that can help inform improvements in your next design phase or round of implementations. Digital development practices change rapidly, and it is essential to reevaluate your ecosystem assumptions.
  • Assess capacity building and other activities that support adoption and use of the tool or system. Monitor the outputs of activities such as training, marketing and community mobilization throughout the initiative. Not reaching your desired outcomes may actually be due to factors like community awareness and uptake rather than any problems with the actual tool or system.