Digital Principles Monitoring, Evaluation, & Learning Part 3: The Indicator Library

This is the second installment of a 3-part series on the Digital Principles’ new suite of monitoring and evaluation resources.

Introduction
Part 2
Part 4

In August 2021, the Digital Impact Alliance (DIAL) announced the slow release of a new, three-part monitoring and evaluation suite of resources as a basic conceptual monitoring and evaluation framework. The first was the Organizational Self-Assessment, which is intended to enable understanding around how the Digital Principles-endorsing community currently engages with the Digital Principles at their respective organizations.* 

Today, we’re sharing the second installment, Digital Principles Common Metrics Indicator Library. This catalog of over 290 process indicators for activities and outputs associated with each individual Digital Principle aims to support an organization’s in-depth analysis of their adherence to individual Principles. 

The principles Understand the Existing Ecosystem and Reuse and Improve guided our work as we researched common metric indicator databases created by other organizations. The Digital Principles Common Metrics were patterned in form after the IRIS+ metrics, though their intended function is more expansive than informing external stakeholders for socially responsible investing decisions.  

Based on this, DIAL conducted a comprehensive review and synthesis of Digital Principles literature for clear actions and recommendations, culminating in over 400 potential actions organizations could take. Action statements were then converted into indicators to measure the recommended action.  

For example, a core tenet in Digital Principles literature around Design with the User is: “Incorporate multiple user types and stakeholders in each phase of the project lifecycle.” The Digital Principles team converted this statement into an indicator to measure the “number of multiple user types and stakeholders that have been involved in each phase of the project lifecycle.” 

Once actions were converted into indicators, DIAL developed tags to make it easier for users to find specific indicators suited for their purpose. These tags include the indicator type (e.g. activity, output), what stage of the project lifecycle the indicator is measuring, the potential measurement theme (e.g. Learn, Assess, Engage, etc.), and additional comments and notes. Following the Design With the User Principle, these indicators were shared with monitoring and evaluation professionals within the digital development ecosystem for feedback and iteration. Now pared down to 291 indicators and an average 32 indicators per Digital Principle, the list is relatively comprehensive for most organizations’ monitoring needs. 

To help users navigate the library and select the most relevant indicators, a supplementary guidance document was developed. It includes invitations for organizations to be precise in the terminology they use—between values, principles, criteria, or standards—as well as caveats for use and against misuse.

Users will also find reference to a variety of indicator quality frameworks that can be used and adapted by endorsing organizations in determining which indicators best meet the organizations’ need for measuring Digital Principle adherence.

As the Indicator Library focuses solely on process indicators of adherence, omitting outcome indicators, it is not exhaustive, nor should it be the sole resource that organizations use for Digital Principles-Focused Evaluation. It is a living document that the Digital Principles team will update periodically, based on feedback from the Digital Principles community.  

Use our Common Metrics Indicator Library to see how you’re connecting principle to practice; and watch this space as we launch the final MEL resource later this week!  

Scott Neilitz

Manager, Monitoring, Evaluation, & Learning at DIAL

Manager, Monitoring, Evaluation, & Learning

Scott believes that creative innovation and technology have the potential to improve the lives of people in low and middle-income countries. He also believes that through constant and iterative research and learning, we can improve programs and, ultimately, impact. Scott joined DIAL in 2018 as a Senior Monitoring and Evaluation Associate.

Claudine Lim

Program Manager, the Principles for Digital Development

Claudine first joined the Digital Impact Alliance in October 2017, shortly after receiving a dual masters in international relations and public relations from the Maxwell School and S.I. Newhouse School at Syracuse University. After working as a Program Coordinator and Researcher for DIAL’s Business Operations, she is currently working with the Principles of Digital Development.

Akshika Patel

Associate

Akshika initially joined the Digital Impact Alliance in February 2021, as a Fellow. After earning her Master’s degree in International Affairs from Columbia University’s School of International & Public Affairs, she transitioned to the Associate position. In this role, she assists with the creation of new Digital Principles content, delivery of communications campaigns, and engagement with external stakeholders.

Scott Neilitz

Manager, Monitor, Evaluation and Learning at DIAL

Scott believes that creative innovation and technology have the potential to improve the lives of people in low and middle-income countries. He also believes that through constant and iterative research and learning, we can improve programs and, ultimately, impact. Scott joined DIAL in 2018 as a Senior Monitoring and Evaluation Associate.

Zach Tilton

Doctoral Research Associate at Western Michigan University

Zach Tilton joined DIAL in 2018 as a Monitoring and Evaluation Fellow and currently works as an Evaluation Consultant with DIAL. He is a Doctoral Research Associate at the Interdisciplinary Doctoral Program in Evaluation at Western Michigan University specializing in peacebuilding evaluation, an Associate at Everyday Peace Indicators, and a member-at-large with the EvalYouth Global Management Group.