Case study: farm.ink

Designing in response to farmer needs over mobile with farm.ink

Principles Addressed: Be Data Driven, Design for Scale

Overview: Farm.ink  has been working on digital farming in East Africa.

Taking inspiration from the traction of farming discussion on social media, farm.ink are looking to deliver a highly engaging learning experience to the rising audience of smartphone owning farmers.

Farm.ink’s mobile services are built with the Principle, Be Data Driven, in mind. Farm.ink saw that there was a huge amount of information flow between farmers on digital channels such as Facebook and WhatsApp. These channels offer farmers a fast and easy way to share their experiences, advice and questions and to connect with others in their area. However, while valuable information is being shared, it is highly unstructured and disorganized, making it hard for farmers to find the information or fellow farmers they need. Being more data driven about this information flow is one of the keys to delivering a better product to users. By studying the behavioral patterns exhibited by farmers online, and taking a data driven approach to user research and product analytics, a core team of 3 have been able to reach an audience of 150,000+ farmers online.

Objectives:

Farm.ink has developed an online forum and chatbot information service that is used by 10,000s of farmers every day. The team are now turning their attention to utilizing web technology to deliver answers to some common critical questions from farmers. One example is a chatbot solution designed to help maize farmers protect their crop from the emerging presence of Fall Armyworm.

Fall Armyworm (FAW) poses a serious threat to food security in sub-Saharan Africa. Originally from the Americas, FAW outbreaks first occurred in West Africa in early 2016 and are now on the precipice of devastating food supplies across the continent, exacerbating global poverty and hunger. FAW attacks more than 80 different plant species and agriculture experts estimate the pest may cause over $13 billion in losses for staple crops such as maize, sorghum, rice, and sugarcane. It can also fly up to 1,600 kilometers (nearly 1,000 miles) in 30 hours meaning it can easily migrate to surrounding farms and countries.

The team designed the Fall Armyworm chatbot course to inform farmers of identification processes and to offer tailored advice on how to treat the pest based on country-level pesticide recommendations. The key point learned from early user testing was that micro-games were the most powerful method for enabling users to retain key information, such has how to differentiate the FAW from other common pests, and how to monitor the farm.

Actions:

* Analysis & Planning. In creating the Fall Armyworm course the team used a body of available knowledge from expert sources such as ‘The Fall Army Worm IPM guide for Africa’ (https://www.usaid.gov/sites/default/files/documents/1867/Fall-Armyworm-IPM-Guide-for-Africa-Jan_30-2018.pdf). This research exercise provided a view on what the critical and feasible learning outcomes for a digital course could be.

In tandem with this approach the team looked at an extensive number of questions and responses from farmers across their existing digital forums. This provided an understanding of the most common myths and knowledge gaps among this community.

* Design & Development. The course concept was built on a model created for a previous dairy course feature. The team had identified that one of the most successful elements of the previous course was a quiz style component: where users could test their knowledge of a small bite-size piece of learning material. This is a design approach common to many digital learning platforms, but few – if any – such approaches were being used for farmers. Through multiple rounds of user testing with prototypes, using a design sprint methodology inspired by IDEO and the Google Ventures method, the team rapidly refined a chatbot flow that was compelling to users in tests. Key learnings were around the right quiz length and difficulty, the best ratio of images to text and the importance of interlacing expert content with questions and experiences from real farmers.

The key thing we looked for was a smile of delight as the user played with the product, in addition to evidence that they were retaining new information over a longer period (e.g. did they remember what the larvae looked like when doing the test again 2 weeks later?). These simple principles go a long way to shaping powerful and impactful products.

Deployment & Implementation. The user tests were translated into a live beta product that moved testing from tens to hundreds of users. This stage gave the team data to work with that validated the capacity of the product to engage and educate users. At this stage of development, the team typically inserts small surveys into the user experience to ask

users what further features they are interested in. For example, in the case of the previous dairy focused product, users requested more information on current milk market prices. The team developed an approach that allowed users access to average prices if they added their own market price in a local area. This quickly led to aggregating thousands of weekly reports and a powerful new feature for dairy farmers.

* Monitoring & Evaluation. The core metrics for a product feature like this are the test scores and engagement levels and both are best to consider over a larger timeframe. One powerful approach is to use cohort analysis. This involves monitoring the usage patterns of a segment of users (e.g. everyone who subscribed during the same week) over a period of time. These metrics and techniques are standard in the world of mobile product design and digital learning, and there is great potential to build on existing frameworks and evidence in these areas. More generally, a conversational style product like a chatbot has proved very powerful as a survey mechanism for users, enabling the team to understand the general value of product features rapidly. The team will share more analytics and results after the full scale roll out of these learning features toward the end of 2019.

Opportunities and Challenges

The major challenge for the Fall Armyworm chatbot was to create something that would be sufficiently engaging to incentivize users to keep using the product. The team used a number of techniques to do this including creating rich, visual content and gamifying the experience through points, levels and badges.

For certain agronomic advice and information country-level regulation must be considered. The team worked with experts such as CABI to ensure that advice, particularly any involving pest treatments, was appropriate for the local context.

The success of the Fall Armyworm prototype highlights the opportunity to provide valuable services to many thousands of users using smartphone-based delivery channels. It also illustrates the potential for digital learning to vastly improve the knowledge and practices of small-scale farmers.

Results

While the specific results of the Fall Armyworm service won’t be made public until later in 2019, overall product engagement has much higher repeat use metrics than for smartphone native apps in the region. Recent survey results from the general user base revealed that 92% of a sample of thousands of users said using the platform had changed the way they farm, with over half of these giving examples of specific changes. A key addition has been the social character of the platform, making it easy for farmers to engage fellow farmers with questions or advice. 87% of the sample said their confidence in farming had increased, where a significant factor was the social support network users felt they formed in addition to pure theoretical learning.

Lessons Learned and Recommendations: Our lessons learned, and recommendations are centered on the principles of “be data driven” and “design for scale”:

• Be Data Driven(https://digitalprinciples.org/principle/be-data-driven/). Mobile products are treasure troves of data, which can be used ethically and wisely. In farm.ink’s case qualitative user interview data was captured early on to validate product value, and usage data was captured later on to validate more quantitative effects of use.

• Keep your design simple and flexible. It’s important to remember that lots of things change in early product development so you may find that the most compelling features to users are not what you originally anticipated. It is possible to maintain a focus on impact without being hamstrung by irrelevant metrics or designs. (https://digitalprinciples.org/principle/design-for-scale/)

• Build products quickly. The development space can still learn a lot from implementing rapid prototyping. This means releasing products quickly and drawing lessons on what is a success and what has failed. The farm.ink team implements less than a tenth of what it protoypes and tests with users. Having failures is part of learning and team expectations will not always meet reality, that’s expected and even encouraged.

• Be creative and smart with how you use product analysis. Use data points to see how the users adopt the product. It translates to impact and evaluates how it impacts the user.

• Design decisions are crucial in creating an interactive program. Pay attention to the principles to create products that can scale and utilize data properly. https://digitalprinciples.org/