Amidst a frenzy of events for NYCâs Climate Week, frog hosted our friends and former partners from UN OCHAâs Centre for Humanitarian Data along with a few distinguished speakers to discuss data collection, building unbiased algorithms, and how we can use the power of predictive analytics to get more proactive about delivering humanitarian aid in times of crisis.
Climate Week sheds a light on this pressing issue as extreme weather and unpredictable storms are creating greater and more frequent humanitarian crises around the world. In the last few months alone weâve seen the devastation Hurricane Dorian brought to the Bahamas, and the massive food crisis caused by flooding after Cyclone Idai hit Malawi, Mozambique and Zimbabwe. These storms devastate communities, wipe out access to clean water, and destroy crops, homes and other resources that people depend on for survival. UN OCHA is dedicated to providing aid in these times of crisis and is working hard to become proactive, rather than reactive in our greatest times of need.
frog worked with OCHA in 2014 to develop the Humanitarian Data Exchange (HDX), a platform that enables workers from the UN, NGOs, government and universities to radically improve data sharing during extreme situations. Since then, The Centre for Humanitarian Data was founded to manage HDX and the Humanitarian Exchange Language (HXL). A trusted environment like HDX gives humanitarians confidence that personally identifiable information will be protected. Adding depth to the intelligence, HXL makes the sharing of important data less binary. The entirety of the details may not be needed for overlapping organizations to benefit from knowing which types of data are being collected. In this way, partners can discern meaningful signals without putting individuals at risk. In addition to providing data services, data literacy, data policy and network engagement, The Centre is also expanding its capabilities into predictive analytics to help manage and finance aid for humanitarian crises in a more proactive way.
The evening was filled with expert speakers and a keynote presentation followed by a panel Q&A with the audience. Our line up included Andrew Kruczkiewicz, Technical Advisor, Red Cross Crescent Climate Centre who discussed forecast-based financing in light of climate change; Leonardo Milano, Predictive Analytics Team Lead, OCHA Centre for Humanitarian Data; and Nadia Piffaretti, Senior Economist at the Global Center on Conflict, Security and Development, at The World Bank. The keynote was delivered by Cathy OâNeil, author of Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, who discussed the ethical implications of using predictive models for humanitarian response.
After welcome remarks by frogâs Global Impact Lead Jon Wasserman, and Centre Lead Sarah Telford, Lisa Carty (UN OCHAâs Director of Humanitarian Financing & Resource Mobilization Division) took the stage to speak about the global humanitarian situation and how UN OCHA is using predictive analytics and anticipatory humanitarian financing efforts to make response time faster and most cost-effective. ââWe are fanatical about improving the worldââ Lisa said, quoting frogâs manifesto, âthatâs what brings us togetherâthe humanitarian and design communities.â
And when it comes to that intersection, we know that âwhen organizations collaborate on common goals, they can learn from each otherâs strengths and challenges,â says Jon Wasserman, âbut itâs not only best practices that get shared. When partnerships are forged, the durability and efficacy of the humanitarian data landscape increases.â
To further delve into the principles surrounding the current humanitarian data landscape, we welcomed to the stage Cathy OâNeil, who spoke more specifically about the ethics of algorithms. Cathy opened by stating that in order to understand how the algorithms used in predictive analytics work, you donât actually have to be a PhD in mathematics. In fact, she urges that itâs all of our responsibility to care about the implications of these algorithms. âYou already know what a predictive algorithm isâitâs in your head,â she said. Algorithms follow the same human logic we pull upon every dayâcollecting data on what was âsuccessfulâ in the past in order to predict what will work for the future.
The problem is that human logic is inherently biased by the experiences of the individualâmeaning âsuccessâ for you may be entirely different than what âsuccessâ looks like for someone else facing different circumstances. Much in the same way, algorithms are often built without taking into context the full ecosystem of stakeholders involved.
One apt example Cathy cites involves the education and incarceration systems. Predictive analytics are often used to try and understand the likelihood of an individualâs propensity to commit a crime. And while this kind of predictive criminology may not look like a Minority Report dystopia yet, it can be just as harmful. In many cases, the data available on crime committed is hard to obtain, so the police use data on arrests by neighborhood or suspensions in school districts instead. The problem is that an arrest is not always indicative of a crime committed. Many reports find that arrests or suspensions are doled out disproportionately in low-income neighborhoods or underserved school districts. Still, the police looking at this data are more likely to continue going to those neighborhoods or schools to look for crime. This biased dataset functions as a self-fulfilling prophecy that keeps perpetuating the same problems, without actually getting to the root to fix them. âBiased algorithms create feedback loops that reward destructive predictive behavior,â says Cathy. The numbers show the algorithms are âsucceedingââbut at what cost?
So whatâs the answer if weâre trying to harness the power of these predictive algorithms in order to aid our most vulnerable populations in times of crisis? The answer, of course, is not so simple. But we do know it will take human-centered, diverse and inclusive teams to root out bias. And, Cathy posits, perhaps a need of a Hippocratic oath for those building the algorithms. Like doctors, shouldnât they promise to âdo no harmâ?
What is clear, is that we will continue to face difficulty in collecting data on some of the most underserved and vulnerable populations around the world. As designers, coders, engineers and policy makers using predictive analytics, we must always be asking ourselves âWho is going to be affected by this algorithm?â And even further, âFor whom does this algorithm fail?
Jon points out that while we talk a lot about âunconscious biasâ or âimplicit biasâ when critiquing data analysis, it ignores the potential for active, conscious bias. Selective targeting or response may have severe impacts on vulnerable communities. There must be responsible humans making thoughtful decisions before, during and after data collection and analysis. By fostering a trusted environment for sharing and discourse, HDX improves the landscape of responsible data use.
More than ever, algorithms are being used everywhereâthey aid us in our day to day activities and function as life-altering decision makers. But algorithms are only ever as just as the systems that make and perpetuate them. This means itâs more important than ever to have diverse voices in the room working on every aspect of these problems, from engineering and coding to designing and strategizing how they will be deployed and how the data will be collected. There are no more excuses for biased data being perpetuated by biased systems. By putting people first and operating with empathy and understandingâalong with engineering and design expertiseâwe believe we can start to use predictive analytics for good.
frog, part of Capgemini Invent is a global design and innovation firm. We transform businesses at scale by creating systems of brand, product and service that deliver a distinctly better experience. We strive to touch hearts and move markets. Our passion is to transform ideas into realities. We partner with clients to anticipate the future, evolve organizations and advance the human experience.
We respect your privacy
We use Cookies to improve your experience on our website. They help us to improve site performance, present you relevant advertising and enable you to share content in social media. You may accept all Cookies, or choose to manage them individually. You can change your settings at any time by clicking Cookie Settings available in the footer of every page. For more information related to the Cookies, please visit our Cookie Policy.