Tools  |    |  August 7, 2018

Ethical OS Toolkit: A guide to anticipating the future impact of today’s technology. Or: How not to regret the things you will build.

Toolkit for technologists, teachers and students to assess the ethical implications of developing products that use artificial intelligence. A joint creation of the Institute for the Future and Omidyar Network’s Tech and Society Solutions Lab. The 78 page toolkit is a PowerPoint slide deck that includes:

  • A checklist of 8 risk zones to help you identify the emerging areas of risk and social harm most critical for your team to start considering now.
  • 14 scenarios to spark conversation and stretch your imagination about the long-term impacts of tech you’re building today.
  • 7 future-proofing strategies to help you take ethical action today.
How can you predict the unintended consequences of the products you are building today? How can you actively safeguard users, communities, societies, and your company from new future risks? As we think about the future of tech at Google, Yoav Schlesinger (Principal for the Tech and Society Solutions Lab at Omidyar Network) will showcase the Ethical Operating System (Ethical OS) and how it can help anticipate risks or scenarios before they happen. By asking the right sets of questions, EthicalOS provides a practical tool for beginning the tough conversations that ensure the tech we’re building, lives up to its promise as a force for good. Runtime 52 minutes.

For Programmers: Things to Consider

  • How could your product, service or business most positively impact the world you want to see? Beyond the problem you’re trying to solve, how will the world be better for having this innovation introduced?
  • Could your product or product feature harm someone directly? Consider not only the first-order consequences, but try to imagine the second- and third-order as well.
  • Could a malicious actor use your product or product feature to harm a person or group? Commit a crime?
  • Could the business model behind your product harm someone or a group of people? Could it fuel addiction? Could it impact certain groups—children, seniors, minority groups, the disenfranchised or disempowered—in different ways?
  • Could your product risk someone’s privacy? What happens if someone’s data gets leaked, hacked, or lost? Do you have a plan in place for these contingencies? How can you best build trust with your users?
  • Could your product or business do anything your users are unaware of? If so, why are you not sharing this information explicitly? Would it pose a risk to your business if it showed up in tomorrow’s news?
  • What can or should you do as a board to ensure your company lives up to the best version of itself?

For Teachers: Ways to Use the Toolkit in the Classroom

  • Ask students to read about all 8 Risk Zones and pick one that interests them. Challenge them to collect signals from that risk zone. Signals are real examples of things already happening in the present that might influence or shape the future. Signals can be found in the news, blogs, social media, scientific journals, tech conferences, TED talks, labs, and anywhere people or companies are sharing new products, ideas, or findings. Students can work in teams with others interested in the same risk zone.
  • Challenge students to pick a real, emerging technology, product, or app that they are interested in. Have them go through the risk mitigation questions and check off the questions in the risk zones they think are most relevant to the tech. Then, choose one of those questions and attempt to answer it, looking for ways to make that tech more ethical and less risky.
  • Have students read all 14 “Risky Futures” scenarios and pick one that captures their imagination. Ask them to brainstorm what ethical risks or harms might occur in that future. Students can work in teams with others interested in the same scenario. This can also be done as a class discussion.
  • Hold a class discussion about one or more of the 7 strategies. Have students vote on their favorite strategy and then spend time imagining in depth how the class’s top one or two strategies might play out in the future.

As technologists, it’s only natural that we spend most of our time focusing on how our tech will change the world for the better. Which is great. Everyone loves a sunny disposition. But perhaps it’s more useful, in some ways, to consider the glass half empty. What if, in addition to fantasizing about how our tech will save the world, we spent some time dreading all the ways it might, possibly, perhaps, just maybe, screw everything up? No one can predict exactly what tomorrow will bring (though somewhere in the tech world, someone is no doubt working on it). So until we get that crystal ball app, the best we can hope to do is anticipate the long-term social impact and unexpected uses of the tech we create today.

If the technology you’re building right now will some day be used in unexpected ways, how can you hope to be prepared? What new categories of risk should you pay special attention to now? And which design, team or business model choices can actively safeguard users, communities, society, and your company from future risk? The last thing you want is to get blindsided by a future YOU helped create. The Ethical OS is here to help you see more clearly.

The Ethical Operating System can help makers of tech, product managers, engineers, and others get out in front of problems before they happen. It’s been designed to facilitate better product development, faster deployment, and more impactful innovation. All while striving to minimize technical and reputational risks. This toolkit can help inform your design process today and manage risks around existing technologies in the future.


See the companion tool Risk Mitigation Checklist. More information at EthicalOS.org

Related article in Wired Magazine: https://www.wired.com/story/ethical-os/