The power of machine learning to change—and maybe even save—the world

| Jennifer Marsman, Principal Engineer, Microsoft AI for Earth

bird houses

In the last two decades, the impact of artificial intelligence (AI) has grown from a very small community of data scientists to something that is woven into many people’s daily lives. Machine learning, computer vision, and other AI disciplines—supported by the cloud—are helping people achieve more, from mundane tasks, like avoiding a traffic jam, to revolutionary breakthroughs, like curing cancer.

Over the past year, Microsoft has been on a journey to apply these transformative technologies to the world’s biggest environmental challenges. On July 12, 2017, Microsoft launched AI for Earth as a $2 million program in London, with a goal of providing AI and cloud tools to researchers working on the frontlines of environmental challenges in the areas of agriculture, water, biodiversity, and climate change.

Since that time, AI for Earth has grown into a $50 million over five-year program, with 112 grantees in 27 countries and seven featured projects. People are using machine learning and computer vision to learn more than previously possible about our planet and how it’s changing, and increasingly using these insights to chart a better future.

These are big goals, but we’re confident in our ability to get there because we know how advanced our tools like machine learning and computer vision already are. Consider machine learning. We have come a long way from the simple pattern-matching of ELIZA. Fifteen years ago, when I got my degree in artificial intelligence, problems like facial recognition, machine translation, and speech recognition were dreams of the field, and now they are solved problems. Among other things, machine learning can group similar items together, detect unusual occurrences, and construct mathematical models of historical data to make future predictions.

These techniques are incredibly helpful for sorting through large amounts of data. Today, we’re excited to share a new story about the power of this technology that also helps answer a basic question: what is the value of AI when we don’t have massive amounts of data already waiting to be processed? This is an issue for many individuals and organizations working in the field of biodiversity, especially when the species are very small, travel great distances, and are hidden from public view.

That’s precisely the challenge we set out to address recently at the most magical place in the world – Walt Disney World Resort. Purple martins are yearly visitors to Disney, nesting at the park before returning their journey to the Brazilian Amazon. Disney scientists have been working with the purple martin community and have provided homes for the families for the past 20 years, studying the conservation of the species with more than 170 nests each year. Despite their annual visits, there is still lots to be learned about nesting behavior of these birds, in part because they nest in enclosed structures known as gourds. Some of what is known is troubling – the species is in decline, with an estimated population drop of 40 percent since 1966.

How do you close this data gap quickly to better understand the species to protect their future? Enter AI. Tiny connected homes, including cameras and cloud-connected sensors were installed, and those combined with computer vision began to deliver data on behaviors that were infrequently observed, like hatching, the caring for and growth of purple martins. External factors, like temperature, humidity, and air pressure were also recorded. Disney and Microsoft hope to expand this work, and AI will help pull all this data together to deliver insights in hopes of inspiring the next generation of conservationists to protect the purple martins for the future.

While this is our newest story, this work is happening across the world. We’re proud to support AI-enabled solutions for biodiversity, including:

PAWS: Machine learning to predict poaching. Spearheaded by a team of researchers at USC, an AI for Earth partner, with additional work being done by a member of the team now at Carnegie Mellon University, an AI for Earth grantee, the Protection Assistant for Wildlife Security (PAWS) processes data about previous poaching activities in an area and creates optimized routes for rangers to patrol based on where poaching is most likely to occur. These routes are also randomized to keep poachers from learning and adapting to patrol patterns. Currently, the PAWS algorithm is being improved so that it can incorporate new information that rangers see while on patrol—such as human footprints—to alter the proposed patrol route in real-time.

Access to ranger patrol data is key. That’s why PAWS partnered with the Uganda Wildlife Authority at Queen Elizabeth National Park. They had collected 14 years of patrol data and more than 125,000 observations on animal sightings, snares, animal remains, and other signs of poaching. PAWS is now being used in several parks, and the system has led to more observations of poacher activities per kilometer than were possible without technology.

Wildbook: Machine learning and computer vision to identify species. One of our newest featured projects, Wild Me, is showing what is possible by pushing the limits of computer vision, with an AI tool that smartly identifies, captions, and moderates pictures. Researchers often have little meaningful data on species. But computer vision makes it possible to tap into an explosion of images, available for free or at a low cost from camera traps, drones, professional photographers, safari-goers, and citizen scientists. Wild Me is not only using computer vision to identify images of zebras, for example, but is also identifying the individual animals in photos—helping to address a fundamental problem in conservation. If we can identify individual animals, then this eliminates the need for physically tagging them, which can harm the animal.

This new data on animals then goes into Wildbook, the platform developed by Wild Me. Using machine learning, it’s possible to either match an animal within the database or determine that the individual is new. Once an animal is identified, it can be tracked in other photographs. Wildbook stores information about the animals, such as their location at a specific time, in a fully developed database. This combination of AI tools and human ingenuity makes it possible to connect information about sightings with additional relevant data, enabling new science, conservation, and education at unprecedented scales and resolution. With a much more detailed and useful picture of what is happening, researchers and other decision-makers are able to implement new, more effective conservation strategies.

We see incredible potential and tremendous progress in our grantees’ work and in the explosive pace at which new algorithms are being built, refined, and made publicly available. And these are just a few of the grantees, featured projects, and partners we’re working with in the area of biodiversity; there’s equally exciting work in water, agriculture, and climate change that we look forward to sharing in the near future on this blog. Check out the amazing organizations and individuals we’re supporting, apply for a grant to join us or our new partnership with National Geographic Society, or just follow our progress on Twitter by following @Microsoft_Green, or me at @jennifermarsman.

Tags: , , ,