November 18th 2021:
You have made almost four million classifications in this project!
Thank you so much for your continued support and all your great work 🐝
Our researchers are busy analyzing the data. We'll be back with new tasks for you very soon.

Follow our Twitter account for any news related to the project!
PollinatorWatch on Twitter

Research

PollinatorWatch: Three Years in the Making

For the past three years, researches from Aarhus University in Denmark and The Arctic University in Tromsø, Norway have been collaborating to gather and analyse large quantities of ecological data from various locations across the Arctic. Our goal is to gain a better understanding of how organisms in these areas are affected by climate change.

The idea is simple (at least in theory): instead of manually counting observations we use time lapse photography to monitor plant and insect phenology in the areas. The thousands of images that result from this process are then used to train a convolutional neural network to automatically locate flowers - and hopefully soon, pollinators - in the images. By combining skills from biology and machine learning we aim to increase the amount of data that can be analysed in a season and at the same time decrease the amount of time spent on manual labour. This in turn allows us to spend more time on the thing that actually matters: deepening our understanding of plant-pollinator interactions and how they are affected by the current climate crisis.

In 2018 the project received support from the Independent Research Fund Denmark. In addition its principal researcher, Toke Thomas Høye, received the price for "Original Idea of the Year".

Where We Are and Where We Are Headed

You are looking through a lot of images, but the truth is that we have many more! The results you produce will be used to train our insect detection model. With this, we can scan a very large number of flowers and find the insect visitors. Your results, and the insect detection model, will let us study the interactions between pollinators and flowers. For example, we will investigate if there are pollinators visiting flowers all through the season or only in short bursts. This will give us knowledge about the sensitivity of these flower-pollinator-systems to changes in climate. We are not quite there yet, but we know that it is easier to succeed when you have a clear idea of where you are headed. Below we have outlined three phases of PollinatorWatch. We are currently at Phase 1

Phase 1:
Neural networks don't train themselves. In fact the training requires large quantities of data. This is where Zooniverse enters the picture. We have already trained a network to classify and localise flowers of the species Dryas Integrifolia with a 84% precision and 92% recall. The predictions from this network are what is being presented to the volunteers in Phase 1.
Once PolliWatchers (that's you!) have identified ~2000 images containing pollinators, we are ready to train a new network and move on to:

Phase 2:
In phase 2 we will have trained a new network on the images identified in Phase 1. This means that we now have a network that can automatically classify whether a an image of Dryas Integrifolia contains a pollinator or not (essentially automating the task that PolliWatchers helped us with in Phase 1). Applying this network to our (very large!) image pool will help us narrow it down substantially.
As we all know by now, neural networks are far from perfect. Hopefully the new network will return images containing pollinators, but to make sure we need PolliWatchers to run through the crops and tell us if the computer got it right or not. That's Phase 2.

Phase 3:
This is where things get really interesting - and also a little less clear. Our research aims to answer the question of how the seasonal variations caused by climate change affect both flowers and pollinators as well as their interaction. We have colleagues who are working on software that can 'segment' insects out of images and distribute them into new ones. Right now we are considering using this technology to create 'fake' images of pollinators that can be used to train a neural network to detect insect species in more general settings (i.e. not only in Dryas Integrifolia). We are also considering bringing in images of a different Arctic flower species, the Silene Acualis, and see if we can replicate our results.
PolliWatchers's tasks in Phase 3 could be to determine different sub-species of insects or to test the capabilities of one of these new networks. It will all depend on the results we obtain from Phase 1 and 2. All we know is that the prospects are exciting!

Our project is still ongoing and continues to evolve into new and exciting research. We gather still more images each year refining our methods as equipment gets better and cheaper. We have now also started tracking a different species of flower, Silene acaulis. One of our next steps is to adapt the neural network to detect flowers of this species as well as Dryas integrifolia.

Another exciting project that has sprouted from this research is the creation of a robot that can count and handle insects for inspection. In collaboration with Toke Høye, engineering students from Aarhus University have developed a robot that can register, pick up and store insects for analysis. You can read more about this project here (Click video in the bottom for English)

You are helping us study Arctic pollinators in a new way and for that, we are very grateful. Thank you!