MICS rethinks impact assessment on citizen science from the bottom up, and puts technology and data at the service of people.
In MICS, data produced by the users of the platform belong to the users of the platform. We believe data should be considered a public good.
Machine-learning algorithms now help make decisions on many issues, including impact assessment. But important decisions are often happening in the dark.
Programs are often unaccountable. To arrive at their decisions, machine-learning algorithms automatically build complex models based on some data sets used as input for learning, so that even the people using them may not be able to explain why or how a conclusion is reached. They're a black box.
Some software owners often claim that increasing transparency could risk revealing their intellectual property. This is not the case in MICS, where all software is open source.
"If my project has a low positive impact, I don't necessarily care so much about how the algorithm works. I actually just want to know why the impact is low and have some guidance on how to improve," James Sprinks, coordinator of iMars, says.
In 2022, Earthwatch implemented Alquimics, a new impact-assessment feature using machine-learning and a rule-based engine in MICS's open-source web application.
"Alquimics is a major step forward in terms of transparency and accountability on the intersection between citizen science and AI," Luigi Ceccaroni, MICS coordinator, says.
He warns, however, that transparency in the impact-assessment process is only one side of the coin of accountability. "It gives you some idea about why a decision has been made in a particular way, but it doesn't mean the decision is justified or legitimate," he says.
One problem, sometimes, is a lack of transparency around what data are being used to train impact-assessment algorithms in the first place.
In many cases, algorithms are building assumptions about citizens and their activities based on variables they are not even aware of. Ceccaroni says organisations should be required to demonstrate that the data points they use to measure impact or make other decisions are both relevant and lead to accurate predictions.
MICS also wants to ensure the right to resonate inferences. Under GDPR, the European data protection regulations implemented in 2018, everyone in the European Union has the right to access and amend any data an organisation holds about them. We also have the right to be forgotten. In MICS the same applies to project data. And Ceccaroni argues that, as well as access to project data, project coordinators should be able to ask for and correct any assumptions that MICS has made about their projects based on whatever content has been collected. "It's the logical counterpart to the right to be forgotten," he says. "You also have a right on how your project is seen."
Alquimics is the algorithm behind the MICS platform. It's being created through part handcrafting (a labour-intensive technique for programming that involves writing explicit rules and templates) part machine learning (a type of AI that learns to perform a task by analysing patterns in data).
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 824711.
Enter your email address above to receive updates on the MICS project. Please see our privacy policy for more information.
MICS science is open science. All the information on this website is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.