ELSA Lab for Multi-Agency Public Safety Issues in AI

How can we use AI technology in the Netherlands to improve public safety? And how can we do that in a responsible and human-centric way, so that using AI does not lead to undesired effects and social dissatisfaction? This is what the participants of this ELSA Lab are currently being confronted with. This ELSA Lab focuses on real-world practical cases involving wide-ranging cooperation, trying to make progress step by step.

What social challenges in AI are being worked on?

When it comes to safety issues, heavy emphasis is often placed on the use of data technology, although we now know that this technology can also cause new problems and social unease. You can end up in a situation where safety has indeed improved but goes hand in hand with social dissatisfaction, for example because people are worrying about privacy issues. In these sorts of developments, it is important to pay considerable attention to social, legal and societal aspects. It is about finding the right balance between what is needed to improve public safety and which solutions have widespread public support.

What types of solutions are offered to the end user?

This ELSA Lab focuses on three special use cases. The first is about high-impact crime, which includes robbery and burglaries. Additionally, this ELSA Lab looks at residential areas, in particular at neighbourhoods that need special attention. This was chosen deliberately as many people in those neighbourhoods may feel unsafe, although that feeling is by no means always associated with criminality. Exploratory research has shown that there are many causes for that social dissatisfaction, such as poor household waste disposal, parking problems or mopeds zipping around at speed. And the third use case is about crowd control. It is about real-time image analysis in the public arena, with one important point for consideration being to what extent privacy issues play a role in such a setup. It is not just about legal responsibility but also about the ethical question of what kind of society we want. There is also attention for the sociological side. What effect do the increased surveillance aspects have on society? This is all being done with the goal of coming to well-thought-out AI solutions paying a lot of attention to how that technology is designed.

What AI methods or techniques are used in the research?

Erasmus University Rotterdam, one of the initiators of this ELSA Lab, assumes open access and an approach based on honesty, facts and scientific integrity when data technology is used. Several parties are involved in that setup, tackling the various challenges with multiple approaches and multiple goals. Very complex, in other words. This is why a new term was created coined for this approach: transdisciplinary innovation. This means that many different types of data technology are used in this ELSA Lab: from the Data Document Initiative (for questionnaires and observation data) and the Text Encoding Initiative (for textual data) to the Dublin Core Metadata Set (for video and audio data). There is an important role for data engineers in all this.

Are we collaborating with other sectors?

The consortium consists of a very diverse mix of parties: the National Police, Woonstad Rotterdam (housing corporation), Rathenau Institute, HSD (The Hague Security Delta), Erasmus University Rotterdam, Delft University of Technology, Leiden University, TNO (Netherlands Organisation for Applied Scientific Research), ICAI (the Innovation Center for Artificial Intelligence), Google and Deloitte, to name a few. Social organisations are also on board, in addition to governmental authorities, the commercial sector and centres of expertise.

What is the ultimate success this ELSA Lab can achieve?

The aim is to create applications that all the parties involved can agree with and which, when implemented, show that they do not merely help provide real-world improvements in public safety but also have a positive influence on social well-being. So the bar is set pretty high. To achieve that long-term goal, this ELSA Lab is first focusing on important stepping stones. In the initial phase, the focus is on a research programme where four PhD students will be given the opportunity to pursue PhD studies on relevant subjects. This will provide an excellent foundation on which the consortium can then continue to build.

Awarded the NL AIC Label

The Netherlands AI Coalition has developed the NL AIC Label to underline its vision for the development and application of AI in the Netherlands. An NL AIC Label formally recognises an activity that is in line with the aims and strategic goals of the NL AIC and/or the quality of that activity. NL AIC would like to congratulate the ELSA Lab for Multi-Agency Public Safety Issues in AI.

More information?

The following people have been involved with its development:

If you would like more information about human centric AI and the ELSA concept, please visit this page.

Share with

More information

Building blocks

The NL AIC collaborates on the necessary common knowledge and expertise, resulting in five themes, also called building blocks. Those are important for a robust impact in economic and social sectors.

Sectors

AI is a generic technology that is ultimately applicable in all sectors. For the development of knowledge and experience in the use of AI in the Netherlands, it is essential to focus on specific industries that are relevant to our country. These industries can achieve excellent results, and knowledge and experience that can be leveraged for application in other sectors.

Become a participant

The Netherlands AI Coalition is convinced that active collaboration with a wide range of stakeholders is essential to stimulate and connect initiatives in Artificial Intelligence. Within fields of expertise and with other stakeholders in the ecosystem to achieve the most significant result possible in the development and application of AI in the Netherlands. Representatives from the business community (large, small, start-up), government, research and educational institutions and civil society organisations can participate.
Interested? For more information, see the page about participation.