Features, Interviews, Security, Technology

Protecting civil liberties and children: how ethical AI is revolutionising law enforcement

Veronica Martin caught up with Ashley Beck, Senior Industry Consultant – Global, Fraud & Security Intelligence Division at SAS, at the World Police Summit in Dubai to discuss the measures that can be used to protect to protect individual privacy and civil liberties when using AI, what steps should be taken to ensure transparency and accountability in the use of AI systems in law enforcement, the most innovative technologies currently being used to investigate cases of child sexual exploitation and how they have improved the investigative process.

You possess extensive years of experience in the field of law enforcement and have developed a specialist cybercrime training program. What measures can be put in place to protect individual privacy and civil liberties when using AI technologies?

The ethical AI technologies that we are developing at SAS are being built alongside customers, ensuring people from different specialisms, different cultures and experiences and crucially, the people who will use the technology, are involved in the development. This approach injects breadth and depth of understanding to extract real value from every piece of data owned and received by law enforcement. Having a deep understanding within the investigation of ‘The policing objective’ and ‘What data translates to valuable information?’ is crucial to encourage focus and improve effectiveness and efficiencies to protect people from harm whilst taking cognizance of privacy and civil liberties.

If an investigator is faced with thousands of lines of data that has been legally obtained for a policing purpose, and this data includes information that is unrelated to the investigation, such as personal information belonging to a completely unconnected person, this can increase the risk of collateral intrusion. However, if this data is weaved with indications of serious criminality, then how do investigators minimize the intrusion of privacy whilst maintaining their ability to exhaust understanding of identity of subjects/suspects and risk that they present, from that data?

This is where ethically developed technology that embeds trust and confidence into its use, where elevating the required data as well as supporting and maintaining ‘the human in the loop’ is at the heart of the objective and development.  I cannot emphasize this point enough!  AI technologies work well only when they are used to enhance – not replace – human decision making.  This approach can elevate the capability of investigators to a point where they have the capacity to uphold efficiency and effectiveness of processes, whilst respecting privacies, civil liberties and legislation within society.

How can community stakeholders be involved in the design and implementation of ethical AI systems in policing?

This comes back to ensuring that the stakeholders who inject value are involved at every stage including pre and post implementation of technological support solutions. We need to learn from experience, learn from data owned by significant others as well as law enforcement organisations, to feed back into the technology and support the prevention of further victims as well as detect criminals, stopping them in their tracks.  The sources of information to support this approach could be derived from many data streams which perhaps wouldn’t be immediately obvious. An example of this is ‘an anonymous surveys of young people’ to understand the volume and types of criminality that have gone unreported.

I am currently working with a charitable organisation who are looking at this landscape Citizens of Cyber in Scotland, who have uncovered innovative ways of understanding the threat to children online and with this, supporting communities.  Questions may be asked how many children have been randomly contacted by a stranger? On what platform did this occur? Did sexualize content exist in this contact?  Understanding not only the information reported to police, but also using data external to the organisation can be key to development of technology that can truly capture, elevate and illuminate threat and risk posed that law enforcement agencies need to act quickly as a result.

Post pandemic, we have seen a 45% rise in children being contacted at random by strangers online. If we couple a data source such as survey information (mentioned above), 3rd sector organisation and public sector data alongside policing data, this could generate an enhanced proactive approach to tackling a growing problem where children remain at risk online on a daily basis.

Data can elevate the understanding of modus operandi from the criminal world, which can enable the translation of a preventative message to communities and also feed into the enhancement of technology to further enrich future law enforcement capabilities.

What steps should be taken to ensure transparency and accountability in the use of AI systems in law enforcement, particularly with respect to decision-making processes and data use?

There are a number of ways that we can ensure transparency and accountability as a joint venture between software vendors, policing and national security organisations. It is about making sure that not only the investigator can explain decision making to authorities, the public and the judicial system. Also, the officers or operative themselves need to understand the objectives, capabilities and limitations of the technologies. At SAS, we recognize and respect the importance of ensuring the technology is explainable to different people, to stimulate public confidence and trust.

As a result, we have developed linguistic rules for law enforcement customers, which presents the investigator with a clear vision as to why data elements are being presented as important to them. They would also understand clearly why data has been omitted from the extraction, therefore allowing a truly ethical, informed decision with the support of technology. Machine Learning can also be utilized to feed into the explainable rules creating an ecosystem of development, extraction and decision making that is ethical, trustworthy and transparent.  And remember – accountability starts and ends with the law enforcement professional who uses AI to make a faster, better-informed decision.

What role does ongoing education and training play in ensuring that law enforcement agencies are using AI in a way that aligns with ethical principles and best practices?

In my experience, in the main, police are doing the best job they possibly can. Emphasis is shifting to education and training in policing, which is crucial to eliminate the fear of data and the digital evidence or intelligence gathering. Investigations have changed dramatically over the years, where the sentiment of ‘every contact leaves a trace’ transformed from purely the physical space of DNA, CCTV etc, adding the digital landscape to the areas where evidence would exist and hold real value.  Police officers are very keen to learn because this shift to gathering digital evidence has been recognised. They desperately want to stay ahead and to be able to investigate and explain complex crimes. So, ongoing education and training play a huge part in elevating the investigative capability.

I saw firsthand the passion and thirst for knowledge, to ensure each officer completes their duties for the day knowing they have done everything they can to protect the public. The training program that I laid out and created for UK law enforcement in relation to understanding Networking concepts, Cyber Security and Programming Languages, was aimed towards giving officers the ability to understand what is available to them in terms of evidence and the digital footprint.  This includes not only gathering of evidence but also ensuring the capture of exculpatory evidence to guarantee a robust prosecution that has integrity.

You have been part of Child Sexual Exploitation (CSE) Online Investigations. What are some of the most innovative technologies currently being used to investigate cases of child sexual exploitation, and how have they improved the investigative process?

This is something that’s really close to my heart. Once you deal with this type of criminality, you can’t go back. It’s something that lives with you forever, and you really want to help. You want to make sure that you are safeguarding children across the world, while also supporting the officers, who are doing an amazing job.

The digital footprint of evidence and intelligence that’s being left by such criminality is huge, not only from the data received but also the subsequent research of other data in external and internal repositories. Therefore, optimizing the exploitation of data using technologies such as AI, text analytics, machine learning, and automation is incredibly beneficial and can inject the ability to triage and investigate large volumes of reports.  I thought that when I moved to private sector from policing, I wouldn’t be surrounded by people who were equally passionate about supporting law enforcement agencies to save our communities.

However, I have found skilled, enthusiastic, caring individuals who would go over and above to support this cause.  This has created a perfect storm to innovate technology which produces tangible results and has founded ideas that will truly support officers to save children and vulnerable people across the world.  The work we are delivering in this space is revolutionary using the technology developed over 45 years and collating technologies for industries such as banking, insurance etc and transposed for the benefit of law enforcement where it can essentially save lives.

We have grown capability through skill, experience and determination to produce a system that can alert officers to risk elements contained in big data.  This approach can detect the potential of criminality within the data, but where we are taking this to a new level is being able to filter the data again to determine the probability of whether the criminality has/will/could occur. This can allow investigators to ‘see’ if a victim is in immediate danger.

What role can data science play in detecting patterns of child trafficking, and how can this information be used to protect vulnerable children?

We have data which is of proven cases of grooming exchanges between a number of suspects and ‘children’, over a period of 2 years and we have found trends buried within that data. The analysis showed a trend in terms of ‘gifting items’ to children to manipulate that relationship. In 2019, it was mobile phones that suspects were providing to children, but in 2021, we realized that it had moved on to gifting money. On finding this, we have to feed that back into communities and make sure preventative messaging is shared with children, parents and persons responsible for children.

We also reviewed profiles of the characteristics of children who have become victims. By understanding this, law enforcement agencies can direct messaging over mediums that can capture the audience and prevent these profiles from becoming victims. It is really insightful to see that these trends can be so easily extracted from volume data and make a real difference to protecting our children both online and in real life.

How can technology be used to help victims of child sexual exploitation recover and heal from their experiences?

As already said, authorities can publish preventative messaging through various mediums, which can then be viewed by victims of child sexual exploitation, and this can aid the healing process. Knowing that potentially these crimes are being prevented, stopping others from becoming victims in the future. Injecting analytics into detecting patterns can also direct support mechanisms to the areas where criminality is directing focus. People who have been trafficked and victims of child sexual abuse can be motivated to speak about their experiences which is a crucial part of healing and prevention in society.

The National Centre for Missing and Exploited Children (NCMEC) contribute extensively to this important topic. There are a number of ways analytics can support organizations in understanding trends and share preventative messaging with parents and children in order to reduce the risk of someone falling victim to sextortion.

What advice would you give to policymakers and practitioners looking to use data science to protect vulnerable children, and what key considerations should they keep in mind?

It’s about transparency, integrity, accountability, and making sure that you are using technology that is explainable, that officers are comfortable with, and that is continually evolving. Law enforcement organisations can partner with a technology company that essentially enables the organization, the technology, and the police officers to grow and learn together collectively. Most importantly, we need to make sure that this is not just a textbook exercise; and that activities and implementation are really injecting public confidence and trust in the organization’s ability to deliver. This way remaining ahead of the curve is achievable and allowing breathing space to remain proactive.

There are a lot of positive byproducts that are evident from supporting police officers with automation, visualization, data extraction, and deconfliction of data. You can’t boil an ocean, but when you add a significant number of improvements and elevate capability, you end up with something really valuable, and the officers feel they can focus on investigating and safeguarding children and adults.

Previous ArticleNext Article


The free newsletter covering the top industry headlines