Future Lawyer Blog

Event review: Using Open Source Investigation for Human Rights Violation

Tara Kelly, member of the Lawbore journalist team, reports back on the TRUE Project and Inner Temple Conference, entitled “New Frontiers in Evidence: The Admission of User Generated and Open Source Material.

Since World War II, national security agencies have collected and analysed open source data from publicly available information to uncover and verify intelligence. Known as Open Source Intelligence (OSINT), its roots lie in the analogue era, where investigators gathered information from newspaper clippings, photographs, maps, satellite imagery, and other research for a specific investigative outcome.

Over the past decade, the widespread adoption of smartphones, 3G connections and social media has reshaped the OSINT field, with citizens documenting human rights violations in war zones and civil unrest at protests. As such, user-generated content shared in massive volumes in real-time continues to impact journalism, activism, academia and law enforcement. The law is no exception.

As deepfakes and AI-manipulated imagery spread online at a dizzying rate, in conflicts from Ukraine to Gaza and beyond, the need for lawyers and judges to understand, assess and verify such digital content for evidentiary purposes in a court of law has never been more pressing.

To learn more about this ever-evolving field, the Inner Temple hosted “New Frontiers of Evidence: The Admission of User Generated and Open Source Material“. The first conference of its kind designed solely for the legal profession, the event included a full day of talks and panels from leading OSINT experts hailing from journalism, big tech, academia, law, human rights, counter-terrorism and more.

From hobbyist to OSINT investigator

Bellingcat’s Eliot Higgins opened the event with a keynote talk charting his humble beginnings of working with open source content as a blogger. In 2012, he explained his intention was purely a curiosity to understand the Syrian conflict and its progression.

Frustrated by a lack of OSINT reporting from an excessive stream of user-generated content from major media outlets, Higgins took it upon himself to examine footage and images from anonymous and unknown sources on social platforms. With online satellite imagery, he began geolocating and chronolocating content from his living room. Soon, volunteers of digital sleuths started to follow him and join him in debunking and verifying a mountain of digital content.

A few years later, in July 2016, he founded Bellingcat. Serendipitously, the outlet launched on the very day the Malaysian Airlines Flight 17 was shot down, an incident the Bellingcat team quickly got to work on and attributed to Russia.

From an evidentiary legal standpoint, Higgins explained how Bellingcat naturally developed a following from human rights defender organisations, like Human Rights Watch. Meanwhile, attention grew, and more journalists and human rights advocates became interested in Bellingcat’s work.

This later led to the formation of the Justice and Accountability Unit – a joint venture between Bellingcat and Global Legal Action Network. The initiative’s mission is to demonstrate the viability of online open source information in judicial processes. The J&A Unit operates separately from the rest of Bellingcat to ensure independence. Various reports were published focusing on the OSINT as evidence landscape.

Uncovering the truth with open-source evidence and user-generated content

The event also examined research focusing on the impact of open-source and user-generated content on citizens and the courts in the age of disinformation.

Jonathan Hak KC of Leiden University, who spent 30 years as a prosecutor in Her Majesty’s government in Canada, spoke about the search for the truth in open-source evidence, calling it both a benefit and a burden:

“It’s a benefit because it amounts to a windfall of evidence from disparate sources, in a myriad of locations from varying perspectives,” Hak said. “It’s a burden because open source evidence may not be authentic or reliable and may adversely impact the search for the truth.”

Beware the ‘seeing is believing’ trap

Hak highlighted some challenges prosecution teams may encounter when authenticating content, as accessing the original image is often nearly impossible. This is even more challenging given that such content’s metadata is stripped when uploaded to social media platforms.

A lack of visual literacy to interpret OSINT imagery among lawyers and judges is another problem he highlighted. He warned, “When counsel doesn’t engage experts competent in assessing such evidence, they may fall into a “seeing is believing” trap.”

However, even though an image or video is questionable, verification is still possible, and trust in the asset can be earned.

“Open source is not an evil to be eradicated…It is evidence that has additional hurdles to clear to be used confidently in a courtroom. It is evidence that must be approached with scepticism.”

Prof Yvonne McDermott of Swansea University spoke about the TRUE project, which seeks to explore the impact of deepfakes and AI-generated media on trust in user-generated evidence in accountability processes for human rights violations.

“The biggest danger with the advent of deepfakes is probably not that deepfake footage will come to be introduced in the courtroom, but rather that real footage will come to be reflexively dismissed as possibly fake,” said McDermott.

She noted that no empirical study has answered this question yet. Filled with three types of research, the TRUE project focuses on interviews with judges, researching case law in this area, and hosting and analysing a mock jury trial.

Exploring principles and methodologies for digital authentication

One of the most significant challenges prosecutors at the International Criminal Court have faced over the past decade, is handling and preserving open-source evidence to ensure it is verified and admissible in international courts.

Dr. Alexa Koenig, Executive Director of Human Rights Center, University of California, Berkeley School of Law, shared how the Berkeley Protocol came about. It serves as a practical guide on effectively using digital open-source information in investigating violations of International Criminal, Human Rights and Humanitarian Law.

Launched in 2020 in partnership with the UN, the Berkeley Protocol aims to establish baseline guidelines for those wanting open-source evidence to be put forward in a legal accountability process.

Koenig explained that the need for such guidelines came about after Berkeley researchers noticed a considerable number of cases were falling apart at the early stages of prosecution at the International Criminal Court due to two issues: Firstly, prosecution teams were overly relying on open-source information as evidence derived from unverified aggregated reports. Secondly, prosecutors were also relying on testimonies of survivors without bringing in corroborating data to meet the evidentiary threshold to allow these cases to move forward.

The ability to corroborate and verify the who, what, where and why of a human rights atrocity was missing, said Koenig. The protocol sought to fix this.

OSINT Data collection at the International Criminal Court

Another recent development we have seen isn’t just around methodologies of handling open-source evidence but the collection and analysis of such data. The International Criminal Court’s Office of the Prosecutor launched the OTPLink portal in May 2023 to obtain data collection and analysis for human rights atrocities as established under Article 15 of the Rome Statute.

David Hasman, Head of eDiscovery and Data Analysis at the International Criminal Court, spoke about how the OTP developed a new and accessible online platform for witnesses of international crimes to submit evidence in real time. Since launching in May 2023, the OTPLink has received 35,000 monthly submissions, averaging 125,000 files in various formats in some 145 languages.

But OTPLink doesn’t just receive information; it sorts and categorises that material for investigators with the help of AI. Meanwhile, much is required to help keep a secure and large-scale data platform running, especially when it is hit with malware, and people are trying to penetrate its system.

Tara Kelly

At the moment, Hasman feels reassured investigators and OSINT experts can detect what is real and what is fake. “But what happens in the next 12 months when deepfakes become so good we can’t even detect fake material?” he asks. “Maybe blockchain can help us do that”.

Tara Kelly is a GDL student at City, University of London. She is an aspiring solicitor transitioning from a career in media and the. non-profit world. She has a passion for tech, media, and immigration law. Tara is a member of the 2023-24 Lawbore Journalist Team.

Leave a Reply

Your email address will not be published. Required fields are marked *