At Web-IQ, our mission is to help Law Enforcement Agencies worldwide in accessing crucial information quickly and efficiently. As a proud partner of the STARLIGHT project, Web-IQ is committed to researching how AI could support the field of Open-Source Intelligence and by leveraging new machine learning techniques to improve LEAs’ access to online information.
Using responsible AI, STARLIGHT improves prevention, detection, and investigation capabilities for Law Enforcement Agencies. STARLIGHT supports LEAs in embracing artificial intelligence tools, which optimise investigative practises, while preventing the illicit exploitation of AI in the fight against serious criminal threats.
The STARLIGHT project is funded by the European Commission under the Horizon 2020 funding program. The strong and strategic consortium contains over 50 partners, including innovative SMEs and corporates, research institutes, law enforcement agencies and other public organisations.
For more information: STARLIGHT Project ->
StreetWise is a three year project, funded by the European Commission, Northern Netherlands Alliance (SNN) and the Dutch Ministry of Economic Affairs and Climate Policy.
Over the next three years, Web-IQ will be able to expand and transform our Voyager OSINT platform to support practical use cases in the field. In collaboration with partners including Gemeente Groningen, Veiligheidsregio Groningen, and the University of Groningen, we will explore impactful use cases, develop innovative technology, test-drive new OSINT solutions and conduct pilot projects in real-world scenarios in the street. Our efforts will contribute to a better regional information position and ultimately help create a safer place for all citizens.
Are social media a safe place for children? Do these platforms protect our kids enough from harmful content when their business models have other priorities? Web-IQ is working on innovative tools to validate if children and minors are at risk on these environments.
AI is powerful and therefore Web-IQ pays specific attention to secure GDPR and other AI related legislation and ethics. Our commitment to making the world a safer and better place drives all of our efforts. With our OSAgE (Open Source Age Estimation) research project, funded by Safe Online, we place the safety of children at the forefront.
OSAgE is an AI-driven age assurance tool with the purpose of supporting authorities in enforcing age legislation on digital platforms by using publicly available web information to verify users' ages. The primary objective is to protect children from the potential dangers associated with sexual abuse on the internet. Leveraging cutting-edge technologies, we provide support to children who might not be aware of the risks associated with using the internet. Our goal is to create a more secure online environment for them.
Eldert van Wijngaarden, CEO at Web-IQ: "Web-IQ is deeply committed to its mission of creating a better and safer world by focusing on the online protection of children. Our efforts in this project are vital in both preventing potential harm, as well as assisting minors who may be unaware of the potential dangers lurking on the internet".
The selection of Web-IQ to be funded from Safe Online for this research project underscores our expertise and dedication. We are proud to be among the chosen organisations contributing to this meaningful cause.
For more information: Tackling age assurance & live streaming of abuse to make the internet safer for children ->
AviaTor is an efficient tool that helps triaging all aspects of NCMEC reports, empowering law enforcement to focus on identifying perpetrators and saving victims.
A substantial part of the development has been supported and funded by the European Commission through the ISF-P program. AviaTor is developed by a consortium of domain experts and technical partners, including the Dutch National Police, Belgian Federal Police, INHOPE, ZiuZ, Web-IQ, TimeLex and DFKI.
Web-IQ supports the AviaTor platform by integrating its Child Protection Solution ATLAS, which allows investigators to prioritise NCMEC reports and the resulting investigations using crucial OSINT information.
Reports of Child Sexual Abuse Material (CSAM) require labour-intensive processing.
Electronic Service Providers in the United States, such as Facebook and Google, are required by law to report potential CSAM to NCMEC as soon as they become aware, resulting in a huge quantity of NCMEC reports. In 2018, NCMEC processed more than 18 million reports of potential CSAM from US based companies. By 2023, this number had risen to 35,9 million reports. These reports are distributed to Law Enforcement Agencies (LEA) around the globe. The LEA will attempt to identify the victim and the offender.
Although the reports can be enormously valuable, the increasing number places an unsustainable workload on LEAs. The AviaTor project was incepted to provide LEA with a system to triage and process these reports more efficiently.
Reduce the time LEAs spend on triaging NCMEC reports. Developed with support from 16 LEAs across Europe, AviaTor offers a long-term solution for law enforcement. To process industry reports of online Child Sexual Abuse Material, numerous LEA still rely on manual solutions, which are unsustainable with the ever increasing number of reports.
AviaTor provides a long-term solution, by creating a locally installed system that makes the process much more efficient. Behind seemingly identical NCMEC reports a ‘simple’ upload or an urgent situation of ongoing child abuse might be hidden. AviaTor reduces duplication and enriches reports by integrating information from automated visual analysis and targeted online research, aiming to offer faster insight into the situation behind the report.
The Belgium Federal Police is one of the law enforcement agencies using the AviaTor tool since its initial version in 2019.
For more information: AviaTor Project ->