Your sextos can be watched by the Royal Canada Gendarmerie, and dressed for 100 years

Police keep for 100 years intimate photos exchanged by private messaging by Canadians and mistakenly reported by artificial intelligence (IA) as being child pornographic, reveals a Quebec judicial case.

This extensive decision of the Court of Quebec dated July 3 reveals new details on the way in which the Royal Canada Gendarmerie (RCMP) obtains and manages certain reports of sexual exploitation of children on the Internet. The process is compared to “abusive collective surveillance of the State” by the lawyer of the man accused in this criminal case of having accessed juvenile pornography.

The judgment also highlights the ways in which AI is used by the police – and its possible drifts.

The police method described opens the door to the fact that innocent photos shared by a parent from his child taking a bath finding themselves in a database of sexual exploitation of minors. “Potentially [tout] parent of the country is relieved, ”says the cybersecurity specialist Luc Lefebvre, invited by The duty to read the file.

We can read in the court decision that web platforms are using AI to automatically detect potentially illegal content, including messages exchanged in private discussions. In accordance with American law, the messages suspected of sexual abuse of children are denounced to a American organization, which then transfers in Canada those who fall under its jurisdiction. Moreover, the vast majority of the reports received by the RCMP are made by Facebook, Google or Snapchat or other private messaging services, can be read in the decision.

All of this content goes under the eyes of around twenty Civilian RCMP employees who have the painful task of viewing the photos, videos or text messages reported. A policewoman interviewed in court explained that these images sometimes show individuals who have physical characteristics of adults, such as hair, chest or hips. The police stop their approaches-and “close” the file-when they have a doubt and may think they are dealing, for example, with a simple consensual exchange of intimate photos by text.

Never mind, even when it determines that it is not a question of juvenile porn, and that the content was reported by the AI by mistake, the RCMP catalog all these reports-photos and videos included-in its database for a period of a century, beyond the life expectancy of any internet user.

The duty asked the RCMP why it retains the data of the files which it has “closed” without investigation, whose reports for which it determined that no illegal activity had taken place. Federal police were content to answer that this information is “collected” for “the detection, prevention or repression of crime, and under the Criminal Code of Canada for the surveys of the sexual exploitation of children”, without approaching the question posed on the “conservation” of the data.

Justified method

Up to 70 % of the reports received in the country are “closed” by the initial sorting carried out at the national center against the exploitation of children (CNCEE) of the RCMP, without being transmitted to the police for investigation. And even among those who are transferred, the majority lead to any survey, for operational reasons or because a doubt remains as to the illegality of the content.

“These reports are part of a specific legislative context targeting a precise, important and urgent social objective, either the fight against the sexual exploitation of children,” said Judan Julie Roy in her recent decision. Even if it recognizes that the use of AI by platforms leads to erroneous reports, children’s safety justifies the continuous conservation of this data, which is not “excessive, indiscriminate, or arbitrary”.

The accused’s lawyer in this case, Me Félix-Antoine Doyon, intends to challenge this interpretation-and the legislative regime in force-, to the Supreme Court if necessary. “If we accept that, we must therefore accept that the State can interfere in our private conversations, in our privacy, for a pack of other reasons deemed valid in the name of public security,” he said, in an interview with duty.

“In other words, it could redefine the limits between what the state can monitor and what should remain deprived of citizens. »»

His client Jonathan Filiatrault saw the police arriving at his home in May 2022 with a search warrant, obtained thanks to the report made by a private messaging application named Kik, who would have intercepted an explicit video showing sexual acts on a young child. They did not find him, assures his lawyer, but found other content justifying the deposit of charges. Mr. Filiatrault maintains his innocence and has not yet been tried.

100,000 reports per year

The court first looked at the request of the accused aimed at excluding proof, alleging that his constitutional right to be protected against excavations and abusive seizures was flouted – which was refused to him. His lawyer now seeks to have various articles of the federal legislative regime declared unconstitutional, which has enabled the collection of incriminating evidence against him.

“In a democracy where the rule of law predominates, it is imperative that the State remains the guarantor of fundamental rights, even when it is based on private companies such as Google, Apple or Kik to collect or use data,” read in its request.

This cause shows above all the extent of the data stored by the police, as well as an idea of the number of “false positives” included, believes Luc Lefebvre, co -founder of CQ (formerly Crypto Québec), an organization dedicated to the popularization of computer security issues.

“It raises a lot of questions, as the case of a parent who consults in telemedicine and who is asked to take a photo of his child, who will then be kept by an infonuagic service. Could it be relieved? He fears that a possible authoritarian government in Canada can use these databases for other purposes than the fight against pedocrime.

According to data released before the Court, the CNCC CNCEE has processed around 111,000 child pornography reports in one year (2023-2024), the overwhelming majority of which comes from an automated monitoring mechanism for large American platforms. All the information contained in the reporting, including the user’s pseudonym and internet protocol (IP), and even in some cases their email address and telephone number, is recorded in a database called Operational Children’s Operational Network (OCEAN, according to its acronym in English).

About 30 % of the files filtered by the RCMP are entrusted to the various police bodies, such as the Sûreté du Québec (SQ), when they concern Internet users residing in its territory. Again, the information is stored for life in SQ servers, even if approximately 55 % of these files are not surveyed, “due to the very high volume of cases and limited resources”, specifies the police body in response to Duty.

“The conservation of this data makes it possible to identify unknown victims by correlation with other documented cases. This is crucial to detect ongoing abuse situations and to intervene quickly when new victims are identified, “it is said.

Together, let us support reflection

Rigorous and lucid media, The duty ne se

Not happy to relate

The facts. Our journalists offer you the keys to better understand

News from here and elsewhere. By supporting our mission, you ensure

The sustainability of independent, demanding and committed journalism.

To watch in video

Comments (0)
Add Comment