Deepfakes and falsehoods in Ottawa's sights
Disinformation, including deepfake videos and software spreading falsehoods, should be included as part of a future online harms bill, according to a panel of experts appointed by Heritage Minister Pablo Rodriguez to help shape a future law.
Deepfakes are fake videos or photos that use deep learning technology, which creates very realistic counterfeit images.
Members of the panel, including Bernie Farber of the Canada Anti-Hate Network and Lianna McDonald of the Canadian Centre for Child Protection, said the law requires tech giants to combat the spread of fake news and videos.
Some have suggested that Canada take inspiration from the European Union's (EU) Digital Services Act, which allows for stronger action to combat disinformation in times of crisis — for example, during elections, international conflicts and public health emergencies.
They said the EU measure was linked to Russia's attempts to spread false allegations to justify the invasion of Ukraine.
Public Security Minister Marco Mendicino said in an interview that the technology was now so sophisticated that some falsified content and images were "virtually indistinguishable" from truthful content, making it very difficult for people to tell the difference.
He said a "whole-of-government approach" spanning multiple departments was needed to combat the spread of disinformation in Canada. "We are at a crucial moment in our public discourse. We are witnessing a growing amount
of disinformation and misinformation based on extremist ideology," he said.
An analysis by academics of more than six million Twitter posts, their shares and origins revealed that Canada is targeted
by Russia to influence public opinion in that country.
This month's study by the University of Calgary's School of Public Policy found that a large number of tweets and retweets about the war in Ukraine can be linked to Russia and China, and that many of these tweets expressing pro-Russian sentiment are related to the United States.
Ministers announced plans to introduce a bill on online harm that would address online abuse — including racist slurs, anti-Semitism and offensive statements targeting members of the LGBTQ community.
It follows a previous online hate bill released just before last year's federal election. The bill did not become law.
The panel, which also includes law and policy professors from across the country, said a bill should not only combat online abuse, including child abuse, but also take into account misinformation and misleading information online.
This could include coordinated disinformation campaigns "used to create, disseminate and amplify disinformation," including the use of software agents that perform repetitive tasks and their networks. This also includes non-genuine accounts and deepfakes.
Some experts on the panel said the bill should also combat misleading advertising, misleading political communications and content that contributes to "unrealistic body image."
The group said platforms would have a "duty to act" to combat "harmful content online, including disinformation, by conducting risk assessments of content that can cause significant physical or psychological harm to individuals."
Some experts on the panel warned that measures to combat disinformation must be formulated carefully so that they cannot be abused by governments to justify censorship of journalism or criticism.
Their warning was echoed by Emmett Macfarlane, a constitutional law expert at the University of Waterloo. "There are still valid concerns about the potential for overbreadth and unintended consequences of this type of legislation. Our existing criminal laws on hate speech and obscenity have resulted in the unjust restriction or blocking of material at the border, for example," he said.
According to the panel's twelve experts, who have just completed their work, misinformation
and false messages could pose higher risks to children.
They recommended that the bill impose strict requirements on social media companies and other platforms to remove content that presents or encourages child abuse and exploitation.
A few of the members criticized the platforms for not immediately removing this content, saying that "the current performance of online services in removing child sexual exploitation material is unacceptable."
The group criticized platforms in general for indicating the percentage of harmful content they remove, but not the time it took to remove it.
Minister Rodriguez thanked the panel for completing its discussions last week, saying that "its advice is critical to developing a legislative and regulatory framework to address this complex issue and help create a safe online space that protects all Canadians."
"Freedom of expression is at the heart of everything we do, and Canadians should be able to express themselves freely and openly without fear of harm online and our government is committed to taking the time to get it right," he promised.
The Minister also thanked the Citizens' Assembly, a group of 45 Canadians studying the impact of digital technology on democracy, for its advice. At a conference last week, the group stressed the importance of combating the spread of disinformation online,
saying it can manipulate public opinion.
Deepfakes are fake videos or photos that use deep learning technology, which creates very realistic counterfeit images.
Members of the panel, including Bernie Farber of the Canada Anti-Hate Network and Lianna McDonald of the Canadian Centre for Child Protection, said the law requires tech giants to combat the spread of fake news and videos.
Some have suggested that Canada take inspiration from the European Union's (EU) Digital Services Act, which allows for stronger action to combat disinformation in times of crisis — for example, during elections, international conflicts and public health emergencies.
They said the EU measure was linked to Russia's attempts to spread false allegations to justify the invasion of Ukraine.
Public Security Minister Marco Mendicino said in an interview that the technology was now so sophisticated that some falsified content and images were "virtually indistinguishable" from truthful content, making it very difficult for people to tell the difference.
He said a "whole-of-government approach" spanning multiple departments was needed to combat the spread of disinformation in Canada. "We are at a crucial moment in our public discourse. We are witnessing a growing amount
of disinformation and misinformation based on extremist ideology," he said.
An analysis by academics of more than six million Twitter posts, their shares and origins revealed that Canada is targeted
by Russia to influence public opinion in that country.
This month's study by the University of Calgary's School of Public Policy found that a large number of tweets and retweets about the war in Ukraine can be linked to Russia and China, and that many of these tweets expressing pro-Russian sentiment are related to the United States.
Ministers announced plans to introduce a bill on online harm that would address online abuse — including racist slurs, anti-Semitism and offensive statements targeting members of the LGBTQ community.
It follows a previous online hate bill released just before last year's federal election. The bill did not become law.
The panel, which also includes law and policy professors from across the country, said a bill should not only combat online abuse, including child abuse, but also take into account misinformation and misleading information online.
This could include coordinated disinformation campaigns "used to create, disseminate and amplify disinformation," including the use of software agents that perform repetitive tasks and their networks. This also includes non-genuine accounts and deepfakes.
Some experts on the panel said the bill should also combat misleading advertising, misleading political communications and content that contributes to "unrealistic body image."
The group said platforms would have a "duty to act" to combat "harmful content online, including disinformation, by conducting risk assessments of content that can cause significant physical or psychological harm to individuals."
Some experts on the panel warned that measures to combat disinformation must be formulated carefully so that they cannot be abused by governments to justify censorship of journalism or criticism.
Their warning was echoed by Emmett Macfarlane, a constitutional law expert at the University of Waterloo. "There are still valid concerns about the potential for overbreadth and unintended consequences of this type of legislation. Our existing criminal laws on hate speech and obscenity have resulted in the unjust restriction or blocking of material at the border, for example," he said.
According to the panel's twelve experts, who have just completed their work, misinformation
and false messages could pose higher risks to children.
They recommended that the bill impose strict requirements on social media companies and other platforms to remove content that presents or encourages child abuse and exploitation.
A few of the members criticized the platforms for not immediately removing this content, saying that "the current performance of online services in removing child sexual exploitation material is unacceptable."
The group criticized platforms in general for indicating the percentage of harmful content they remove, but not the time it took to remove it.
Minister Rodriguez thanked the panel for completing its discussions last week, saying that "its advice is critical to developing a legislative and regulatory framework to address this complex issue and help create a safe online space that protects all Canadians."
"Freedom of expression is at the heart of everything we do, and Canadians should be able to express themselves freely and openly without fear of harm online and our government is committed to taking the time to get it right," he promised.
The Minister also thanked the Citizens' Assembly, a group of 45 Canadians studying the impact of digital technology on democracy, for its advice. At a conference last week, the group stressed the importance of combating the spread of disinformation online,
saying it can manipulate public opinion.
Marie Woolf -
The Canadian Press
Deepfakes and falsehoods in Ottawa's sights | The Duty (ledevoir.com)
The Canadian Press
Deepfakes and falsehoods in Ottawa's sights | The Duty (ledevoir.com)