Friend
Professional
- Messages
- 2,653
- Reaction score
- 850
- Points
- 113
The bill, which proposes to introduce a new qualifying feature in several articles of the Criminal Code - the use of deepfake technologies - will be submitted to the State Duma on Monday. It was developed by the head of the Duma Committee on Labor, Social Policy and Veterans' Affairs Yaroslav Nilov (LDPR) and Senator Alexei Pushkov, the text of the document is at the disposal of TASS.
"We proposed to amend the Criminal Code and clarify some qualifying signs for attempts to use deepfake technologies for illegal purposes. Legally, this concept has not yet been spelled out in the legislation, although it has been around for a long time. But by implication, we understand that these are technologies that allow you to use an image or sound, both natural and synthesized, to mislead a person. And scammers take advantage of this," Nilov told TASS.
The bill proposes to amend the articles "Slander", "Fraud", "Theft", "Extortion", "Fraud in the field of computer information" and "Causing property damage by deception or abuse of trust". According to the bill, defamation committed using the image or voice (including falsified or artificially created) of the victim, as well as using his biometric data, will be punishable by a fine of up to 1.5 million rubles or imprisonment for up to two years. Fraud using such technologies can result in a fine of up to 400 thousand rubles or imprisonment for up to six years.
As the authors of the project noted in the explanatory note, the development of computer technologies has expanded the possibilities of creating video and audio materials based on images and the voice of citizens, "artificially recreating non-existent events." They stressed that earlier attackers, for example, faked photos for slander, but modern technologies, including neural networks and artificial intelligence (deepfake, digital mask technologies, etc.), make it possible to create fakes that are almost impossible to distinguish from real ones. They also make it possible to reproduce other biometric data.
The government's official response to the bill, which is also at the disposal of TASS, notes that the industry legislation does not regulate the use of identity substitution technologies. "Thus, the introduction of the proposed regulation into criminal legislation is not possible due to the lack of corresponding norms of substantive legislation, which entails significant risks of the formation of incorrect law enforcement practice," the document says.
In addition, the government clarifies that the explanatory note to the bill does not contain statistical and other data "indicating the public danger of criminalized acts." In this regard, it is impossible to conclude about the increased degree of public danger of these acts, the government notes. "Based on the above, the bill requires significant revision," the review says.
Source
"We proposed to amend the Criminal Code and clarify some qualifying signs for attempts to use deepfake technologies for illegal purposes. Legally, this concept has not yet been spelled out in the legislation, although it has been around for a long time. But by implication, we understand that these are technologies that allow you to use an image or sound, both natural and synthesized, to mislead a person. And scammers take advantage of this," Nilov told TASS.
The bill proposes to amend the articles "Slander", "Fraud", "Theft", "Extortion", "Fraud in the field of computer information" and "Causing property damage by deception or abuse of trust". According to the bill, defamation committed using the image or voice (including falsified or artificially created) of the victim, as well as using his biometric data, will be punishable by a fine of up to 1.5 million rubles or imprisonment for up to two years. Fraud using such technologies can result in a fine of up to 400 thousand rubles or imprisonment for up to six years.
As the authors of the project noted in the explanatory note, the development of computer technologies has expanded the possibilities of creating video and audio materials based on images and the voice of citizens, "artificially recreating non-existent events." They stressed that earlier attackers, for example, faked photos for slander, but modern technologies, including neural networks and artificial intelligence (deepfake, digital mask technologies, etc.), make it possible to create fakes that are almost impossible to distinguish from real ones. They also make it possible to reproduce other biometric data.
The government's official response to the bill, which is also at the disposal of TASS, notes that the industry legislation does not regulate the use of identity substitution technologies. "Thus, the introduction of the proposed regulation into criminal legislation is not possible due to the lack of corresponding norms of substantive legislation, which entails significant risks of the formation of incorrect law enforcement practice," the document says.
In addition, the government clarifies that the explanatory note to the bill does not contain statistical and other data "indicating the public danger of criminalized acts." In this regard, it is impossible to conclude about the increased degree of public danger of these acts, the government notes. "Based on the above, the bill requires significant revision," the review says.
Source