Friend
Professional
- Messages
- 2,653
- Reaction score
- 851
- Points
- 113
The State Duma proposed to establish criminal liability for theft and defamation using deepfake technology.
The head of the State Duma Committee on Labor, Social Policy and Veterans Affairs, Yaroslav Nilov (LDPR), introduced a bill that proposes to introduce a new qualifying feature in several articles of the Criminal Code-the use of deepfake technology. The document is available to TASS.
The proposed changes will affect articles such as "Defamation", "Fraud", "Theft", "Extortion" ,"Fraud in the field of computer information" and "Causing property damage by deception or abuse of trust". For example, defamation carried out with the use of forged images or voices of the victim (including falsified or artificially created materials) will be punished with a fine of up to 1.5 million rubles or imprisonment for up to two years. In cases of fraud using deepfakes, a fine of up to 400 thousand rubles or imprisonment for up to six years is provided.
The explanatory note states that the development of computer technologies has expanded the possibilities for creating video and audio materials based on images and voices of citizens, which allows "artificially recreating non-existent events." Previously, attackers also forged photos for defamation purposes, but modern technologies, including neural networks and artificial intelligence, allow creating fakes that are almost impossible for a layman to distinguish from real ones. These technologies can also reproduce other biometric data.
According to Nilov, the image and voice of a person are often used to identify a person, but in some cases they do not have the status of biometric data. Given that these elements are most often used for deception, the MP suggested that they should be separated into a separate category.
The government's official response to the draft law, which is also available to TASS, indicates that the current legislation does not regulate the use of identity substitution technologies. Introduction of the proposed amendments to the Criminal Code is impossible due to the lack of relevant norms in the substantive legislation, which may lead to the formation of incorrect law enforcement practice.
In addition, the government notes that the explanatory note to the draft law does not contain data confirming the public danger of such acts. In this regard, it is impossible to draw a conclusion about the increased degree of public danger of crimes committed using the image or voice of the victim. The review emphasizes that the draft law needs significant improvement.
Nilov told TASS that the bill will be submitted to the State Duma, taking into account the necessary improvements and adjustments.
Source
The head of the State Duma Committee on Labor, Social Policy and Veterans Affairs, Yaroslav Nilov (LDPR), introduced a bill that proposes to introduce a new qualifying feature in several articles of the Criminal Code-the use of deepfake technology. The document is available to TASS.
The proposed changes will affect articles such as "Defamation", "Fraud", "Theft", "Extortion" ,"Fraud in the field of computer information" and "Causing property damage by deception or abuse of trust". For example, defamation carried out with the use of forged images or voices of the victim (including falsified or artificially created materials) will be punished with a fine of up to 1.5 million rubles or imprisonment for up to two years. In cases of fraud using deepfakes, a fine of up to 400 thousand rubles or imprisonment for up to six years is provided.
The explanatory note states that the development of computer technologies has expanded the possibilities for creating video and audio materials based on images and voices of citizens, which allows "artificially recreating non-existent events." Previously, attackers also forged photos for defamation purposes, but modern technologies, including neural networks and artificial intelligence, allow creating fakes that are almost impossible for a layman to distinguish from real ones. These technologies can also reproduce other biometric data.
According to Nilov, the image and voice of a person are often used to identify a person, but in some cases they do not have the status of biometric data. Given that these elements are most often used for deception, the MP suggested that they should be separated into a separate category.
The government's official response to the draft law, which is also available to TASS, indicates that the current legislation does not regulate the use of identity substitution technologies. Introduction of the proposed amendments to the Criminal Code is impossible due to the lack of relevant norms in the substantive legislation, which may lead to the formation of incorrect law enforcement practice.
In addition, the government notes that the explanatory note to the draft law does not contain data confirming the public danger of such acts. In this regard, it is impossible to draw a conclusion about the increased degree of public danger of crimes committed using the image or voice of the victim. The review emphasizes that the draft law needs significant improvement.
Nilov told TASS that the bill will be submitted to the State Duma, taking into account the necessary improvements and adjustments.
Source