Experts criticized the software intended for criminal proceedings

Teacher

Professional
Messages
2,673
Reputation
9
Reaction score
688
Points
113
bc8a62e7-9a34-4ed8-8362-9acb2216ed2c.jpeg


Hello, running in the shadows! Hello, random carders. It's no secret that lawyers also try to keep up with the times (although it doesn't always work out). In general, we worked together this time. The databases are raw, the programs don't work, and if they do, they don't. Read it, in general.

Go:

Cybercriminalists and information security experts spoke at the DEF CON conference and spoke about the imperfection of software used by law enforcement agencies and courts in their work.

Jerome Greco, a cybercriminalist and member of the Legal Aid Society, Dr. Jeanna Matthews, a professor at Clarkson University, and Nathan Adams, an engineer at Forensic Bioinformatic Services, spoke about the problems of the software that is used to catch criminals, bring them to justice and pass sentences. The researchers said that the code of such software often does not pass any checks, it is not available to the security side, and in some cases the tools may show bias.

DkW6Q25U8AArdMA.jpg


As an example, experts cite the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) system, which is used by first instance judges when working on sentencing and determining the conditions of parole.

"The company that develops COMPAS considers gender to be one of the most important factors influencing the decision-making process. So, men are much more likely to become repeat offenders, which means that they are less likely to be recommended for release on bail, " says Greco. — Thus, women are more likely to be released on bail, and men are more likely to receive longer sentences. We don't know exactly how this data affects us or how important gender is. The company uses trade secret laws to hide its code from any checks."

The researchers add that such systems are often "trained" on the basis of data sets containing systematic errors. For example, facial recognition systems "train" mainly on the examples of white people, and they are ineffective when working with people with a different skin color.

Greco says predictive policing systems that help police choose areas to patrol are also imperfect. The expert claims that such systems are often openly racist and allow the police to justify their actions by saying that "the decision was made by a computer."

At the same time, not only manufacturers themselves, but also law enforcement agencies strive to protect their products from any checks. A good example is the use of IMSI interceptors or Stingray, which we will discuss later on this channel. Let me remind you that such devices are widely used by both special services and"bad guys". They use the design feature of mobile phones — to give preference to the cell tower whose signal is the strongest (in order to maximize signal quality and minimize their own power consumption).

As practice shows, law enforcement officers prefer to keep silent about the use of such systems, but only between 2008 and 2015, the New York Police Department used such devices more than 1000 times.

Dr. Jeanne Matthews says that many of these systems are in urgent need of bug detection and improvement:

"Independent third-party testing is a big advantage, you need teams that will encourage [manufacturers] to detect problems, and not people who convincingly declare that everything is fine."

In general, gentlemen, while the lawyers are so fucked up, we have every chance to earn money and quietly go into the shadows.
 
Top