Throw away your iPhone urgently! Now the company from Cupertino will view absolutely all photos on your phone!
The fight for anonymity
I could not afford to miss the most scandalous news of the last days related to everyone's favorite apple company. This news applies to all iPhone users, without exception, and to my deep regret, it is rather sad. As you know, for me anonymity and safety of personal data are key values and that is why I have never walked with an iPhone. I don't think there is even anything to explain. How many scandals there have been ... Starting with Icloud leaks, endless hacking of politicians' correspondence, and ending with surveillance using Pegasus. These were truly dark times. And now, it would seem, Apple still embarked on the right path. We began to work on the protection of personal information, to combat the surveillance of third-party applications. But no. With their last trick, they crossed out absolutely everything. Now if you own an iPhone, you no longer have privacy. And that's why.
Cpu
Everything, as usual, starts under the auspices of a good cause. Apple imagined itself as the standard of chastity and took the lead in the fight against child porn. Starting with iOS 15, all personal photos of iPhone users with iCloud Photos enabled will be scanned for elements of child pornography. Apple said the company is not going to scan its entire library of iPhone and iPad users' photos for child porn. Instead, the company will use cryptography to compare images with a well-known database provided by the National Center for Missing and Exploited Children. Well, yes, of course, that's what we believed. All just to protect children, which latent homosexual Tim Cock of course thinks about! Most likely, only in their sick fantasies ...
Privacy uppercut
Okay, most apple fanatics eat this. After all, this is all for good purposes, and the pictures will be checked by a neural network. Well, it's okay .. No matter how it is! Let me tell you a little secret. All these algorithms and neural networks are imperfect. Yes, they can find prohibited material, but the final decision will always be made by live moderators. And this means that all your nudes will pass through more than one pair of eyes, because most likely the algorithm will still detect a naked body. But whether it is a CPU or not - it's up to the person to decide. In short, this is a full pi% "dec. Don't be too upset. This system will work only in the USA so far, so you still have a year to throw off a fresh shovel.
The fight for anonymity
I could not afford to miss the most scandalous news of the last days related to everyone's favorite apple company. This news applies to all iPhone users, without exception, and to my deep regret, it is rather sad. As you know, for me anonymity and safety of personal data are key values and that is why I have never walked with an iPhone. I don't think there is even anything to explain. How many scandals there have been ... Starting with Icloud leaks, endless hacking of politicians' correspondence, and ending with surveillance using Pegasus. These were truly dark times. And now, it would seem, Apple still embarked on the right path. We began to work on the protection of personal information, to combat the surveillance of third-party applications. But no. With their last trick, they crossed out absolutely everything. Now if you own an iPhone, you no longer have privacy. And that's why.
Cpu
Everything, as usual, starts under the auspices of a good cause. Apple imagined itself as the standard of chastity and took the lead in the fight against child porn. Starting with iOS 15, all personal photos of iPhone users with iCloud Photos enabled will be scanned for elements of child pornography. Apple said the company is not going to scan its entire library of iPhone and iPad users' photos for child porn. Instead, the company will use cryptography to compare images with a well-known database provided by the National Center for Missing and Exploited Children. Well, yes, of course, that's what we believed. All just to protect children, which latent homosexual Tim Cock of course thinks about! Most likely, only in their sick fantasies ...
![apple-budet-iskat-zagruzhennoe-v-icloud-detskoe-porno-1.png](https://www.natpress.net/uploads/posts/2020-01/apple-budet-iskat-zagruzhennoe-v-icloud-detskoe-porno-1.png)
Privacy uppercut
Okay, most apple fanatics eat this. After all, this is all for good purposes, and the pictures will be checked by a neural network. Well, it's okay .. No matter how it is! Let me tell you a little secret. All these algorithms and neural networks are imperfect. Yes, they can find prohibited material, but the final decision will always be made by live moderators. And this means that all your nudes will pass through more than one pair of eyes, because most likely the algorithm will still detect a naked body. But whether it is a CPU or not - it's up to the person to decide. In short, this is a full pi% "dec. Don't be too upset. This system will work only in the USA so far, so you still have a year to throw off a fresh shovel.