Carding Forum
Professional
- Messages
- 2,788
- Reaction score
- 1,198
- Points
- 113
How simply opening a tax return led to unexpected results.
Recently, Kevin Bankston, senior advisor for AI management at the Center for Democracy and Technology (CDT), was surprised to find that Google's Gemini AI assistant automatically analyzed his documents opened in Google Docs without any permission.
Kevin shared his story on social networks, where he said that he simply opened the tax return in PDF format through the cloud-based document editor Google Docs, and then unexpectedly received a brief overview of his taxes from Gemini. He claims that he did not call Gemini manually and did not give it consent to analyze his files.
After further studying how the AI assistant worked, Kevin noticed that the problem only occurred with PDF documents. He assumed that once, by opening a PDF and clicking on the Gemini button, he "allowed" the assistant to appear every time he opened PDF files.
Kevin then tried to disable this feature by asking Gemini itself how it could be done. Unfortunately, the man received incorrect instructions from the chatbot, which did not lead to anything. As a result, we managed to find the necessary settings on our own, but even here we were disappointed — the function was already turned off, which means that the algorithm for working with user files is even more confusing than it seemed at first glance.
The reaction of users to Kevin's post was diverse. Some expressed concern that Google's services could be used to train AI on user data. Others criticized Kevin for uploading tax returns to Google Docs. And one of the commenters asked if the unsolicited help from Gemini was useful, to which Kevin replied in the negative.
From this unpleasant situation, we can conclude that you should not open personal documents, especially those related to finances and other sensitive information, through cloud editors.
Of course, there is a possibility that Kevin himself made a mistake when working with Gemini, but we can not exclude the fact that the "corporation of good" could really decide to study user files in order to make its neural network more personalized.
Therefore, you should always check your privacy settings and be aware of the risks associated with AI assistants accessing your personal data, especially if you do not want this data to become public.
Source
Recently, Kevin Bankston, senior advisor for AI management at the Center for Democracy and Technology (CDT), was surprised to find that Google's Gemini AI assistant automatically analyzed his documents opened in Google Docs without any permission.
Kevin shared his story on social networks, where he said that he simply opened the tax return in PDF format through the cloud-based document editor Google Docs, and then unexpectedly received a brief overview of his taxes from Gemini. He claims that he did not call Gemini manually and did not give it consent to analyze his files.
After further studying how the AI assistant worked, Kevin noticed that the problem only occurred with PDF documents. He assumed that once, by opening a PDF and clicking on the Gemini button, he "allowed" the assistant to appear every time he opened PDF files.
Kevin then tried to disable this feature by asking Gemini itself how it could be done. Unfortunately, the man received incorrect instructions from the chatbot, which did not lead to anything. As a result, we managed to find the necessary settings on our own, but even here we were disappointed — the function was already turned off, which means that the algorithm for working with user files is even more confusing than it seemed at first glance.
The reaction of users to Kevin's post was diverse. Some expressed concern that Google's services could be used to train AI on user data. Others criticized Kevin for uploading tax returns to Google Docs. And one of the commenters asked if the unsolicited help from Gemini was useful, to which Kevin replied in the negative.
From this unpleasant situation, we can conclude that you should not open personal documents, especially those related to finances and other sensitive information, through cloud editors.
Of course, there is a possibility that Kevin himself made a mistake when working with Gemini, but we can not exclude the fact that the "corporation of good" could really decide to study user files in order to make its neural network more personalized.
Therefore, you should always check your privacy settings and be aware of the risks associated with AI assistants accessing your personal data, especially if you do not want this data to become public.
Source