February 22, 2024

Google confirms AI privacy nightmare for iPhone and Android users

Android and iPhone updates this year will be dominated by AI, much of which will come from Google. But this comes with a serious new warning each user, and it should change the way we use our phones.

The rollercoaster AI smartphone ride is now in full swing. It was always going to be the case that integrating generative AI into the smartphone apps we use most would overshadow the introduction of ChatGPT last year. And here we are now.

But not so fast: all this poses a huge risk to your security and privacy.

It seems we all have a blind spot when it comes to generative AI chatbots. We can be careful about the apps we install, the permissions we grant, the browsers we use, and the data we share with Facebook, Google, and others. But put us in front of an AI chatbot and we forget all this. Suddenly we find ourselves in what feels like a private chat with a helpful new friend. And we want to share.

MORE FROM FORBESSerious new iMessage issue confirmed for Apple iPhone users

But this is clearly not a friend, this is a front for a computer ecosystem worth billions of dollars that is ultimately funded by advertising and data trading.

I’ve warned about this before, with the introduction of AI chat into our private messaging apps. Now that Bard is turning Gemini and a range of new apps are starting to make their way to our phones, Google itself has warned all Android and iPhone users to be very careful with these new technologies.

“Don’t provide confidential information in your conversations or data that you wouldn’t want a reviewer to see or that Google uses to improve our products, services, and machine learning technologies,” Google warns. “Google collects your Gemini Apps conversations, related product usage information, location information, and your feedback.” It says the information is used to “improve and develop Google products and services and machine learning technologies.”

Fortunately, Google assures that “your Gemini Apps conversations will not be used to serve you ads,” although that could change, at which point “we will clearly communicate this to you.”

The risk here is that a huge privacy nightmare befalls those who share too much with the various AI chatbots that help us write our business plans and sales presentations, or cheat on our school homework. When you exit the chat, the questions you asked and the answers you received are part of a record that is stored, meaning it can be retrieved and reviewed. And it can also potentially leak.

The standalone apps are just the beginning, of course, and Google also warns that “when you integrate and use Gemini Apps with other Google services, they will store and use your data to provide and improve their services, in accordance with with their policies and Google guidelines. Privacy Policy. If you use Gemini Apps to interact with third party services, they will process your data according to their own privacy policies.”

The risks of generative AI are only now becoming clear. For example, when it comes to messaging, I warned that Google will apparently ask Gemini (née Bard) to view all your previous private messages to shape the context and content of its suggestions. It will also violate the end-to-end encryption of Messages.

That open storage outside the device is the real problem here. Google says that data will be retained for a maximum of 18 months “by default,” which you can change to 3 or 36 months in your Gemini Apps Activity setting. Information about your location, including the general area of ​​your device, IP address, or home or work addresses in your Google Account, is also stored with your Gemini Apps activity.

MORE FROM FORBES‘Dangerous update’ warning issued for Google Chrome users

This is of course not limited to Google. This level of data collection and use is fairly typical of the emerging Gen-AI industry. For example, how we will assess the security and privacy of a Google versus an OpenAI remains to be seen.

But, as ESET’s Jake Moore warns, “any data we share online – even in private channels – has the potential to be stored, analyzed and even shared with third parties. When information is a premium and even seen as its own currency, AI models can be designed to dig deeper into users who reveal large amounts of personal information. Data sharing could ultimately cause security and privacy issues in the future and many users are simply unaware of the risks.”

Google says you can disable long-term data collection within Gemini if ​​you play with the settings. “Future conversations will not be sent for human review or used to improve our generative machine learning models… You can also delete a chat from your pinned and recent chats. Doing this will also delete the related activity from the ‘Your Gemini Apps’ activity. But also if you do this, “your conversations will be retained on your account for up to 72 hours so that we can provide service and process feedback. This activity will not appear in your Gemini Apps activity.”

As I’ve noted before, the on-device/off-device nature of the AI ​​analytics that will power this new generation of smartphone functionality will be the next big divide. Apple will likely do as much as possible with its own apps and services on the device.if that works. And we know it’s now experimenting with device on/off performance. Google will do a lot more in its cloud, given its very different setup and focus.

MORE FROM FORBESPutin’s ‘internet kill switch’ suddenly becomes reality

For the millions of Android and iPhone users who now have Gemini-powered apps on their devices, there are some tough choices to make. The rest of us won’t be far behind.

We can’t have it both ways. Either we value our privacy and the tremendous strides made in recent years in private browsing, tracking, and location sharing. Or AI integrated into mainstream apps is just too cool to live without, and we’ll use it for everything.

If the latter is the case, this could become the ultimate “be careful what you wish for.”

Leave a Reply

Your email address will not be published. Required fields are marked *