Worried about DeepSec? Well, Google Gemini is collecting more of your personal data.

Deepseek AI, developed in China, has raised a number of privacy and security concerns since its launch, with some governments either no longer offering the service or launching investigations into its data handling practices. When it comes to privacy, the Chinese chatbot may not be the worst offender.

According to recent data from Surfshark, one of the best VPN providers on the market, Google Gemini takes the gold medal for the most data-hungry AI chatbot app. Deepseek, in fact, comes in. only 5th out of 10 most popular apps for aggressive data collection.

SurfShark researchers also found that 30% of the chatbots they analyzed shared user data, such as contact details, location, and search and browsing history, with third parties, including data brokers.

You may like

  • Don't Take AI on a Valentine's Day Date - There's a Huge Bill to Pay That You Never Expect

  • Experts warn Deepseek is 11 times more dangerous than others

The Real Cost of Using AI Chatbots

As Thomas Stamoulis, chief security officer at Surfshark, explains, the apps we use every day regularly collect our personal information. While some of this data is essential for the apps to function, some of it is tied to our identities. “AI chatbot apps can go even further by processing and storing conversations,” he says.

To determine the true privacy price tag attached to AI chatbots, Surfshark researchers looked at the privacy details of the 10 most popular apps in the Apple App Store. They then compared how many types of data each app collects, whether it collects any data associated with its users, and whether the app includes third-party advertisers.

The analysis revealed an average of 11 different types of data out of 35 possible. As mentioned earlier, Google Gemini It stands out as the most data-friendly service, collecting 22 of these data types, including highly sensitive data like precise location, user content, device contact list, browsing history, and more.

Of the applications analyzed only Google Geminiand Copilot, and Confusion Found to collect accurate location data. Controversial Debsik Chatbot stands in the middle, collecting 11 unique types of data, such as user input like chat history. The main issue here – and what has attracted privacy complaints under GDPR rules – is that Provider Privacy Policy It claims to keep this data as long as necessary on servers located in China.

Her competitor, chatgptHot on the heels of Gemini, with 10 types of data collected. This includes contact information, user content, identifiers, usage data, and diagnostics. It’s also worth noting that while ChatGPT also collects chat history, you can choose to use ephemeral chat instead to ensure that this information is deleted after 30 days — or request that personal data be removed from its training sets.

App data collection is only one aspect of the privacy issue.

That's because, Stamulis explains, "this data could be used within the company or shared across third-party networks, potentially reaching hundreds of partners, and leading to highly targeted advertising or an increase in spam calls."

Researchers also found that 30% of these chatbot apps also track user data. This means that the user or device data collected from the app is linked to third-party data for targeted ad measurement or advertising purposes.

Copilotand b, and jasper These are the three apps that collect data used to track you. Surfshark experts noted that this data “can be sold to data brokers or used to serve targeted ads in your app.” Copilot and Poe only collect device identifiers for this purpose, while Jasper collects device identifiers, product interaction data, advertising data, and other usage data, which refers to “any other data about user activity in the app.”

“As a general rule, the more information is shared, the greater the risk of data leakage,” Stamoulis said, adding that cybercriminals have been known to exploit such incidents to create personal imaging attacks that can result in massive financial losses.

Stamulis recommends studying the information you provide to chatbots, reviewing your sharing settings, and disabling chat history whenever possible.

كاتب

Leave a Comment

en_USEnglish