Being single on Valentine’s Day can be depressing, but finding comfort in conversations with an AI assistant is no less. Not only do they lack a personality, but their true desire is their personal data.
Surfshark privacy experts discovered that four of the five most popular complementary applications in Apple App Store can track their use of personal data to make profits.
“Instead of being there for us, they can feel more as surveillance tools,” said surfshark cybersecurity expert, Miguel Fornés, pointing out how AI Companions follow -up can shake user’s confidence while invading their privacy.
AI Companions: What are the most hungry for data?
The surfshark team carefully inspected the data collection practices of the five complementary services of AI. These details were obtained from the Apple App Store and include the number, type and handling of the data types collected by each application.
Among the applications analyzed: Kindroid, Nomi, Replika, Eva and AI – 80% “can use data to track their users.”
The monitoring, experts explain, refers to the linking of user data or device collected from the application with user data or device collected from other applications and websites for specific advertising purposes. The tracking also implies sharing user data or device with data corridors.
“These detailed data can lead to companies that influence their choices, which can have negative effects, such as overwhelming ads, financial risks or other unexpected problems,” said surfshark cybersecurity expert.
AI character It was the most in love with user data. While the average was of 9 unique types of data collected from 35. The AI character rises above its competitors collecting up to 15 of these. Eve It was the second data hungry of the lot, collecting 11 types of data. Worse, both applications collect approximate location information of users to deliver addressed ads.
Nomi It was the only application that separated when not collecting data for monitoring purposes.
However, not only the data collected by the service seem to be problematic. Application developers, explains Surfshark, could access the data that also voluntarily shares during their conversation with the AI chatbot.
The danger here is that the complementary applications of AI are designed to simulate human interactions such as friendship and love. You may be more willing to reveal even more confidential information than with chatbots similar to chatgpt.
“This can lead to unprecedented consequences, particularly when IA regulations are simply emerging,” experts point out.
This is the reason why Surfshark advises to take some precautions when using AI Companion to maintain your personal personal data and minimize misuse.
Fornés said: “Be sure to see what permits these applications have and take into account what information you are sharing.”