![]() They’re not hiding anything,” Caltrider said. ![]() ![]() Nevertheless, OpenAI does warn users of the ChatGPT app that their information will be used to train its large language model. “Publicly, OpenAI says it isn’t collecting location data, but its privacy policy for ChatGPT says they could collect that data,” she told TechNewsWorld. “It’s been found that it’s relatively easy to de-anonymize information, especially if location information is used,” explained Jen Caltrider, lead researcher for Mozilla’s Privacy Not Included project. “However, anonymization may not be an adequate measure to protect consumer privacy because anonymized data can still be re-identified by combining it with other sources of information,” Joey Stanford, vice president of privacy and security at Platform.sh, a maker of a cloud-based services platform for developers based in Paris, told TechNewsWorld. Anonymized chats are stripped of information that can link them to particular users. The iOS app comes with an explicit tradeoff that users should be aware of, she explained, including this admonition: “Anonymized chats may be reviewed by our AI trainer to improve our systems.”Īnonymization, though, is no ticket to privacy. ![]() “efore you jump headfirst into the app, beware of getting too personal with the bot and putting your privacy at risk,” warned Muskaan Saxena in Tech Radar. The arrival of a ChatGPT app in the Apple App Store has ignited a fresh round of caution. Since OpenAI introduced ChatGPT, privacy advocates have warned consumers about the potential threat to privacy posed by generative AI apps. ![]()
0 Comments
Leave a Reply. |