Ammon News - Experts from the Hasso Plattner Institute (HBI) in Germany, which specializes in information technology, warned of the danger of providing confidential information using the artificial intelligence chat, ChatGBT, due to the weak data protection and privacy preservation factors of this platform.
Experts said that despite concerns that GBT Chat could provide false or false answers to questions, the service should not be seen as a good place to maintain privacy.
This comes at a time when users are increasingly relying on AI chat platforms like GBT Chat in their daily lives, so data security experts are warning against sharing sensitive information through these platforms.
At the same time, experts from the HPI Institute point out that users can rush to add sensitive data to the platform, considering that the more information the user provides, the better and smarter the answers will be.
It should be noted that any user who exchanges confidential information on any artificial intelligence platform runs the risk of giving up the privacy of their data, which is confirmed by the privacy policy of the GBT chat platform. OpenA, developer of the GBT Chat platform, says: “As part of our commitment to security and responsibility, we review our chats to improve our systems and ensure that the content meets our policies and security requirements”, emphasizing that platform employees can see what users are typing and the company provides a data wipe feature.