Privacy Campaign Group Takes on ChatGPT in Austria: Demands Accuracy and Compliance

ChatGPT under fire for ‘unresolvable inaccuracies’ – Science & Technology

Vienna-based privacy campaign group NOYB announced on Monday that it intended to file a complaint against ChatGPT in Austria. The group claimed that the flagship AI tool produced incorrect answers and that its creator, OpenAI, was unable to correct these inaccuracies.

NOYB highlighted that it is essential for information about individuals to be accurate according to EU law. The group insisted that if a system cannot produce accurate and transparent results, it should not be utilized to generate data about individuals. NOYB’s data-protection lawyer, Maartje de Graaf, emphasized the importance of ensuring that technology complies with legal requirements.

The campaign group cited instances where ChatGPT provided incorrect information about the birth date of NOYB founder, Max Schrems. Despite requests for OpenAI to rectify or erase the incorrect data, the company refused, claiming it was impossible. NOYB also noted that OpenAI failed to respond adequately to Schrems’ requests to access his personal data, violating EU law.

Since its introduction in November 2022, ChatGPT has attracted attention for its impressive capabilities. However, the technology has faced criticism and legal action in various countries. Italy temporarily blocked the program, and France’s regulatory authority launched an investigation in response to complaints. NOYB has raised concerns about the regulation of AI by authorities and is pushing for OpenAI to be fined to ensure compliance with EU law.

In recent years, there have been growing concerns about privacy and data protection as more people share their personal information online. This is where privacy campaign groups like NOYB play a critical role in advocating for stronger laws and regulations to protect individuals’ rights.

One of the significant issues raised by NOYB is accuracy when generating data about individuals. As AI tools like ChatGPT rely heavily on algorithms and machine learning models, there is always a risk of errors or biases in their outputs.

NOYB also raised concerns about where the data used by ChatGPT came from and what it stored about individuals.

OpenAI claims that they do not collect any personally identifiable information from users when using ChatGPT.

However, this does not mean that they are not processing any sensitive information like names or dates of birth.

As such, it is crucial for companies like OpenAI to have robust data protection policies and procedures in place to ensure that they comply with legal requirements and respect users’ rights.

Leave a Reply