News

Samsung Employees Exposed Sensitive Data While Communicating with ChatGPT

According to media reports, Samsung allowed its engineers to use ChatGPT in their work and they disclosed confidential company data.

Samsung employees began to use AI to quickly fix errors in the source code, and “leaked” confidential data to the chatbot, including notes from internal meetings and data related to the company’s production and profitability. As a result, access to ChatGPT may be blocked for Samsung employees.

Let me remind you that we also wrote that Amateur Hackers Use ChatGPT to Create Malware, and also that Microsoft to Limit Chatbot Bing to 50 Messages a Day.

Also information security specialists reported that Blogger Forced ChatGPT to Generate Keys for Windows 95.

The Economist reports that in 20 days, the company recorded three cases of data leaks via ChatGPT at once. In one case, a Samsung developer gave a chatbot the source code for a proprietary error-correction program, essentially exposing the code to a secret AI application run by a third party.

In the second case, an employee exposed ChatGPT test patterns designed to identify defective chips and requested their optimization. These templates are also highly sensitive data, and optimizing them can speed up testing and verification of chips, significantly reducing costs for the company.

In the third case, a Samsung employee used the Naver Clova app to convert a recording of a private meeting to text and then sent it to ChatGPT to prepare a presentation.

All this forced Samsung management to warn employees about the dangers of using ChatGPT. The company informed managers and employees that the data that ChatGPT receives is transmitted and stored on external servers, it cannot be “revoked”, and all this increases the risk of confidential information leakage. In addition, ChatGPT learns from the received data, which means it can disclose confidential information to third parties.

The company is currently working on safeguards to prevent similar incidents in the future. If such situations continue to occur even after taking action, access to ChatGPT for Samsung employees may be blocked altogether. Also, according to journalists, the company plans to develop its own AI service, similar to ChatGPT, but intended for internal use.
Sending
User Review
0 (0 votes)
Comments Rating 0 (0 reviews)

Daniel Zimmermann

Daniel Zimmermann has been writing on security and malware subjects for many years and has been working in the security industry for over 10 years. Daniel was educated at the Saarland University in Saarbrücken, Germany and currently lives in New York.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Sending

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button