Apple Inc. has cracked down on employee use of ChatGPT and other external artificial intelligence tools over concerns of confidential data leaks, according to The Wall Street Journal. Citing a document and sources familiar with the matter, the newspaper reported that Apple, which is working to develop its own artificial intelligence engine, has also restricted the use of Copilot, a GitHub program owned by Microsoft that generates software code. With the move, the iPhone maker joins a growing group of companies, including JPMorgan, Walmart, and Verizon, that have reacted with trepidation to the potential of artificially intelligent models. During a quarterly earnings call earlier this month, Apple CEO Tim Cook said that while “the potential is certainly very interesting,” there remained “a number of issues that need to be sorted” before the technology could be explored further. The news comes the same day it was reported that Microsoft-backed research laboratory OpenAI had launched a ChatGPT app on the U.S. App Store.
Read it at The Wall Street JournalTech
Apple Says Employees Can’t Use ChatGPT: Report
THINK AI-FFERENT
The company is concerned its staff could leak confidential data to the artificial intelligence tool, according to The Wall Street Journal.
Trending Now