ChatGPT Edu feature reveals researchers’ project metadata across universities (exclusive)
ChatGPT Edu feature reveals researchers’ project metadata across universities (exclusive)
A configuration in Codex Cloud Environments lets thousands of colleagues see repository names and activity linked to ChatGPT accounts.
[Photos: HyperlapsePro/Adobe Stock; Jonathan Kemper/Unsplash]
BY Chris Stokel-Walker
High-level information about the private work of students and staff using ChatGPT Edu at several universities can be viewed by thousands of colleagues across their institutions due to a misunderstanding of what is being shared, according to a University of Oxford researcher who identified the issue.
The problem affects Codex Cloud Environments in ChatGPT Edu and exposes the names and some metadata associated with the public and private GitHub repositories that users within a university have connected to their ChatGPT Edu accounts.
No private code or repository data was exposed to unauthorized users. Nevertheless, the metadata that is visible can still reveal a meaningful picture of users’ activity.
“Anyone at the university, or a large number of people at least—including me—can see a number of projects [people have] been working on with ChatGPT,” says Luc Rocher, an associate professor at the University of Oxford, who identified the issue and raised it with both the University of Oxford and OpenAI through responsible disclosure. He later approached Fast Company after what he felt was an inadequate response from both.
In addition to the projects, Rocher says he could see how many times users interacted with ChatGPT on a given project and when those conversations began. From that metadata, Rocher was able to piece together that an Oxford student was working on an article for submission using OpenAI’s tools—something the student confirmed when Rocher approached them.
“In terms of the width of different people that can access each other’s behavioural data, that is quite worrying,” says a separate University of Oxford researcher, who was granted anonymity by Fast Company to speak freely about their employer. However, the researcher acknowledges that the data exposure is internal and, while broad, limited in depth. “I suspect that might be why the data protection team haven’t reacted as quickly as if it was a public-facing thing.”
However, the researcher calls the institution’s lack of response “naïve.” They add: “There are reasons for researchers to have private repositories.”
Artificial Intelligence
Claire's went from tween mall icon to bankrupt — twice?
