This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 1 minute read

Does using ChatGPT count as a public disclosure?

Amazon has warned its employees not to share confidential information on ChatGPT. This is very sensible advice.

We have also noticed a number of articles discussing whether ChatGPT could be used to help applicants (or attorneys) with patent drafting, an idea we would strongly warn against. There is the knotty issue of whether the ChatGPT system may be trained on confidential input provided to it, and subsequently disclose that information in conversations with other users.

However, there is a far more clear reason to avoid inputting your inventions into ChatGPT. The FAQ for ChatGPT explicitly warns users not to share any sensitive information in their conversations with the chatbot, as the content of those conversations may be reviewed by the system as well as human trainers. The relevant section of the FAQ is shown below.

Making information available for OpenAI employees to review, regardless of whether they do or do not review it, is likely to be considered a public disclosure, as these employees have not entered a confidentiality agreement with the user regarding the invention. While there are some limited grace periods in some countries, generally, a public disclosure of this nature will mean that the invention is no longer eligible for patent protection.

It almost seems inevitable that Large Language Models (or their successors) will have a profound impact on almost all areas of life, including Intellectual Property. At the moment the hype is around ChatGPT in particular, but for now at least, we would strongly discourage using ChatGPT to help draft patent applications.

An Amazon lawyer warned employees about sharing confidential company information with ChatGPT


patents, artificial intelligence, data & connectivity, digital transformation