Skip to main content
ChatGPT homepage

ChatGPT data security concerns

Submitted by stauffeg on

​By Mary Jones​ - OIT Security Compliance Program Manager

​​This awareness article discusses the concerns that are arising with the new AI application known as ChatGPT. The creator and owner of the application is the company known as OpenAI. The most concerning and pressing issue is the company’s vague privacy policy. This policy is creating major concerns for any research data or university personal information entered into their applications. There are limited restrictions in place on ChatGPT and similar AI apps for data that is being collected both automatically and through conversations.​

​​OpenAI has marketed ChatGPT as a tool that can be used to enhance productivity and wrangle complex datasets. AI apps have grown wildly popular overnight and have created heightened threats to data privacy. Users that share sensitive information, such as research data or personal information, may not realize that there is no guarantee for how their information will be used. According to OpenAI’s privacy policy, personal information can be used in numerous manners:​

  • To provide, administer, maintain, improve and/or analyze the Services;
  • To conduct research;
  • To communicate with you;
  • To develop new programs and services;
  • To prevent fraud, criminal activity, or misuses of our Services, and to ensure the security of our IT systems, architecture, and networks; and
  • To comply with legal obligations and legal process and to protect our rights, privacy, safety, or property, and/or that of our affiliates, you, or other third parties.

This type of policy verbiage provides the company with legal rights to collect your data and use it to their benefit. This means any type of data you place on their applications is theirs to capitalize on and utilize for personal gain. A couple of examples of how the company utilizes your data are:

  • If you wanted to use the app to summarize notes from a meeting and pasted your notes into the app, you would have inadvertently disclosed personal information to ChatGPT about colleagues and the university.  
  • If you wanted to share your research notes so that the tool could generate a summary for a presentation, you would have placed research findings into ChatGPT that might be discovered and used by other researchers or the general public.  

If you think that you can send a request to OpenAI to delete your history and information, it’s not as simple as you think. Identifying the origin of data shared on ChatGPT and similar apps can, in some cases, be impossible, and the company’s privacy policy gives no assurance that a user’s data can be removed after the fact.  

The takeaway here is to be extremely cautious with what type of data you share with ChatGPT. Once it is shared online, it will be there forever!  

If you are currently utilizing or considering utilizing apps like ChatGPT, the Office of Information Technology asks that you do not place any highly confidential or confidential information on the app. If you have any questions about what the university considers to be highly confidential or confidential information, please visit the data classification webpage for clarity or contact us directly at security@colorado.edu.

If you believe that sensitive information has been inappropriately shared with ChatGPT, or anywhere else online, please contact the OIT Security team immediately via the email provided above.

 

Additional Readings

OpenAI Terms of Use

The Wire “ChatGPT Has a Big Privacy Problem”

CTV News “Don't tell anything to a chatbot you want to keep private”