Artificial Intelligence Data Safety Guidelines

Last Updated: June 20, 2025

Artificial Intelligence (AI) tools like Gemini, ChatGPT, and Claude can be powerful resources, but it is important to understand how these tools use your data so you can make an informed decision about which tool may be most appropriate for your data.

This document is designed to help you determine what data can be safely shared with AI while ensuring compliance with the 91³Ô¹ÏºÚÁÏÍø’s Information Security policies. It provides recommendations for safe data sharing based on four data categories: Restricted, Confidential, Protected, and Unrestricted. If you are unsure about how to classify your data, please contact Enterprise Information Governance

This guide offers general recommendations on data safety when using AI tools, but it is not exhaustive. As AI technologies, privacy policies, and terms of use continue to evolve, it is your responsibility to assess whether sharing specific data is appropriate and secure. If you are unsure, follow the Information Security Policies or contact aiguide@ualberta.ca before proceeding.

Why we recommend Gemini

Google’s Gemini App provides an institutionally-approved, free and secure tool to respond to the demand for access to generative AI as a productivity tool. Interactions with the Gemini App stay within the Google Workspace Cloud isolated to the U of A and are not reviewed by humans or otherwise used to improve Google's generative AI models. 

The Gemini App is currently the only generative AI platform that has been approved for use with 91³Ô¹ÏºÚÁÏÍø data. While it is recommended as a safe generative AI option, in terms of data privacy and security, there are still limitations on what data is safe to use. 

The table below provides more details on what types of data are safe to use with various types of data policies that AI tools commonly have. Other generative AI tools are not currently authorized for use involving University data and should be approached with caution.

If you are using AI tools for research, please connect with the Safeguarding Research Office (safegrd@ualberta.ca) to discuss the security considerations relating to use of AI tools to support your research.

Table: details on what types of data are safe to use with various types of data policies that AI tools commonly have
Institutionally Approved Tool* Tools Not Reviewed by the Institution
Restricted High isk High risk
Confidential Not advised High risk
Protected Approved Not advised
Unrestricted Approved Approved

*Institutionally Approved tools include Gemini via your CCID

Legend

Approved: This type of data is generally safe to use with the identified AI tool type that adheres to the specified data use conditions.

Not Advised: Some risks may exist, review the tool’s privacy settings and terms before sharing data. Before proceeding, ensure appropriateness of data use with the data steward or data owner, in alignment with the university’s records management policy.

High Risk:This type of data must not be shared with the AI tool.

Definitions

Restricted Data

This classification is for information that is extremely sensitive and could cause extreme damage to the integrity, image or effective service delivery of the 91³Ô¹ÏºÚÁÏÍø. Extreme damage includes loss of life, risks to public safety, substantial financial loss, social hardship and major economic impact. Restricted information is available only to named individuals or specified positions. (Examples include restricted spaces, credit card numbers, social insurance numbers and personal medical records).

Confidential Data

This is for information that is sensitive within the 91³Ô¹ÏºÚÁÏÍø and could cause serious loss of privacy, competitive advantage, loss of confidence in university programs or damage to partnership, relationships and/or reputation. Confidential information includes highly sensitive personal information. Confidential information is available only to a specific function, group or role. (i.e. personnel files, including personal salary data and third party business information submitted in confidence).

Protected Data

This is for information that is sensitive outside the 91³Ô¹ÏºÚÁÏÍø and could impact service levels or performance, or result in low to medium levels of financial loss to individuals or enterprises, loss of privacy, loss of confidence in university programs or damage to partnerships, relationships and/or reputation. Protected information includes personal information, financial information or details concerning the effective operation of the 91³Ô¹ÏºÚÁÏÍø. Protected information is available to employees and authorized non-employees (contractors, sub-contractors and agents) possessing a need to know for a business-related purpose. (i.e. grades, dates of birth and personal contact information other than university email addresses).

Unrestricted Data

This is for information that is created in the normal course of business that is unlikely to cause harm. Unrestricted information includes information deemed public by legislation or through routine disclosure or active dissemination. Unrestricted information is available to the public, employees and contractors, sub-contractors and agents working for the university. Or, where the information has not been made available to the public, if it were, it would not have any harmful or negative effect. (i.e. university email addresses, building names, program names).

Data Definitions and categories are defined in the Institutional Data Management and Governance Procedure.

Additional Resources