Protection, Equity and Specific Areas
Data Protection, Confidentiality and Information Security
From a good practice perspective, it is recommended not to introduce personal data of students or university staff into external Artificial Intelligence tools. Likewise, it is recommended not to include sensitive or specially protected data, nor confidential information or non-public institutional documentation.
Likewise, it is recommended that the use of Artificial Intelligence systems be carried out in accordance with the applicable data protection legislation and with the information security policies in force at the EHU.
When outputs generated by Artificial Intelligence systems are incorporated into official documents or institutional communications, it is advisable that they be reviewed and verified in advance by a responsible individual, in order to ensure their appropriateness, reliability and compliance with institutional criteria.
Equity and Access
When the use of Artificial Intelligence systems forms part of an assessable activity, it is recommended to ensure conditions of equity and access, through reasonable alternatives or institutional resources that prevent situations of inequality.
Procedural Safeguards in the Interpretation of the Use of Artificial Intelligence
Use of Artificial Intelligence Systems by Technical, Management and Administrative Staff
In the context of administrative functions, it is recommended that the use of Artificial Intelligence systems be carried out with due consideration, among others, to the following aspects:
- That it be conducted under effective human supervision, avoiding the full automation of administrative processes.
- That prior human verification be incorporated in official documents, institutional communications, and procedures with legal or administrative effects.