Office of Undergraduate Research AI policy

The learning and creation of new knowledge through research is a deeply human venture. Researchers use a variety of tools to assist learning and creating, yet the responsibility for the work, learning, validity, authenticity, and accuracy always lies with the human researcher. Artificial intelligence (AI), specifically generative AI, has recently and rapidly become a tool available to researchers to enhance or expedite the research process. Generative AI is not, and cannot be, a substitute for the human endeavor in creating and learning.

The use of AI tools may also depend on the values of the researcher. As with many other tools or technologies, the use of AI has unintended negative consequences on the environment, generating and disseminating misinformation, social and professional productivity, and even the educational progress of users therefore, it is important that every researcher consider their use of AI tools carefully to be consistent with their core values. The University of North Carolina at Charlotte has expressed a commitment to the ethical and responsible use of artificial intelligence to enhance academic and administrative endeavors. The University’s vision is to create an institutional environment where faculty, staff and students can engage with AI technologies to enhance teaching, learning, research and operations, while upholding the highest standards of responsibility and integrity.

In support of the development of new knowledge, ethically and responsibly, the Office of Undergraduate Research (OUR) has provided additional guidelines and interpretation of the university vision, which applies to the application process, program requirements, program management, recruitment and identification of students, and the research that OUR sponsored students and mentors are doing. First and foremost, the Office of Undergraduate Research emphasizes that the student researcher must be able to understand, defend, and articulate their work, regardless of whatever tools they may use to facilitate that research.

The use of generative AI in a research proposal, application essay, or conference abstract, as well as in presentations of findings, posters, research or scholarly manuscripts or project reports, creative works, or even the program reflection or assessment, must adhere to the Code of Student Academic Integrity. OUR has adopted the following standards for any content produced or significantly assisted by AI-powered systems within the context of academic research, scholarship, or creative work.

Generative AI Use in Office of Undergraduate Research (OUR) Applications and Proposals or in OUR Supported Research, Scholarship, and Creative Works

  • When using AI for any purpose, integrate your personal knowledge and expertise. AI should enhance your understanding rather than be used as a shortcut that could limit the learning and benefit students receive from their assignments.
  • AI-generated content may be used as a tool to assist in the writing or research processes but not as a replacement for original thought, analysis, and critical thinking.
  • Students using AI for a research project, a research manuscript, or creative work must have documented prior approval from their faculty mentor for the project.
  • The final responsibility for the quality, accuracy, and academic integrity of a research project, a research presentation or manuscript, or creative work produced with the assistance of AI lies with the student. (See the Code of Student Academic Integrity)
  • If you cannot explain or demonstrate full understanding of the content of your proposal or project without AI support, you have gone beyond acceptable use.
  • If AI tools are used during the development of the research proposal, application essay, or conference abstract, their use must be disclosed in the proposal/essay and students must clearly identify which part(s) of their work was generated or significantly influenced by AI, making sure to provide proper citation and attribution where appropriate.

Generative Artificial Intelligence Use by Reviewers during any Review Process

To protect the intellectual confidentiality of the proposals submitted, OUR does not promote the use of generative AI tools during the peer-review process for analyzing or formulating peer-review critiques of grant applications or abstracts. Reviewers are asked to refrain from uploading any content containing intellectual property submitted to OUR into any generative AI technology. Only in a manner consistent with OneIT’s AI security checklist, OUR may use AI in the following manners:

  • In an effort to protect reviewer identity, OUR may use generative AI to summarize review content for student feedback.
  • OUR may use AI to organize and pre-categorize student applications to expedite the evaluation of application competitiveness within the applicant pool.

Examples from Purdue University Office of Undergraduate Research

Examples of Acceptable Uses of AI: 

  • Brainstorming and Idea Generation: Using AI to help explore potential directions or organize thoughts. This can be useful when determining a way to convey a complex topic to a generally educated audience. 
  • Editing Support: Using AI for grammar, spelling, clarity, tone, and readability improvements. 
  • Presentation Practice: Using AI to generate sample questions for Q&A or simulate practice sessions. 
  • Citation Checks: Using AI tools to verify formatting or suggest citation improvements. 

Examples of Unacceptable Uses of AI: 

  • Content Creation: AI must not be used to generate full sections of your presentation’s content or your abstract. 
  • Automated Design: AI tools must not be used to fully design your research poster or slides without substantial human input and editing. 
  • Data Fabrication or Analysis: AI may not be used to fabricate data or perform analyses that misrepresent the project. 
  • Unacknowledged AI Use: Any substantial assistance from AI beyond the acceptable uses above must be disclosed to your mentor and may require acknowledgement on your presentation. 

Additional Resources:

Student Guide to Artificial Intelligence (2025)

Council on Undergraduate Research Statement (2026)

Additional Resources for UNC Charlotte:

AI Task Force Report (2025)

AI at UNC Charlotte

OneIT Guiding Principles, Security Checklist, and Guidance

Adopted January 2026