Turnitin Clarity Limited Pilot

Summary

While vendors often claim high accuracy of their AI detection tools, faculty experience and independent research have shown mixed results. OIT conducted an assessment of AI detection tools in summer 2025 and found that many CU Boulder faculty perceive these tools as inconsistent and biased. Reflecting these sentiments, the majority of CU's peer institutions in the AAU do not endorse AI detection technologies and caution against their use, citing both ethical and pedagogical concerns.

In response to faculty interest in transparency tools as alternatives to AI detection, OIT coordinated two vendor-led information sessions for Turnitin Clarity in October 2025. As an add-on to Turnitin Feedback Studio, Clarity is designed to support writing instruction by providing visibility into the student writing process. In a follow-up survey to the information sessions, some faculty indicated interest in exploring Clarity further to support student writing and academic integrity.

To address these instructional needs, OIT is launching a pilot of Turnitin Clarity for the spring 2026 semester. The goal is to assess Clarity's usability, pedagogical value, accessibility, and overall impact on teaching and learning. While this initiative reflects interest in transparency tools as alternatives to AI detection, adoption of Clarity is not guaranteed. Findings from the pilot will be shared with advisory groups and IT governance to help determine next steps.

Process

Launch

Once integrated in the LMS, the ATI team will launch the project with faculty participants. The team will:

  • Host vendor-led onboarding sessions for faculty.
  • Provide quick-start guides and troubleshooting resources for faculty and students.
  • Establish escalation paths for technical issues.
  • Monitor early usage to resolve onboarding issues promptly.

Analysis

During the pilot, ATI will:

  • Liaise with faculty and students to collect qualitative feedback through periodic check-ins.
  • Deploy mid- and end-of-semester surveys to gather faculty and student feedback.
  • Analyze data to evaluate the tool's usability, pedagogical impact and accessibility, including support tickets submitted through ServiceNow and logged with the vendor.
  • Prepare a summary report with recommendations.
  • Document lessons learned for future assessment and pilot initiatives.

OIT Technical & Business Operating Principles

  • User experience matters: Through the pilot, assess the tool's effects on the student writing experience and the impact on faculty workflow and grading.
  • Security is foundational: Keep the pilot small and controlled in order to allow for a thorough security review while minimizing exposure to risk during implementation and use.
  • Strategically use governance: Collect and analyze usage data and feedback to support leadership and IT governance in informed decision making.
  • Innovate where it matters: Address a critical instructional need by supporting faculty and students in navigating the challenges of AI writing in education.
  • Understand higher ed landscape: Further our commitment to supporting evolving pedagogical practices and student expectations in the context of AI and academic integrity.

Project Participants and Roles

  • Ann Ruether, Academic Technology Professional
  • Raj Singh, LMS Administrator
  • Rebecca Kallemeyn, Program Manager for Academic Technology Initiatives, Consulting & Training
  • Kortney Russell, Graduate Research Assistant
  • Jessica George, Academic Technology Professional