Beyond AI-Detection: How VisibleAI Builds Future-Ready Graduates Through Authorship Transparency

As generative AI becomes inseparable from academic work, higher education faces a pivotal question: How do we prepare students to use AI responsibly while strengthening the human skills employers value most, such as critical thinking, creativity, judgment, and originality?

This question guided a recent Kritik panel discussion featuring:

  • Dr. Keely Croxton, Associate Dean of Undergraduate Studies, Fisher College of Business, Ohio State University
  • Dr. Andrew Reffett, Associate Dean for Educational Excellence, Farmer School of Business, Miami University

Both leaders agreed: AI does not replace the need for human thinking; it amplifies it.

Preparing students to thrive in an AI-integrated world requires more than new rules or restrictions. It calls for a renewed focus on developing human skills, promoting transparency, and helping learners engage critically with technology. VisibleAI, Kritik's authorship transparency tool, reveals how student work is created by distinguishing between AI-assisted and original writing.

The platform gives educators full visibility and control over how students use AI in their assignments: all edits are documented, and its built-in AI assistant can be configured to allow or restrict specific capabilities beyond those of a default chatbot. This enables instructors to guide ethical, responsible AI use while assessing student originality and fostering meaningful reflection throughout the writing process.

From “AI as a Threat” to “AI Literacy”

AI is now a core workplace skill. Employers are increasingly looking for graduates who not only know how to use AI tools but who can use them well, responsibly, and creatively. Educators must move beyond limiting or banning AI use and instead guide students in learning how to apply it ethically. It is essential for educators to be aware of both the benefits and challenges that AI brings to education, so they can make informed decisions and effectively support students.

At Ohio State, Dr. Croxton led a curriculum redesign intended to align with workforce needs and clarify the intended outcomes for students, centered on four pillars:

  • Know and apply: foundational business knowledge
  • Solve with insight: critical thinking and problem-solving
  • Build bridges: communication across roles and cultures
  • Grow with resilience: adaptability and grit

This methodology emphasizes durable human capabilities that AI cannot replicate, while also recognizing the importance of technological fluency. It reflects a broader pedagogical shift: one that positions AI not as a threat to learning, but as an opportunity to deepen it, provided that students are guided to use it thoughtfully.

By embedding these principles into course design, Dr. Croxton ensures that AI is introduced in context, connected to real-world professional scenarios, and framed as a tool that supports human insight rather than replaces it. Her work illustrates how AI literacy can be meaningfully integrated into core learning outcomes without losing sight of what higher education is fundamentally meant to develop: original thinkers, ethical decision-makers, and adaptable professionals. In the next section, we explore how these principles come to life through specific classroom activities that encourage reflection, critical thinking, and responsible AI use.

Watch Now: Highlights from Panel Discussion with Dr. Keely Croxton & Dr. Andrew Reffett

Preparing Students for a Rapidly Changing Job Market

AI is transforming industries faster than curriculum updates can keep pace. In fields like supply chain and auditing, tasks that once trained entry-level professionals are now automated. As a result, students must be prepared to take on more complex responsibilities from the start.

“Whatever I teach them today… six months from now, year from now, two years from now, it’s going to look completely different.” — Dr. Keely Croxton

Instead of teaching specific tools alone, institutions are shifting focus toward adaptability, problem-solving, and recognizing where and when AI adds value. VisibleAI supports this by helping students trace their decision-making, demonstrate how they used AI responsibly, and reflect on the value they contributed beyond what AI could generate.

Transparency in AI systems is crucial for building trust and understanding. When people can access information about how an AI model or machine learning models are created, trained, and how they make decisions, it helps students, instructors, and stakeholders better interpret and trust AI outcomes.

Balancing Innovation with Integrity while Leveraging AI Writing Tools

As artificial intelligence becomes more integrated into higher education, building collaborative learning communities is more important than ever. AI writing tools and automated assignment generators can help students improve content quality, streamline their workflow, and engage more deeply with complex topics. These technologies support a range of academic tasks, from organizing data to generating citations, giving students more time to focus on creativity and critical thinking.

However, the growing use of AI in academic work raises new challenges around academic integrity. Many students are still unclear about what counts as AI-facilitated plagiarism, and institutions are grappling with how to maintain originality in student submissions. Plagiarism detection and AI content identification tools are essential in addressing these concerns and upholding fair assessment standards.

Fostering trust in academic communities requires more than just detection software. It also involves creating a culture of transparency and accountability. By openly discussing AI use and teaching students how to apply it ethically, educators can reinforce the importance of original thinking and prepare students for real-world expectations.

AI also opens the door to more collaborative and engaging classroom experiences. AI-powered platforms can support peer feedback, collaborative writing projects, and the inclusion of diverse perspectives. These tools help students connect, engage, and explore ideas more deeply.

When used strategically, AI can enhance both teaching and learning. By combining the benefits of AI with clear guidelines for academic integrity, institutions can create future-ready classrooms built on trust, collaboration, and authentic learning.

Read More: How to Effectively Integrate AI Tools in Education

Why Transparency in the World of Artificial Intelligence is Essential

Generative AI can produce essays, reports, and analyses in seconds. This challenges traditional models of assessment and academic integrity. The goal is no longer to detect AI use, but to understand how it was used, how the content was created, and whether students engaged in critical thinking. Being transparent about AI means being honest about what a system is intended to do, where it fits with the organization's overall strategy, and how it is likely to impact people.

“It allows us to assess the process. And if we can get students to show us how they are using AI and how their own mind is crafting and enhancing what the AI is putting out, I think that is really important.” — Dr. Keely Croxton

AI is increasingly shaping students' lives, influencing how they learn and interact with educational content. For stakeholders to trust that AI is making effective and fair decisions on their behalf, they need visibility into how the models operate and the logic of the algorithms.

In a world where AI can produce a polished assignment instantly, focusing solely on the final product is no longer sufficient. VisibleAI tracks student engagement with AI tools, encouraging educators to assess deeper learning and cognitive effort. VisibleAI promotes academic integrity by highlighting the student’s own voice and decisions. This moves the conversation from AI prevention to AI accountability.

AI tools can improve grammar, structure, and clarity, but they also risk obscuring student voice and original thinking. However, research shows that when students combine AI feedback with structured reflection, they develop AI literacy, critical awareness of AI’s limitations, and stronger writing agency, supporting deeper learning and intentional revision (Sperber et al., 2025). It is important to note that AI can introduce privacy and security concerns regarding the data it collects and processes, making it essential for educational platforms to prioritize data security and confidentiality.

VisibleAI offers transparency into the writing process by showing:

  • Where AI was used
  • How much of the content came from AI
  • How students edited, revised, or enhanced AI-generated text

Instructors can view when AI tools were used during writing, how students revised AI-generated draft, and whether the student’s thinking is evident in the final submission. This supports a shift toward process-based assessment, giving credit for reflection, judgment, and iteration.

Download: Teaching Students How to Properly Cite AI

How Faculty Are Redesigning AI-Enhanced Learning

Faculty across institutions are exploring creative strategies to bring critical thinking and AI literacy into the classroom education. In this process, they are examining different aspects of AI-enhanced learning and ensuring that various stakeholders, including faculty, students, and instructional designers, are involved in the redesign of educational experiences. Here are four use cases discussed by Dr. Reffett and Dr. Croxton:

1. Critiquing AI Output

Rather than asking students to generate answers, some faculty ask them to evaluate AI responses as part of their academic assignments.

“They have the questions and then the faculty member gives the printout of the AI response… and the exam itself is to critique and add to the response.” — Dr. Andrew Reffett

As part of this critique, students are encouraged to assess the accuracy of AI-generated responses, ensuring correctness and reliability in the information provided. This strengthens students’ ability to analyze, refine, and improve AI-generated content skills that are increasingly relevant in industry.

2. Real-Time Rationales of Written Work

To promote accountability and discourage overreliance on AI, instructors call on students to explain their written homework on the spot. Students are required to respond to questions or prompts in real time, demonstrating their own understanding without external assistance.

“Different students are randomly selected to walk the class through the homework… without the aid of any sort of AI.” — Dr. Andrew Reffett

This encourages genuine understanding and preparation.

3. Reflection and Metacognition

VisibleAI helps students document and reflect on their use of AI tools, promoting deeper thinking.

“Having students reflect and then convey… what they received, what AI provided, and how they added value… That has very clear real-world applicability.” — Dr. Andrew Reffett

These reflections train students to think critically about both input and output, a skill employers increasingly demand.

4. Plagiarism Detection with Context

AI plagiarism checkers and similarity tools only offer part of the picture. Using a plagiarism checker allows instructors to review an article for originality, helping to identify unintentional use of others' words or ideas and prevent unintentional plagiarism before submission. Plagiarism detection tools play a crucial role in maintaining academic integrity, but VisibleAI takes it a step further by giving faculty a more complete view. The tool provides more context by showing how AI was used throughout the writing process, helping to distinguish ethical use from misconduct, and supporting fairness in evaluation.

Want more activity examples? Check out: Point Loma Nazarene’s Approach to Ethical AI Use in the Classroom

Key Takeaway: Durable Skills Still Win

While AI will continue to shape how tasks are performed, it cannot replace the human skills that drive innovation and leadership. Employers want graduates who can think critically, adapt to change, and contribute original ideas.

“What I really need to teach students… is… that they continue to stay on top of the developments.” — Dr. Keely Croxton

VisibleAI helps educators teach these durable skills by making students’ thinking visible and encouraging responsible AI use. It prepares students not only to work alongside AI, but to lead with it.

“The cut and paste isn’t going to work… and it’s probably not going to work in the real-world setting either.” — Dr. Keely Croxton

Want all the insights? 

Access the full recording of our panel discussion with Dr. Andrew Reffett and Dr. Keely Croxton below: Building Future-Ready Graduates with AI and Critical Thinking

VisibleAI for Academic Integrity at your Institution!

With a focus on AI visibility and AI transparency rather than AI detection, academic institutions can shift from punitive measures to a more constructive approach that embraces AI as a learning tool rather than a source of misconduct.

By redefining what academic integrity means and embracing AI transparency factors, institutions can empower students to leverage AI responsibly, ensuring that technological advancements serve as facilitators of learning rather than barriers to integrity.

Book a pilot with VisibleAI for your institution today!

References

Croxton, K., Reffett, A., & Shahini, M. (2025, October 8). Building Future-Ready Graduates with AI and Critical Thinking [Webinar]. Kritik. Available upon request from https://www.kritik.io/resource/building-future-ready-graduates-with-ai-and-critical-thinking

Sperber, L., MacArthur, M., Minnillo, S., Stillman, N., & Whithaus, C. (2025). Peer and AI Review + Reflection (PAIRR): A human-centered approach to formative assessment. Computers and Composition, 76, 102921. https://doi.org/10.1016/j.compcom.2025.102921

Heading