With the growing popularity of platforms like Chinchilla, YouChat AI, and OpenAI, students are curious to try these tools to complete their assignments and projects. This has seen panic among instructors to decide whether they should be embracing AI technology or not (Celik et al., 2022).
As instructors look for ways to remodel academic coursework that delivers an engaging learning experience, they also want to ensure students are not misusing this technology.
In this article, we will be covering the following:
- How can you tell if an Article was written by AI
- 4 Ways to Transform your Assignments for AI Technology
- 9 New Assessment types to Try out in your course
- 3 Tools that can Detect AI written content
How can you tell if an Article was written by AI
AI tools that are used to create text are unable to produce content organically and instead, they function as content generators that use natural language generation (NLG) and natural language processing (NLP) to create individualized content based on the prompt and description provided by the user. As a result, the content generated by AI is not always infallible or coherent. Although software-based tools are available that can detect the use of AI in content generation, you can also tell if an article was written by AI by looking out for certain signs.
- Repetition: Since AI lacks the capacity for human-like organic thinking (Korteling et al., 2021), AI-generated content is often riddled with repeated words and phrases. AI-generated content often relies on using keywords as frequently as possible, causing words and phrases to turn up in instances that appear unnatural and forced.
- Unverified data: AI platforms are known for sourcing data from forums with open access, which often includes blogs with unverified data. This data is repurposed for AI-generated content, therefore resulting in inaccuracy. Looking out for such discrepancies in articles can help you to understand if the content has been generated using AI.
- Unrefined content: The lack of organic analysis and refinement can be a beneficial lead in differentiating between organic and AI-generated content. Despite AI tools being immensely efficient in collecting data, organic analysis is not always an AI's strongest suit. This can result in unrefined content that hardly ever meets collegiate standards.
4 Ways to Transform your Assignments for AI Technology
By embracing AI tools for education, instructors can transform their assignments and create learning experiences that prepare students for the workforce.
1. Introducing peer assessment
By making students a part of the assessment process and giving them the opportunity to provide constructive feedback on their peers’ work, instructors are providing an efficient way of engagement. With the help of growth-oriented and skill-based rubrics, peer assessment can encourage students to turn in original and self-crafted work rather than depending on AI to complete assignments.
2. Incorporating creative displays instead of written work
Reframing the assignment type to accommodate creative displays for answering questions instead of written work can go a long way in helping students utilize their skills instead of depending on AI tools. By encouraging students to include personal experiences and opinions in their answers, instructors can ensure more original and well-thought-out submissions.
Here are some examples:
1. Record a video of yourself explaining your routine on a working day.
2. Draw your family tree and talk about 1-2 good values that have been passed on from each generation.
3. Creating assignments with open-ended questions
Open-ended questions demand critical thinking from students, allowing them to hone their analytical skills and rely on their academic prowess rather than using OpenAI-generated content. This will also encourage students to conduct individual research to showcase their perspectives and insights.
Here are some examples:
1. Explain what the future of war looks like.
2. How would you explain the idea of justice to an alien?
4. Including group assignments
Group assessments that require students to work as a team can ensure students stay true to their coursework by using their critical thinking, understanding of the assignment, and creativity. Instructors can include questions that require original ideation, recommendations and critiquing a problem.
Here are some examples:
1. Submit a 3 min video with your group with one solution to fixing the widening income gaps in countries.
2. Role-play a conversation with your partner to highlight how NOT to respond to an angry customer.
9 New Assignment Types Instructors can Try out
In addition to accommodating the permissible use of AI tools (Chubb et al., 2021), instructors should devise new assessment types. These assessment types should encourage students to rely on their intuition and perspectives based on what they’ve learned. Below are some new assessment types to reduce reliance on OpenAI writing bots.
1. Peer Assessment: Using peer assessment methods, encourage students to add to their knowledge by evaluating their peers’ work and seeing how their peers approached the problem.
2. Videos: Video projects offer a refreshing break from theoretical studies allowing students to display their perception and creativity to explain the topic.
3. Podcasts: Podcasts provide an opportunity for students to express their opinion and work on their presentation abilities uniquely.
4. Diagrams: Since diagrams rarely depend on written text, students cannot use AI tools to generate them, allowing them to conceptualize their knowledge and express it meaningfully.
5. Performances and debates: Class presentations, such as performances and debates, provide productive grounds for integrating ideas and learning through interactive sessions.
6. Self-Reflection: As self-reflection is introspective, it forces students to delve deep into the assignment to assess their work.
7. Case study analysis: Analyzing case studies encourages students to think critically through all possible outcomes with the given information.
8. Developing a Product or proposal: Asking students to create a product can help instructors evaluate the course's practical applications and understanding.
3 Tools that Detect AI-written content
There are also effective online tools and software that detect AI-generated content. Here are some commonly used AI detection tools:
GPTZero is an AI detection tool known for its streamlined UI, efficiency, and recognition digits of AI assignments. GPTZero has been updated with several unique features, alongside improvements in its detection.
Originality.ai is one of the AI checkers that academic institutions can use to scrutinize assignments for OpenAI writing. Originality.ai uses Machine Learning (ML) to scan and detect plagiarized content. With its advanced linguistic analysis feature, Originality.ai can prove useful as a potent AI-generated content detection tool
GLTR (Giant Language model Test Room) is an advanced AI detection software that remains unmatched in terms of its accuracy and efficiency. It provides in-depth reports on whether content has been AI-generated or written by a human. Although GLTR.io, as an AI-checker, offers limited features, it is quite efficient in detecting AI-generated content.
Implement an AI-driven peer assessment tool in your course
Although the rising trend of written assignments on AI has left academic institutions concerned about their legitimacy, Kritik can play an active role in bringing instructors closer to AI-supported pedagogy. By having an effective discussion on peer assessment, instructors can encourage students to see this as a learning opportunity while simultaneously regulating the rise of AI plagiarism.
Schedule a demo with Kritik today to adopt an engaging peer learning experience in your course.
Celik, I., Dindar, M., Muukkonen, H., & Järvelä, S. (2022). The Promises and Challenges of Artificial Intelligence for Teachers: A Systematic Review of Research. TechTrends, 66, 616-630. https://doi.org/10.1007/s11528-022-00715-y
Korteling, J. E. (Hans)., van de Boer-Visschedijk, G. C., Blankendaal, R. A. M., Boonekamp, R. C., & Eikelboom, A. R. (2021). Human- versus Artificial Intelligence. Sec. AI for Human Learning and Behavior Change, 4, 1-13. https://doi.org/10.3389/frai.2021.622364
Chubb, J., Cowling, P., & Darren, R. (2021). Speeding up to keep up: exploring the use of AI in the research process. AI & Society, 37, 1439-1457. https://doi.org/10.1007/s00146-021-01259-0