Help Center

How to quickly get up and running with Kritik

How to convert your course content into Kritik Activities

Break large assignments down into smaller activities

Cumulative assignments can be transformed into smaller segments of peer evaluation that will ultimately help your students produce a higher quality final assignment.

How to:

Start by segmenting the steps necessary to creating the final assignment, and creating activities for each portion. For example, in a research paper, assign:

  • One activity in which the student presents their hypothesis
  • One activity in which the student presents their data collection methodology
  • One activity in which the student presents their findings and discussions

Turn Readings into Engaging Material

Students absorb an abundance of content through weekly readings, but they cannot fully exercise this until class or during their exams. You can transform this into opportunities for students to retain this information and extend their learning through peer evaluations.

How to:

Per each weekly reading, you can assign quick, frequent activities such as:

  • Creating a question based on the reading materials
  • Sharing notes and comments made on the readings
  • Teaching the readings in a creative way
  • Answering thought/discussion provoking questions
  • Creating a video explaining the contents of the reading

Homework Questions and Problem Sets

Homework questions are also a great repository for peer evaluation content. Not only can students evaluate solutions to questions, but they can also investigate and build on their peer’s thought process.

How to:

Assign activities per set of homework questions, and ask students to clearly outline their thought processes, formulas, and diagrams for peer review. Be sure to share solutions to the questions with the class as soon the deadline is terminated.

Labs and In-Class Activities

In-class teaching methods are easily transferable to Kritik. The beauty of using Kritik for labs and in-class activities is that you can reap the benefits of the concise timing of the activity scheduler, and prolong the discussion far after class time.

How to:

Set the deadline for creation shortly after the lab or in-class activity is done. The creation phase can be used to submit lab results or findings done in class. The evaluation and feedback stage are used as a discussion board for the different conclusions that your students have made through their findings.

Bloom's Taxonomy

Our templates for our activities include higher order thinking skills from Bloom's Taxonomy. Learn more about Bloom's Taxonomy here (HYPERLINK)

Create a Question

Students are required to formulate a higher order thinking question that is focused on course material. This question aims at assessing the complexity, depth, reach, and importance of the problem.

Create an Essay

Students are encouraged to write a controversial opinion or subject-based argumentative essay. Students are assessed on their thought clarity, accuracy, creative, critical thinking, source and evidence.

Create Content to Teach Peers

Students are asked to teach content to their peers in a way that promotes higher content retention among their peers. Students are assessed on the basis of content organization, relevance, clarity and knowledge.

Creative Communication

Students are expected to express the content of the course in a creative manner (i.e. through illustration, infographic, short video, or anything that helps to convey the message more easily than plain text). Creations are assessed by organisation, knowledge, text, readability, creativity and visual aids.

Best Practices

Best practices for creating activities and structuring peer evaluations

Validity

The outcome that you should look for in a peer evaluation is validity. What this means is that a student peer evaluation should mimic the same depth, thought process and insight as a professor’s evaluation. This is a clear marker of success, because a professor’s marking abilities are typically held to the golden standard. Not to mention that a valid student evaluation proves that grading automation is sustainable because it replicates that of a professor’s.

Reliability

Reliability is measured by the consistency among peer evaluations. Unless a piece of work is subjective, a collection of peer evaluations must point to a general direction in order to provide value. This can only occur when evaluations are consistent across the board, in terms of evaluation depth or given score.

Maximum Capacity of Words on Bodies of Work (Creations)

For written work, essays should not exceed more than 1,000 words. As you might imagine, students can provide more precise feedback on content that is shorter in length. This leaves less room for too much variation in evaluations, and students are prompted to make more consistent conclusions among themselves.

Maximum Capacity of Words on Evaluations

Even the number of words on evaluations should also be limited as well, in order to ensure that effective, regular and concise feedback is given. According to a study conducted by West Virginia University, feedback should not exceed 50 words or more.

Clarity of Criteria Given to Students 

Keep your rubrics clear and concise. Give examples and indicators of poor, moderate, and excellent bodies of work. In terms of transferring professional knowledge to your students, clarify your thought process as well as tips and tricks that you use throughout your grading process. Naturally, this boosts the validity of your student’s evaluations, as you help shape an evaluation process that mirrors that of your own.

Number of Evaluations Required by Students

An excessive number of assigned evaluations will exhaust the student and their time, which can deplete the quality and validity of the peer evaluations despite adequate training and instruction. At the same time, the accuracy of grading will be compromised if only one or two peer evaluations are provided for a given work. According to a few studies conducted by Georgia State University and Pennsylvania State University, the optimum number of individuals to review is between four and six.