Methods

Problem Framing

Concept Testing

Storyboarding

Interviewing

Co-Design with Users

Responsibilities

Project Lead

User Experience Design

Wireframes

Research

Team

2 User Experience Designers

Timeline

6 weeks


The Ask

Peer review is a common step taken in English Comp courses, but it can be quite difficult for an instructor to manage.

We were looking at creating the peer review process from the ground up for a new writing tool to be used in college level composition courses. Overall peer review has been poorly executed within digital tools. Our team had an objective of streamlining and executing the basic functions of peer review as smoothly as possible, but we were also looking for new opportunities areas that our competitors had not explored. 


The User

For this project we had to design both the student and instructor experience. Those two groups are too large though, so we had to break it down even further.

For both groups we try to recruit based on experience level, university or community college, and gender mix. We also look at more specific traits for students such as major, year, and non academic lifestyle factors. For instructors we look at their title/experience level, class size, and other teaching responsibilities such as being a department head.

Additionally we had previously created some archetypes for both students and instructors from extensive research with both groups. This helped us home in and target more specific people in each group.

Students were much more complex. Since peer review is a group activity we really had to think about all of the archetypes here. We did opt to put a bit more focus on the strivers and laid-backs though. The reason we chose strivers is because we were finding that peer review stressed them out. They have a tendency to judge themselves based off their perception of their peers, and this exacerbates when they are in a group setting.

Additionally since the success of a peer review relies on everyone putting forth the appropriate effort, we wanted to make sure the laid-back archetype was being involved enough to help their peers.

For the instructor side we were primarily targeting the coach, with a secondary look at teachers and explorers. While all instructors use peer review, these 3 are the ones that get the most excited and involved in the process. We found that they were putting significantly more time into making the peer review process successful.


Current Situation

In addition to interviews we conducted remote design activities with 11 instructors. The goal was to understand the steps involved in conducting a peer review, and the pain points in the process.

Initially we had some hypothesis around what internally we thought the opportunity areas were going to be. Those included creating the groups, managing the quality of the work each student was putting in, and monitoring group progress. As you can see in the chart above we found out that our hypothesis wasn't far off the mark, and we had the benefit of learning in much more detail the nuances of these issues.


New Opportunities

We also uncovered a bit of an anomaly in the current state of peer review. It kept coming up over and over just how much prep time was used for peer review. The instructors kept telling us that they had to spend a lot of time trying to create community in their classroom so that the students would feel comfortable to do a quality review. For example, you couldn't conduct a peer review in the first few weeks of class, or the feedback each student would give would be very surface level and nothing that would the author improve their paper.

Interestingly all of this time being devoted to creating community was rated very highly on our As-Is scenario above. Instructors really enjoyed this part of their teaching, even though they were spending so much precious class time on it. We realized that while this wasn't a pain point, there was a really opportunity here to build upon a bright spot and bring new value to a peer review tool.

"You have to build that up well(community)...it takes time"

                                                                                                                                            Instructor, UT El Paso


Concept Testing

The next step in our process was to really explore that community building aspect to see what we could bring to our product that might really wow these instructors and show them some new value. While we were exploring this area we were also working on the known opportunities such as group creation and monitoring to make life easier in those areas. So we had a dual tracking of exploring new value adds while working on more of the common usability and experience problems.

We used storyboarding to test two of our new ideas just at a vision level to see if there was positive feedback from our users. The first storyboard dealt with the issues of managing and monitoring groups, and getting students engaged. The idea was for a competitive scoreboard so each group could see what the other groups were doing, and the instructor could also see how each group and student were contributing.

The second idea focused more on community building and getting students comfortable with critique. Each student would get the same sample paper to do a peer review on, then the system would compile all of the reviews from each group member into 1 view. This allowed the students to see what each other were commenting on to give them confidence that they were aligned with their group mates. Additionally it gave some early insight into how each group member worked, so the students would know what to expect.

These were tested initially with instructors, and at a later date with students. Feedback from both groups was overwhelmingly positive.


Peer Review

In addition to the concept work we continued to improve the usability of the core functionality of peer review. Over the course of 4 weeks we conducting 4 RITE studies, 2 with instructors and 2 with students. These helped us iterate quickly on the basic functionality of reviewing papers, giving feedback, receiving feedback, and monitoring progress.

Additionally we are looking into patent opportunities on the new concepts that were developed.

This project is currently being built and will be released as part of our Fall testing in 2018 and continue to be iterated upon through Spring 2019.