Methods

Problem Framing

Concept Testing

Storyboarding

Interviewing

Co-Design with Users

Responsibilities

Project Lead

User Experience Design

Wireframes

Research

Team

2 User Experience Designers

Timeline

6 weeks


The Ask

Peer review is a common step taken in English Comp courses, but it can be quite difficult for an instructor to manage.

We were looking at creating the peer review process from the ground up for a new writing tool to be used in college level composition courses. Overall peer review has been poorly executed within digital tools. Our team had an objective of streamlining and executing the basic functions of peer review as smoothly as possible, but we were also looking for new opportunities areas that our competitors had not explored. 


The User

For this project we had to design both the student and instructor experience. Those two groups are too large though, so we had to break it down even further.

For both groups we try to recruit based on experience level, university or community college, and gender mix. We also look at more specific traits for students such as major, year, and non academic lifestyle factors. For instructors we look at their title/experience level, class size, and other teaching responsibilities such as being a department head.

Additionally we had previously created some archetypes for both students and instructors from extensive research with both groups. This helped us home in and target more specific people in each group.

Students were much more complex. Since peer review is a group activity we really had to think about all of the archetypes here. We did opt to put a bit more focus on the strivers and laid-backs though. The reason we chose strivers is because we were finding that peer review stressed them out. They have a tendency to judge themselves based off their perception of their peers, and this exacerbates when they are in a group setting.

Additionally since the success of a peer review relies on everyone putting forth the appropriate effort, we wanted to make sure the laid-back archetype was being involved enough to help their peers.

For the instructor side we were primarily targeting the coach, with a secondary look at teachers and explorers. While all instructors use peer review, these 3 are the ones that get the most excited and involved in the process. We found that they were putting significantly more time into making the peer review process successful.


Current Situation

In addition to interviews we conducted remote design activities with 11 instructors. The goal was to understand the steps involved in conducting a peer review, and the pain points in the process.

Initially we had some hypothesis around what internally we thought the opportunity areas were going to be. Those included creating the groups, managing the quality of the work each student was putting in, and monitoring group progress. As you can see in the chart above we found out that our hypothesis wasn't far off the mark, and we had the benefit of learning in much more detail the nuances of these issues.


New Opportunities

We also uncovered a bit of an anomaly in the current state of peer review. It kept coming up over and over just how much prep time was used for peer review. The instructors kept telling us that they had to spend a lot of time trying to create community in their classroom so that the students would feel comfortable to do a quality review. For example, you couldn't conduct a peer review in the first few weeks of class, or the feedback each student would give would be very surface level and nothing that would the author improve their paper.

Interestingly all of this time being devoted to creating community was rated very highly on our As-Is scenario above. Instructors really enjoyed this part of their teaching, even though they were spending so much precious class time on it. We realized that while this wasn't a pain point, there was a really opportunity here to build upon a bright spot and bring new value to a peer review tool.

"You have to build that up well(community)...it takes time"

                                                                                                                                            Instructor, UT El Paso


Concept Testing

The next step in our process was to really explore that community building aspect to see what we could bring to our product that might really wow these instructors and show them some new value. While we were exploring this area we were also working on the known opportunities such as group creation and monitoring to make life easier in those areas. So we had a dual tracking of exploring new value adds while working on more of the common usability and experience problems.


Project Monocle

For the remainder of the project we immediately moved to mid-fidelity wireframes and utilized Invision to make a prototype we could test with our users. The UX designers worked directly with our researcher to use modified RITE testing to quickly iterate on our prototype.

This RITE method allowed us to make 3 iterations of our prototype per week, each version getting tested by our users. Not only did this method help us rapidly improve our designs, but by quickly acting on user feedback we built strong trust with our users. From one week to the next they were seeing their feedback being incorporated, and this kept them highly engaged with our team.

Towards the end of the project our team delivered a coded prototype that highlighted key features of Monocle. Making Monocle extensible was a key component, for the scope of this project we were dealing with 1 type of security update. However, we built Monocle in such a way that it had a framework that could be scale to include other updates such as OS updates. We had learned that these admins are foremost concerned with problems, so we included a layout and filtering that allowed the admins to drill into the problems immediately. A dashboard was designed, that was role based and could be customized. This meant that now more people could access Monocle and get job relevant information without having to go through an administrator.

The real power of Monocle was that we brought all the update information into the GUI, including dependency information. This eliminated the need for a user to search IBM's repository in an attempt to find the necessary information. This information was proactively served to the user inside Monocle. The user could also see any dependency relationships that needed to be taken care of, and schedule the updating process through Monocle. Then Monocle could automatically take care of the updates on it's own, while not requiring the user to give up more control than they were comfortable with. None of this was possible before Monocle. A few of the key screens can be seen below.


Results

This project was presented with our stakeholders, with several executives present. Our users also attend the presentation, and laid a strong business case for Monocle to be picked up and brought to life.

"When is the release date for Monocle? I already have reserved a portion of my budget to purchase it."

Due to the response and success of the project, this was the first IBM Design incubator project that became a hallmark project. This meant that our 6 weeks of work was so well received that a new team was formed to continue to build this into a real product that is released into the market. A team of designers was formed along with business and technical support to continue work on Monocle.

Additionally our design team was asked to present the work to some of the executive leadership of IBM Design as a success story. Our work and our process became an example that has been used by leadership to highlight the benefits of IBM Design Thinking throughout the company.

Additionally we are exploring 4 possible patents for the work done with Monocle.