Doctor vs. Doctor
Case Study: Doctor vs. Doctor
In late 2017, my team was interested in creating a product offering that could leverage the competitiveness of physicians - but our existing products could not be repurposed to meet the need. I was tasked with coming up with a new idea and putting together a beta which could be debuted at a conference later in the year.
My team and I went through the exercise of understanding the physician mindset. We interviewed the physicians we work with and walked through a typical day for a clinician.
The takeaway was that while physicians are in fact competitive, they are also among the busiest humans around. I determined that any new product would have to demonstrate its value very quickly, and provide snack-sized interactions in order to capitalize on a short user attention span.
Research + Brainstorm
In terms of an application that provides competition within a professional skill, the first thing that came to mind was CodeFights. In that application, users are pitted head-to-head and have to solve coding problems while being timed.
I determined that we could create a head-to-head quiz game, but with hard-hitting content from our vast network of top medical faculty. From there, I performed competitive analysis on a number of trivia-based applications like QuizUp, HQ, and TriviaCrack.
I started with some initial quick mockups of what a quiz-like interface could look like, and developed a simple prototype for an application called MedFights. I put together enough views that our engineering team was able to create the beta, which we debuted at the American Society of Clinical Oncology event in summer of 2018.
With the momentum we had gained, we decided to rebrand the application and address some of the issues we found from the conference debut - in particular that on-boarding the user was too cumbersome. In the original design I was asking the user for a photo, profession and specialty before allowing access to the game.
Remembering from the empathy phase that users need to see the value as quickly as possible, I decided instead to allow users to register as a guest with as few fields as possible, and then complete their account later after trying the game.
I designed the logo for the newly-dubbed MedChallenge, shown below.
Along with my new team of designers, I also spent a good deal of time working on possible scoring algorithms for the application. After a lot of math and white-boarding, we came up with a new pattern which led to a redesign of the scoreboard component.
From our user testing at ASCO, we knew that users also struggled with understanding what content and challenge types were available. We decided against having a random option and moved toward a pattern where each challenge includes a title, challenge type and description. We also added a point incentive for users who challenge a top performer.
With everything in place for MedChallenge, we needed to see whether our decisions on gameplay and on-boarding would resonate with real users.
Testing with physicians can be very expensive, and it’s often difficult to get their time. Given that this application was so new, we wanted to validate things a bit more before investing heavily.
I realized that the gameplay would work so long as the user playing had some chance of knowing the answers. In other words, if the subject matter for the challenge made sense to the user, the scoring, answering, etc could be tested. On a few consecutive work days we loaded up MedChallenge with content related to people in the office, and were able to observe that:
The scoring algorithm worked
The game flowed nicely
Users did not want to stop playing. They were addicted.
It wasn’t a very productive day (for everyone else!)
Validated, we then took the application to one of the Medscape booths at the American Heart Association conference, where we performed usability testing live.
The application has been a resounding success, and we’re working on cultivating the existing user base.