Five Questions to Ask Yourself and Your Team

There are five questions you and/or your team should ask when setting up a new study. These questions will provide you with a foundation with which to think through the study and its many facets.

  1. What do we want to learn? (Objectives)
  2. How will we collect and analyze data? (Methods)
  3. What will we test? (Materials)
  4. With whom will we test? (Participants)
  5. What will we do with findings? (Application)


1. What do we want to learn?

This question is by far the most crucial question to ask when setting up a new study. The answer(s) to this question will help you and your team define the study’s objectives. Objectives are essentially a list of what the study aims to accomplish. Objectives are important as they set the scope and expectations of the study. By knowing what you want to learn upfront it will be a lot easier to answer the other four questions.

2. How will we collect and analyze data?

This question will help you outline your study’s methods. Methods are the means by which you will collect and analyze the data needed to answer what you want to learn. Methods can be thought of as tools in a toolbox. Here, what you’re trying to accomplish will determine the tool(s) you’ll need to accomplish the job.

3. What will we test?

This question will help you identify your study’s materials. Materials are the actual medium you will be testing such as a prototype, wireframe, low-fidelity comps, an existing website, etc. Materials can also be the instrument of your methods. For example, the digital or printed surveys/forms you would like filled out. In some case, you may not have materials. For example, ethnographic research (user research) focuses more on the behavior of a given demographic without respects to a given product or service. The purpose of this form of research is to identify the goals of the observed demographic and to understand the hurdles they face in achieving them. Note, this form of research is one of the best ways to fuel product innovation.

4. With whom will we test?

This question will help you identify your study’s participants.

5. What will we do with findings?

This question will help you with application – specifically, strategy and prioritization.



Usability Testing

Usability testing is the task-based assessment of a design direction. Here, participants execute predefined tasks requiring them to interact with various elements, features, etc. of a design direction. Performance can be assessed in a variety of ways depending on what it is you are trying to find out or measure.


Example: Student | Dynamic Geometry Tools

Moderated 1-on-1s are excellent when your experimental design requires a great deal of control and when what you are trying to measure is quantitative in nature (such as time on task). Here, each participant executes the same predefined tasks on their own. Results are then used to formulate an unbiased average.


Example: Students | Studio Collaboration

Example: Principals | Usage Reporting Dashboards

Moderated Group Usability tests should be used when:

  1. The product / feature you are testing is collaborative in nature and/or requires multiple end-users to complete a task.
  2. You’re not as concerned with quantifying individual task performance as you are getting a grasp on the product / feature’s general learnability.

Number two is not as self-explanatory so I’ll provide more context. I have found that when you sit 4-5 students or educators together at a table (each with their own device) and present them with loosely defined tasks to complete on their own, they naturally gravitate towards helping each other complete the tasks. The benefit being the highly qualitative conversations and interactions that occur between the participants.


Example: Teacher | Classrooms Prototype

Remote, unmoderated usability tests allow participants to complete specified tasks on their own time in a setting convenient to them. Meaning, you designate the tasks and they complete them someplace else. These types of usability tests allow data to be collected without the need for a researcher to be present. However, I have found that many of the online tools for remote usability testing fall short in two areas: getting participants up and running and providing useful outputs/artifacts to the researcher. For example, if the steps just to get into the test are too complicated, participants will say f-it and drop off. Further, many online tools will try to sell you their ability to analyze the tests for you. Sorry, there is no one-size-fits-all analysis tool as interfaces and the objectives they allow users to accomplish vary in complexity. User behavior on its own is extremely complex and, if you consider yourself a good researcher, you should be doing the analysis yourself. Of course, this requires the tool to provide you with a recording of each participant’s session / interactions. Seems like a no-brainer but you’d be surprised how many tools out there do not provide this crucial artifact.


Observations are crucial to user research. Observations allow end-user behavior to be captured as it exists naturally in real-world form. Without observations, understanding of end-user behavior would be based entirely on speculation, hearsay, self-reporting, and other biased translations. However, capturing end-user behavior naturally can be tricky.

As researchers we want to capture quality data without being too invasive or perturbing the environment in which we are observing. Ideally, we must be omnipresent flies on the wall. The traditional field notes and pencil approach to observations (researcher observes over end-user’s shoulder and scribbles notes) is burdensome as it inserts the researcher too closely into the equation and, more importantly, limits the amount of data that can be collected to what can be seen, written, and remembered from a single, biased perspective. Which is why I encourage researchers to come up with clever ways to 1. remove themselves from the equation and 2. capture higher quality data from multiple, un-biased perspectives. This is extremely important especially when what is being observed is a complex, multi-end-user environment such as a classroom.

Remember, you can only observe something once. If you capture it correctly, you can review and analyze as many times as you like.


Example: 10th Grade Geometry Classroom | Math Techbook


Example: Educator Searching for Content / Media | Search

Orion: A Story of Innovation from the Perspective of a UX Researcher

This is a work in progress. My goal here is to tell a story. A story about the key UX research studies that molded my understanding of educator behavior and evolved my subsequent thoughts on how to accommodate the larger, unfulfilled needs of those educators. However, this story isn’t just about my research or my thoughts but rather how they slowly merged with those of the many brilliant and talented individuals in my organization ultimately bringing to fruition some pretty incredible innovations. Now, as I mentioned, this is a work in progress and at some point my writing will come to a dead stop. Writing is not exactly my strong suit – just ask my supervisor, J.S., who rarely sees reports from me in a written format. J.S. is an amazing boss and definitely a good sport for dealing with my particular style of communication. See, I communicate through visuals – mainly diagrams, animated gifs, videos, whiteboard time, and a whole lot of hand waving. – which you will see excerpts of scattered throughout this story. So let’s begin.

I started working for Discovery Education in the spring of 2013 and soon began conducting a number of studies focusing on how educators were using our services. These studies were meant to shed light on the pain points educators were experiencing with our services but it turned into a slow unveiling of the larger objectives educators were attempting to accomplish.

JUNE 2013

MediaShare 001

MediaShare was a tool that allowed educators to share DE content. However, a major pain point with the service was that it really only allowed them to post content to a mass audience of educators residing in their school, district, or larger DE network. What educators wanted to do was share content with smaller, more intimate groups of individuals – both educators and students. The animated gif and image below are from a presentation suggesting that what was needed was functionality that would allow educators to create their own “circles” of students and/or educators with which they could freely exchange content to serve a variety of purposes.

The analogy I made was that a Circle needed to act like Dropbox + Google Plus (which I later revised to be Edmodo). The thought at the time was that a system of circles could eventually evolve to be…

A network that visually promotes the interaction of specific individuals and/or self-assimilated groups of individuals for the purposes of collaboration, communication, PLN, content management* / sharing / assigning, and observation of student performance & critical thinking.

I didn’t realize at the time but this concept of a group which educators could easily share content with specified students and educators was the beginning of a much larger map of educator behavior and needs.

JULY 2013

My Content 001

In July of 2013 my research turned its focus to My Content. My Content is a tool that allows educators to archive and organize gathered DE content. See, educators like to reuse things so the ability to organize the digital resources (aka “content”) they have gathered saves them a lot of time in the long run. The purpose of this particular study was to identify ways to kind of overhaul the service in order to get it up to educators’ expectation. I won’t get into those exact findings here as they are irrelevant to the story. However, I do want to mention that this project was my first time ever interacting with my colleague the mad genius, J.F.. You will hear plenty more about J.F later on. Anyways, the insight I gathered from this study paired with that from Media Share 001 lead to the next evolution in thought.


Board Builder 001

Board Builder 002


Community 001




Asteroid 001


Search 001

Search 002



Digital Resource Management 001

Digital Resource Management 002

Global Navigation 001