No time to waste, yet everything to gain...
When tasks in an educator’s day can take them away from their ultimate joy, we need to take stock. See the insights and designs that let educators interface more with their kiddos!
intro
A non-profit designed to teach underprivileged kids to read by grade 3 was not designed to monitor its own operations and determine how to continuously improve. That’s where the help of a UX researcher comes in handy to find the most time-consuming pain points, and designs new platforms to complete the mission.
Designed an educator resource site that reduced Time-on-Task by almost 50%. I used diary studies, user interviews and usability testing to gather data. Key deliverables were Time-on-Task vs. Favorability study, heuristic evaluation, wireframes, prototypes and a full list of requirements for future development.
the tldr:
Meet springboard collaborative!
Springboard Collaborative (SBC) is an nonprofit striving to bridge the literacy gap in grades K-3. Post Covid, realities in the educational space placed great pressure on nonprofits serving school districts to ‘cut costs’. But, where to cut while keeping or improving the UX?
I was tasked with improving and streamlining the experience of the educator, an integral part of the “secret sauce” at SBC. I had just begun gathering a picture of the user journey but there were many holes to fill. This is the study of one of those touch points and how we ultimately improved AND established a process to monitor and continue improving.
the problem
“We want a complete picture. There is so much we do not know. So much is service and we have no way of knowing just how [much time] any of it [takes] for our educators”
-Chief Product Officer
THE GOAL
Financially, the goal was clear. After finance and sales team analyzed competing organizations and the depletion of COVID era funding, the org needed to reduce costs by 45%.
A large driver of cost was hourly labor. Reducing Time-On-Task (TOT) for educators was critical. But we could not hurt learning outcomes for pupils, so I suggested focusing on tasks we could identify as “administrative”--I would define it as high cost/low learning impact. Management agreed, now the goal was less abstract and manageable.
THE hurdle
With ToT and educator experience (EdX henceforth) as clear levers to pull, I had to gain an inventory of all touch points and tasks in a 5 to 10 week period, a.k.a: A ‘Session’. I thus interviewed 3 internal SMEs.
Management first requested a survey to record the TOT and EdX for each point. Instead, I suggested a diary study of N=16 participants in strategic districts across the U.S. The diary method was approved. As a result, we got very rich, accurate qual AND quant data as our baseline for improving. So…first hurdle? The Educator Resource Site or, simply, The ERS.
the research
"The biggest problem is not knowing what you don’t know".
-Samuel Langhorne Clemens
WHAT I DIDN’T KNOW
Coming into this project, I knew that the greatest takeaway from this was the establishment of an accurate baseline. Where are we, and how do we achieve our overall 45% cost reduction goal–which is quite ambitious. I needed to know:
As accurately as possible, number of total tasks, with “administrative” and “impact” breakdown.
Were tasks repeated?
How much time did they take for seasoned educators? For new-comers?
Did unionized teachers take longer than non-union teachers…or vise-versa?
How much data did we already have? Analytics on user logs and support tickets?
I needed to ask the right question regarding task favorability. I knew from prior interviews that teachers positively correlate a task’s impact on student learning, with a task’s desirability. I needed to decouple the two concepts and focus on how well we aided them in effecting the goal of that task. I needed to know:What is the goal of each task?
Did they feel we set them up for success in completing that goal?
What systems or materials do we have in place to aid in the completion of that task?
Which did they feel had failed or succeeded? How? Why?
WHAT I learned
After gaining some clarity on the questions above from the SMEs, I could now conduct a diary study with close collaboration from our Partner Success Team—the team closest to programming. Here’s what I learned from the diary study:
-
of all administrative ToT was spent looking for materials in the Educator Resource Site (ERS)! It was also tied for the most repeated task. 6 of 14 touch points required educators to interact with it.
-
was the average time to find a resource on the ERS. An eternity, even for a content oriented page like the ERS.
-
The modal response for: “How helpful was the [ERS] in helping you find the resources you needed throughout your session?
(Likert scale of 1 to 5, 1 = lowest) -
Running a word cloud of diary study responses, this was the 2nd most frequent input. 1st was “Lesson Plans”, which at this point is not very useful--but will become so later!
Having conducted a diary study, the org now had for the first time a full inventory of the ToT for activities on and off of a screen. If it’s on a screen, site analytics or user-logs could help us, but so much of our offering was based on off-screen activities, we naturally had to many holes in our total picture. The following graphic was instrumental for showing which tasks needed the greatest attention and getting org buy-in:
The diary study allowed for a more accurate illustration of Time-on-Task AND the ability to gather nuanced feedback on the sentiment of completing each task.
Given that I am also a Digital Product Designer, I was entrusted with the redesign of the Educator Resource Site. The other ‘top three’ tasks were assigned to two other teams that now had abundant qual data to guide their redesigns! Diary studies are incredibly useful when you want to get a relatively accurate depiction of the UX over a period of time AND can’t be present on site.
the solution
“I don’t know... It’s like I see these documents so often but I’m never sure where anything is.”
-Educator undergoing SBC programming for the 3rd time
look and feel
OK, so first, a good heuristics and site audit! Oh my…no need for a usability test just yet. There were clear issues with the site upon inspection. See below.
On Figma I overlayed a (12) columnar grid on the site and it became clear that elements were not placed with much care for layout and alignment.
This exercise revealed:
-
No adherence to basic layout conventions and design best practices. Little adherence to a grid or alignment.
Issue: Visually and mentally taxing to the visitor -
CTAs were not clear. Abuse of a “button” as an interaction element.
Issue: Visually and mentally taxing to the visitor -
Section hierarchy followed the order of the “FELA” method, and potentially not according to frequency of use of each component of the method, where the most common should be placed at the top of the home page.
Issue: Not following user mental models and possibly pointing at fundamental IA issues. -
not depicted in the screen shot above, the destination pages were massive and contained a lot of sections, thus, educators ‘skimmed’ many pages to find resources.
Issue: Content not parsed correctly and again, suggesting IA issues.
A SOLUTION WILL HAVE TO WAIT. SOME MORE TESTING, PLEASE!
So, before we could consider a solution, we conduct some usability tests to uncover specific issues with the current design. The diary study helped point me in the right direction for what to fix. The Usability Tests would uncover why it’s broken.
Remote usability test to gauge where educators were getting stuck and causing them to take a whopping 6mins 23 seconds to find resources, on average.
I chose an N=4 test to uncover some preliminary insights, recruiting 2 returning and 2 new teachers. The diary study did have qualitative input that suggested that there was a difference in how new/returning educators were experiencing the site. I used the existing site without any changes for this test. Here is what I learned:
-
actually recalled the FELA* method. Yet, the one that did, could not describe it properly. They did not know the order of its phases.
Takeaway: Ordering the site by the FELA method, was NOT the way of organizing sections on the home page. -
or 11/16 tasks were completed correctly and 3 false positives (wrong material found with participants reporting high confidence finding correct materials!)
Takeaway: Size of destination page and navigation is not clear. Content reduction, modularization and IA exercises needed. -
of the click-backs were at the destination page level. 80% of the time was spent at the destination page. CONTENT OVERLOAD!
Takeaways: participant was arriving to the end of their navigation to learn that they were in the wrong section of the site–requiring a lot of time. We need to modularize content and improve our IA. -
little correlation in program seniority and performance. N=4 is small but no obvious difference in performance.
Takeaway: Site is or poor design/inconsistent. Users couldn’t learn flows. Potentially too many clicks in order to reach the destination page. Designs must be limited to “3 clicks to find.”
*FELA Method: Family Educator Learning Accelerator. It is a series of steps taken between families, educators and students that help all interact more effectively.
It became likely that my task was more than just layout and aesthetic improvements. We would need to modify terminology, content structures and Information Architecture. Essentially, a full revamp. Prior insights, and the following, informed the decision making for our design:
DESIGN DECISIONS
-
During the diary study, I learned that educators MOST used the site to find student instruction materials, followed closely by family interaction resources. These needed to be above the fold in the redesign.
-
The links that finally led to the destination page were ‘titled’ in the format above. There was no correspondence to ‘lesson plans’. Recall that that was the most common word/expression in our diary study word cloud… This was a clue to how we may need to structure our content: Structure our content as educators structure their teaching.
-
Diary studies revealed what is almost inevitable in design: If your design is inadequate, people, will often “satisfice”until abandoning, meaning that they start to find workarounds and then quit. There was in fact a spreadsheet circulating that served as the workaround. They said: “The worksheet shows my progress, I need to know where I’m at” A progress visual and a grid design for the units and lessons may ultimately be integrated.
I synthesized my findings, got approval for a card sort exercise and workshopped some wire-frame candidates with my team until settling on this design:
The most important feature here is that the two most commonly visited parts of the site were revealed by the diary study to be Student Instruction and Family Engagement. These were placed prominently to appear above the fold. It included a first attempt at a grid layout for lessons to mimic educators’ mental models.
After arriving at this wireframe, we then proceeded to mock up and test the following iteration:
There were some ‘mishaps’ where the educator was choosing the wrong grade level and confusion regarding what this page contained. I made some small ‘tweaks’ to the design and arrived at this prototype for final user testing.
A final (N=6, tasks= 4/ea.) prototype testing lead to:
-
of tasks completed accurately, where all reported high confidence in finding proper materials.
However, 3 of 24 tasks not properly completed and all reported proper completion…
There is proper use of terminology, the content is quickly identifiable, CTAs are clear and affordance of navigation elements comes through strongly. -
No participant went down a false path
Only two participants performed click-backs on a task but eventually pushed on and found they were on the right track. -
and 7 seconds--the average time for finding the proper material. This is a better than 50% decrease in time navigating and identifying material.
It was better than expected. The portion of navigation that saw greatest improvement was time on destination page--CONTENT DESIGNERS’ REDESIGN IS CRITICAL! -
is the change in mode responses between old and new sites when asked: “How helpful was the ERS in helping you find the resources you needed” with the lowest score being 4.
Although not yet just right, this was ample evidence that this resource prototype was a step in the right direction.
*FELA Method: Family Educator Learning Accelerator. It is a series of steps taken between families, educators and students that help all interact more effectively.
the takeaway
“I’m sorry in advance. I know I will be wrong, but I don’t know about what…”
-Ronald Ricardo (yours truly)
be a thought partner
While discussing a method for fleshing out our educator user journey, management suggested a survey. We had a walk-through on the reasoning and I gathered that they wanted a strong Time on Task baseline, but also a strong sense of ‘what [educators] actually did, heard, saw and thought’. There was an emphasis in filling in the gaps of our journey maps, and I sensed a desire to bring all design teams up to speed on the full user experience.
I suggested that we should invest a little more and conduct a diary study with a smaller, carefully curated participant sample, rather than a large sample’s random approach. The cost increased ~$1K, and our budget easily supported the difference.
I said: “Even with open-ended survey questions, I can’t ‘dig’ like I can with an interactive diary study where I chat with a participant for 5 to 10 weeks”. I cautioned against a survey that would take place maybe 13 weeks from a task’s completion, then hope the participant recollects details such as time and ultimate experience resolving tasks. The fact that internal SMEs identified 14 touch points and 33 distinct tasks, necessitated a survey with at least 2 data points per task. A 66 question survey for a $25 compensation would lead to survey fatigue or abandoned surveys. I am glad management and I both regarded each other as thought partners, arriving at an optimal method, yielding rich data.
a word of caution
Once the designs were submitted and all the requirements approved, I included a recommendation for a Ways-of-Working design flow for our team. It included designating a team to monitor the performance of our deployed designs with the clear establishment of KPIs to track. This unfortunately was not adopted. There were concerns that our designs would ‘never end’ and that staff would be bogged down by an ‘analysis paralysis’ and spreading the team too thin. This made sense and I suspect that this may be the sentiment of teams everywhere. The linear design that you just witnessed above could be considered a bit naive. I am proud of the design that I submitted, and I stand by it given the info that I had. IT WILL BE PROVEN WRONG to some extent. There are aspects that will fail, like any design.
Another concern I had was that the ERS is one of three different systems that educators must interact with. In my deliverable, I suggested investigating how to merge all these systems, thereby putting the existence of the ERS as a standalone platform into question. I left this project knowing that there was still work to be done, but confident improvement had been achieved.
other case studies
A story of a startup not letting vacation planning to chance.
This story began with a founder wanting to find out what to do when Friday rolled around. Research revealed that, if he faced a challenge, it paled in comparison to those traveling abroad.
“Well, frankly, you’re the worst, and it’s not close.”
I put on my UX Researcher hat when talking to some clients at a trade finance bank. Asking one of them: “So, how does this bank stack up versus its competition?”