Study with Millie 2.0

team
1 product designer (myself!), 1 manager

skills

product strategy, product management, design systems, visual design, prototyping, interaction design, user research, information architecture

duration

2.5 months (jan 2022 - mar 2022)

🦄 Designing at a startup with no product

as a primarily service-based education startup, millie did not have a team dedicated to design. therefore, i was the first formal design member of the company.

as such, i was heavily involved in all parts of the design process. since the company was not familiar in the product space, i was also heavily involved in product management and strategy.

thanks to this opportunity, I dabbled in product roadmapping and feature prioritization, in addition to design system creation, information hierarchy, mapping out user flows and creating a MVP.

Project context

during january - april 2022, I worked part-time as a product design intern at millie, a NYC-based edtech startup.

this project was meant as a blue skies project, where i was given the freedom to do anything i wanted without considering technical or business feasibility.

i was the only designer on this project, and communicated product decisions directly with the CEO of the company.

The digital test taking era is here at Millie... but the process is a pain for administrators

test creation and facilitation is a huge manual process, and test grading is prone to human error. admins juggle multiple third-party apps to get things done

End-to-End Digital Test Platform

the new platform allows tests to be created on the platform itself, as well as automates the test facilitation and marking process.

Understanding the user

Understanding the humans behind the process

since the staff members had already been using the internal product for some time, talking with them could provide me with insights into their tasks, desires, workflows, and frustrations.

Research objective

Understand the test creation process, primary frustrations of test creation, and the role of the product in test creation

questions that i asked:

1

Are there any steps during the whole process that do not happen on the Study with Millie platform? what effect does that have on your productivity?

2

Can you walk me through how you completed [task]?

3

Does any step of the process while completing [task] stand out as confusing/frustrating to you?

4

What kind of information do you need when completing [task]?

Test facilitation and grading is a huge frustration

based on the initial round of interviews, it can be seen that the product was not fully utilized. this in turn resulted in unnecessary reliance on third party software.

millie staff expressed significant frustration during the room creation and assignment process and grading process.

these frustrations come from these common sources:

1

repetitive actions fuel boredom and increase in operation time

2

manual operations introduce human error due to lack of visibility of system status

Setting metrics

Defining success metrics to guide conversations

after conducting multiple stakeholder interviews, i established the following goals and success metrics:

1

reduce the number of third party applications

2

shorten the time needed to complete a task

3

increase task automation

4

reduce manual error when completing a task

Design Decisions

information architecture

Grouping business requirements and user pain points

because of how big the project was, it was crucial to figure out the relationships between the different product requirements first before diving deep into the design.

grouping relevant product requirements together gave me a starting point for brainstorming user flows and features.

1. Test creation

electronic test creation system
PDF upload system
the objective of this design is to achieve t business goal of providing a wider variety of tests while addressing the success metrics of increasing automation and reducing operation time established before.

2. Room creation and assignment

grouping by timeslot

I recognized that in the previous version, staff still had to go through a number of steps to set up multiple time slots and rooms. I explored reorganizing the test hierarchy to increase efficiency.

in this design, each room is grouped under a time slot. to create two rooms under the same time slot, the user only requires two steps - time slot creation and room creation, reducing the number of steps from the previous version by 75%.

grouping by room

in this version, i introduced the new concept of "time slots". within each test, staff is able to create multiple rooms, and assign time slots to each room. this way, staff no longer need to recreate entire tests based on time slots.

grouping by test

this is a sample mockup of the original design. here, staff create a new test for every time slot. within each test/time slot, staff can create even more rooms. not only is this time consuming, staff also has to take extra steps to conduct the analysis they need.

3. Grading

automated grading

due to the complete migration of the sat test to an online model, staff can now set the correct answer when creating a test. grading is automated, effectively decreasing operation time by at least 3 hours and reducing room for manual errors.

manual grading with spreadsheet

prior to the product, staff graded each student's answer manually. they also had to compute the absolute and relative ranks of each section manually.

given that there are more than 150 students taking each test, grading often takes up more than 3 hours of time, which can be saved through automation.

4. Section separation and timer

dividing test into sections

student pov

sections are now timed and graded individually. students are only able to access the section's questions when the timer for that respective section begins.

not only does this better simulate a proper test, but also allows staff to conduct analysis based on test sections more easily!

single pdf test

previously, tests were uploaded as a whole pdf.

this meant that students were able to access all sections of the test at once, increasing the rate of cheating.

it was also difficult for staff to separate data based on test sections, as it was not guaranteed that a test was done during the allocated time period.

Atomic design

Building a style guide one step at a time

as i was the first internal team member, there was no design system in place. all i had in my hands was less than 10 hex codes.

referencing chakra ui, i laid the groundwork for building millie's first ever design system.

Documentation and Hand-off

Outlining flows, limitations, and next steps

to hand off my designs, i compiled a 39 page document outlining how different users will use the product, what actions they can complete on each page, any error states, and any limitations.

Reflection

every month, millie would hold a mock sat diagnostic test for students all around the world. these tests are held online through zoom.

students take the test during one of their designated timeslots.

since there is a capacity for the number of people in a zoom meeting, millie staff may have to set up separate zoom meetings beforehand for the same timeslot.

for partner schools, yet another separate zoom meeting is used.

this makes it incredibly hard for millie staff to grade and rank all students taking the test.

in the original product, millie staff had to manually assign students to different zoom meetings. sometimes, students may receive the wrong link.

how might i

uwblueprint was tasked to provide a solution that allows planet read admins to manage volunteers and translations more efficiently

taking into consideration that the millie staff has low bandwidth for relearning workflows, i explored retaining the original test structure, but automating the room assignment process.

this is the final version after several iterations

this design automates the room assignment process and alleviates the issue of having to combine multiple submissions manually to analyze students' score.

here, there is an overarching "Test" page where the admin can set and create different timeslots. within each timeslot are different "rooms" that are separated into different pages.

this design ensures that millie staff are able to see all students taking the test, regardless of timeslot or meeting room. in addition, it also provides millie staff the ability to view which students are in each time slot and each meeting room.

in addition, the overarching "test" page acts as a centralized platform for millie staff to upload test papers and other supplementary materials.

millie staff has the flexibility to create new timeslots and new rooms with just one click. a new room is automatically created once all previous rooms are full.

millie admin no longer has to upload the same files multiple times, achieving one of the central goals of this project - reduction of operation time and manual error.

Learnings

📈 Never forget metrics and evidence

when scoping out the project, team members on the business team were reluctant to cut out several features that would be able to attract new users. however, it was impossible to build out all the great features the business team came up with during such a short timeline. clarifying business goals and feature prioritization using metrics and user interview insights made the communication between design and business a lot smoother.

🧱 Timeblocking really works!

when scoping out the project, team members on the business team were reluctant to cut out several features that would be able to attract new users. however, it was impossible to build out all the great features the business team came up with during such a short timeline. clarifying business goals and feature prioritization using metrics and user interview insights made the communication between design and business a lot smoother.

Next project