SAAS Enterprise Software / User Research / UX + UI Design / Lean UX / Product Strategy
Duration

May 2021 - Present

Role

Product Designer | UX Researcher

Company

Papercurve

team

2 Designers | 3 Engineers | 2 CS Reps

Tools

Resources: GV Sprint, W3C Web Accessibility, NN/g   |   Design: Figma, Illustrator, Principle, Webflow | Research: PlaybookUX, Full Story, Mix Panel, Zoom  |   Ideation: FigmaJam  |  Collaboration: Jira, Confluence, Zoom, Slack, Google Meets

Overview
Company Brief

Papercurve is an AI-powered Content Lifecycle Management platform for life sciences companies to streamline their MLR review process. Woven into the DNA of Papercurve is to make “software disappear” through the emphasis on good UX in a sea of non-user friendly enterprise software.

My role
Design

End- to End 8 Full Features (so far)

12  Product Fixes (so far)

Strategy

Product Strategy and build roadmap with PM & CEO and present decisions to company

Collab

Collaborate with cross functional teams (Product, Eng, CS)

Work 1:1 with engineering counterpart


Present

Synthesize research and report in company meetings


Manage

Improve company processes by creating standard practices and work templates


Strategy

Track and report success metrics for my features


Research

Conduct UX research on current and future features 

Specification

Write technical specs on Jira tickets & feature specs on Confluence

Webflow

Design new pages on Webflow pages for company site

Workshops

Run design workshops for larger features

Demos

Demo lucrative feature prototypes to prospects and clients

Maintain

Maintain and improve Design System

Case Study

This Case Study details 2/8 of the features I designed, each to illustrate a different point.

Comments Search, Sort & Filter
To show my design process when everything goes smoothly

Website Review
Designing with technological constraints and unforeseen challenges

Process
001 Comments search, filter & sort
Problem

Users did not have the ability to search or filter through comments on any given document using Papercurve, this is troubling because they would have to use poor work arounds or manually search for a specific comment when there could be 100s of them.

Narrow Down Comments By Filtering For Date & People

Users are able to narrow down comments by using simple date parameters and by reviewers who commented on the content

Rationale

The simple date parameters were chosen over specific date selection because it is faster and less cognitive load. For the same reason, the people filter features only reviewers who have already commented to lessen a potentially massive list

Find Comments With Searching Keywords

Users can search through comments by typing in any keyword.

Rationale

Our research indicated that 30% of people use a search box when one is available and the searched terms remaining highlighted when the thread is open lowers the user's cognitive load by not having to look for it again.

Sort By Relavant Method

Users are able to sort comments according to their need instead of being stuck with the default sort.

Rationale

Enabling users to sort by other methods allows them to organize comments according to the need at hand.

Comment Feed Curated Just For Me

The "For Me" filter provides a personalized feed of comments.

Rationale

Our research indicated that users mostly care about comments that are relevant to them. This enables users to see just that with a single click. This feature is inspired by Slack's "Threads" feature.

UI Considerations

  1. Dropdown highlights selected filter when active so users know which option is selected
  2. The filter pill has various states indicating system status
  3. The left line in the 'threads' sections create hierarchy
  4. A red dot appears when a new reply is added to a thread to visually notify the user of new reply.
Research
User Interviews

Our CS team records complaints, insights & requests along with the associated contact. So, I reached out to some of the contacts to find out what problems users are facing with comments as it exists now. What I found is that right now, if a user is trying to find a specific comment they can do so in 3 ways, none of which are good:

Key Insights
01
Manually search through with the comments panel on the side bar
This takes users along time and is tedious
02
Use the left side panel (Legacy UI)
This doesn't show the comment itself (only the author) and there is technically a search bar but doesn’t work, it doesn’t show general comments, and it mixes up references and comments
03
Use command F with Chrome
This works but is a Chrome feature not facilitated by Papercurve 
Define
Business Objectives

Before I dive into anything else, I  identified the business goals with my Product Manager to ensure my later design decisions are aligned with them.

01
Increase Usefulness ASAP
Comments is already among the most used features, so creating value in a high traffic feature makes Papercurve more "sticky" and visibly valuable
02
Increase productivity & personalization
The ability to search, sort & filter comments should decrease the time users waste filtering through comments to find the few relevant ones
03
Lower cognitive load in searching comments
Currently, finding comments is cognitively intensive; lowering it is a huge win because it allows users to expend that time and energy more productively
"How Might we?"s

Using what I learned for my user research and conversations with customer success, product and engineering team, I listed out a summary of our users' pain points, technical constraints and business priorities and then conducted a HMW exercise with the design team.

Efficiency of Tasks scatter plot
Ideate
Affinity map

Then, I organized the HMWs in an affinity map with a series of potential solutions with the design team.

Efficiency of Tasks scatter plot
Design Patterns Research

Before I began my design phase, I looked at various applications for design flow, patterns, and concepts that would be relevant to my design. By doing this, I can cater to users' existing mental models and use safe, well research UI/UX concepts used by established platforms.

Efficiency of Tasks scatter plot
Early Wireframes
Prioritize
Value-Complexity Matrix

I wireframe the concepts (see above) and used it to communicate concepts to engineering, business development & my PM. Then, together we evaluate the concepts by placing them on a Value-Complexity Matrix based on business goals, desirability/value add & technical complexity.

Efficiency of Tasks scatter plot
Design
Efficiency of Tasks scatter plot
Feedback Loop

My design process consist of designing and receiving feedback in iterations from the design team for critique and engineering for technical feasibility and on later from CS, Sales and other teams.

This feedback loop enables me to iteratively go from wireframes, to a high fidelity prototype. The image below is a sample of my iterations:

Usability Test
Process

Io make sure there were no usability issues, I conducted an unmoderated usability test using PlaybookUX with 5 participants who work in Marketing, Sales or Brand Management at Life Sciences companies.

Tasks

  • Find a comment using Search
  • Find a comment using a Filter & Sort
  • Guess the expected behaviour of "For Me"

Questions & Metrics

  • System Usability Scale Questionnaire (SUS)
  • How useful do you think is the "For Me" filter is (1-5 scale)?
  • How and why do you think "For Me" is used?
  • If you could change anything about this experience, what would it be?
  • What is the most amount of comments you would find in a document you are reviewing
  • How do you search or a specific comment on the platform that you currently use?
  • Time Per Task
  • Error Rate
Key Insights
01
All participants completed first task correctly and quickly
02
Guessed what "For Me" does accurately
03
Commented "For Me" is a useful feature they would use frequently
04
Found the over all feature straightforward and intuitive
05
Commented on the simplicity of the UI
06
Missing highlight in the dropdown when a filter or sort is selected
Overall Takeaway

This usability test was an overall success and that meant it was ready to prep for development!

Handoff to development + wrap up
Efficiency of Tasks scatter plot
Update Design System

Though I make every effort to use existing patterns when creating new designs, to reduce engineering effort and follow users' existing mental models, new patterns are sometimes inevitable, so I have to update the design system to include the new components.

Success Metrics
# of Comments Left

Total number of comments created has increased by 62% in a 3 month period
Feature Usage

68% of active users use search & filter
81% of 'super users' use search & filter

My final step is to go back to the business objectives and select success metrics. Once chosen, I keep track if them and report it in 3 months time.

002 Website Review
Problem

Currently, users don’t have an easy way to collaborate, review and approve website content. The only workaround is either screenshots, which do not allow highlighting of text or interactivity, or PDF exports which requires external tools and can be inaccurate.

Easily Add New Webpage

Users are able to add new webpages by simply inputting the URL.

Rationale

Our research showed that website review is done on many URLs at a time and our low tech users want the simplest solution possible. This list also ensures all webpages in review are seen by reviewers.

Review Desktop, Mobile & Tablet Mode

Users are able to review the responsiveness of their website with 3 of the most common screenwidths by clicking the screenwidth icons.

Rationale

Our research indicated that users conduct review primarily on desktop but mobile and tablet views are also checked. Our users are also not very tech savvy, hence the 3 simple icons. Also, I selected the screenwidth PX sizes based on stats for most common screenwidths globally.

Interact With Website On Browse Mode

Users are able to click around and interact with the website using browse mode.

Rationale

A website is an interactive medium, so a sufficient review cannot be done without being able to review the interactive elements. Users go into browse mode by clicking on the cursor icon, a pattern already familiar to users today.

Give Feedback with Annotate Mode

Reviewers are able to leave feedback using annotate mode by selecting the pindrop icon, putting the pin on the webpage and then typing up a comment.

Rationale

The pindrop flow to leave comments has not changed for website review so it is a familiar flow to users, thus matching their mental models.

See All Feedback Organized in Comments

Users are able to see all the feedback in the comments panel which is categorized by page name and the accompanying screenwidth.

Rationale

All the feedback (regardless where of it is left) lives in one area to make sure nothing is left uncheck. If one needs to filter further, they can use filter feature which now sports a new screenwidth filter.

Website Review Process
001 Research
User Interviews
Key Insights
01
Tediously Reviews websites on spreadsheet
Did everything via an excel sheet by tediously inputting for ex. “webpage name, line two: typo, change to “_”
02
Uses PDFs for reviewing non live websites
Before site is live, review is done on the PDF version, and when it goes live, review becomes a very manual process.
03
Conducts Multiple Rounds of Review
Website may go through multiple rounds of review before and after development depending on the project, but not always.
04
Needs to check multiple URLs  
Many webpages are reviewed at once (up to double digits sometimes)
05
Checks responsiveness
Often needs to do a mobile and tablet view check
06
some websites are Password protected
This is due to unreleased content, presence of a paywall or confidential content
Website Review Process
002 Research
Technical constraints
01
constraint by Proprietary Software
Due to use of proprietary software (PDFTron), I was constraint by its technical limitations of which there were plenty
02
webpage height cannot be auto detected
Has to manually guess & check until the webpage height is detected
03
webpage load time is quite slow
Especially problematic when guessing and checking height
04
webpage is not interactive
Cannot review website interactivity (dropdowns, hover states, videos)
05
Cannot detect if website is blank or an error page
No way to detect if website is blank or an error page before loading
06
loading updated webpage versions is problematic
Each height needs to be checked manually incase content is added
07
Cannot Get through password protected websites
Website with password input cannot be reviewed past the login page
Website Review Process
003 Research
Competitive Analysis

I found 2 products in the market designed for website review: Userback.io and Markup.io. I then investigated the platform for patterns, weaknesses/opportunities and how they overall solved the problem. Some features that stood out were:

Video feedback | image and video attachment | number ID for each comments | browse & annotate mode | desktop tablet and mobile view | naming pages with HTML page name | slack and jira plugin | share as pDF

Website Review Process
001 Define
Proto-personas
Website Review Process
002Ideation
Early wireframes

Using screenshots, I identified the points where the regular upload flow would deviate and did some low fidelity mockups that were just good enough to communicate to my team how these differences could manifest.

Website Review Process
002Design Iterations -New Website Content
Add New Content

I did several iterations of design for each aspect of the feature to come to my solution, here are some iterations for the add new content flow with some feedback notes.

Website Review Process
003Design Iterations -Adding page
Preview Mode v1

First version of adding page in a 'preview mode' had a visual separation between the adding webpages flow and doing the actual review but ultimately was retired due to not being very user friendly due to lack of affordance on key inputs and creating needless engineering effort.

Website Review Process
004Design Iterations -Adding page
no preview mode

Next I experimented with the 'add page' flow in the main viewer. This flow uses a familiar pattern to users, has the error prevention & affordance needed and also lower engineering effort. However, it brought with it lots of error states and technical complexity.

Website Review Process
Usability Test - Cognitive Walkthrough
Key Insights
01
Load Latest Versions is problematic
Although participants accurately understood the functionality, watching them go through the task brought up a number of potential problematic use cases.
02
add New webpage without preview mode created Complexity
The current adding new page flow is missing a loading state that cannot be done elegantly on the main viewer.
03
Upload button is not accurate anymore
Participants got confused by the upload button in the beginning because they are not "uploading" the website.
04
Metrics show Tasks were done correctly & quickly, but...
Participants did assigned tasks with accuracy & speed while on the happy path, but the problems arised with participants who did a cognitive walkthrough where step by step instructions were omitted.
Overall Takeaway

Load new versions needs to be descoped, iterate on add new page flow and change the upload button copy to have more affordance.

Website Review Process
006Design Iterations -Adding page
preview mode v2

Based on feedback, a new version of 'preview mode' was brought back to the add page flow. Testing showed this flow solved the issues found in the previous iterations.

Website Review Process
007Design -Load Latest Versions (Descoped)
What happened?

Lots went into figuring out how to facilitate multiple rounds of review & tackle versions, but due to the technical constraints, it had to be descoped until the technology improved.

Website Review Process
Handoff Prep
Usability test 2

The second usability test showed that other than a few minor changes, the MVP was ready for development.

Ticket

I created all the secondary screens, empty states, edge cases and error states and then wrote the engineering tickets.

uh oh... Technology update
PDFTron Update

The day after I wrapped up ticketing, we got notice from PDFTron that the software has been updated with significant changes. Engineering then created a demo site to test with. Using that, I determined the new constraints and opportunities this update presented:

New Opportunities

  • Website is interactive now
  • Adding webpage flow MUCH simpler
  • No more manual height and width selection
  • Toggle between screenwidths
  • Browse mode & Annotate mode
  • No need for edit webpage flow
  • No need to load new versions

New Constraints

  • Tracking comments when browsed away from initially loaded webpage is not possible
  • Browse mode pose some usability challenges
  • Review is done on live website so comments may be out of date if website is updated
Redesign
User Flow Update

I them took a look at the current flow and identified what needs to be changed, deleted and added.

Usability Test Insights
01
Average time to add a webpage cut in half
Metrics showed the average time for the 'add a webpage' task cut down round 50% from the initial design
02
Correctly understood each feature element and its function
Post-test questions showed an understanding of each element of the feature and its value
03
Commented on the intuitiveness
Participants commented verbally the feature was overall comprehensive and intuitive in post-test questions
Finalize Feature

After completing my iterations of design, feedback and usability testing (where I found the average time i it took to add a page get cut down in half and users commented on how intuitive it was) , I moved on to revising the engineering tickets and wrapping up the feature. Below is the Figma ticket I wrote for development and you can see the solution here.

Efficiency of Tasks scatter plot
Success Metrics
% of Clients Creating Website Content

So far, a significant % of clients have already created website content, this exceeding our expectations!
% increase of inquiries about website review

Inquiries about Website Review increased doubled since release. This includes training requests, replies to email blast, Customer Success requests & tickets and general questions from clients and prospects.

These are the success metrics that were selected and here are the results 3 months after release while I cannot report the exact numbers, these are the results.

Final Thoughts

This feature was a challenging one and really forced me to think outside of the box. I also got better at staying agile and adaptable when the technology is only tentative.

003 Other responsibilities
In addition to doing end to end design work, I cultivate a better product for our users and a more productive company in several other ways
Communicate / Pitch Designs

A key part of my process is pitching my design work. Through trial and error, I developed a method that is well received by my audience (below).

I begin by using storytelling to painting a picture of the problem, then I detail my process, followed by demoing the happy case, and finally detailing why certain design decisions were made.

Exploratory Research

For Papercurve's long-term product strategy, I conduct in-depth research on new features & systems. For instance, I recently I did a deep dive into Learning Management Systems where I looked at:

technical requirements
feasibility
user pain points
competitors  
design patterns


I also conduct ethnographic studies, user interviews and/or focus groups when needed.

Actively empathize with users

An important part of my process is constantly and actively empathizing with users; this is done through biweekly meetings where the CS teams breakdowns all the insights, feedback and requests made by our users and also by conducting monthly ethnographic studies (while not in a pandemic, of course).

This is crucial as a UX Designer because it is not enough to "do empathy" by conducting interviews once, drafting a persona or two and calling it a day.

Optimize company Workflows

I create a number of work template to standardize & improve our process where there is friction. For example, I standardized the post-test questions for usability testing, engineering ticket template, and CS ticket template (see below).

More features
Conclusion
Reflection

Working at Papercurve over the past year has been an incredible learning journey, As a UX Designer, I am grateful to work at a company that truly "walks the walk" when it comes user-centred design.

I am also grateful to have an incredible mentor as my product manager as well as a team that supports me in my work and gives me constant feedback to improve.

Lastly, I am privileged to have gotten such a well around experience from product strategy/management to working 1:1 with engineering so early in my career.

Next Project

PiggyBank