As the Product and Experience designer, I led the design of the Skills Discovery toolkit to help organisations engage in meaningful employee development. This B2B tool was built to turn key concepts in positive psychology into a simple, intuitive experience, designed for long-term engagement and client value.

Preview of the Skills Discovery Toolkit project

Overview

Cappfinity are global leaders in measuring and developing potential in Talent Acquisition and Talent Management. Skills Discovery Toolkit is part of a new product offering launched by Cappfinity, focused on helping individuals and organisations unlock and apply their strengths. Building on Cappfinity’s reputation as a world leader in Talent Acquisition & Talent Management, the toolkit expands their offerings in the Talent Development space, with this product serving as the foundation.

Product and UX Designer
UX Audit, User Research, Usability Testing, Prototyping, UI Design, Interaction Design

Team
2 Engineers, Product Owner, 2 Content Strategists

Date
April 2024 - February 2025

01
UX Outcome

Early feedback showed users felt more confident navigating, revisiting and using the content, signalling improvements in long-term product value

02
Organisational Impact

Reduced delivery complexity and content duplication, moving from fragmented tools to a single experience

03
Strategic Value

Laid the foundation for a new product suite with strong upsell potential, supporting the company’s expansion into the talent development space

01

The Toolkit was built to fix a broken talent development journey

Where were Cappfinity headed?

Talent development at Cappfinity had grown ad hoc, leading to fragmented branding and delivery across PDFs, emails, and portals. This caused internal inefficiencies, limited upsell potential, and lowered the perceived value of our offerings.

Existing fragmented and redundant user flows

Business Problem Framing

As Cappfinity expanded its talent development offering, internal tools and content had become fragmented, inconsistent, and difficult to maintain. We needed a scalable, unified solution that could improve delivery efficiency, strengthen brand perception across the talent lifecycle, and support long-term growth in this space.

The Skills Discovery Toolkit emerged as a strong proof of concept for the broader product suite. It was a natural extension of existing business initiatives.

While the business goal was clear, it wasn’t immediately obvious how users would engage with it, or what content would feel useful in practice.

02

Reframing the business goal through the lens of the user

Navigating research constraints

Limited direct access to end users

Compressed timeline due to other project demands

Securing buy-in for research due to limited value from previous attempts

Determined to not go into the design process blind, I had to get resourceful. I created and secured sign-off for the following research plan:

Client-facing
interviews

Capture common friction points, user feedback and delivery challenges through proxy users.

Audit of legacy
product

Identify structural and content weaknesses in the previous toolkit.

Review of existing user feedback

Identify patterns in user sentiment, feature requests, and drop-off points.

Auditing a legacy learning product

Key insights

  • Internal teams needed a more structured and unified approach to content
  • Some "power users" were screenshotting their entries to share with others
  • Poor feedback interactions on responses and user actions
  • Very content dense with minimal interactivity
  • Users want to be able to revisit reflections
  • They didn't see much value in one-off tasks
  • Users noticed the content duplication/irrelevance caused by internal inefficiencies in delivery

How did I use the findings from the discovery phase in the rest of the design process?

03

Translating insights into design direction

Identifying opportunities

The first step to figuring what exactly I should do with them involved creating a sort of "opportunity-risk" map.

A snapshot of the process of mapping user insights onto risks and opportunities

From emerging patterns to behavioural modes

Opportunity mapping clarified what the solution needed to do, but behavioural modes showed how users would engage with it, helping us align design with real usage patterns.

Insights from stakeholders and qualitative data were brought together to define distinct modes of user engagement with the toolkit

These behavioural modes flowed directly from the research. They also mapped on to some degree to Goal Orientation Theory, helping to ground these observations in real behavioural psychology. This was the first step to thinking about how to implement the opportunities the research surfaced.

How I moved from research to the start of solution thinking through the lens of different modes of user enagement

04

Converging on an MVP

Understanding the technical constraints

Disconnected CMS platforms

Assessment and toolkit sat on separate systems so we couldn't integrate personalised assessment information

No shared user
identity

We couldn’t pass data across platforms so users had to create another account separate to the assessment platform

Disconnected
databases

We couldn’t trigger automated onboarding or
re-engagement flows

Problem reframed

How might we create a modular, cohesive toolkit experience that adapts to different user behaviours and delivers long-term value despite the lack of shared user identity, automated flows, or integrated content systems

Designing with intention

To ensure alignment and clarity on what we were doing with toolkit, we collaboratively came up with 4 design principles. I then mapped each principle on to specific success metrics: 

Principle 01

Design for Repeatable Reflection

Arrow
Success metric

Measuring Return & Reuse

e.g. number of sessions per user
Principle 02

Keep Interactions Lightweight

Arrow
Success metric

Measuring Clarity & Depth

e.g. engagement with optional deep content
Principle 03

Content-Led Relevance

Arrow
Success metric

Measuring Perceived Usefulness

e.g. qualitative feedback by user types
Principle 04

Structure for Sustainability

Arrow
Success metric

Measuring Internal Efficiency

e.g. delivery time for new toolkits

Value-to-effort mapping

A vital step towards converging on a design solution was mapping out our ideas based on the impact it would have on the user and how hard it would be to implement with the existing technical architecture in mind.

Using the tension between user value and technical feasibility to narrow focus, leaving us with a set of pilot features, defining our MVP

Features prioritised for the MVP chosen for technical feasibility, to maximise immediate value and easily scale for the future

How does it all come together?

We started by defining a set of modular, content-agnostic tools that could flex across topics. These gave the content team a clear framework to work within.

From there, the content team grouped these tools into three thematic learning areas. This work from the content team shaped most of the sitemap for the toolkit:

The final MVP sitemap for the Skills Discovery Toolkit platform

User journey mapping

From there, I began working on mapping out a couple of prospective user journeys for both first time and returning users.

A snapshot of the proposed user journeys for first time and returning visitors, focused on goals and emotions at each stage of the journey

Reflection: Process Wins

Collaborating early and often to balance vision, user needs, and feasibility

Weekly Sessions

helped the team align on priorities and user flows, keeping momentum and reducing handoff friction

Real-Time Problem Solving

meant engineers could flag technical constraints early, allowing for faster iteration

Co Creation of Tools

between content and UX helped balance narrative clarity with interaction design

MVP Co-Definition

was a shared effort, enabling us to scope features based on both user needs and delivery feasibility

05

Building a modular, scalable toolkit

What guided my wireframing process

How all of the work we'd put in helped me define a framework that guided my design explorations, making things much more efficient and focused

Adapting one feature for different ways of engaging

Here’s an example of how this framework came to life in the process of designing the "Missions" experience:

An example of how I used the wireframing decision framework to make design decisions at the lo-mid fidelity wireframing stage for the Missions feature

Every part of this layout was intentionally shaped by our guiding inputs: user insights, behavioural modes, design principles, technical constraints, and the features we had prioritised.

I repeated this process across screens, seeking out feedback from engineers on feasibility and priorities from the PMs to refine these wireframes until they were in a good enough place to start applying styling.

06

Validating designs with usability testing

Prototype testing: early directional validation

Goal:

Validating foundational design logic, content and usability before build for efficiency.

Findings
Action
  • Users struggled to form a clear mental model of the toolkit
  • Refine homepage copy to emphasise value of key features and tutorial; consistent page structure and modular layout
  • Navigation was generally intuitive, but lacked reinforcement
  • Include breadcrumbs, progress indicators for each section; persistent exit points; active state cues; reposition page heading and details
  • Missions were perceived as one of the highest value areas of the toolkit
  • Make Missions more prominent so users can access them quickly and understand their value upfront; more guidance to help users get the most out of them
  • Uncertainty around positioning limited perceived value for non-managers
  • Update approach to content for certain areas of the toolkit to broaden relevance

Fully functional and interactive prototypes of the branch track scenario videos and the bucket sort component built in Figma

Beta testing with live toolkit

Goal:

Validating improvements from previous phase and evaluate stability of components in live environment

Gathering feedback from live beta testing

Findings
Action
  • Broken buttons, browser compatibility issues, unreliable interactions
  • Full interaction QA audit; rebuilt or simplified interactive components
  • Task instructions and video-based activities still felt unclear
  • Usage prompts before video content; repositioning task support information on screens
08

Design Showcase

What we delivered

After 10 months of collaboration, content and UX design and development reworks, we successfully delivered:

A live MVP with 6 modular, interactive tools

Documentation and recommendations for future integrations

A user experience designed to be revisited and part of the end users career development workflow

A style guide extracted from the learning branch of our design system, establishing a consistent foundation for future iterations

Interactive learning tools

To move beyond passive learning experiences like static PDFs, I designed reusable interactive components that encourage active engagement. These tools were designed to support better knowledge retention in a self-directed learning environment.

Responsive, Accessible Design

I designed the toolkit to be fully responsive across devices, ensuring accessibility and ease of use whether users were engaging on desktop, tablet, or mobile. Greater flexibility with how users can interact with toolkit facilitates repeat use by reducing friction.

Persistent, Revisitable Missions

Rather than one-and-done tasks, Missions allow users to reflect, save, and evolve their inputs over time. This directly solved the problem of users feeling unsure about how to use and reflect on their learnings from previous offerings.

Flexible Theming

We introduced both light and dark mode options to adapt to user needs and environments. This addressed accessibility and usability concerns. Offering flexible theming also contributed to creating a premium-feeling product experience.

09

Results

Early validation

While the toolkit has only recently launched, we've conducted early validation with internal client-facing teams and a pilot client cohort.

Key early signals

  • Positive feedback on the clarity of the navigation and modular structure
  • Strong perceived value in the ability to revisit and build on reflections
  • Pilot users described the toolkit as a noticeable improvement in ease of use compared to previous offerings

UX Outcomes

Designed for engagement, clarity, and reuse
  • Supported both light and deep engagement through flexible interaction models
  • Early feedback showed users felt more confident navigating and returning to content

Organisational Impact

Unlocked scalability and internal alignment
  • Showcased efficiency of highly collaborative process
  • Built modularly, reducing overhead for content and tech teams
  • Created shared language and process for future toolkit design

Strategic Value

Foundation for a new suite of talent products
  • Toolkit now forms the model for future development solutions
  • Created internal momentum into development-focused offerings

Post-launch validation plan

To support continuous UX improvement, I established a structured validation strategy linked to the design principles outlined earlier and the related success metrics:

Principle 01

Design for Repeatable Reflection

Arrow

Research Focus

e.g. What triggers users to return unprompted and when do they fall off?
Principle 02

Keep Interactions Lightweight

Arrow

Research Focus

e.g. Are lighter interaction styles contributing to repeat use and perceived usefulness?
Principle 03

Content-Led Relevance

Arrow

Research Focus

e.g. How do different user types interpret relevance? Are there features or moments where this breaks down?
Principle 04

Structure for Sustainability

Arrow

Research Focus

e.g. Where do bottlenecks or friction points appear when reusing modular components?

Reusable foundations for faster delivery

The approach we took to building the Skills Discovery Toolkit has already started to pay off. We've been able to begin production on the next version of the toolkit significantly faster.

By investing early in reusability and scalability, the toolkit is making product development more efficient and commercially viable.

06

Reflections

Key Learnings

  • Early alignment on both user needs and business goals is critical to designing scalable products.
  • Building modular, reusable systems up front unlocks massive efficiency later.
  • True cross-functional collaboration strengthens product quality and delivery speed.

What I'd Do Differently

  • Taking a component-first approach alongside wireframing would have allowed us to focus conversations on the logic and reuse value of individual components - the building blocks of the toolkit ecosystem
  • Push harder for lightweight external user validation earlier, even with limited resources. As we got closer to launch, we still had open questions around product positioning and commercial strategy

More to explore

Preview of the Cappfinity Design System project

Cappfinity Design System

view CASE STUDY
Preview of the website redesign I executed for my client FORTH

FORTH Website Redesign and Restructuring

view CASE STUDY