'Kastane' Applicant Tracking System

Kastane: at a glance

No time for the details? Click through the summary slider below to understand my key contributions and takeaways.

summary

'Kastane' is a job applicant filtering software designed specifically for recruiting software engineering talent, developed by Atera Technologies Pvt. Ltd.

I operated as a Product Design Lead, unclogging a 6-month roadblock, spearhead user research, established a design pattern library, and working with development to deliver Kastane V1 to clients within 8 months.

The Goal

The team as stuck in deadlock, unable to apply a user research and product development process. They wanted the product to be usable, but leadership couldn't commit the time needed to guide the team, had no idea how.

I was recruited by leadership to achieve several key project goals were met.

Methods

  • Conducted 3 Stakeholder interviews & 6 team interviews — collating research gathered so far.
  • Conducted 8 Interviews with early-adopter clients
  • Revised 3 Personas for key user groups according to JTBD framework.
  • Generated 40+ User stories for core features.
  • Developed Information architecture map, and UML diagrams.
  • Led team through weekly Mid-fidelity wireframe iterations, addressing 250+ usability issues.
  • Led team though High-fidelity prototyping for the “Create Job”, “Applicant dashboard”, and “Job Application” user flows.
  • Usability tested 5 participants, identifying 25 usability issues.

insights

  • The stakeholders & team's research emphasized the core needs for a low-effort, customizable, solution that:
    • Reflected organizationally unique hiring philosophies
    • Integrated into existing tech stacks instead of replacing them.
  • The revised Personas compartmentalized role-specific features for recruiters, hiring managers and team-leaders.
  • Usability testing revealed that:
    • The “Job Overview” text box was the most time-consuming step for 4/5 participants.
    • 3/5 participants tried and failed to edit skills in the “Screening Questions” screen, failing to realize they had to navigate to the previous page.
    • 3/5 participants were confused about filling out the screening questions.
    • 4/5 participants overlooked the “Seniority level” selector, focusing on the “Job category” dropdown instead.

what went right & what went wrong

The project successfully delivered an MVP for development and timely release. Key accomplishments included:

  • What went Right
    • The revised personas and user stories finally aligned the stakeholders, design, research, and development teams.
    • Improved mid-fidelity wireframes, addressing usability issues.
    • Usability testing validated MVP, while uncovering usability issues.
    • Provided comprehensive findings and recommendations for future iteration
  • What went Wrong
    • It was challenging accessing user demographic for testing.
    • The product's positioning as a plug-and-play solution conflicted with client preferences for an end-to-end solution.

Introduction

what is 'Kastane'?

Kastane was a job applicant recruitment and filtering software specifically designed for recruiting software engineering talent.

The product was being developed by Atera Technologies Pvt. Ltd., and was envisioned as a plug-and-play solution that integrated into local and international company's existing tech-stacks

Fun fact: 'Kastane' is the Sinhalese word (from one of many native languages) for a Sri Lankan ceremonial sword. In short, a pseudonym.

Project goals and challenges

The team was stuck in deadlock, unable to apply a user research and product development process. They wanted the product to be usable, but leadership couldn't commit the time needed to guide the team, had no idea how.

I was recruited by leadership to achieve several key project goals were met.

Implement an iterative research process with milestones, and concrete progress.

Create an MVP prioritized roadmap of application features.

Train the team on basic usability methods.

Stakeholder & Client Interviews

catching up on the research so far

In order to understand the insights gathered so far, I interviewed the leadership, design, and development team leads and gathered some fundamentals.

Kastane began with a client problem: clients needed large amounts of tech talent, but couldn't wade through hundreds of mixed quality applications.

With leaderships interviews with 8 clients, and the design teams' literature review, I gathered the following key insights:

The app should automate as much of the process as possible.

Users should be able to customize the application to suit their organization’s hiring practices.

The a should integrate and add additional functionality to existing systems.  

case study scope

Narrowing the Focus of to the 'create job' flow

This project was a complete application design covering 100's of screens, still ongoing, and beyond the scope of this case study.

Therefore, this case study will focus in detail on one specific user-flow — the Create Job flow.

This case study will detail the journey to the final high fidelity prototype, containing the following features:

  • Create a Job posting.
  • Select job type.
  • Choose job selection criteria.
  • Customize job screening questions.
  • Review the job summary
  • Publish the posting

personas

Updating the personas

The team had already conducted informal interviews, competitor analyses, and customer research online, and then used this data to generate Personas.

However, due to irrelevant details, and feature-first framing, the Personas, were failing to help the team identify who they were designing for, and what they were designing.

no time for a redo

Unfortunately, there was no time to redo the Personas from the ground up.

I reorganized the insights gathered so far, and made the following improvements to the Personas:

  • Simplified the BIO to not stray from our insights.
  • The original Goals were reverse engineered from features the team was already developing. The new goals took a step back, and focussed more on actual business problems that the Persona might encounter in their entire work context.
  • Adding “Platforms” gave the team an idea about what design patterns the Persona's would be most used to.
  • Removed unhelpful personal traits, and replaced them with more detailed motivations.

Looping between info architecture & user stories  

In order to help the team translate their disconnected insights into concrete features, I decided to iterate between informal architecture mapping, and user stories.

This format helped the switch between micro and macro, giving them confidence and direction.

Information Architecture

no map, and no agreeable direction

The team was lacking a strong idea of what the overall application would look like, and what features would be integral, and what would be non-integral.

So we started by using MIRO to create an Information Architecture (IA) diagram, and add to it as new features were added over time.

This MIRO I.A. diagram could then be converted into a formal IA diagram once we got to the developer handoff documentation.

User stories

Using User stories to get really specific

At this point, the team was still having trouble converting Persona characteristics into relevant features. So I chose to break the Personas down, by helping the team generate exhaustive user stories for each feature.

They had to explain how each Persona would use each screen in a user-flow in various scenarios, forcing them to think more concretely about how each feature would be used.

This stage substituted Story boarding, helping the non-design members of the team get over their design fears by writing instead of drawing.

As a Hiring Manager, I want to...

  1. “…create requirements for a specific job, so I can clearly define what type of person we are looking for”.
  2. “…save the job I’m creating as a draft So I can return later to finalize the changes”.
  3. “…create different levels for resume matching. This way, I can sort a list of potential candidates according to what level of match they fit into”.
  4. “…assign a job category for a new job, so I can have similar type of jobs grouped together”.
  5. “…make job requirements mandatory or optional, so that I can give priority to certain skill sets over others”.
  6. “…view a summary of the job I created, so I can review it before publishing”.

After several revisions, the user stories helped the team get a more granular idea about how each Persona would interact with each stage in a feature’s user-flow.

Mid Fidelity Wireframes

improving the wireframes made so far

Now armed with revised Personas, User Stories, and Information Architecture Diagram, we focussed on revising the Mid-fidelity wireframes the team had started.

The team was now thinking more critically about the application's features. While reviewing these wireframes, we revisited the user stories and IA diagram, expanding as we went along.

The screens listed below demonstrate some of the core issues that were most prevalent throughout the screens.

The page size was uneven and there was poor design hierarchy.

More detailed progress bar.

The screening method doesn’t fit any mental model, terminology, or best practices used in by recruiters. This made the candidate ranking system difficult to understand.

We modified as many pages, categories, and feature names to represent the current processes by which job creators go about creating and publishing jobs.

Lots of empty, unused screen real-estate, that reflected poor information grouping.

Introduced customizable job criteria, to better match the traditional job advertisement format.

'Assessments' were this version of the application's method for assessing applicants. However, grading criteria was unclear, and what assessments were, and what they entailed was not communicated.

We spent a lot of time experimenting with the grading criteria.

High-Fidelity prototype

Refining key features, flow, and aesthetics for The
“Create Job” User-flow

Our main objective now, was to create a believable-enough prototype for usability testing.

One of the main flow we wanted to test was the 'Create a Job' flow, which contains the following key stages:

  • Create Job.
  • Select Job template or Create Custom Job.
  • Upload Job Description.
  • Select End date or Headcount.
  • Choose Job Criteria.
  • Customize Job Screening Questions.
  • Review Job Summary.
  • Publish Job.

While iterating through this prototype, I outlined several major objectives we had to achieve to be ready to test.

Re-build categories:
Current categorization features were contradictory, and did not fit recruiters' mental models.

Add advanced filters:

Filtering options were sparse and unhelpful. 


Apply Heuristics:

Many screens violated heuristic principles, and progress was unclear inflexible.


Create Job Features

Create New Job
The “Create Job” function can be accessed while on the “All Job Postings” page. User testing indicated that not having this function in the sidebar might pose an issue for users that want to create a job fast.
Select A Template
Once in the job creation wizard, users can select pre-made job templates, or create one from scratch. Templates were included to speed up the process — and enable users to create their own templates as well.
Upload Job Description
On the “Overview” screen, users have two options. Either, they can upload a job description (JD), and we can extract and populate the application, which the user can then review. Or, if there is no pre-existing JD, they can enter information field by field.
Select Closing Criteria
The most important new feature we added was the End date / Head Count job close state — allowing for the job to automatically close upon satisfaction.
Select Job Criteria
The “Job criteria” screen is the most important, and complicated screen in this user-flow. Users must select / review their desired skills, academic qualifications, and work experience. Users can also add additional cards for language, portfolio, and other.  
Choose Skill Levels
Each skill in our library has 3 proficiency levels — novice, intermediate, and expert. The skills and their levels then correspond to automatically generated screening questions — our application's first screening method. Work experience, on the other hand, scans applicant profiles for pre-defined keywords.
Choose Skill Levels
The “Screening Questions” screen allows users to do two things:
  • Users can select five questions, covering salary, remote versus in-person or other important screening criteria.
  • Users select 3 top skills to test applicants. The selectable skills are autopopulated from the “Skills and Technologies” card in the previous screen.
Review via the Job summary page
The “Job Summary” screen allows users to review their job posting. If a user sees an error in any part of the process, they can click the pencil icon, and edit that specific part — eliminating the need for backtracking.

A user can then either save the job as a draft, or publish it. Upon which, they will be prompted to share the job URL, publish to any integrated social platforms, and then returned to the home screen.

usability testing

Accomplishments & limitations

Upon completing the High Fidelity Prototype for this function, we began testing with 5 Users. After conducting a pre-qualification survey, we filtered down the applicants to simulate the Personas we had designed for.

Our test was conducted in person using a scripted set of questions. Each question asked users to complete job-relevant tasks using the application. Examples, include; creating a job and then publishing it, or filtering candidates to select those that match the job description most closely.

By asking users to complete these tasks, and speak out loud as they found their way through the application, our team gathered data on thoughts, sticking points, missing functions, and a whole host of other usability issues.

Our key findings were as follows:

The most time-consuming step was the Job Overview text box on the first page.

Users tried to select edit skills in the 5th question in the “Screening Questions” step.

These could not be edited without clicking 'back'.

Users tried and failed to click on the progress bar to go back quickly instead of using the “Back” button

Users were confused about whether they were supposed to fill out the screening questions.

Users missed the Seniority level, and went ahead with the default level — not seeing, or understanding, this selector's importance.

Users focused on Job category dropdown instead.

Conclusion

Accomplishments & limitations

We presented our final high-priority findings, core recommendations, with screen illustrations.

in a comprehensive, referenceable report, that collected

We encountered a few limitations

  1. Our user demographic was hard to access, thereby forcing us to settle with a near random assortment of users to test the platform with.
  2. Our applicant filtering approach is contrarian, and has a learning curve that would force recruiters to approach recruiting from a place that goes against a lot of conventional recruiting wisdom. As a result, it could face adoption resistance when marketed.
  3. Our testing was still in the concept validation stage. We did not get to the point of testing for optimization or usability excellence.
  4. One of our biggest pieces of feedback when pitching this product suggested that clients want an end-to-end solution (eg; Bamboo HR). But when discussing this feedback with stakeholders, an E2E solution proved to be highly time-consuming / resource intensive. It would go against the business case of this product being originally conceived as a 'plug-and-play' solution that would fit into other tech stacks.