TopResume is one of the largest career-services brands in the world, providing job-search advice and products to a huge chunk of the job-seeker market each year. TopResume’s freemium product, a personalized expert resume critique, goes out to millions of people each year, but doesn’t allow for repeat usage.
To provide more value to job-seekers looking to perfect their resumes, we built a fully-automated version of the critique called the Scorecard that gives people insight into how their resume gets pulled apart and shown to recruiters by automated hiring software that is pervasive in the hiring process.
The biggest challenge with showing the customer how the automated hiring software scans their resume is educating them up-front on the fact that this software is used by a vast majority of recruiters and hiring managers. It’s important to be able to see what the software gets wrong about one’s resume before they submit job applications.
Unfortunately, without clear context, it’s very easy for incorrect scan results to look like we have a poor product that gets a lot of things wrong. To alleviate this, we included a friendly scanning animation and brief overview of what the user is about to see before diving into the results.
TopResume has a unique opportunity to provide context to the scan results by comparing them to the millions of other scan results we’ve done for others using machine learning. Visualizing this data in a simple way was key for the Scorecard. To start, we put file-size and word-count into perspective, showing people where they fell on the spectrum, as well as when they were out of range for what’s indsutry-standard good length or size for the job-hunt.
Integrating Useful Advice
While the automated scan can offer data like whether contact information is missing, whether a complete employment history has been parsed, and other binary information, it’s up to us to give useful advice based on this data. We worked with our career advice expert and resume writers to draw conclusions based on the possible responses from the scan, and provide actionable advice to improve. For example: if we find employment dates longer than 15 years ago, we let the user know that recruiters generally see that as a negative. (That’s a freebie!)
Developing a Score
Finally, we developed a metric to score resumes on based on the presence or absence of key information like contact information and some more subtle metrics, like detected employment gaps or overlaps. The formula takes into account a holistic picture of the employment history, best guesses at the user’s intentions, and recruiting industry standards. We then deliver actionable recommendations based on the score.
User testing revealed that users found the Scorecard to be a valuable tool that was successful at contextualizing the importance of computer-readable resumes. Upon launch, Scorecard increased our engaged user-base by 8% and added an additional $100,000 in revenue in three months.