An HR manager at a London professional services firm cut job description writing time from 2–3 hours to 30 minutes by learning how to structure AI prompts for HR-specific outputs. The same approach extended to screening questions, rejection emails, and onboarding checklists — all using the AI Survival Kit for HR Professionals.
How an HR Manager Cut
Job Description Writing
Time by 70% Using AI
James T. had tried ChatGPT once, got output that was "too generic to use", and assumed AI wasn't for HR. He was wrong about the tool. He was right about the prompt.
Disclaimer: This is a composite account based on early user experiences. Names and identifying details are fictional. Individual results vary.
The Problem
Writing job descriptions from scratch for 15–20 roles a quarter
James T. is an HR manager at a professional services firm in London, leading a team of three. His firm is growing — 15 to 20 new roles a quarter, ranging from mid-level analysts to senior consultants. Each role is different enough that copy-pasting the previous JD wasn't an option.
Writing a job description properly — clarity on seniority, responsibilities, must-have qualifications, company culture — took two to three hours. Multiply that across 20 roles and it consumed a significant chunk of James's month before he'd done anything else.
The screening question sets were generic. He used a standard bank of 12 questions and rotated them. The questions weren't wrong — they just weren't specific to the role. Interviewers noticed. Occasionally, a candidate did too.
Rejection emails were the most uncomfortable part. His team used a single template, copy-pasted and tweaked by hand. For candidates who'd made it to a final round, it felt impersonal. He knew it. He just didn't have time to do it differently.
He'd tried ChatGPT once, typed "write me a job description for a senior consultant in professional services", and got back something that read like a LinkedIn post from 2018. He closed the tab and didn't try again.
The problem wasn't ChatGPT. It was the prompt.
What He Did
The kit didn't just give him templates — it showed him why the prompts work
James bought the AI Survival Kit for HR Professionals. The first thing he did was read the system prompt module — a section on writing a persistent instruction set that tells the AI it's an HR professional working in a specific context, before any task is given.
This was the key change. His previous ChatGPT attempt had no system prompt — he'd just asked for a JD cold, with no context about the firm, the sector, the seniority level, or the role the output needed to serve. The system prompt fixed all of that.
His second step was the JD prompt. The kit provides a five-bullet brief format: seniority, team structure, top three responsibilities, must-have qualifications, and one sentence on company culture. With those five bullets filled in, the AI produces a first-draft JD that reflects the actual role — not a generic consultant job spec.
The third step was adapting the same brief to generate screening questions — something that took him a few iterations to get right, but which now runs automatically from the same role input.
3 Prompts He Used
Prompt 1 — HR System Prompt (set this first in every session)
Why it works: This grounds the AI in a specific context before any task begins. The "your audience" instruction is particularly important — it shifts the AI from writing a JD for a recruiter to writing one for a candidate who has options.
Prompt 2 — Job Description from 5-Bullet Brief
Why it works: The five-bullet brief gives the AI specific material to work from. The word count and format instruction ensures you get a usable structure — not a wall of text. With the system prompt already set, the tone is consistent throughout.
Prompt 3 — Role-Specific Screening Questions
Why it works: By referencing the JD you've already built in the same session, you don't have to repeat the context. The "distinguishes candidates who led vs. supported" instruction is the most valuable part — it forces the AI to write a question with discrimination power, not a softball.
What Changed
Across every stage of the recruiting workflow
JDs: 2–3 hours each, written from scratch
✓30 minutes including review and edits
Generic 12-question screening bank, rotated
✓Role-specific question sets generated per JD
Single rejection template, copy-pasted
✓Staged rejection emails by candidate interview round
Onboarding checklists built from memory
✓Full onboarding checklist drafted in one session
The rejection email upgrade was the change James mentions most. He now has three templates — early-stage, mid-process, and final-round — each written with different levels of warmth and specificity. Candidates who reached the final interview receive a more considered message. The AI wrote the first drafts of all three in a single session.
The onboarding checklist was a surprise use case. He'd been putting it off for months. He gave the AI a role description and a list of systems the new hire would need access to, and asked for a 30-day onboarding plan. It took 20 minutes to produce something he could review and send.
"The prompt library is the bit I keep coming back to. It's not just copy-paste — it teaches you why the prompt works."
Common Questions
AI for HR professionals
Why does ChatGPT produce generic job descriptions? +
Generic output is almost always a prompting problem. If you tell ChatGPT 'write me a job description for a project manager', it has no context — it produces the statistical average of every PM JD it's been trained on. The fix is a structured role brief: seniority, team size, reporting line, top 3 responsibilities, must-haves, and company tone. With that context, the output is specific and usable.
Can AI help with candidate screening as well as job descriptions? +
Yes. Once you've written the JD, you can ask the AI to generate a role-specific screening question set based on the same brief. Ask it to produce questions that test for the top 3 required competencies, plus one question that distinguishes candidates who have actually done the work from those who've only supported it. The HR kit includes a full module on this.
Is it appropriate to use AI to write rejection emails to candidates? +
Yes — with care. The risk with AI rejection emails is that they feel even more impersonal than a standard template. The solution is to give the AI the candidate stage (phone screen vs. final-round) and instruct it to vary the level of acknowledgement accordingly. A candidate who went through three interviews deserves a warmer tone and more specific thanks. The HR kit includes a staged rejection email module.
Get the Same System
AI Survival Kit for HR Professionals
30 copy-paste prompts for HR workflows. System prompt templates. JD builder, screening question generator, rejection email library, and onboarding toolkit — all included.
See the HR Professionals Kit →From $47 · Instant PDF · 30-day money-back guarantee