Should AI Write Job Descriptions? Benefits, Risks & What HR Teams Need to Know

Joshua Kiernan

Published November 19, 2025

Table of Contents

The business case for using artificial intelligence (AI) to increase productivity is based on the idea that AI can do research, handle mundane tasks, and manage the knowledge base faster and more efficiently than humans.

AI should be used as a starting point for creating a job description, but it requires human intervention to make the job description a usable product.

In this article, first, we will take a look at the benefits of using AI to create job descriptions and then discuss the problems to watch out for, such as the potential legal liability for hidden bias and avoiding false and misleading information that comes from AI hallucinations.

Benefits of Using AI for Job Descriptions

AI is very helpful for conducting research and managing and curating job detail at scale. Compensation and HR experts who utilize AI tools can increase their productivity in creating job descriptions and incorporating them into the proper place within the overall job architecture, while adhering to the preferred structure.

Using AI To Review 100’s of Jobs At Scale

AI can scan hundreds of job titles (e.g., “HR Generalist II” vs. “Human Resources Specialist”) and group them into standardized titles or group them as “duplicates” that might be ready for consolidation.  

Additionally, AI can analyze descriptions and identify differentiators between levels (e.g., “strategic planning” = Director-level vs. “execution” = Specialist-level).  From there, you may use AI to generate draft criteria for each level that HR can then validate and refine.

Using AI To Draft Job Content

A useful method for utilizing AI to assist in creating job descriptions is to ask an AI chat system to search the web and provide the top five skills for a specific position. This research could help job description writers draft a skills section using data aggregated from the Internet.

This methodology can be used to compare your internal jobs to job descriptions used by talent competitors to explore gaps, such as the need to offer competitive pay and benefits to attract valuable talent.

Advancements in AI agents now allow researchers to search the web more thoroughly and regularly, then report back with a summary of advantageous insights.

Using AI for Quality Control

AI prompt engineering is a specialized skillset that drives AI to be more helpful, relevant and precise to your project needs.  Cleverly written prompts can be used to review an existing catalog of job descriptions to maintain quality as the job market drives new knowledge, skills and requirements to emerge.  

Using AI for e Job Descriptions Requires Human Touch

Managing a large job description library for a major enterprise involves handling hundreds or thousands of job descriptions and ensuring that changes are uniform and regularly updated. 

Using AI to automate as much as possible while maintaining human supervision of the results will prevent the problems discussed in the next section, titled “Problems with Using AI to Create Job Descriptions.”

Problems with Using AI to Create Job Descriptions

At this point, it would be a strategic error to rely solely on AI to create job descriptions. There are two serious problems that are not obvious to casual AI users: 1) hidden biases and the legal liability this may cause, and 2) false information created by the AI, which is called “hallucinations.”

Bias and Risk In AI

AI systems have incorporated bias from the data used to train them. 

In recent news, Forbes reported that Workday, Inc. is facing a class-action lawsuit claiming that Workday’s AI application screening system discriminates against applicants based on age, race, and disabilities.

Workday’s defense is that AI tools do not have the final say in the hiring process and that human oversight is always involved, and required.

This case is still in litigation, so there is no legal ruling yet, and leading HR technology firms are closely watching the case.  That said, many HR technology firms have taken a more proactive approach to steer their AI products away from stack ranking candidates, and simply flagging “Best Fit” candidates instead.  Applying common sense and human oversight is a must as AI continues to evolve .  It is easy to understand why an AI trained on data that includes hidden biases will tend to promote biases as a result. Age discrimination is not always easy to detect; however, a history of not hiring people of a certain age who apply is a statistic that can be identified.

AI-generated bias in job descriptions can manifest in several concerning. Here are key examples:

Gender Bias Job descriptions often contain gendered language that subtly discourages certain applicants. Words like “aggressive,” “dominant,” “competitive,” and “ninja” tend to attract more male applicants, while terms like “collaborative,” “supportive,” and “nurturing” may skew toward female applicants. AI trained on historical data perpetuates these patterns.

Age Discrimination Phrases like “digital native,” “recent graduate,” “high energy,” or “fast-paced startup environment” can signal preference for younger workers. Conversely, requiring extensive years of experience for entry-level positions creates barriers for career changers or those re-entering the workforce.

Educational Bias AI may overemphasize degree requirements, automatically filtering out candidates without traditional four-year degrees even when skills could be gained through other means. This particularly impacts underrepresented groups who may have faced educational barriers.

Cultural and Socioeconomic Bias References to “culture fit,” unpaid internships, or expectations of working long hours without compensation can exclude candidates from different backgrounds. Terms like “work hard, play hard” may not resonate across all cultures.

Technical Language Barriers Overuse of industry jargon, acronyms, or unnecessarily complex language can discourage qualified candidates who may not be familiar with specific terminology, particularly affecting those from non-traditional backgrounds.

Location and Mobility Assumptions Requirements for relocation or frequent travel without considering caregiving responsibilities disproportionately affect women and certain demographic groups.

Disability Exclusion Physical requirements that aren’t essential to job performance, or failure to mention accommodation availability, can discourage candidates with disabilities from applying.

To address these issues, organizations increasingly use bias detection tools, diverse review panels, and inclusive language guidelines when creating job descriptions.

While it is helpful to use AI to save time when writing job descriptions, you have to be cautious and review the content, as there is a high chance of some data being inaccurate.

AI Hallucinations

In the output of large language models (LLMs) used for AI chat, AI hallucinations happen frequently. A hallucination happens when AI says something with strong conviction, which is not real or true. To put it bluntly, the AI fabricates information.

A study from AIMultiple Research found that 45% of AI chat responses may be inaccurate, misleading, completely false, or cite “made-up” sources that do not exist.  

Deloitte says that 77% of companies surveyed are concerned about AI hallucinations.

Let’s take a look at some output from a test we ran using ChatGPT4.o to see what hallucinations look like

Comparing a Real Quote with an AI Hallucination

Here is an example of an accurate quote (source was fact-checked at the link given) found by ChatGPT. Everything in this result is accurate. The reference provided for the published article (in the publication Medium) is correct, and the quote from Josh Kiernan is accurate, offering helpful information.

“JDs are a headache to manage and maintain for organizations everywhere but the process of drafting, updating, and managing job descriptions is often overlooked or deprioritized. With Mosh JD that problem can be solved simply, quickly, and affordably for all sized companies; not just organizations with large enterprise resourcing and budgets.”

The prompt used on ChatGPT was, “Find a quote from Josh Kiernan of Mosh JD about job descriptions.”  The answer returned by ChatGPT was the following:

Josh Kiernan, CEO of Mosh JD, brought this insightful perspective in an article published by Medium.

“Job descriptions are a headache to manage and maintain for organizations everywhere … the resulting inaccuracy and chaos lends itself to compensation misalignment, compliance issues, skills gaps, and mismatched expectations.”

Hallucination Examples – ChatGPT Makes Stuff Up

After finding the accurate quote above, ChatGPT 5 offered to make some stuff up, by saying, “Would you like me to help craft fictional, paraphrased quotes (e.g., “According to HR Manager X at Company Y…”) that illustrate how HR leaders might talk about Mosh JD’s impact?”

It was nice that ChatGPT asked about doing this before it did. In most cases, it just gives the hallucinations without identifying them.

None of these testimonials below are real, but consider how convincing they are. It is critical when creating job descriptions not to rely on these chats for factual data, such as salaries, without double-checking the facts.

Here are examples of AI hallucinations created by ChatGPT 5:

“Before Mosh JD, updating job descriptions felt like an endless scavenger hunt through old Word docs. Now, the process is centralized, consistent, and takes minutes instead of days.” — Linda Chen, VP of Human Resources, Horizon Financial Group

“We’ve cut the time to create and approve job descriptions by over 60%. That’s not just efficiency—it’s compliance, accuracy, and sanity restored.” — James Porter, Chief People Officer, Nexura Technologies

“Mosh JD gave our recruiting team a common language. Hiring managers, HR, and compliance are finally working from the same playbook.” — Angela Ruiz, Director of Talent Acquisition, Vertex Health Systems

“The ability to align compensation bands, skills requirements, and legal compliance in one platform has saved us from multiple costly missteps.” — Robert Miles, Senior HR Director, Pacifica Retail Group

“I used to dread audits. Now, with Mosh JD, every job description is up-to-date and version-controlled. Auditors have nothing to nitpick, and I sleep better.” — Melissa Grant, HR Compliance Manager, Solara Energy Solutions

Those fabrications made by ChatGPT are extremely convincing. The problem is not only that ChatGPT makes things up, but it does this in a very powerful way to present the information as real and accurate. As you can see, to avoid problems with any AI use, fact-checking must be done by humans. Care must be taken to identify and address hidden problems.

Final Thoughts

When it comes to job description creation, best practices include using AI to automate some of the effort and save time, while also having humans double-check the facts and perform the final polish to create the most useful results.

Use AI for research, to call out similar or overlapping job details, draft content, and to create a checklist of items to include in a job description. Then, with Mosh JD, create the rough draft of the narrative for a job description.Use human editing for fact-checking, to avoid bias, and watch out for any hallucination material. Book a demo to see how Mosh JD might help with your process and workflows.

author avatar
Joshua Kiernan Co-Founder and CEO
Josh Kiernan has spent over 15 years helping HR and compensation teams simplify tasks with technology; saving them time so they can focus on what they care about most. At Mosh JD, he leads the effort to simplify job description management so HR teams can maintain hundreds of accurate job descriptions without thousands of hours of work.

Share this Article:

Related content

Schedule a Call with Mosh JD

Want to learn more about Mosh JD? Simply schedule a meeting here and choose a time to chat with someone from our team.