CRBC News
Technology

AI Gives an Illusion of Expertise — New Report Warns It’s Eroding Workers’ Core Skills

The Work AI Institute report warns that generative AI can create an "illusion of expertise," making workers feel more capable while their underlying skills erode. The risk is greatest in creative and knowledge‑intensive roles and among early‑career employees who miss out on apprenticeship learning. Researchers contrast a beneficial "cognitive dividend" when AI complements existing expertise with a harmful "cognitive debt" when it becomes a reflexive shortcut. The report urges leaders to measure AI by outcomes (quality, customer satisfaction, innovation) rather than raw usage metrics.

A new report from the Work AI Institute, produced with researchers from institutions including the University of Notre Dame, Harvard University, and UC Santa Barbara, warns that generative AI is giving many office employees a false sense of expertise while their fundamental skills quietly deteriorate.

Rebecca Hinds, head of the Work AI Institute at workplace search company Glean and a coauthor of the report, told Business Insider that AI often blurs the line between a person's knowledge and the machine's suggestions. "AI is putting expertise into our hands in a way that's not always predictable," she said. "There's often this illusion that you have more expertise, more skills than you actually do. Even if you're very well aware you're using the technology, it's often unclear where your knowledge ends and where the technology begins."

How the Illusion Forms

Hinds compares the phenomenon to the early days of search engines, when easy access to information sometimes substituted for genuine understanding. With generative AI, the effect is more powerful: workers can quickly produce drafts, ideas, and analyses that feel polished — but may hide gaps in underlying judgment and domain knowledge.

The report highlights a common pattern: employees use AI to beat the "blank page" by generating first drafts or initial solutions. While this accelerates output, it also removes the messy, time-consuming process of wrestling with ideas — the very practice that builds deep understanding and the confidence to defend work in meetings. "That process is highly inefficient," Hinds said, "but it's also really healthy. If workers lean too heavily on AI to skip it, your skills are going to atrophy."

Cognitive Dividend vs. Cognitive Debt

The researchers describe AI's effects as either a "cognitive dividend" or a "cognitive debt." When used intentionally as a partner in areas where workers already have expertise, AI can free time and sharpen judgment. But when it becomes a reflexive shortcut into unfamiliar domains, it produces weaker skills and misplaced confidence.

Who’s Most At Risk

Early-career employees face the greatest exposure. Roles that traditionally function as apprenticeships — junior developers learning from senior engineers, entry-level marketers learning campaign craft, or young analysts building models from scratch — may be hollowed out if routine tasks are automated or juniors rely entirely on AI. Without hands-on practice and mentoring, these employees may never develop the foundational skills they need to advance.

Perverse Incentives From Metrics

Hinds warns that organizational measurement can unintentionally make the problem worse. Some companies equate AI adoption with clicks or usage metrics and even tie those numbers to performance reviews. That encourages employees to maximize tool use rather than invest time in genuine understanding. "Organizations stack-ranking employees based on how many times they're clicking an AI tool is a big red flag," she said.

How Leaders Should Respond

Hinds does not recommend rejecting AI. Instead, she urges deliberate adoption that aligns with business outcomes. She suggests leaders and teams ask three guiding questions:

  • What roles should stay deeply human? Identify the parts of work that develop judgment, creativity, and motivation — and resist fully automating those.
  • Where is AI truly adjacent? Use AI where it supplements existing expertise, not as a shortcut into domains employees don't understand.
  • What are you measuring? Focus less on raw tool usage and more on whether AI improves outcomes like quality, customer satisfaction, and innovation.
“AI does not magically transform you as a leader,” Hinds said. “More often, it amplifies what already exists within the organization.”

Measured and contextualized, AI can be a force multiplier. Left unchecked, however, it risks producing a workforce that appears more expert than it truly is — and a generation of employees with weakened core skills.

Read the original reporting on Business Insider.

Similar Articles