For decades, large organisations have run on professional gatekeepers. IT decided which tools were approved, procurement decided what was compliant, legal decided what was safe. Specialist departments decided what “good” looked like.
That structure concentrated authority. If you wanted something done, you went through the function that controlled the expertise. They set the standards, defined the metrics and controlled the flow of execution. AI is quietly destabilising that system.
In software engineering there is a concept known as “shifting left,” moving responsibility earlier in the process and closer to the people doing the work, rather than keeping it concentrated in specialist teams.
Power is shifting left
AI is beginning to create the same shift inside organisations. Not by eliminating those functions outright, but by weakening their monopoly over execution. When intelligence is embedded directly into everyday tools, the person closest to the task no longer needs to route work through a central department. They can generate analysis, draft contracts, translate content, test products or build workflows themselves. And when that happens, power shifts.
This is not primarily a productivity story. It is a redistribution of authority inside organisations. Professional functions derive influence from scarcity – scarcity of knowledge, access and approved pathways. Over time they formalise that influence through frameworks, standards and performance metrics. These structures are often necessary, but they also create control. If only one team can execute safely or correctly, that team holds leverage. AI reduces that scarcity.
When a product manager can produce legal-grade drafts with embedded guardrails, or a marketing team can localise content instantly using AI systems with expert review layered in, the argument that “only we can execute this properly” becomes harder to sustain. The question changes from “Is this perfectly compliant with our professional framework?” to “Is this good enough to ship?”
That shift sounds minor. It is not. Perfection, as defined by specialists, is a source of institutional power. “Good enough”, as defined by operators, redistributes it. Software development illustrates this clearly. For years, testing was controlled by centralised quality assurance teams. Modern development practices introduced the idea of shifting testing left, encouraging developers to test code earlier in the development process rather than waiting for a separate quality assurance stage.
AI will change business functions
Quality assurance did not disappear. But its authority changed. It moved from day-to-day gatekeeping to defining standards, building automated frameworks and overseeing risk.
Enterprise IT followed a similar trajectory. For years, business units waited for central approval and provisioning. SaaS platforms chipped away at that control. Teams began selecting tools directly. Shadow IT emerged not because governance disappeared, but because operational needs moved faster than central processes.
IT still exists. But its role evolved. It sets policy and manages security rather than controlling every purchase. AI is accelerating that same pattern across far more functions at once.
Legal departments will not vanish, but routine drafting will increasingly begin outside their walls. Localisation teams will still matter, but translation will often start at the point of need. Finance teams will continue to manage risk, but analysis will be generated long before it reaches them. In each case, the centre of gravity moves outward.
This shift has consequences beyond workflow efficiency. When execution becomes self-service, buying power moves as well. The people closest to the work begin to define what matters. They optimise for speed, usability and outcomes rather than internal process metrics.
Governance is not the same as control
Professional KPIs rarely disappear overnight. They erode when users can achieve acceptable results without going through the traditional channel. The destabilising force is not that AI makes experts obsolete. It is that AI makes expertise ambient.
When capability is embedded directly in the tool, the tool competes with the department – and tools scale faster than organisational hierarchies. This does not eliminate risk. Governance may become even more important. But governance is not the same as control. Setting guardrails from the perimeter is different from sitting at the centre of every decision.
For leaders, the strategic question is not whether AI will replace functions. It is whether their authority depends on being a mandatory intermediary.
AI makes influence fragile
If influence depends on owning the only path to execution, that influence is fragile. AI will route around bottlenecks wherever possible. If influence instead comes from defining standards that scale across decentralised execution, it can endure.
Inside organisations, power rarely disappears. It migrates. AI lowers friction at the edge. The most visible impact of artificial intelligence may be faster drafting, cheaper translation and quicker analysis. The deeper impact will be less visible: a shift in who gets to decide what “good” looks like.
And inside any institution, that is never a neutral change. AI is not just automating work. It is shifting power left.
Yoav Ziv is the CEO of Tasq AI, a platform that helps enterprises scale AI and GenAI models by integrating human judgment into high-stakes data workflows.
