Analytics and AI create the most value when they help you outperform the status quo, not merely replicate it.
I. What Automation Actually Does
The seductive business case writes itself: find a repetitive task, determine the time spent doing the work, attach a measurable labor cost, and build the ROI story around automating it away—or expanding it off into infinity. It feels like progress. It looks like efficiency. It sells well internally.
The problem is that automating a task is also a vote of confidence in the specific process behind it. Implied is that the current process is appropriate, correct, and the best available behavior. That is where the logic quietly breaks down.
II. The Problem with That Vote
Every process carries embedded assumptions. Tasks are performed to achieve a goal—which raises the most important question: is this the right goal, and is it actually aligned to business objectives?
Often, a process exists to reduce friction caused by systemic constraints. Something—data, information, a physical product—needs to move from one place to another, and the existing systems don’t facilitate the transfer. A workaround gets created. Over time, that workaround becomes the pain point and the time sink that now looks ripe for automation.
‘Because it has always been done this way’ is a magnificently poor piece of sophistry if one truly cares about driving success. When you choose to automate, you don’t surface any of the failures that led to things being done that way in the first place. Without addressing underlying problems and aligning to outcomes, you risk cementing a flawed process in place indefinitely. You’ve made it permanent—and given it significantly more credibility.
The harder questions—the ones worth asking upfront—are these: Why does the process exist at all? Can we eliminate it? Do we know the existing process is the best way to achieve the goal? Is the behavior we’re trying to replicate actually aligned to business objectives like profitability and growth?
III. The Reframe—Outperform, Don’t Replicate
Our instinct when faced with a business problem is to throw the newest capabilities at it and automate the repetitive parts away. Fight that instinct.
The framing that has guided my approach: address core problems first, build models tailored to optimize objective outcomes, and align incentives so analytic solutions and business processes reinforce the right behaviors—not just the current ones.
A good analytics solution should perform better, not just faster. Imitation sets the ceiling at what a person currently does. A well-designed analytics solution removes that ceiling entirely.
IV. What This Looks Like When It Works
The difference in practice is what the solution is designed to optimize. Rather than replicating what a person is doing, the goal is to provide better information so better decisions get made.
A common example: analysts produce reports from disparate systems, track them in a spreadsheet, append data periodically, and maintain a tracker that feeds key KPIs to leaders. Lots of time is expended. The process is prone to inaccuracies, formatting breaks, and delays when upstream systems are slow or someone in the chain is unavailable.
The pain is the process around a workaround that was originally intended to reduce friction. Many automation strategies could be employed: macros to copy/paste/transform; bots to log into systems and pull reports; copilots or LLMs to provide commentary and flag anomalies. All of these are imitation with better tooling.
A good analytics project asks different questions. What are these KPIs actually used for? What business objective do they support? Often the KPI is an intermediate artifact—leaders digest it to make tactical decisions that drive final outcomes like revenue or profitability. The crucial insight isn’t the KPI itself; it’s understanding how and where the business objectives will actually be met.
Instead of automating the existing workflow, redesign around the outcome:
- Extract the necessary data directly from the source system;
- Enrich it with relevant internal and external data;
- Design and deploy models trained to the final outputs;
- Predict profitability or revenue directly using source data as inputs;
- Allow for changes in the control variables the business leader can actually influence—headcount, production levels, pricing;
- Deploy tools that show current state, predict future values, and enable scenario planning conditional on those changes.
Now you’ve automated the data flows and provided real-time intelligence on the outcomes that actually matter—so decisions improve, not just execution speed. That’s the difference between automating a process and outperforming it.
V. The Closing Argument
The automation trap isn’t laziness. It’s that imitation is easier to scope, easier to sell internally, and easier to measure. It’s easy to look at a business environment and see how a model could make decisions the same way a person does. The case is visible. The path is clear.
That’s precisely why it dominates—and precisely why it underdelivers.
There’s a lot of focus on imitating human behavior when we’re better served by trying to provide the intelligence to make a better decision than the one currently being made. Overperformance requires understanding the business problem deeply enough to define what ‘better’ means before you build anything. It requires asking whether the process itself is worth preserving—or whether analytics is the opportunity to improve on it entirely.
That’s the harder problem. It’s also the one actually worth solving.