Need smarter insights in your inbox? Join our weekly newsletters to get solely what issues to enterprise AI, knowledge, and safety leaders. Subscribe Now
Previously decade, corporations have spent billions on knowledge infrastructure. Petabyte-scale warehouses. Actual-time pipelines. Machine studying (ML) platforms.
And but — ask your operations lead why churn elevated final week, and also you’ll possible get three conflicting dashboards. Ask finance to reconcile efficiency throughout attribution methods, and also you’ll hear, “It is determined by who you ask.”
In a world drowning in dashboards, one fact retains surfacing: Knowledge isn’t the issue — product pondering is.
The quiet collapse of “data-as-a-service”
For years, knowledge groups operated like inside consultancies — reactive, ticket-based, hero-driven. This “data-as-a-service” (DaaS) mannequin was high quality when knowledge requests have been small and stakes have been low. However as corporations turned “data-driven,” this mannequin fractured below the load of its personal success.
Take Airbnb. Earlier than the launch of its metrics platform, product, finance and ops groups pulled their very own variations of metrics like:
Nights booked
Energetic consumer
Obtainable itemizing
Even easy KPIs assorted by filters, sources and who was asking. In management opinions, totally different groups offered totally different numbers — leading to arguments over whose metric was “appropriate” fairly than what motion to take.
These aren’t expertise failures. They’re product failures.
The implications
Knowledge mistrust: Analysts are second-guessed. Dashboards are deserted.
Human routers: Knowledge scientists spend extra time explaining discrepancies than producing insights.
Redundant pipelines: Engineers rebuild related datasets throughout groups.
Resolution drag: Leaders delay or ignore motion attributable to inconsistent inputs.
As a result of knowledge belief is a product drawback, not a technical one
Most knowledge leaders suppose they’ve an information high quality problem. However look nearer, and also you’ll discover a knowledge belief problem:
Your experimentation platform says a characteristic hurts retention — however product leaders don’t consider it.
Ops sees a dashboard that contradicts their lived expertise.
Two groups use the identical metric title, however totally different logic.
The pipelines are working. The SQL is sound. However nobody trusts the outputs.
It is a product failure, not an engineering one. As a result of the methods weren’t designed for usability, interpretability or decision-making.
Enter: The information product supervisor
A brand new function has emerged throughout prime corporations — the info product supervisor (DPM). Not like generalist PMs, DPMs function throughout brittle, invisible, cross-functional terrain. Their job isn’t to ship dashboards. It’s to make sure the best folks have the best perception on the proper time to decide.
However DPMs don’t cease at piping knowledge into dashboards or curating tables. One of the best ones go additional: They ask, “Is that this truly serving to somebody do their job higher?” They outline success not by way of outputs, however outcomes. Not “Was this shipped?” however “Did this materially enhance somebody’s workflow or determination high quality?”
In observe, this implies:
Don’t simply outline customers; observe them. Ask how they consider the product works. Sit beside them. Your job isn’t to ship a dataset — it’s to make your buyer more practical. Meaning deeply understanding how the product suits into the real-world context of their work.
Personal canonical metrics and deal with them like APIs — versioned, documented, ruled — and guarantee they’re tied to consequential selections like $10 million finances unlocks or go/no-go product launches.
Construct inside interfaces — like characteristic shops and clear room APIs — not as infrastructure, however as actual merchandise with contracts, SLAs, customers and suggestions loops.
Say no to tasks that really feel subtle however don’t matter. An information pipeline that no workforce makes use of is technical debt, not progress.
Design for sturdiness. Many knowledge merchandise fail not from dangerous modeling, however from brittle methods: undocumented logic, flaky pipelines, shadow possession. Construct with the belief that your future self — or your substitute — will thanks.
Clear up horizontally. Not like domain-specific PMs, DPMs should always zoom out. One workforce’s lifetime worth (LTV) logic is one other workforce’s finances enter. A seemingly minor metric replace can have second-order penalties throughout advertising and marketing, finance and operations. Stewarding that complexity is the job.
At corporations, DPMs are quietly redefining how inside knowledge methods are constructed, ruled and adopted. They aren’t there to scrub knowledge. They’re there to make organizations consider in it once more.
Why it took so lengthy
For years, we mistook exercise for progress. Knowledge engineers constructed pipelines. Scientists constructed fashions. Analysts constructed dashboards. However nobody requested: “Will this perception truly change a enterprise determination?” Or worse: We requested, however nobody owned the reply.
As a result of government selections at the moment are data-mediated
In right now’s enterprise, practically each main determination — finances shifts, new launches, org restructures — passes by way of an information layer first. However these layers are sometimes unowned:
The metric model used final quarter has modified — however nobody is aware of when or why.
Experimentation logic differs throughout groups.
Attribution fashions contradict one another, every with believable logic.
DPMs don’t personal the choice — they personal the interface that makes the choice legible.
DPMs be sure that metrics are interpretable, assumptions are clear and instruments are aligned to actual workflows. With out them, determination paralysis turns into the norm.
Why this function will speed up within the AI period
AI received’t change DPMs. It’ll make them important:
80% of AI venture effort nonetheless goes to knowledge readiness (Forrester).
As giant language fashions (LLMs) scale, the price of rubbish inputs compounds. AI doesn’t repair dangerous knowledge — it amplifies it.
Regulatory stress (the EU AI Act, the California Client Privateness Act) is pushing orgs to deal with inside knowledge methods with product rigor.
DPMs usually are not visitors coordinators. They’re the architects of belief, interpretability, and accountable AI foundations.
So what now?
For those who’re a CPO, CTO or head of information, ask:
Who owns the info methods that energy our greatest selections?
Are our inside APIs and metrics versioned, discoverable and ruled?
Do we all know which knowledge merchandise are adopted — and that are quietly undermining belief?
For those who can’t reply clearly, you don’t want extra dashboards.
You want an information product supervisor.
Seojoon Oh is an information product supervisor at Uber.
Day by day insights on enterprise use circumstances with VB Day by day
If you wish to impress your boss, VB Day by day has you lined. We provide the inside scoop on what corporations are doing with generative AI, from regulatory shifts to sensible deployments, so you possibly can share insights for optimum ROI.
Thanks for subscribing. Take a look at extra VB newsletters right here.
An error occured.