The Perfection Trap: Why Best Practices Are Killing Your Data Consulting Business
- Christian Steinert

- 16 minutes ago
- 7 min read
Walking the walk when speed beats perfection—and what I learned selling $100K ROI guarantees under impossible timelines
Like the Back to the Future DeLorean’s License Plate, We’re OUTATIME.

In this issue, I’m focusing on how to truly walk the walk as a consultant. I think this piece is therapeutic for my own psyche, and selfishly, why I’m diving into it on this week’s Rooftop Insights issue.
It’s easy to talk about best practices in data. After 4 years of establishing my brand as a true data management expert, all of my content promotes the build of robust data modeling, minimal tech debt creation, and building data platforms correctly the first time.
Proper data management feels so effortless when writing content for LinkedIn or Substack.
Start with the problem. Talk to end users. Understand what questions they’re trying to answer. Build a data solution that helps them answer that question (even if it’s just one or two to start). Reinforce taking action on these insights.
I get a hit of dopamine when I align my content with these best practices, highlighting what we did well at XYZ clients that led to success using these frameworks (it’s true!), and trashing every data team that violates these practices.
Poor practices that easily arise in a real-world working environment are:
Building what a stakeholder asks for
Querying directly off the raw transactional tables
Lifting and shifting legacy SQL logic into the cloud
Throwing 30 metrics at a stakeholder without qualifying their necessity
2+ years full time in my own data consulting practice has taught me a massive lesson. Best practices are sexy to talk about, but the speed needed to show value ASAP creates obstacles towards achieving them fully.
Data management best practices are the North Star, existing as a goal to ALWAYS keep your sights on, but typically not practical in the initial phases of development at clients.
Why Best Practices Take a Back Seat (And Why That’s Okay)
Why, might you ask?
Investors and the c-suite executives don’t care. They need value NOW.
In a truly data-first organization, c-suite and executive buy-in is the ultimate solution to an “order-taking data culture”. It’s also the solution to extended timelines for building with best practices.
People talk about buy-in at the top as one of the most important aspects of successful data teams.
Is this always realistic though? In my near decade of experience in analytics, hell no.
CEOs report to investors. You’re dealing with incredibly talented and accomplished individuals. This comes with a hefty amount of ego and pride.
These executives have gotten to where they are due to operating with extreme efficiency, cutting the fat and executing only the pieces that move a business 10X. They’re also extremely aware of the latest ways of working.
In my opinion, AI tooling is more hype than reality, but it absolutely does accelerate our efficiency. In the world of data management, c-suite execs are over-promised by AI vendors on efficiency gains, but do you think they care to learn about why that is? No.
If they hire a data team, they need to see results immediately. All of this combines to make our job incredibly hard, with massive expectations.
In the same context, as a data consultant our goal is to show value ASAP. That’s part of our offer’s ROI guarantee at Steinert Analytics.
Otherwise, what’s the point of hiring a 5-6 figure data consultant for a 4-8 week project “sprint”?
In 2025, we don’t have 3-6 months of runway before showing value. Value must be shown at hyper speed. And to do that, you have to deliver incrementally and iteratively.
This requires ditching perfection. In my eyes, delivering perfection as a data engineer is aligning to best practices while considering the business need.
When you’re up against the clock, it’s a Return on Investment race that results from the pressures of your clients/stakeholders’ needs and the promise you made to sell your impressive offer.
Today’s State of Working in Healthcare Data for the Mid-Market
Let me give you some insight into a mid-sized healthcare company’s data ecosystem in the United States.
They’re typically brownfield. We begin an engagement by documenting all databases, tables, fields, source systems, data flows, existing reports, data lineage and data pipeline operations. This builds the foundation of a data catalogue they never had.
They’re querying directly off their transactional database (typically SQL Server). Sometimes that’s even hosted on-prem (my goodness!).
There is little to no documentation of anything. We’re unraveling thousands of lines of SQL code that sits in 400 stored procedures and views written by a DBA no longer at the company.
These stored procedure logics have never gone through due diligence. They’re just being assumed as the correct source of truth regardless of data validation / code quality.
They have 30-50 Power BI reports that are directly connected to the transactional database. This is mainly querying EHR data.
There may be 3 versions of the same Power BI report in two separate workspaces. Stakeholders with influence are picking and choosing which reports are considered “source of truth”.
You may have one or two high level stakeholders that have built their own version of a Power BI report and claim that’s the source of truth for a given set of metrics. Other users follow along due to the influence of these top level stakeholders.
There’s also plenty of Excel reports that are being manually updated daily by pulling from a few of these legacy Power BI reports and native EHR UI reporting.
As the consultant, it’s our responsibility to identify one critical report we can take from a manual workflow into full automation. This baselines our offer with a hard ROI based off time to dollar savings in labor cost.
We’ve found this use case is common enough to confidently propose the solution to them, while providing extremely hard evidence of an ROI we can guarantee quickly.
The Trust-Building Paradox: Match Legacy First, Then Improve
When we enter the build phase, it’d be ideal to start from scratch. However, the reality is these mid-sized healthcare companies treat their legacy reports as the source of truth.
If you’re going to gain trust early, you don’t have the luxury of redefining the data model into a new logic that produces slightly different numbers in your new and shiny data warehouse. Often, you’re stuck building so numbers match legacy as a baseline.
Once you’ve proven that in a quick POC phase, trust builds. Then and only then can you start challenging the legacy logic vs. your new, best practice data model that sits in the data warehouse. From there, conversations between you and these influential stakeholders start happening.
It forces these stakeholders to critically think about how metric logics are defined. You’d be shocked at the lack of clarity that is uncovered in these conversations. You as the data person begin showing them why their legacy logic is incorrect, but you only get to do that by proving you can validate your new build to their legacy first.
How to Deliver Value Immediately While Thinking Longer Term
I often hear that you need to build what satisfies their requirements in the immediate while simultaneously building your new and improved solution for the long term.
While I agree with this in theory, in practice while data consulting it’s a whole hell of a lot harder. You don’t have 40+ hours per week to dedicate to one particular client.
Maybe it’s on me to get more efficient, but if you’re capped at 15-20 hours per week you have to prioritize delivering value that can be seen and showcased - at least early on.
This is why you must over-communicate to your clients! If you’re delivering value but know it’s violating best practices for the long term health of a company’s data, flag it to them.
A great set of questions that helps me baseline “the value:tech debt ratio” is this:
What value is this delivering to our stakeholders?
ie. saves them $40,000 per year in labor costs
What’s the opportunity cost of this approach?
Maybe slightly more subjective, but here’s the deal. If you build spaghetti legacy logic to automate a manual report quickly, long term this takes away from stakeholders being able to use the same dataset for other decision making.
You’re going to be limited on detail and other information. This will require end users to submit requests to the data team.
They’re going to ask the data team to add a field and that ends up consistently burning 5-10 hours per week of an analytics engineer’s time to add that field into the nightmare of spaghetti code.
No documentation will be produced, and no one will truly understand the logic of this hairy spaghetti query used to power this automated report.
Whereas if you’d have taken the time to build a best practice data model, the stakeholders could have gotten what they needed right away (with proper training). Analysts and engineers wouldn’t be stuck adding an additional field to a table and burning more valuable labor hours.
Overcommunicate and Carve Out Time
With all that said, overcommunicate tech debt, their limitations and drawbacks of the deliverables to satisfy quick value creation and ROI guarantees. This helps to set expectations with them.
Encourage the team to carve out time in established Jira tickets/statements of work for the more robust data modeling and future state of the data warehouse.
As long as you are delivering value that establishes trust, overcommunicating to level set expectations, and illustrating the outcomes of a more robust data warehouse build, stakeholders will be much more understanding of where you spend your time.
I hate to say this, but it’s really all a balance.
A balance of delivering ROI that establishes trust. A balance of overcommunicating where we are still falling short. And guiding them on the action steps and plan to improve so we can ensure we’re building the most robust data foundation for these healthcare companies in the Age of AI.
Healthy Long Term Partnerships
What I’m describing are really just attributes of good project management and reliable consultants.
Deliver quick wins that establish trust.
Stay transparent about the pros and cons of what we’re doing.
Create an action plan to keep moving towards a positive outcome while ensuring everyone is still getting value while we’re doing it.
Sounds easy enough? Hah! I know, this is extremely challenging.
It’s a process that I think I’ll be learning forever, but let me tell you - sticking to our core values of transparency, egoless candor, continuous learning and work ethic has always shined through for being successful with our clients.
At the end of the day, data transformation consulting is a people & relationship game.
Without transparency, trust and continuous follow-through, the relationship erodes. Sound familiar to a long term relationship of any kind? That’s because it is.
This is a partnership, not a transactional sale. Treat it as such.
Christian Steinert is the founder of Steinert Analytics, helping healthcare organizations turn data into actionable insights. Subscribe to Rooftop Insights for weekly perspectives on analytics and business intelligence in these industries.
Also - check out our free Healthcare Analytics Playbook email course here.
.png)



Comments