How to Run a Healthcare Data Migration Without It Running You
- Christian Steinert

- 1 day ago
- 7 min read
The exact system I use to deliver high-stakes healthtech migrations without the chaos
10,000 lines of legacy SQL. 30+ semantic models. One sole analyst who built it all.
This is what you walk into for a healthcare data migration—and how to handle it without burning out yourself, your team, or your client relationship.

We’re getting technical for this issue of Rooftop Insights. My goal is to paint the picture crystal clear for what you’ll walk into for a data migration at a majority of healthcare/healthtech companies and how to handle it smoothly.
A tale as old as time: when it comes to platform work vs. data infrastructure, prioritization of the platform is always on top. HealthTech companies know data is extremely important, often being the true lifeblood of their business. However, the platform (in my case telehealth) is what the customer actually sees and interacts with.
Therefore, it’s chosen over everything else as priority. When I say “chosen,” I’m specifically referring to resource investment. You might have 2-4 senior software engineers working on the platform, and 0-1 data people curating reports and analysis of the data.
I’ve seen environments where a sole analyst built out over 15 Power BI dashboards with multiple pages equivalent to 10,000 lines of code directly querying the production OLTP database. We’re talking around 30-35 semantic models that are really just massive SQL queries sitting in Power Query.
Timezone nuances, grain differences, CASE groupings, and specific filters for each report force these queries to reuse many of the same tables—but reading the code becomes critical to spot the differences and their intent.
Layer on the high-pressure timeline deadlines for a project like this when working with a high-growth (and sometimes seed-funded) HealthTech scale-up, and you’ve got A LOT on your hands to ensure this migration goes smoothly.
Now I’ll begin walking you through how we at Steinert Analytics handle projects like this to ensure they go smoothly. This is pulled directly from my own success navigating projects like this. I don’t want to give the impression that I’ve never hit a bump in the road.
I have—and more likely than not, you will too at some point. But it’s my best play for giving you something valuable to walk away with and take action on today in your own data migration projects.
Win as a Consultant by Setting Expectations
The first piece of success is to manage expectations with stakeholders and your team. Stakeholders want things FAST. And as your customer, it’s your duty to strive hard to deliver fast. But that doesn’t mean sacrificing quality and overpromising.
In the Statement of Work (SOW), I’ll map the deliverables to their expected timeline, but I’ll clause that we will deliver as many migrated dashboards as possible in the allotted time (ie. 20 hours per week, etc). I always allow for buffer/breathing room in how I write the SOW for a project like this.
It’s impossible to know at the start of a project what snags or gotchas you might hit. Especially when logic is extremely nuanced in the arcane legacy SQL written by an analyst/engineer years ago with depths of business knowledge you’re just getting up to speed on.
Furthermore, build out a Gantt Chart and use Trello for your Kanban board internally. Assign your team individual tasks on the Gantt Chart that can be mapped onto the Trello board. Each weekly sprint, new tickets are added.
This gives your team operating structure and flat visibility into what everyone is responsible for. A key to successful data migrations.
The Development Process That Actually Works
Step 1: Inventory Check All Tables for Each Dashboard Migrated
Do you have all the source tables pulled into your data platform (in my case Microsoft Fabric)? If not, the first thing to do is load those tables and build any required DDL and stored procedures.
If you have the metadata from the database’s information schema, leveraging Claude to create the DDL and Stored Procedure scripts is super convenient.
Step 2: Validate the Legacy Query in Your Data Platform
Rebuild the legacy query you’re migrating away from in your data platform. Don’t jump right into building the data model just because you’ve checked the table counts and done some quick data quality checks on the pipeline.
The reality is, timezone offsets, case sensitivity, and other data types can cause slight differences from source. The best baseline is to get the legacy query matching in the new environment—this also forces you to become intimately familiar with the logic.
Real example: I was recently migrating a patient report on a telehealth engagement. Table counts matched. Quick spot-checks looked fine. But when I lined the new query up against the legacy numbers, episodes were landing on the wrong week.
The culprit? The legacy SQL was implicitly converting UTC timestamps to the client’s local timezone before applying the week boundary. The new environment wasn’t. A handful of episodes near the Sunday-Monday cutoff were flipping into the wrong reporting week and throwing off the totals.
The fix was straightforward once I found it—AT TIME ZONE with an org-to-client or person-to-client timezone join. But here’s the point: if I had skipped this validation step and gone straight into building FACT and DIMENSION tables, that timezone drift would have been baked into the entire star schema. Every dashboard built on top of it would have been wrong.
Validating the legacy query first is the most critical foundation to build in a healthcare data migration.
Step 3: Build the STAR Schema(s)
After the source legacy query is validated, NOW is when you can build the formal FACTS and DIMENSIONS for a STAR schema. This is the meat of data modeling.
Leverage Claude to help you make a definite consideration, but I’ve found it always points me back to a dimensional model when we’re working with Power BI.
I literally prompt Claude to “Think like a combination of Ralph Kimball, Bill Inmon, and Joe Reis for constructing this data model” before I get into it. It’s been mind blowing at helping me make grain considerations and FACT table creations for my clients.
Don’t rely entirely on it. A successful data modeling experience using AI depends as much on you, the data practitioner. Turns out that years of experience are the key to making a robust data model, allowing me to challenge Claude when it suggests creating a dimension table that actually represents a fact. I give that feedback to Claude and it continues to be a great thinking partner during this critical exercise.
Step 4: Reproduce the Validated Legacy Numbers Using Your New FACT and DIMENSION Tables
If you took the proper care to model grain and logic, your “GOLD” (if using a medallion architecture) layer tables should produce the same results as the legacy query—except with far fewer lines of code and joins.
Once numbers tie out, congrats! You can take this query and build in the BI tool of your client’s choice. For us, it’s Power BI.
I then assign a ticket to my Power BI Engineer and he builds that Gold query in Power BI and validates the visualization against the legacy dashboard.
There’s always a consideration here as to how much logic you want built in the backend Gold query itself vs. Power BI DAX and the semantic model. Often times, our clients prefer the logic be baked directly into the data warehouse tables since healthcare data is used in far more places than just Power BI.
Healthcare data demands versatility, so if key logic is dependent on the BI tool, it restricts where else we can use this data. Food for thought as you consider these options and patterns with your clients.
The Non-Negotiable: Constant Communication
Perhaps the biggest key to success is constant communication. We typically have weekly updates to cover progress made and where we’re at against the timeline set. If I see deliverables slipping, I make that known right away.
Ping your stakeholders more than once a week in my opinion. A progress update every few days establishes continued trust and allows for pivots and iterations sooner.
For example, at one of my telehealth clients we uncovered the need to have all major logic in the FACT and DIMENSION tables themselves, rather than in Power BI DAX. This could only have been uncovered in our routine weekly update meeting, making these points known in collaborative conversations with the CTO. Otherwise, we would’ve spent a whole lot more time modeling the custom logic in DAX rather than the data warehouse SQL itself.
The Bottom Line
There you have it. My direct experiences migrating healthcare and healthtech companies to a cloud-native OLAP data platform with robust data models and management in place. I make this sound so streamlined. Trust me, it isn’t. And it’s taken me years to move at the pace we go now.
However, if you can take away managing expectations and constantly communicating with your stakeholders throughout, the key to successful migrations always comes back to the relationship you have with your clients.
Just stay transparent, stay consistent, and ALWAYS FOLLOW THROUGH—and the data migration will fall into place.
I hope this piece relieves a little stress for you. I know writing it served as a good reminder for even myself.
Until next time, see you in the next issue of Rooftop Insights. Thank you!
Christian Steinert is the founder of Steinert Analytics, helping healthcare organizations turn data into actionable insights. Subscribe to Rooftop Insights for weekly perspectives on analytics and business intelligence in these industries.
Also - check out our free Healthcare Analytics Playbook eBook here.
.png)



Comments