How Steinert Analytics Saved a Healthtech Company 25 Hours a Week
- Christian Steinert
- May 14
- 4 min read
Updated: May 20
An in-depth look at our success as a data engineering team for healthcare companies
Data Infrastructure in Healthcare Is a Mess — Here’s How We Fixed It
When a fast-scaling Central Ohio-based healthtech company reached out in Fall 2023, they were in a bind.
Their only data engineer had just put in his notice. There was no replacement on deck. Yet their operations, sales, finance, and leadership teams depended on a static web of scripts, queries, and manual processes to function every day.

As a telehealth company focused on patient engagement, outcomes, and remote monitoring, this wasn’t just a technical inconvenience. It was an operational risk — and a compliance risk.
They needed a bridge - fast.
Steinert Analytics stepped in with one goal: stabilize and mature their data operations to carry them through a critical growth period, and leave the organization stronger than we found it.
Here’s how we did it — and the measurable impact we made.
Step 1: Bring Order to the Chaos
The company’s data stack centered around an EHR integration tool that aggregated patient and performance data from multiple source systems — including EMRs. The tool was useful, but the data output was dirty.
All the raw data was dumped into massive, generalized tables. No modeling. No governance. Just tangled, semi-structured data across thousands of rows, with inconsistent identifiers, null values, and incompatible formats.
Dozens of brittle SQL scripts were being used to make sense of the mess. They were slow, inefficient, and error-prone.
Our first step was to rewrite and refactor over 30 core SQL queries — improving execution time by more than 60% and standardizing them with naming conventions, modular logic, and consistent filters. This instantly reduced the troubleshooting burden across the organization and made onboarding future data engineers significantly easier.
Outcome:
60% faster runtime on core SQL pipelines
70% less time spent debugging broken logic
Set foundation for scalable, repeatable workflows
Step 2: Automate Manual Data Cleaning
The company was spending hours every week downloading CSV files, manually spot-checking them, removing duplicate rows, renaming columns, and formatting them for ingestion into their proprietary data warehouse.
This process took anywhere from 15 to 20 hours a week.
We replaced this entirely with a suite of Python scripts that handled validation, transformation, and reformatting based on dynamic schema definitions.
These scripts could be run with a single command — no more dragging and dropping files or manually opening spreadsheets.
Outcome:
15–20 hours/week saved across engineering and operations
Dramatic reduction in human error
Reusable codebase for future use cases
Step 3: Secure the Last Mile (Distribution)
Even once data was cleaned, it was still being shared manually. Stakeholders — from sales to marketing to finance — were waiting on CSVs sent by email or Teams.
We built secure Python-based automation to handle email distribution at scale, ensuring:
The right files went to the right people
No PHI or HIPAA-sensitive data was sent by mistake
A full audit trail was captured for compliance
With scheduling and distribution now automated, operational leaders got the insights they needed faster — without relying on a human bottleneck.
Outcome:
Eliminated 8–10 hours/week of administrative overhead
Improved security, transparency, and consistency
Step 4: Make EMRs Work for the Sales Team
Before we came in, the sales/account executives were manually requesting and wrangling EMR data from doctor’s offices just to prepare for meetings or performance reviews.
It was time-consuming, awkward, and often delayed by admin back-and-forth.
We logged into their source EMR platforms and built custom reports and views that allowed sales staff to click one button and export the exact data they needed — no more playing middleman between office admins and the internal team.
Outcome:
Reduced EMR data access time by 90%
Empowered sales team to self-serve
Unlocked new opportunities to act on performance trends faster
Step 5: Train the Next Engineer
Once the company was ready to hire a full-time data engineer, our job shifted from doer to enabler.
We spent several weeks documenting every pipeline, SQL convention, and Python script. Then, we trained the incoming engineer on:
How to query the middleware platform effectively
How to run and modify the ETL scripts
How to manage and update Tableau dashboards
How to safely distribute data across the org
By the time we stepped away, the new engineer had full ownership and confidence.
Outcome:
Zero disruption during engineer transition
New team member onboarded with full systems knowledge
Sustainable workflows that didn’t rely on tribal knowledge
The Takeaway: Infrastructure Is Your Force Multiplier
What this healthtech company experienced is common.
You start small. You duct-tape some queries together. You lean on one key person to handle the data flow. Then you grow — and suddenly that “good enough” system becomes a liability.
But with the right partner, stabilizing and upgrading your data ops doesn’t have to mean starting from scratch or building a massive data team.
In just a few months, Steinert Analytics helped:
Save 25+ hours per week through automation
Improve data security and HIPAA compliance posture
Speed up insights for sales, marketing, and leadership
Leave behind sustainable systems that outlive any one person
Results in Brief

If you’re in healthtech and your reporting feels duct-taped together — or your infrastructure depends on one or two people — we can help.
At Steinert Analytics, we specialize in building modern, maintainable, and secure data platforms for healthcare orgs who are scaling fast.
Ready to get your data operations under control?
P.S. Shout out to Logan Colyer, Lead Data Engineer at Steinert Analytics for all the great work here.
Bình luận