When this blog post gets published, my son will be elbows deep in a recipe for homemade bow tie pasta with tomato cream sauce. He’s taking over our kitchen this week as part of a virtual cooking camp, which is a ploy to keep him busy – and our family fed – over his winter break.
Clever – yes. Delicious – let’s hope. But why are we talking about it here?
As I reviewed the list of ingredients he’d need for the week, I made some substitutions. Fresh basil. Nope, we’ve got dried. Semolina flour. We can manage with all-purpose. Yet with each replacement, I was keenly aware of the tradeoffs we’d be making. The quality of our inputs (ingredients) would have a direct impact on the quality of the output (dinner). Which brings me to IMO’s new white paper, The impact of data quality on healthcare: Ongoing issues surrounding patient data are costing providers time and money.
A few months ago, IMO and HIMSS conducted a survey of clinical, business, and IT personnel at various US hospitals and health systems. We were looking to understand how patient data is being used in decision-making analytics and how challenges with data quality are getting in the way of achieving enterprise goals. The research was revealing.
- 57% reported that data is inconsistent due to subjective documentation from providers.
- 50% said that data derived or extracted from external sources is variable in accuracy and completeness.
- Despite these issues (and others), organizations are using patient data for a range of initiatives, including quality measurement and reporting (81%), revenue cycle management (60%), and clinical decision support (55%).
In short, important decisions are being made based on the analysis of data that is often inconsistent, inaccurate, or incomplete. And not surprisingly, 95% of respondents have experienced negative outcomes as the result of challenges with patient data quality in the last 12 months.
In culinary terms, the ingredients aren’t always what they should be, but the kitchen remains open for business.