This doodle came to me while I was contemplating the time value of money – that is, a sum of money is worth more now than the same sum will be worth at a future date due to the earnings potential between now and the future. It struck me that the same basic concept applies to data.
As a general principle, the value of data decays over time, and that’s especially true when the temporal dimension of decision-making is measured in seconds, minutes, and hours. Formula 1 drivers need to know their speed and fuel right now, not five minutes ago. And therein lies another data principle – the frequency at which you collect data should be matched to the temporal dimension of the decision making.
Now, back to the doodle.
As data strategists, we should constantly attempt to compress the curve and the time that expires between the business or clinical event of interest; the data that is captured about that event; and the action taken based on that event. As most of us who have been in the world of data and data analysis have experienced, it takes a painful amount of time to pre-process, curate, normalize, and model data in preparation for its analysis. At IMO, our team is focused on standardizing terminology and data quality at the front end of clinical care in order to minimize the need for downstream data curation and normalization.
Since data is often collected with less regard for standards and quality at the front end, we are developing a normalization engine – IMO Precision Normalize – that supports human data engineers with artificial intelligence, machine learning, and natural language processing. Our goal is to minimize the human labor and data latency in the normalization phase of the data lifecycle while also improving the quality of the output of that normalization process. In other words, as this doodle depicts, we are compressing the curve and raising the time value of data, taking a uniquely proprietary but very common sense approach to this problem.