Bad data can kill. Literally.
I’m not talking about the impact of bad data on the bottom line of business, where estimates of total losses are usually gauged in the hundreds of billions of dollars per year. Nor am I talking about the inconvenience we all face when presented with a real-world situation resulting from incorrect or stale data. The scenario I’m describing is far more harmful than the emotional crash landing of seeing years of your own work destroyed due to unit of measure errors.
Bad data can accidentally end a human life.
Just a few years ago, there was a major news story about actor Dennis Quaid’s then-newborn twins. The Quaid infants, along with one other newborn, were accidently given 1000 times the normal dose of the anticoagulation drug Heparin due to a data error in the unit of measure. Fortunately, in this case none of the children suffered any long-term effects, but it’s a very small leap of the imagination to picture this occurring with a more dangerous concoction. Even the most benign drug administered in massive doses could be harmful or fatal, so it’s a fortunate case that there no lasting repercussions from the error. Healthcare is becoming increasingly automated, which often results in better and faster care but also bears the impact of fewer opportunities for manual checks to make sure what is administered is what was actually intended. This increases the opportunity for a negative outcome when bad data is present.
Healthcare is not the only vertical where this could happen. Earlier this year, a demolition crew knocked down the wrong house in a neighborhood recently touched by a tornado. This error was blamed on a glitch in Google Maps, which pointed the demo crew to the wrong physical location for the address. Although there were no injuries, it’s not hard to imagine what would have happened if the family – who were busy rehabbing their tornado-damaged home – had been inside the residence at the time.
The examples above all reflect possibilities where a person could be accidentally killed because of bad data. What about applications where bad data in a device intended to kill leads to the killing of the wrong person? It’s a sad reality that we live in a world with some very bad people, and a too-frequent conclusion to interactions with those bad people is the use of deadly force. Military and law enforcement applications rely heavily on technology to guide those deadly solutions, but like any other system, it is only as reliable as the data fed to it. Though not heavily publicized, numerous stories have popped up surrounding the mistaken assault on incorrect targets. While I’m not going to engage this post in the morality of using automated smart bombs and the like, I will use this scenario to emphasize how critical it is to have good data and data processes to ensure that no one is mistakenly harmed because of incorrect information.
We Are All on the Front Lines
It’s easy for any of us to fall into the trap of thinking that our jobs are just about data. For some of us, that is true. However, most of us touch systems that could, either directly or indirectly, harm a person. I remember the first time I heard this, and how ridiculous it sounded. I was working for a school district at the time, and attending college classes after hours. I had an technical writing professor who told us on the second day of class, “When writing technical documentation, you could kill someone.” I actually laughed out loud, thinking he was making a joke. However, his background was in robotics, where he wrote user manuals for robots used to disarm bombs and mines. He was absolutely correct – as ludicrous as it sounds, under the right circumstances a misplaced adjective or modifier in a user manual could have led to injury or even death.
When I left my school district job to go to work for a hospital, my role as patient protector quickly came into focus. I’ve never written a prescription, never given an injection, and never treated anything more serious than a minor scratch. However, the people that do all of those things rely on the systems that I write and the data I deliver. To that end, I have as much responsibility as any physician or nurse to ensure that I’m doing my job properly.
The short story is this: Don’t think for a second that you’re “just” a DBA, developer, or ETL architect. The system you touch could help or hurt people, and it’s up to you – and all of us – to keep watch over those who could be impacted. Never forget that bad processes or bad data can kill.
Thanks for this reminder, Tim.
Thanks, Connie! Glad you enjoyed the post.