It’s all too familiar that Big Data comes at “Big Cost”. Big Data has the potential (and in some cases the track record) to be a catalyst behind increasingly efficient and higher-quality healthcare – let’s call it “enlightened healthcare”. The demand for enlightened healthcare exists across thousands of providers regardless of setting or size. Making these Big Data tools affordable is essential to the spread of enlightened healthcare. In the end, health systems are spending a fortune on Big Data architecture and intelligence. Some can afford it and some cannot.
Despite health systems having built a Big Data industry segment through hundreds of millions of dollars invested, the average hospital still does not get actionable ROI from Big Data. And they certainly are not ready to add artificial intelligence into the milieu. The competent generalists staffed by the average health system have not proven they can excel at quickly deploying the right amount of technology for the challenge, nor the the ability to consistently organize and drive internal performance improvement projects. The results of these efforts are all too often committees, multiple vendor evaluations, 18-month projects and seemingly endless debate.
Better decision support soon becomes intertwined with population health and other macro-initiatives. Meanwhile patients, real people with problems, are being run through care processes with too much variation and sub-optimal results. And care providers, real people solving problems, trudge along making the best decisions they can without the full decision support their organization is already capable of delivering. What’s needed is technology that disrupts the “$3 M and 3-year data warehouse” paradigm. What’s needed is an advanced but adoptable means of real-time alerting and retrospective analysis. What’s needed is a tool that can take advantage of the data already in the hospitals’ systems and can analyze structured AND unstructured data for opportunity.
Instead of the prevalent “Big Data” model, why not approach this from a lean perspective? Why not address the 5 clinical workflows the hospital already knows need improvement? Why not produce the just-in-time data warehouse that is sufficient to go after these problems? Why not buy just enough technology to get that ROI, then build on the success? Can that technology please be implemented in 100 days and cost less than 6 figures? We need a little approach to Big Data.
For example, incidental findings from OP medical imaging are a persistent problem. The findings from radiologists doing their jobs and documenting further follow up needed often sit unread and wait for manual handling simply because the language around the finding is unstructured. Fixing this “miss” has been shown to lead to early stage cancer detection repeatedly. How much more of a call to action is required? And yet the approach to fix this with readily and economically available technology is often buried in larger efforts around Big Data. So patients wait and health systems underperform.
If you have read this far, it is certainly self-serving for me to now state that we have such a tool at iWT health called NOTIFI. But it makes my point no less valid. NOTIFI was developed and applied to solve the example above regarding incidental findings for several years now. As the failed Big Data approach became clear to iWT health, NOTIFI was enhanced and purposed to solve other clinical workflow problems too. Regardless of whether you find NOTIFI or some other tool that works, health systems should reconsider the status quo approach. The “$3 M and 3-year data warehouse” option is not working for most and in fact Big Data becomes an excuse for delays, failures and wasted resources. Go “little” on Big Data