One item addressed here is the cost and I don’t think one has been able to miss that over the last few years that for every efficiency we gain in healthcare with technology, it spurs yet another layer of technology or IT investments, it’s the beast we live with today. Thus so all those studies and reports you saw about Billions and Millions and Trillions to be saved in healthcare with efficiencies were false. You still see them out there today and anyone in technology knows that all these projections by economists, etc. won’t hold water as there’s always the unexpected or a Black Swan event that will dramatically make huge changes almost instantly. When it comes to all these wearable devices you have to not only think about the cost, but what I said below, “do people work this way” as we are seeing model after software model to begin to fail, some of them even before they get off the ground.
People Don’t Work That Way” A World of Broken Software Models That Don’t Align To the Human Side,Too Much Push At Times With Only A Proof of Concept That Fails in the Real World.. - Medical Quack
We are starting to see the wear and tear and there’s nothing more glaring and in our face than the VA right now as we have become over dependent on on stats and reports. Sure we need them, but how many and where and not go into overload either. We almost go into the dog it’s tail at times.
VA Crisis Just The Tip of the Iceberg As US Needs a Full On Healthcare Culture Change Everywhere To Get Back In Touch With the Real World of Patients…
Let’s face it when you need healthcare would you rather look a human or an algorithm in the face? Sure that’s a bit extended but we’re not far from it. Another good point made here is that it’s not so much the treatment avenues, it’s the billing and cost that’s up front here too. We just had that this week with WellPoint and their new answer for Cancer Treatments. Do you want your care based on what an oncologist gets paid from the insurer to use only their regimens? That is exactly what we have here.
WellPoint Begins New Oncology Program For Providers Offering Doctors $350 Monthly Payment For Each Patient Treated Using Insurers Recommendations - Is This A Kickback Offer?
The nurses work with systems every day and live it so they deserve some attention for sure. Humans will add ethics to their decisions as well and machines can’t do this with decision making. Again as I mentioned above the data model may show you one thing but ethics added to the data may bring about a different decision so the human doctor still needs to be in the picture. In just doing this blog I have seen a huge deterioration of ethics all over the place and it’s getting worse. One example was just the language used when two IPAs merged together with United, “inventory was transferred”..so doctors and patients are now just inventory? See what I mean with ethics and those were just words and it gets worse. Here’s a bit of a scientific write up where someone really tried to explore giving machines ethics, can’t be done.
Limitations And Risks Of Machine Ethics (That Really Don’t Exist) - Abstract Basically Substantiating the Existence of What I Coined As “The Attacks of the Killer Algorithms”…
When it comes to models working or not working, look at this video below..do you want an end of life robot? I’d rather have someone shoot me in the head that have to be put through this, but again we have folks living in virtual values today that get the real world and virtual values confused. The person who created this is certainly out there in a virtual world. This is the worst but people will create non ethical garbage like this if we leave the humans out. Again, think of the VA and what occurred there, everyone initially interviewed were just stat rats as they were so brain washed. That is also why I created “The Attack of the Killer Algorithms” page with videos that help you see what’s going on around you so you are aware.
So again nurses make some good “human real world” points on not to lose our ethics by all means and what in the heck is all of this going to cost. Sadly we have resorted to a bit of a sick way to rationalize who gets what and that is “scoring” and the World Privacy Forum is all over it with how this denies access and it’s proprietary and we don’t know the formulas, models, etc. and just have to believe that some of the junk science is true? I hope not as it’s getting worse with “The Grays”.
World Privacy Forum Report - The Scoring of America: How Secret Consumer Scores Threaten Your Privacy and Your Future - One Big Element that Fuels the Continued Attack of Killer Algorithms & Demise of the Middle Class Creating Profiteering And/Or Denial of Access
Again for all the efficiencies we gain, we lose any savings to some form of software or medtech and I’m not saying it’s all bad because it’s not but how we use it and how we balance is what’s important. Is that the future of what kind of care we get…algorithms with no ethics? Something to think about. BD
“What this technology does is generate profits for healthcare corporations because they standardize based on this model of care that’s based on the factory floor. You treat everybody like a Model T Ford,” Deborah Burger, a registered nurse and co-president of NNU, told me over the phone.
By speeding up the provision of care with algorithms, private hospitals can serve more patients in a day—and charge them accordingly. Apache III collects patients’ information and sends it right to the hospital’s billing department. “It’s actually a billing mechanism more than it is a treatment protocol,” Burger said.
Algorithms that can analyze symptoms and spit out a diagnosis favor efficiency over proper care, according to a recent campaign by National Nurses United (NNU). The union claims that automated diagnosis systems lack the individualized care a nurse can provide and mainly allow private hospitals to boost their bottom line.
Healthcare professionals have worked for years to develop diagnostic algorithms—including early methods like Apache III and SAPS III, as well are more more advanced clinical decision support systems—which are used to help determine how patients are treated.
They compare patient symptoms to a base dataset—Apache III's was culled from over seventeen thousand ICU patients—to determine things like mortality probability and whether a patient should remain in intensive care or be moved. It may be a bit impersonal and morbid, but it’s efficient. Instead of improving the quality of healthcare, however, NNU sees these algorithms as eroding it.