Hospitals Around the World Are Struggling in the Aftermath of the Great IT Meltdown

“If one of those computers is affected, suddenly all of your sterilization procedures have to slow down or even stop, and then operations stop,” he says.

With large healthcare systems employing thousands of personnel and looking after vast numbers of patients—last year Michigan Medicine had more than 2.7 million outpatient visits—modern healthcare has become reliant on digitization as a matter of necessity, from systems which relay communications between busy departments to electronic medical records, orEMRs, which store vital information about individual patients.

But in recent years, concerning reports have emerged about the potential consequences of those systems breaking down. Studies have shown that during electronic medical record downtime, laboratory testing results are delayed by an average of 62% compared with normal operations, while in the NHS, IT failings have been directly linked with cases of patient harm.

In April, Sofia Mettler, at the time a resident physician at Mount Auburn Hospital, published a paper in JAMA Internal Medicine in which she described a day where the hospital’s EMR system was down for a period of seven to eight hours. The disruption meant that samples for morning lab tests were unable to be collected because the phlebotomy team did not know which patient needed which tests, while the results of tests conducted before the downtime could not be disseminated, making it harder to assess overnight progress.

Mettler, now a pulmonary and critical care fellow at Brigham and Women’s Hospital, says that experience pales in comparison to the consequences of the CloudStrike outage.

“This time, the extent of the system downtime is way more profound,” she says. “We are currently unable to use any software that relies on digital data transmission. For example, we are unable to review CT scans, because the radiology software is down as well. It is difficult to make clinical decisions without access to what has become the essential part of medicine. We are using bedside ultrasound machines, but it is not nearly as good as CT scans in telling us what is going on in the lungs.”

Dean Sittig, a professor of biomedical informatics at the University of Texas Health Science Center at Houston, says that in case of such incidents, hospitals are supposed to have paper backup systems, and to ensure that vital devices such as IV pumps, blood pressure monitors, and ventilators which are controlled on the internal network are isolated from the internet. However, this doesn’t always happen. “Every hospital has fire drills, but they should have things like downtime drills as well, where they turn off the computer and make sure that everything still functions,” he says.

According to Sittig, there are many reasons why computer failures can lead to patient safety issues like delays in prescribing certain medications. However, some of the biggest problems are subtler, such as a lack of manpower. With a healthcare center relying on lab test results being passed on by hand, there can be delays in discharging patients, which means they stay in hospital for longer and become more vulnerable to contracting infections.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Secular Times is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – seculartimes.com. The content will be deleted within 24 hours.

Leave a Comment