I recently had a chance to read the article The Overdose: Harm in a Wired Hospital. If you have the time, it’s worth reading. This article brings up a number of important issues. I wanted to share my thoughts about those issues and the evolution of technology and human interaction in pharmacy.
As technology evolves, the role of the healthcare professional is changing. I am not entirely convinced that all of the changes are for the better. In fact, I believe the interface between humans, technology, and competency is potentially heading in the wrong direction! I base my observation on the following practice tenants:
1. Assume it’s wrong.
2. Who’s making the decision?
3. Make this screen go away!
4. Can I do this with my eyes closed?
5. It’s too big to fail?
Assume it’s wrong.
As a practicing pharmacist, one of my responsibilities is to enter physicians’ orders into the pharmacy electronic system. As I do this, I ask myself several questions with each order: What was the physician’s intent? Is the drug / dose appropriate? Are there clinical issues with this drug on the patient’s medication profile? Only when I can answer ALL of these questions will I enter the order. By doing so, I have taken ownership of that order.
It’s easy to become complacent and assume that my purpose is to simply enter the order. But there is a big difference between just entering an order and taking ownership of it. I have a colleague who, when training new pharmacists, tells them to always assume the order is incorrect until it can be proven otherwise.
Many times the pharmacist’s approach is to default to the physician’s judgment. This unspoken chain of command was certainly a factor in The Overdose. The author points out, “As is so often the case with medical mistakes, the human inclination to say, “It must be right” can be powerful, especially for someone so low in the organizational hierarchy, for whom a decision to stop the line feels risky.” We may not be naturally inclined to question the physician’s orders, but sometimes it is imperative that we do so.
Additionally, human instinct is generally to look at something in a digital format and assume it’s correct. As the author in The Overdose points out, “humans have a bias toward trusting the computers, often more than they trust other humans, including themselves.” It takes an extra step to take a step back, ask questions, and actually take ownership of the order.
Who’s making the decision?
Where are we in healthcare IT today? We need an interface. We need to minimize keystrokes. It needs to be in the cloud. We want computer systems to talk to one another. We want to minimize the human intervention because human intervention leads to errors.
In theory, I agree with all of these statements. But I believe we must ask, “When do we need human intervention?” The fatal error that occurred in The Overdose points directly to this issue. This error was not due to a knowledge deficit in either physician or pharmacist. Both knew what was appropriate for this patient, but they missed the mark. I think the key factor was humans taking a backseat and allowing the IT system to make the decision. Are we comfortable with that?
In an effort to eliminate interface issues, we are gradually pulling healthcare professionals out of the decision making process. What role does it leave for us humans? I am certainly not against technology; in fact I’m all for it! But I am concerned that we are going too far, too fast. Technology should enhance and guide our decisions, provide answers, and make our jobs easier. It should help to educate us and make us better at our jobs. It should not make decisions in spite of us.
At the hospital in The Overdose, “They eliminated the step of the pharmacist checking on the robot, because the idea is you’re paying so much money because it’s so accurate.” We need to ensure that someone is still there to check on the robot. We need to retain human intervention.
Make this screen go away!
Anyone working in a modern hospital pharmacy has dealt with the complexity of most IT systems. Often, you know what you want to do, but just can’t get it done. You sometimes find yourself in a maze that seems to have no exit. The medication order you are trying to master becomes secondary to managing the system.
If you have ever taken the subway, you can probably relate. Recently, I traveled to Paris, France. My plan was to take a train from the Paris station to my next destination. I was told to take the Red line train, the A train. I knew the direction I wanted to go and as the train approached the station, I hopped on. All is well.
Not so fast. The Red Line split and headed in two different directions. I was on the wrong Red train.
The analogy I draw with the IT systems is the same. I know what I want to do, I know the train I want to take, but just can’t seem to get there. Take for example, a physician who enters an order for a PEDIATRIC patient, but the drug happens to have an ADULT pathway as well. If the physician chooses the adult pathway, the dose rounding may be different. Adult dose rounding could be to the nearest 10 mg instead of to the nearest 0.1 mg. Same drug, but with a very different outcome. In this situation, the responsibility to catch the error would land squarely on the pharmacist.
As difficult as it is for pharmacists, I think the complexity of IT systems also presents a clear challenge for physicians, especially residents. The true intent of the physician may be misinterpreted just because it was the wrong order set, the wrong panel, or the wrong patient category. That is exactly why, for us pharmacists, I will reiterate tenant #1 – Always assume the order is incorrect until it can be proven otherwise.
Can I do this with my eyes closed?
One of the greatest innovations in medication safety has been the introduction of barcode verification for dose preparation, medication dispensing, and dose administration. Barcode verification provides assurance that you have the right drug, in the right form, and in the right dose. It has definitely saved lives. The problem though, is the potential for the caregiver to become disengaged and detached from the process. Instead of reading the label, we just listen for the confirmation beep. When barcode verification becomes a substitute for reading the label, I believe we could actually be increasing the risk of medication error!
In The Overdose, the author points out that, “the nurse trusted something she believed was even more infallible than any of her colleagues: the hospital’s computerized bar‐coding system.” This is why it is so important that we retain human engagement in every process.
Additionally, most electronic systems offer no help in identifying when there is actually a problem. There are so many false alerts that most experienced users pay little or no attention to them. Another reason to agree with Sully Sullenberger! To summarize his quote from this article: We need to be capable of independent critical thought and prioritize our warning systems so that important alarms don’t get lost in the shuffle. Check out this blog post including how Sully Sullenberger has also inspired us. We have even utilized that inspiration as a springboard for development at RxTOOLKIT!
It’s too big to fail?
As IT systems become more inclusive, it is sometimes impossible to tweak one aspect of the program without affecting another part of the program. Sometimes that second part could include a medication issue. For example, let’s suppose you want to create a tool that nursing can use during a pediatric crisis. The tool is designed by nursing, programed by a non‐healthcare professional, and then published. The tool really doesn’t affect pharmacy, so pharmacy is not consulted. What could go wrong?
In most instances, the medication files in an IT system would be set up using several different drug files: one drug file for adults, one for pediatrics, and one for NICU. If the programmer is unaware that there are three separate drug files and builds the application using only the adult drug file, all of the doses and concentrations could potentially be incorrect. This type of nuance is not always readily apparent to the people building or performing the QA checks on the final product.
It’s easy to see how individual silos within the programming team can lead to bad results simply because, “you don’t know what you don’t know”.
Where do we go from here?
So how do we move forward and deal with these issues? Here are my current recommendations:
1. Lower the expectations that your primary IT system can do everything. These systems are fantastic, but like everything else, there are things they do well and things that they don’t.
2. There must be a balance between the operational process and IT capabilities. If you find yourself striking a disproportionate balance: stop, rethink, and readjust.
3. Don’t build a new process around IT capabilities. We should never expect IT capabilities to supersede operational process. IT solutions should integrate with your established process.
4. Observe how your staff is really using your technology. If you observe that your staff has become disengaged; it’s time to re‐train and re‐engage.
5. Whenever there is a process or programming change and medications are involved, include pharmacy in the development team. This applies even if the new process was not built for or will not be utilized by pharmacy.
6. Don’t give up on the humans. Knowledgeable users who are engaged in the operational process are absolutely necessary for positive outcomes.
7. Don’t give up on the humans. (This is worth repeating!) There are qualified and conscientious people out there who care about doing the job correctly and accurately. These are the folks you want on your team!