I hate tape. I don’t really mind having blood drawn, but tape is a pain. Yanking or coaxing it off, it doesn’t matter. Getting hair ripped off my arms always hurts more than a needle.
So when the phlebotomist or medical assistant says, “Hold this for five minutes,” I happily comply. And if they go for the tape, I insist on holding the gauze over the wound. It will stop quickly enough.
One reason we apply some kind of pressure bandage is because many patients do not comply. I’ve said, “Hold pressure on that for five minutes,” and as soon as the patient stands up he or shes daubs the site, shrugs, and chucks the gauze. “I don’t bleed” or “I clot quickly” are two comments I’ve heard many times. Telling the patient that blood will not clot for four or five minutes doesn’t matter, because he or she is too busy thinking about leaving and won’t listen.
I’ve held gauze in place for patients, such as following a therapeutic phlebotomy or arterial blood gas. Otherwise, some kind of pressure bandage involving tape or Coban wrap is needed with most routine phlebotomy patients. The walking and talking prefer to do just that, eager to leave and get on with their day.
But tape doesn’t just pull hair. Patient skin is sensitive to different types, some tape is stickier than others, and older skin can be frail enough to be damaged when tape is removed. Depending on the position of the arm and venipuncture site, a wound can still ooze under the bandage, causing bruising and even a hematoma. Coban adhesive wrap is a neat solution to tape, but it can curl or tighten and be ineffective.
Our morning rounds now include visits to patient rooms. We’ve started looking at skin care issues, especially if the patient has had blood drawn recently. This is good post analytical quality assessment. We’re looking at issues with tape, bruising, hematomas, or other skin issues (such as chafing or signs of infection). Just maybe, some patients hate tape as much as me.
NEXT: Computers are Stupid
Middle managers are often told to “engage” employees with buzzwords and gimmicks: empowerment, inclusion, work teams with stupid names made of acronyms, action plans, team huddles, and good old-fashioned delegation. We need to train our replacements, mentor those with potential, tell stories, emphasize cultural values, give feedback, hold people accountable, and manage by walking around. And we should have an open door, transparency, communication, standing meetings, one on one meetings, goal setting, and crucial conversations.
Employees will spot a gimmick like a used car lot. And they won’t buy either unless they really have no choice. The further removed management is from the basic purpose of the job, the harder it is to see that. It’s too easy to narrow our focus, clutch our perceived power to our breast, and believe our own hype.
So if gimmicks are a waste of time, why do we work?
I’ve made this simple and told my staff many times, “If my son was in an automobile accident and came into the ED, I would trust his life to any one of you. That’s what it’s all about. That’s what we are doing here. It’s all about the patient.”
We assume the unspoken doesn’t need to be said, but it does, plainly and often. It never hurts to realign a team by stating “the obvious,” because it gives perspective. And it simplifies decisions, because a clear purpose creates a universal litmus test e.g. does this help the patient? All great teams keep their eyes on the prize.
According to a New York Times article, employees need four core needs met to be satisfied and productive. The more of these that are met, the more “engaged” the employee becomes. Here’s the breakdown:
- Physical - regular opportunities to recharge and renew energy
- Emotional - feeling valued and appreciated
- Mental - the when and where of the most important tasks are well defined with a chance to get them done well
- Spiritual - a chance to do what we enjoy the most while connected to a higher purpose
Is it that simple?
NEXT: Hold This For Five Minutes
As much as we would love to believe a job is just a job, it is much more than that: it consumes time, energy, and emotions; it advances, stalls, or kills careers; it creates fulfillment, ennui, or anger. And there’s the other thing: coworkers. One blogger writes, “In business, it’s not always about liking people, it’s about being able to trust -- and work -- with them. Sometimes, you will actually grow to like someone, in addition to trusting them and working well with them - and that is special; but let's face it, it's quite rare.”
That depends on how two people hit it off. Some people just form friendships easily.
Another blog points out that having friends at work is a great morale booster, quoting a survey with 70% of respondents valuing friends over salary. All things being equal, a workplace with friends is a better place to work. Working in a place without friends is almost worse than a hostile environment, where at least you get attention. A great friendship makes an emotionally dark place a little brighter.
There are health benefits to friendships, too. The Mayo Clinic web site lists:
- Increase your sense of belonging and purpose
- Boost your happiness and reduce your stress
- Improve self-confidence and self-worth
- Help you cope with life trauma
- Encourage you to make positive changes
Without friendships, it can seem like we work alone, and emotionally we might as well. Without a friendly face to greet, laughter to share, or all-important support a job really is “just a job.” We may do great work but leave feeling a bit empty. Isn’t it better to share success with friends?
Absolutely. More than that friendships are crucial for meaningful success.
But for managers, it is much harder to form friendships. It can be hard for a friend to be promoted into management for everyone. Managers from the outside can find it a minefield to form friendships with staff. And workplace politics is often vicious at a middle management level, arbitrarily isolating a manager. I wonder how many managers fail for those very reasons.
NEXT: Why We Work
I keep forgetting about the cool site Stumbleupon, which will find web sites based on your interests. Then I’ll get an email, say “Aha!” and stumble away. I’ve found a few good ideas this way. For example, the other day I read an article on microsleep.
Microsleep, which is caused by sleep deprivation, is just what it sounds like: a short episode of sleep that lasts anywhere from a fraction of a second to thirty seconds. According to Wikipedia, microsleep episodes have caused disasters such as train wrecks, plane crashes, and the Chernobyl nuclear reactor accident.
Researchers at the University of Wisconsin-Madison explain microsleeping as regions of the brain going off-line while the rest of the brain appears to be awake and functioning. Not getting enough sleep, it turns out, can affect some regions of the brain before others, making us check out at random intervals. In an experiment with rats, for example, 18 out of 20 neurons stayed awake; the rats in the meantime made mistakes.
Signs of microsleep include: drooping eyelids, head nods, blinking, blank stare, poor concentration etc. But at times these episode may happen with no outward signs. The person simply stops responding the stimuli. According to one site, these events are more likely before dawn and in early afternoon.
Like most of us in our sleep-robbed world, I’ve had this happen. I’ve felt sleepy behind the wheel, struggled to stay away in meetings, and periodically “zoned out.” I’ve always thought this was simply fatigue. But the above suggests that parts of my brain are literally shutting down and not responding for short periods of time. That is far more dangerous, and more than a little frightening.
Repetitive tasks that require less attention than novel tasks may be more susceptible to microsleep episodes, such as computer data entry. I wonder, too, about some laboratory testing. We all joke that after so many years of doing this job we can do it in our sleep. Ironically, we may be doing just that.
How about you? Do you microsleep?
NEXT: Work Friendships Are Crucial
The more I hear about diabetes, the worse it sounds. The statistics on the disease, recently updated by the CDC, are alarming:
- 29.1 million people have diabetes (9.3% of the US population)
- 8.1 million people are undiagnosed (about 1 in three with the disease)
- Based on fasting glucose or glycated hemoglobin levels, 37% of adults 20 years or older have prediabetes (about 86 million Americans)
- Most diabetics (56.9%) are treated with oral medications, but a third take insulin
We all seen and heard the horror stories: patients in the ED in diabetic ketoacidosis, those who get feet and legs amputated, and peripheral skin wounds that refuse to heal. I hear from diabetics who can’t feel their feet describing that walking has no sensation. It all sounds bad to me.
According to the same CDC publication above, “Many people with type 2 diabetes can control their blood glucose by following a healthy meal plan and a program of regular physical activity, losing excess weight, and taking medications.” That’s a tall order for an insidious disease created by lifetime habits. It is the hardest thing in the world to change habits related to diet and exercise.
If 94 million Americans are really walking diabetic time bombs -- that is an incredible number of people -- we will see an increased need for screening and diagnostic testing in the laboratory. Rapid glucose, ketone (and tests such as beta hydroxybutyrate), and glycated hemoglobin (A1C) are some of the tests on the front lines. But on the periphery -- literally -- are tests arising from complications of this disease, such as wound care and antibiotic stewardship.
As laboratorians we have an obligation to teach nursing and other team members doing point of care testing proper collection and testing technique as well as why quality control is important. But we also have a role in diabetes education. A nurse educator may do the initial education and teaching on using a home meter, for example, but we can help answer any further questions a patient may have. We’ll be busy, that’s for sure.
NEXT: Do You Microsleep?
Scripting has been imposed on us, like many organizations, to standardize the customer experience. From answering the telephone to directing traffic, guidelines are set to make sure no matter who a patient or family member interacts with they receive similar treatment. This practice is so common in retail these days it’s expected in many other settings.
Most often I’ve seen scripting developed by committees or teams to enhance satisfaction scores e.g. “To protect your privacy...” or comply with a policy e.g. “Would you please state your full name?” These become cultural dogma, assumed to be crucial for service excellence, and never seriously questioned. Of course, this kind of scripting must be effective, because the experts tell us it is. And it meets our goal: a common customer experience. Doesn’t it?
But when was the last time you applied for a Walmart credit card because you heard the cashier asked the four people before you in line, “Would you like to apply for a Walmart credit card today?” And did that question meet your expectations as a customer, or make you feel like the scripting was all about the company’s bottom line?
Hospitals aren’t Walmart; retail goals are conspicuously financial by necessity. But it seems clear that scripting in a hospital setting should consider the needs of the patient more than the organization. Scripting by committee, it strikes me, has a bias toward the latter; the goal will always be to improve a score, comply with a policy, impose a change, or meet some theoretical need of an imaginary customer. I don’t think I’ve ever worked in a place that has asked a patient or family member directly how they want to be treated. The idea seems strange.
“What are your expectations?” is abstract and buzzwordy. Patients and family members may reflect that they expected different treatment, but I wonder if it’s possible to list those expectations in advance. As a patient myself I had one: I needed help. Many of our patients are helpless.
Is scripting that simple? Why can’t we just ask, “How can I help?”
NEXT: Fast Facts About Diabetes
“We never got the result!” is our most common complaint, followed closely by “My doctor never got the result!” It is frequently delivered in an accusatory tone instead of the more accurate “We can’t find it. Did you send it?” Or even, “Did you perform the test?”
We hear the latter once in a while. We look up the patient who hasn’t been seen in our hospital since 2008, and on the other end of the telephone a frustrated office working just hangs up in exasperation. Or the patient was seen, but we didn’t perform the test because it wasn’t ordered. Or it was ordered, but it wasn’t sent in time. Etc.
Our most bizarre complaint was from a physician who claimed he ordered a free T4 and we performed a total T4. Since we don’t have T4 on our test menu, I couldn’t figure out at first what he was angry about. He berated me for a few minutes while I looked at the report. It turned out that he confused a CPSI interface code (T4) with the name of the test (Free T4); both appeared on the report, viz.
[T4] Free T4 __
Clearly, he only saw the interface code. He still insisted we had run the wrong test and demanded that we redraw the patient.
Along with confused, frustrated, or angry telephone calls, this kind of failure is expensive. Physicians reorder tests or demand that they be collected or sent elsewhere if they aren’t done or the report is missing, translating into extra cost or lost revenue. Over time, these events whittle away a lab’s reputation and bottom line.
As partners in the patient’s healthcare, we can be treated like an obstacle, blamed for not doing a test that the doctor didn’t order. But many of our customers are genuinely frustrated. They may have no idea what lab work has been ordered by other providers or if the patient has complied. And sometimes they really don’t get a report when they should. As a fee for service business, how can a lab possibly fix this?
NEXT: How Can I Help?
Standard deviation index (SDI) measures bias using simple, easy to understand criteria. I also like this for daily quality control, because it works on all levels. Here’s the calculation:
SDI = (Value - Target Mean) / Standard Deviation
Thus, a glucose of 97 with a control range of 80-100 has an SDI of 1.4. A positive SDI indicates a value above the mean; a negative value indicates a value below the mean.
Also called z-score, the SDI corresponds to where on a run chart a value falls. As James Westgard explains on his web site, “It is very helpful to have z-scores when you are looking at control results from two or more control materials at the same time, or when looking at control results on different tests and different materials on a multitest analyzer.”
SDI is commonly used on laboratory peer reports, comparing to dozens or hundreds of other laboratories. Here are some guidelines for interpreting an SDI:
- 0.0 - perfect match peer group
- Less than or equal to 1.25 - acceptable performance
- 1.25 - 1.49 - some investigation may be required
- 1.5 - 1.99 - investigation is recommended; marginal performance
- Greater than or equal to 2.0 - unacceptable performance
Typically, peer reports are scanned for high or low SDI values, especially if they are seen across multiple levels. But the SDI is useful internally, too. If your information system calculates an SDI a tech can quickly see what’s in, what’s out, and what’s trending instead of just responding to Westgard flags. For example, if all levels of QC on an analyte have a negative SDI, there may be a calibration bias.
As I’ve described, our lab uses a program that plots SDI values instead of standard run charts. Every chart looks similar with plus or minus 2 SDI; actual means and standard deviations are indicated. Because this data depends on the parameters at the time of posting, these charts reflect QC at run time and answer the question, “What did the QC look like on a particular day?”
NEXT: We Never Got The Result
According to US News and World Report, the top five “best jobs” in 2014 are: software developer, computer systems analyst, dentist, nurse practitioner, and pharmacist. We make the list, too: phlebotomist is #16 and clinical laboratory technician #22. These jobs “offer a mosaic of employment opportunity, good salary, manageable work-life balance and job security,” the site reports.
We think we know more about those top jobs, too. We all use software, for example, and it can seem like a great job to create popular games. (It isn’t. Most software is hurried, cobbled together, filled with bugs, coded under extreme pressure, and barely works at all before hitting the market.)
I love that we made the list, but how do we explain ourselves? People may think they have a good idea of what a nurse does from television, but we are nonexistent in the popular culture and thus, public imagination. Explaining ourselves can be a challenge when asked, “What do you do?”
The most common answer I hear techs give is, “It’s like detective work.” This isn’t bad and more or less true. We piece together pieces of the clinical picture, but that’s just part of the story.
We handle specimens and instrumentation, which requires a level of technical competence unique within healthcare. That’s something. But the “detective” aspect is also true. More and more, we handle information as a product and commodity. We see more data than ever before, from quality data to delta checks, and we’re expected to interpret it to produce more reliable results. That’s much more than simply collecting specimens and pressing buttons.
As information technology becomes more integrated with healthcare and the data we see expands with the electronic health record, I expect we’ll become more like information analysts than bench technicians. This “applied information technology” is exciting but even more difficult to explain to a neophyte. (Or bean counter!)
But that’s just my view of our profession. I’m a geek, so I tend to see my job as information. How about you? How do you explain yourself?
NEXT: Using SDI
If there’s anything a lifetime of change teaches us, it’s that change never matches the hype. At some point in your career you’ve heard and seen it all: staffing ideas, alphabet soup, meeting gimmicks, efficiency notions. Like hornets, buzzwords are best left undisturbed, because after a while we’re allergic to their sting.
Or are we?
According to an article in Scientific American, openness to new experiences peaks during one’s 20s and declines thereafter (it increases in some people after age 60, so relax -- there’s hope). Since this implies less resistance to change, it seems to reinforce the stereotype (especially at computer keyboards) of old dogs and new tricks.
Much of this resistance is toward computers and information technology. Millennials (if you have any in your lab) likely use smart phones, smart watches, and Google as a verb. Your older techs may have a flip phone that they rarely use, have a wind-up Timex that has worked for thirty years, and read books. They won’t or can’t change. So I hear.
But according to a 2010 study, age is negatively related to resistance to change. Author and research scientist Jennifer Deal argues that generation related stereotypes are wrong; everyone hates change. Resistance to change has to do with what a person has to gain or lose. The stereotype may be self-fulfilling; one article on Medscape points out that younger nurses view older nurses as both resistant to change and using more sick time. Neither is true.
What all this suggests to me is that younger workers have less to lose by adapting ideas that are new (to them). It’s one way to compete in a workplace where experience (and age) teaches connections between ideas and data, too. But I’ve always found people agreeable to change when it means less work, fewer errors, or better patient care. The trick is finding change that really does that, and older techs are often skeptical or cynical, depending on your perspective.
So, do older techs hate change? What do you think?
NEXT: Explaining Ourselves
While working out I listen to short articles using an Android app called Umano. Many of the articles I hear claim that we have a lack of people who can write computer programs. Indeed, last year President Obama endorsed an “Hour of Code” during Computer Science Education Week to encourage students to learn how to program computers (write code). “Learning these skills isn’t just important for your future, it’s important for our country’s future,” he said.
The number of computer programming jobs is expected to outpace the number of students, and coders are needed everywhere. I’ve listened to other articles explaining that it should be added to your resume, no matter the job.
That certainly applies to the laboratory. And not just coding, but any transferable skills involving how hardware and software function. A laboratory may or may not need a programmer -- chances are the need exists -- but there are other useful skills that you can market, such as:
- Interface experience. We are more connected than ever before, and setting up interfaces can be a pain. If you have experience with test environments, test formats, truth tables, etc., it’s a plus.
- LIS or middleware rules. I expected software algorithms to replace many routine decisions, so any experience writing these e.g. autoverification is another plus.
- Meaningful use. Obamacare is the law of the land, and labs are struggling with meeting its terms as it rolls out. Understanding and implementing LOINC codes, SNOMED codes, and other details is critical to getting paid. Another plus.
- Reducing cost. Many managers see computers as a necessary evil that adds expense and work. If you’ve led or been part of projects that have saved money or time e.g. created templates in Excel to calculate cost per test, creatively using mobile technology, etc. we want to hear about it. So will everyone else.
I never see this kind of thing on resumes. Usually, if there is anything related to computers it’s using Microsoft Word, a program that has been around since 1983, the Folin Wu of software. (Don’t list either.)
NEXT: Do Older Techs Hate Change?
Laboratories revise a test menu based on clinical need or cost. The common “bread and butter” tests I blogged about last time are easy to decide on and often performed on analyzers with volume discounts. But in many cases estimating the volume of tests to be performed can make the decision for us. In other words, is it cheaper to make or buy? Do we bring a test in house or send it out?
Ideally, a test should be brought in if it fills a clinical need that changes patient treatment faster or cheaper than sending it out. Many of these fall under the “soft dollar” umbrella of reducing length of stay -- notoriously hard to sell to bean counters.
Better, a cost per test can be calculated based on an estimated volume and compared to the cost of sending it out. If these “hard dollars” (variable cost) save money, that’s an easy sell. Bean counters love saving beans.
But it’s tricky, especially with new technology or services, pushing us again into that area where the beans are slippery. Here are a few ideas:
- Look at sendout volumes (easiest) - referral labs provide usage reports or numbers can be pulled from your information system. It’s a good idea to compare your top five sendout tests against your instrumentation menus to see if it’s worth performing these tests in house. Sometimes you can save money!
- Ask the docs who will order the test (harder) - having a physician advocate for a new test can’t hurt, and docs can have an expert opinion of how many tests are likely to be ordered.
- Make an educated guess (hardest) - an educated guess can be made easier if you already have a protocol in place; for example, a sepsis protocol which includes lactates can suggest how many procalcitonins may be ordered. If you don’t have a protocol you can ask local medical centers or do research online to guess how many tests might be ordered. The more data-based your estimate, the better.
Note that this doesn’t hinge on keeping techs busy. One way or another we always have plenty to do.
NEXT: Market Your Computer Skills
The bread and butter of labs are those tests ordered on most patients: chemistry panels, blood counts, urinalysis and culture, and to an extent coagulation and blood bank. These are often ordered serially on patients admitted to your hospital, creating a cumulative report of laboratory values. As professionals we tend to be most productive and competent performing these tests, or at least operating the instruments that produce them.
But it is those rarely ordered but critical tests that can drive new technology and make us valuable to clinicians. Not just anyone can run them.
For example, in Maine many people use wood heat or are shut in for much of the winter. Carbon monoxide poisoning is a seasonal risk. These patients arrive at our ED, and it would be good if a physician could have a carboxyhemoglobin level STAT. Instead we offer this test as a sendout to our referral lab. It’s rarely ordered, instrumentation is costly, the docs don’t complain about not having it except (naturally) when they really need it, and when the weather warms it falls off the radar. It’s a good example of test that when the doc needs a result, he or she needs it now and not in two to four hours.
These rare but critical tests present unique challenges. They are expensive to keep in house, sometimes difficult to verify and control, more likely to have expired materials, and are harder for techs to remember how to do. Since they can be ordered in an urgent context, the latter is a real problem. It isn’t a problem to put a STAT chemistry panel on a busy instrument designed to prioritize STATs, but cerebrospinal fluid testing, for example, is always disruptive -- ask any night tech!
“Rare but critical” is the reason laboratories need highly trained professionals. Running routine specimens through analyzers isn’t worry-free, but variables are limited and repetition ensures a degree of competency. But a rarely ordered test ordered on a critical patient is already a problem.
We’re looking at the Avoximeter to resolve our carboxyhemoglogin issue. Is anyone using this?
NEXT: How Many Tests?
I blogged about “the cloud” in 2010: “If you’re using any applications that run in your web browser over the Internet, you’re using cloud computing.” While our hospital still uses aging Microsoft Office software and local storage the world has moved up.
Hardware doesn’t matter. I can write on a desktop, laptop, tablet, or smartphone, here or at work. I’ve talked into my smartphone and seen the text appear in real time on my desktop. Since everything is stored in the cloud, I don’t worry about hard drive failures, corrupted file structures, making backups, or printing a paper backup. In fact I never print anything.
And not just my writing, but everything is in the cloud. I store receipts online using an Android app, for example. Sites such as Dropbox and Box offer additional gigabytes of free storage for documents or photographs that can be uploaded and accessed from multiple devices. Anything I upload can be accessed anywhere. This all happens without any extra work. That’s a paradigm shift.
All this translates into efficiency and convenience, at least for me. Working in the cloud is easy.
Healthcare has been slow to adopt these changes and laboratories slower still. But change is happening. We are connected to more systems, our information systems are being configured to meet terms of “meaningful use” set by the Affordable Care Act, and the ubiquity of cheap, vast online storage is making electronic document management a dawning reality. Similar to my own experience, labs will find working in the cloud efficient and convenient.
NEXT: Rarely Ordered But Critical Tests
I read a lot of resumes, and most are awful.
I recently interviewed a candidate, for example, who waxed eloquent during the interview about how he valued great customer service. He gave examples, talked about involving line staff, insisted that people needed to talk to each other to get things done. Yet his resume didn’t contain the words “customer service.” It looked generic, as though he had prepared it for any job interview.
I’ve read resumes that list every tiny detail about work history, education, and hobbies. I’ve read resumes that list every instrument a tech has ever worked on, boring tasks associated with any job, or too much experience (and too many pages) for too few years in the field.
I’ve flubbed up, too. When I applied for my current position the HR director looked at my resume near the end of the interview and said in good humor, “I can’t help but comment that you haven’t listed ‘Leadership’ as a core skill.” (“Give me that!” I said, and I grabbed it and wrote down “Leadership.”)
But resumes aren’t mysterious. There are only two rules.
- Write for an interview. The purpose of a resume is to get an interview, so potential employers form a first impression of you from a resume. Trust me, after you’ve read enough resumes they really help in selecting good candidates.
- Write for your audience. This is such a basic rule that it surprises me how often it isn’t followed. Yet many candidates submit a generic resume that sells them in broadest terms instead of for the job they want.
For example, list core skills applicable to the job you want. If you’re applying for a generalist position, list experience in microbiology, blood bank, and any area that sells you as a generalist. Find out the instrumentation used in the laboratory and list them if you’ve worked on them. Above all list experience selling your core values. If you plan to say you value customer service, for example, make sure those words appear on your resume.
NEXT: Working in the Cloud