The more I hear about diabetes, the worse it sounds. The statistics on the disease, recently updated by the CDC, are alarming:
- 29.1 million people have diabetes (9.3% of the US population)
- 8.1 million people are undiagnosed (about 1 in three with the disease)
- Based on fasting glucose or glycated hemoglobin levels, 37% of adults 20 years or older have prediabetes (about 86 million Americans)
- Most diabetics (56.9%) are treated with oral medications, but a third take insulin
We all seen and heard the horror stories: patients in the ED in diabetic ketoacidosis, those who get feet and legs amputated, and peripheral skin wounds that refuse to heal. I hear from diabetics who can’t feel their feet describing that walking has no sensation. It all sounds bad to me.
According to the same CDC publication above, “Many people with type 2 diabetes can control their blood glucose by following a healthy meal plan and a program of regular physical activity, losing excess weight, and taking medications.” That’s a tall order for an insidious disease created by lifetime habits. It is the hardest thing in the world to change habits related to diet and exercise.
If 94 million Americans are really walking diabetic time bombs -- that is an incredible number of people -- we will see an increased need for screening and diagnostic testing in the laboratory. Rapid glucose, ketone (and tests such as beta hydroxybutyrate), and glycated hemoglobin (A1C) are some of the tests on the front lines. But on the periphery -- literally -- are tests arising from complications of this disease, such as wound care and antibiotic stewardship.
As laboratorians we have an obligation to teach nursing and other team members doing point of care testing proper collection and testing technique as well as why quality control is important. But we also have a role in diabetes education. A nurse educator may do the initial education and teaching on using a home meter, for example, but we can help answer any further questions a patient may have. We’ll be busy, that’s for sure.
NEXT: Do You Microsleep?
Scripting has been imposed on us, like many organizations, to standardize the customer experience. From answering the telephone to directing traffic, guidelines are set to make sure no matter who a patient or family member interacts with they receive similar treatment. This practice is so common in retail these days it’s expected in many other settings.
Most often I’ve seen scripting developed by committees or teams to enhance satisfaction scores e.g. “To protect your privacy...” or comply with a policy e.g. “Would you please state your full name?” These become cultural dogma, assumed to be crucial for service excellence, and never seriously questioned. Of course, this kind of scripting must be effective, because the experts tell us it is. And it meets our goal: a common customer experience. Doesn’t it?
But when was the last time you applied for a Walmart credit card because you heard the cashier asked the four people before you in line, “Would you like to apply for a Walmart credit card today?” And did that question meet your expectations as a customer, or make you feel like the scripting was all about the company’s bottom line?
Hospitals aren’t Walmart; retail goals are conspicuously financial by necessity. But it seems clear that scripting in a hospital setting should consider the needs of the patient more than the organization. Scripting by committee, it strikes me, has a bias toward the latter; the goal will always be to improve a score, comply with a policy, impose a change, or meet some theoretical need of an imaginary customer. I don’t think I’ve ever worked in a place that has asked a patient or family member directly how they want to be treated. The idea seems strange.
“What are your expectations?” is abstract and buzzwordy. Patients and family members may reflect that they expected different treatment, but I wonder if it’s possible to list those expectations in advance. As a patient myself I had one: I needed help. Many of our patients are helpless.
Is scripting that simple? Why can’t we just ask, “How can I help?”
NEXT: Fast Facts About Diabetes
“We never got the result!” is our most common complaint, followed closely by “My doctor never got the result!” It is frequently delivered in an accusatory tone instead of the more accurate “We can’t find it. Did you send it?” Or even, “Did you perform the test?”
We hear the latter once in a while. We look up the patient who hasn’t been seen in our hospital since 2008, and on the other end of the telephone a frustrated office working just hangs up in exasperation. Or the patient was seen, but we didn’t perform the test because it wasn’t ordered. Or it was ordered, but it wasn’t sent in time. Etc.
Our most bizarre complaint was from a physician who claimed he ordered a free T4 and we performed a total T4. Since we don’t have T4 on our test menu, I couldn’t figure out at first what he was angry about. He berated me for a few minutes while I looked at the report. It turned out that he confused a CPSI interface code (T4) with the name of the test (Free T4); both appeared on the report, viz.
[T4] Free T4 __
Clearly, he only saw the interface code. He still insisted we had run the wrong test and demanded that we redraw the patient.
Along with confused, frustrated, or angry telephone calls, this kind of failure is expensive. Physicians reorder tests or demand that they be collected or sent elsewhere if they aren’t done or the report is missing, translating into extra cost or lost revenue. Over time, these events whittle away a lab’s reputation and bottom line.
As partners in the patient’s healthcare, we can be treated like an obstacle, blamed for not doing a test that the doctor didn’t order. But many of our customers are genuinely frustrated. They may have no idea what lab work has been ordered by other providers or if the patient has complied. And sometimes they really don’t get a report when they should. As a fee for service business, how can a lab possibly fix this?
NEXT: How Can I Help?
Standard deviation index (SDI) measures bias using simple, easy to understand criteria. I also like this for daily quality control, because it works on all levels. Here’s the calculation:
SDI = (Value - Target Mean) / Standard Deviation
Thus, a glucose of 97 with a control range of 80-100 has an SDI of 1.4. A positive SDI indicates a value above the mean; a negative value indicates a value below the mean.
Also called z-score, the SDI corresponds to where on a run chart a value falls. As James Westgard explains on his web site, “It is very helpful to have z-scores when you are looking at control results from two or more control materials at the same time, or when looking at control results on different tests and different materials on a multitest analyzer.”
SDI is commonly used on laboratory peer reports, comparing to dozens or hundreds of other laboratories. Here are some guidelines for interpreting an SDI:
- 0.0 - perfect match peer group
- Less than or equal to 1.25 - acceptable performance
- 1.25 - 1.49 - some investigation may be required
- 1.5 - 1.99 - investigation is recommended; marginal performance
- Greater than or equal to 2.0 - unacceptable performance
Typically, peer reports are scanned for high or low SDI values, especially if they are seen across multiple levels. But the SDI is useful internally, too. If your information system calculates an SDI a tech can quickly see what’s in, what’s out, and what’s trending instead of just responding to Westgard flags. For example, if all levels of QC on an analyte have a negative SDI, there may be a calibration bias.
As I’ve described, our lab uses a program that plots SDI values instead of standard run charts. Every chart looks similar with plus or minus 2 SDI; actual means and standard deviations are indicated. Because this data depends on the parameters at the time of posting, these charts reflect QC at run time and answer the question, “What did the QC look like on a particular day?”
NEXT: We Never Got The Result
According to US News and World Report, the top five “best jobs” in 2014 are: software developer, computer systems analyst, dentist, nurse practitioner, and pharmacist. We make the list, too: phlebotomist is #16 and clinical laboratory technician #22. These jobs “offer a mosaic of employment opportunity, good salary, manageable work-life balance and job security,” the site reports.
We think we know more about those top jobs, too. We all use software, for example, and it can seem like a great job to create popular games. (It isn’t. Most software is hurried, cobbled together, filled with bugs, coded under extreme pressure, and barely works at all before hitting the market.)
I love that we made the list, but how do we explain ourselves? People may think they have a good idea of what a nurse does from television, but we are nonexistent in the popular culture and thus, public imagination. Explaining ourselves can be a challenge when asked, “What do you do?”
The most common answer I hear techs give is, “It’s like detective work.” This isn’t bad and more or less true. We piece together pieces of the clinical picture, but that’s just part of the story.
We handle specimens and instrumentation, which requires a level of technical competence unique within healthcare. That’s something. But the “detective” aspect is also true. More and more, we handle information as a product and commodity. We see more data than ever before, from quality data to delta checks, and we’re expected to interpret it to produce more reliable results. That’s much more than simply collecting specimens and pressing buttons.
As information technology becomes more integrated with healthcare and the data we see expands with the electronic health record, I expect we’ll become more like information analysts than bench technicians. This “applied information technology” is exciting but even more difficult to explain to a neophyte. (Or bean counter!)
But that’s just my view of our profession. I’m a geek, so I tend to see my job as information. How about you? How do you explain yourself?
NEXT: Using SDI
If there’s anything a lifetime of change teaches us, it’s that change never matches the hype. At some point in your career you’ve heard and seen it all: staffing ideas, alphabet soup, meeting gimmicks, efficiency notions. Like hornets, buzzwords are best left undisturbed, because after a while we’re allergic to their sting.
Or are we?
According to an article in Scientific American, openness to new experiences peaks during one’s 20s and declines thereafter (it increases in some people after age 60, so relax -- there’s hope). Since this implies less resistance to change, it seems to reinforce the stereotype (especially at computer keyboards) of old dogs and new tricks.
Much of this resistance is toward computers and information technology. Millennials (if you have any in your lab) likely use smart phones, smart watches, and Google as a verb. Your older techs may have a flip phone that they rarely use, have a wind-up Timex that has worked for thirty years, and read books. They won’t or can’t change. So I hear.
But according to a 2010 study, age is negatively related to resistance to change. Author and research scientist Jennifer Deal argues that generation related stereotypes are wrong; everyone hates change. Resistance to change has to do with what a person has to gain or lose. The stereotype may be self-fulfilling; one article on Medscape points out that younger nurses view older nurses as both resistant to change and using more sick time. Neither is true.
What all this suggests to me is that younger workers have less to lose by adapting ideas that are new (to them). It’s one way to compete in a workplace where experience (and age) teaches connections between ideas and data, too. But I’ve always found people agreeable to change when it means less work, fewer errors, or better patient care. The trick is finding change that really does that, and older techs are often skeptical or cynical, depending on your perspective.
So, do older techs hate change? What do you think?
NEXT: Explaining Ourselves
While working out I listen to short articles using an Android app called Umano. Many of the articles I hear claim that we have a lack of people who can write computer programs. Indeed, last year President Obama endorsed an “Hour of Code” during Computer Science Education Week to encourage students to learn how to program computers (write code). “Learning these skills isn’t just important for your future, it’s important for our country’s future,” he said.
The number of computer programming jobs is expected to outpace the number of students, and coders are needed everywhere. I’ve listened to other articles explaining that it should be added to your resume, no matter the job.
That certainly applies to the laboratory. And not just coding, but any transferable skills involving how hardware and software function. A laboratory may or may not need a programmer -- chances are the need exists -- but there are other useful skills that you can market, such as:
- Interface experience. We are more connected than ever before, and setting up interfaces can be a pain. If you have experience with test environments, test formats, truth tables, etc., it’s a plus.
- LIS or middleware rules. I expected software algorithms to replace many routine decisions, so any experience writing these e.g. autoverification is another plus.
- Meaningful use. Obamacare is the law of the land, and labs are struggling with meeting its terms as it rolls out. Understanding and implementing LOINC codes, SNOMED codes, and other details is critical to getting paid. Another plus.
- Reducing cost. Many managers see computers as a necessary evil that adds expense and work. If you’ve led or been part of projects that have saved money or time e.g. created templates in Excel to calculate cost per test, creatively using mobile technology, etc. we want to hear about it. So will everyone else.
I never see this kind of thing on resumes. Usually, if there is anything related to computers it’s using Microsoft Word, a program that has been around since 1983, the Folin Wu of software. (Don’t list either.)
NEXT: Do Older Techs Hate Change?
Laboratories revise a test menu based on clinical need or cost. The common “bread and butter” tests I blogged about last time are easy to decide on and often performed on analyzers with volume discounts. But in many cases estimating the volume of tests to be performed can make the decision for us. In other words, is it cheaper to make or buy? Do we bring a test in house or send it out?
Ideally, a test should be brought in if it fills a clinical need that changes patient treatment faster or cheaper than sending it out. Many of these fall under the “soft dollar” umbrella of reducing length of stay -- notoriously hard to sell to bean counters.
Better, a cost per test can be calculated based on an estimated volume and compared to the cost of sending it out. If these “hard dollars” (variable cost) save money, that’s an easy sell. Bean counters love saving beans.
But it’s tricky, especially with new technology or services, pushing us again into that area where the beans are slippery. Here are a few ideas:
- Look at sendout volumes (easiest) - referral labs provide usage reports or numbers can be pulled from your information system. It’s a good idea to compare your top five sendout tests against your instrumentation menus to see if it’s worth performing these tests in house. Sometimes you can save money!
- Ask the docs who will order the test (harder) - having a physician advocate for a new test can’t hurt, and docs can have an expert opinion of how many tests are likely to be ordered.
- Make an educated guess (hardest) - an educated guess can be made easier if you already have a protocol in place; for example, a sepsis protocol which includes lactates can suggest how many procalcitonins may be ordered. If you don’t have a protocol you can ask local medical centers or do research online to guess how many tests might be ordered. The more data-based your estimate, the better.
Note that this doesn’t hinge on keeping techs busy. One way or another we always have plenty to do.
NEXT: Market Your Computer Skills
The bread and butter of labs are those tests ordered on most patients: chemistry panels, blood counts, urinalysis and culture, and to an extent coagulation and blood bank. These are often ordered serially on patients admitted to your hospital, creating a cumulative report of laboratory values. As professionals we tend to be most productive and competent performing these tests, or at least operating the instruments that produce them.
But it is those rarely ordered but critical tests that can drive new technology and make us valuable to clinicians. Not just anyone can run them.
For example, in Maine many people use wood heat or are shut in for much of the winter. Carbon monoxide poisoning is a seasonal risk. These patients arrive at our ED, and it would be good if a physician could have a carboxyhemoglobin level STAT. Instead we offer this test as a sendout to our referral lab. It’s rarely ordered, instrumentation is costly, the docs don’t complain about not having it except (naturally) when they really need it, and when the weather warms it falls off the radar. It’s a good example of test that when the doc needs a result, he or she needs it now and not in two to four hours.
These rare but critical tests present unique challenges. They are expensive to keep in house, sometimes difficult to verify and control, more likely to have expired materials, and are harder for techs to remember how to do. Since they can be ordered in an urgent context, the latter is a real problem. It isn’t a problem to put a STAT chemistry panel on a busy instrument designed to prioritize STATs, but cerebrospinal fluid testing, for example, is always disruptive -- ask any night tech!
“Rare but critical” is the reason laboratories need highly trained professionals. Running routine specimens through analyzers isn’t worry-free, but variables are limited and repetition ensures a degree of competency. But a rarely ordered test ordered on a critical patient is already a problem.
We’re looking at the Avoximeter to resolve our carboxyhemoglogin issue. Is anyone using this?
NEXT: How Many Tests?
I blogged about “the cloud” in 2010: “If you’re using any applications that run in your web browser over the Internet, you’re using cloud computing.” While our hospital still uses aging Microsoft Office software and local storage the world has moved up.
Hardware doesn’t matter. I can write on a desktop, laptop, tablet, or smartphone, here or at work. I’ve talked into my smartphone and seen the text appear in real time on my desktop. Since everything is stored in the cloud, I don’t worry about hard drive failures, corrupted file structures, making backups, or printing a paper backup. In fact I never print anything.
And not just my writing, but everything is in the cloud. I store receipts online using an Android app, for example. Sites such as Dropbox and Box offer additional gigabytes of free storage for documents or photographs that can be uploaded and accessed from multiple devices. Anything I upload can be accessed anywhere. This all happens without any extra work. That’s a paradigm shift.
All this translates into efficiency and convenience, at least for me. Working in the cloud is easy.
Healthcare has been slow to adopt these changes and laboratories slower still. But change is happening. We are connected to more systems, our information systems are being configured to meet terms of “meaningful use” set by the Affordable Care Act, and the ubiquity of cheap, vast online storage is making electronic document management a dawning reality. Similar to my own experience, labs will find working in the cloud efficient and convenient.
NEXT: Rarely Ordered But Critical Tests
I read a lot of resumes, and most are awful.
I recently interviewed a candidate, for example, who waxed eloquent during the interview about how he valued great customer service. He gave examples, talked about involving line staff, insisted that people needed to talk to each other to get things done. Yet his resume didn’t contain the words “customer service.” It looked generic, as though he had prepared it for any job interview.
I’ve read resumes that list every tiny detail about work history, education, and hobbies. I’ve read resumes that list every instrument a tech has ever worked on, boring tasks associated with any job, or too much experience (and too many pages) for too few years in the field.
I’ve flubbed up, too. When I applied for my current position the HR director looked at my resume near the end of the interview and said in good humor, “I can’t help but comment that you haven’t listed ‘Leadership’ as a core skill.” (“Give me that!” I said, and I grabbed it and wrote down “Leadership.”)
But resumes aren’t mysterious. There are only two rules.
- Write for an interview. The purpose of a resume is to get an interview, so potential employers form a first impression of you from a resume. Trust me, after you’ve read enough resumes they really help in selecting good candidates.
- Write for your audience. This is such a basic rule that it surprises me how often it isn’t followed. Yet many candidates submit a generic resume that sells them in broadest terms instead of for the job they want.
For example, list core skills applicable to the job you want. If you’re applying for a generalist position, list experience in microbiology, blood bank, and any area that sells you as a generalist. Find out the instrumentation used in the laboratory and list them if you’ve worked on them. Above all list experience selling your core values. If you plan to say you value customer service, for example, make sure those words appear on your resume.
NEXT: Working in the Cloud
What started out attached to a desktop personal computer is everywhere. Keyboards are attached to computer terminals, COWs (Computers on Wheels), and many instruments. Yet little has changed in their design (more about that below). They are big, bulky, clunky, difficult to clean, and hard to adapt to a traditional laboratory setting designed for paper.
The OSHA Computer Workstations e-tool offers these tips:
- Put the keyboard directly in front of you.
- Your shoulders should be relaxed
- Your wrists should be straight and in-line with your forearms
The goal is to ensure your posture and joints are in a neutral position. Placing a keyboard too high or low strains joints after hours of repetitive motion. Cumulative strain causes injury.
Adjustments are fine for a single workstation in an office or cubicle used by one person but notoriously difficult amidst specimens, requisitions, reagents, instruments, and on countertops installed pre-computer. We do the best we can.
A keyboard’s switch type -- what’s between the plastic key and the keyboard circuit board -- also affects its ergonomics.
Most use a “rubber dome” switch type, in which a key compresses a polyurethane bubble coated in graphite that completes a circuit. These dome-type keyboards are dirt cheap, lightweight, and reasonably quiet. The downside is they wear out quicker, the keys have a “mushy” feel to them when pressed, and each key has to be pressed down all the way to work. Typing speed is slower, and your fingers have to work a teensy bit harder.
Mechanical switches, these days favored by gamers, used to be common and have a separate switch beneath each key. The design of the switch determines the amount of pressure needed and how loud it is. But generally mechanical keys are more accurate, far more durable, and require less effort to press. (They’re also pricey.) Typing speed is faster, and your fingers don’t work as hard. It adds up.
Recently I dug my old mechanical Microsoft Internet Keyboard (circa 2005) out of my attic and hooked it up using a PS2 to USB adapter. And you know? The keys are better.
NEXT: What to Put on Your Resume
According to a Rand Corporation report about half of U.S. employers offer wellness programs. The bigger the employer, the bigger the program, many of which include risk assessments. Despite evidence that wellness is associated with lower healthcare costs and use, less than half of employees undergo screening or participate.
Our small hospital offers a great wellness program. We have health fairs, risk assessments, smoking cessation programs, team challenges e.g. losing weight, and an employee gym. Yet participation seems poor to me.
A 2010 National Institute for Health Care Reform research brief states, “While employer wellness programs have spread rapidly in recent years, few employers implement programs likely to make a meaningful difference in employees’ health...” Experts believe financial incentives are the most effective way to ensure employee buy-in.
This isn’t a matter of employee greed. Undergoing a health risk assessment and sharing this information with an employer can be unsettling. As the Los Angeles Times asked last year, “Would you be willing to share with your employer how much you eat, drink, smoke, or exercise?” Studies of wellness program effectiveness show mixed results; financial incentives can get employees in the door, but sustained gains are still a challenge.
I haven’t been involved in our hospital’s wellness programs, I’ll confess. But I exercise every day at home, which works for me. I don’t smoke, hardly ever drink, don’t have hypertension, and my knees still work. I appreciate how difficult it can be to commit to wellness. It isn’t easy to find the time, energy, or motivation when you have to work so hard for so little. Slow, incremental improvement is the reality of daily exercise. It’s boring and it hurts.
But it’s worth it to take advantage of a wellness program. If your employer offers one, it can be a win-win, reducing their risk and increasing your satisfaction in the long run. I don’t believe there is any hidden agenda to coerce employees to reveal healthcare information about themselves. Good health benefits all, and what better place to learn about health? Finding your inner motivator is the challenge.
NEXT: Keyboard Ergonomics
Remarkably, 54 million Americans have been bullied at some point in their careers, either peer to peer or from a boss. Writes author Sherri Gordon in About, “Many times people don’t even realize that their boss is bullying them. Instead, they falsely believe that their boss is just tough or pushes his workers to get results.” She lists verbal abuse (shouting, humiliation, etc.), intimidation, questioning performance, intrusion of privacy, undermining, and other characteristics of these workplace monsters.
I’ve certainly worked for bosses who have annoyed, harassed, and micromanaged me only to blow off their own bullying with “I’m just pushing you to succeed.” Then why haven’t I ever felt like a success working for these jerks?
Jacquelyn Smith describes several bullying bosses in Forbes, from those who throw tantrums to those who are covert, changing their behavior day to day. “These bosses with bullying tendencies are masters at pushing you to the limit ... they may attempt to disguise their demeaning and discourteous behavior with levity, saying, ‘Oh, I was just joking,’ or ‘You’re too sensitive. You know you’re doing a great job,’” she writes.
I repeat: why doesn’t this ever feel like a great job?
While there are bullies at all levels, most are in management. It’s riskier to bully peer to peer but easy to push subordinates around. The above article cites 72% of workplace bullies as bosses in a study done by the Workplace Bullying Institute.
I’ll admit I’ve never found a solution to working for a bully other than quitting. If the bully is boss, chances are he or she was hired by a bully and exists in a culture that tolerates bullying. These poisonous environments are toxic and toxigenic, and they are as unlikely to change as a prison from the bottom up. Smart and creative people jump ship, and victims and enablers languish in the hold.
A bullying boss can be the toughest problem in your career, keeping you awake at night, upsetting your stomach, and robbing joy from your family time. It sure ain’t fun. Do you work for a bully?
NEXT: Take Advantage of Wellness
The nitroprusside test typically performed with a Bayer Acetest tablet is a laboratory classic. It’s one of the first tests I learned. In the nitroprusside reaction, acetoacetic acid, a serum or urine ketone, reacts with sodium nitroferricyanide and glycine to produce a purple color. I’ve been in labs where two-fold serial dilutions are common. Sample is dropped on the tablet, and you either see a color or not.
Acetone and beta-hydroxybutyrate (BOHB) are the other significant circulating ketone bodies during ketogenesis, a process that breaks down fat when there aren’t carbohydrates to burn for energy. This can happen while we sleep, fast, and in diabetics when there is a severe drop in insulin levels. Since ketones are acidic, the result is a measurable acidosis.
The problem with the nitroprusside test is simple: it doesn’t detect beta-hydroxybutyrate. This is significant in diabetic ketoacidosis (DKA) where the ratio of BOHB to acetoacetate may increase tenfold. BOHB has greater sensitivity and specificity in patients presenting to the ED with hyperglycemia and correlates better with treatment monitoring than anion gap or pCO2.
About two-thirds of patients with DKA have type 1 diabetes and can present to an ED with hyperglycemia, metabolic acidosis, electrolyte imbalance, dehydration, and or shock. From 1985 to 2005 DKA-related hospitalizations increased 42%, according to one source, suggesting this is an emerging problem that likely mirrors our obesity epidemic.
Clearly beta-hydroxybutyrate is a better marker for diagnosis and treatment of diabetic ketoacidosis. There are other applications such as alcoholic ketoacidosis, but in an emergent setting we’re more likely to encounter the former. Since the test needs to be performed STAT, it should be done point of care or in house.
I don’t know how many labs perform the test. I’ve been looking for opportunities to bring this test in house, and recently decided to evaluate a user-defined method for our Siemens analyzer. Our ED director is excited about this test, which is good. But in previous years there was less interest. It’s funny how these things depend on timing, an odd phenomena of laboratory business. It also keeps our job interesting.
NEXT: Do You Work For a Bully?