Are you ready to embrace the paradigm shift in data management and its latest tools and techniques? Wellcentive’s Kirk Elder shows us how.
Is your data doing enough good for you and your health organization? Better yet, is your data any good to begin with? If these are the questions you’re asking yourself, here’s another good one: What are you doing to successfully manage your data to effectively manage your population? Wellcentive’s Kirk Elder says you probably already know the answer to that, too: not nearly enough.
There’s no question that data management in healthcare, while popular, has become increasingly perplexing. Thanks to the impending data demands of healthcare reform, it’s hard for health organizations to get a handle on it all. Add in big data now bursting the digital dams, and you have yourself a tall task when it comes to data management. But Elder said successful data management is possible, but it begins by having the right frame of mind.
“Data management is preconceived as this perfected, technical conquest supported by stiff governance and complex algorithms,” said Elder, CTO of the Atlanta, GA-based company specializing in comprehensive population health solutions. “Successful data management stems from the fingertips of those behind it. It’s not a task made perfect by a computer. It’s a work in progress made possible by the people who work with it every day, in the pursuit of bettering the populations they serve. If you want to employ successful data management, stop waiting for perfection and bring speed-to-value to what you’re trying to achieve— one accurate data piece at a time.”
That’s an especially important principle to understand when employing what Wellcentive calls responsible population health management (PHM). Organizations like ACOs know all too well the challenges that come with tackling the clinical measures and reimbursement requirements that build the foundation of responsible PHM. Without clear, accurate, and actionable data to breathe life into it your efforts, your best attempts at building an effective PHM model are already at a dead end, according to Elder.
“The world of PHM is changing fast, so your data management capabilities have to change right along with it,” he said. “Some organizations are trying to apply slow and inflexible data management solutions to a model that requires nothing short of agile adaptability. It’s a brave, new data management world out there—health organizations practicing PHM have to accept that. In fact, all health organizations have to at some point.”
Elder may be onto something. According to IT-research firm Gartner, Inc., 20 percent of CIOs in regulated industries will lose their jobs for failing to implement data governance (a major component of data management) successfully by 2016. Those losses could be even graver in healthcare, where the quality of lives, as well as corporate livelihoods could be at stake.
Still, garnering the courage to take on the challenges that come with data management isn’t easy. But Elder said there are guidelines you can follow to ensure your data management efforts gain traction from the very beginning. He also helped us at HIT Consultant spot some of the road blocks that often lead improvement efforts astray and straight into data debacles.
Here’s Wellcentive’s take on successful data management for PHM— the Do’s and Don’ts:
You know by now that PHM lives and breathes in accurate, actionable data. So, how does your data measure up? Well, you won’t know until you measure it, of course. This seems like a data management no-brainer, but it’s a mistake organizations make in their haste to meet a set of outward expectations.
Elder explained: “Sometimes organizations get so caught up in where they want to be, they fail to accurately evaluate where they are. However, no path for progress can be navigated without knowing where you stand in the first place. How can you possibly dictate where you will end up if you don’t have a trusted starting point? You need a compass here. Reliable data has to be that guide.”
To achieve that level of trust in your data, you have to ensure you have the technical means to consistently measure that data, monitor that data, and display how that data flows and functions throughout your IT landscape. It’s a tall order, but with the right, sophisticated tools, it is possible. But first, you have to ask to know what to look for when it comes to trying to achieve quality measurement.
“Is your data complete? Is it consistent? Is it valid? Those are the question you need to ask,” Elder said. “Those are the same questions we ask ourselves at Wellcentive, as we work to create out-of-the box metrics and best practices to help organizations quantify the answers.”
Keeping that in mind, Elder cautions that you shouldn’t compromise when it comes to your tool set. “If your data flow isn’t capturing everything it needs to, you’re missing something. And if you’re not sure if you’re missing anything, you most likely are,” he said “Ask your physicians if they trust the data. If they don’t, you shouldn’t either. It’s just that simple.”
Sometimes the wrong tool doesn’t measure enough, but other times, it might be measuring your data wrong all together. Elder cautioned that although measuring your data is essential, how you measure it should be defined by your organization, not delineated by the restrictions of your software.
“Don’t get boxed in,” he said. “The right solution is configurable to what you are trying to achieve and adaptable to the constant changes that occur, such as payer regulations or government stipulations; between the government and the commercial payers, new quality and risk-sharing programs are being released weekly. Your software needs to let you quickly fold in new measures.”
Elder continued: “To ensure you capture the data with the most effective workflow, you need modern technologies and vendors that have open technology stacks that can deliver integrations effectively and affordably. Bottom line: you should be setting the bar on where and how you want to measure up, not the other way around.”
Having the ability and adaptability to measure multiple data streams for your PHM efforts is crucial, but it all can seem a bit overwhelming. Elder says don’t get caught up in the enormity of it all. Simply start small and start soon.
“Get traction as soon as you can,” he said. “You don’t have to have all your data issues solved to start making progress. Adopt a small goal, meet that goal, and move on to the next. So, when it comes to meeting clinical benchmarks, you can start with focusing on ensuring 85 percent of your diabetic patients have a controlled HbA1c, for example.”
Focusing on measuring just one clinical measure accurately can have a profound impact on your organizational engagement, said Elder. “If you can align your IT systems to accurately measure that data, and gain physician acceptance along the way, then the entire organization will be marching to the same beat. That’s how you apply the speed-to-value principle to your data management processes.”
It takes more than a concentrated focus to accomplish that goal, however. You not only need measurement and configurability, but also real-time analytics: A key complexity solved by big data tools and techniques. “If your analytics vendor updates its measure snapshot monthly, then that will be the fastest you can adapt. Speed-to-value requires being able to tweak your data management processes hourly. It’s the only way you can ensure your data is accurate,” said Elder.
The message seems clear: Don’t sit on your data management and improvement efforts. It seems like common sense, but it’s not a commonplace practice in the IT world. There are many vendors who pride themselves on taking six to nine months to perfect your data infrastructure. But Elder warns that if it sounds too good to be true, it probably is.
“While waiting for your data to be cleaned and perfected, you make no impacting efforts with your population,” he said. “That’s nine months of twiddling your thumbs. That’s nine months you could be fixing minor health issues that turn into bigger ones during the latency period. And then, when you finally put your data back into play, you’re going to be looking at it differently than how you imported it in the first place. In this scenario, you’re waiting on a perfection that never comes, and guess what? Progress won’t come from that approach either.”
Similarly, Elder thinks that trying to map out every conceivable code or data classification is a fruitless place to start. “Focus on what you are trying to measure and don’t go beyond that. Creating results from that one measure is far more valuable than building out an infrastructure you may have to reconfigure anyways,” he said.
You don’t have to have every aspect of your data flow figured out from the get-go, but that doesn’t mean you have to resort to data management anarchy either. Elder said creating data governance is an important part of the process. However, your governance doesn’t have to be stodgy or structured from the top down like the data committees of yester years.
“Organizational awareness is still important. Your data governance practices should help create the culture shift you need to accomplish your data management and improvement efforts. Everyone needs to understand how their data input processes or any outside factors could alter or influence your data output,” said Elder. “You should ensure that you have a team of experienced stakeholders who are passionate about moving fast, and who can weigh in on significant changes to assess the down-stream impacts of these changes. Moving fast doesn’t mean that you move without looking where you are going.”
To break down the divide between your IT and the clinical teams, it’s also essential to create transparency around your technology. Gaining a clear understanding of how your data functions and flows throughout your organization can also help you pinpoint potential problems. “Make sure everyone understands what systems you have in place, what data you have in each one of them, and how that data flows throughout your data management ecosystem, “said Elder. “Then, you can build out a PHM program with that data, and keep the rules of how that data comes into the system simple.”
In addition, you need to adopt crowd sourcing data management practices that support the proper care and feeding of your interfaces and data quality. “Everyone contributes to the problems surrounding data quality and everyone can be part of the solution,” said Elder. “If administration checks their possible duplicate patient list or their physician attribution list for 15 minutes a day, then you not only have an economically sound data management model, but also one that fosters engagement. It puts everyone in the same boat and paddling in the same direction.”
Adding to that idea of everyone moving upstream together, Elder stressed it’s important to engage your physicians with that same proactive approach. “Create a forum where physicians can understand the organizations goals. Let them tell you what’s working and what’s not. Remember, they need to trust the data in order to effectively apply any improvements. In addition to the physicians, engage your community-based quality organizations, educate them on your program goals and identify opportunities for collaboration. Your MVP’s in PHM will be the people who use the data, so engage them,” he said.
When it comes to outfitting your data management system with all that it needs, collaboration is important here, too. Elder cautioned that partnering with closed vendors that are not open to working with others can be a costly mistake. “No one vendor can solve all your problems,” said Elder. “You shouldn’t have to wait two years and spend $75,000 on an interface to serve your population effectively. That vendor is essentially saying, ‘let’s slow our customer down for our own financial gain.’ At Wellcentive, we commit to being open; we will send data to our competitors, if that is better for your PHM program. Remember, it’s about creating speed-to-value practices to ensure you end up with the best possible outcomes for your population.”
Elder said the technologies are out there to collaborate both simply and effectively. “Implementing single-sign on today is a commodity; interfacing with web-based APIs is a commodity. If your vendors don’t have those tools, and aren’t willing to work with them, then they’re out of touch with today’s data management world.”
All this talking about configuration, agility, and antiquated vendors may have you thinking about building your own data warehouse solution by hiring business intelligence experts. Don’t. Elder warned that if you want to walk away with any data management wisdom here, it should be that there is no need, nor is it financially sustainable, to build an analytics solution yourself.
HIT resources that understand both the clinical concepts and information technology best practices are becoming scarce while the demand for them is growing. “There aren’t enough IT resources out there; for example, there are probably only a few hundred people in the nation with experience in clinical information and big data tools, and they come at a premium,” said Elder.
“Why waste time and money on a solution that isn’t going to be user friendly or inexpensive in the long run? There are already configurable, out-of–the-box solutions out there that can help you achieve successful data management. Remember, it’s about bringing speed-to-value to your PHM efforts. Any solution that doesn’t give you that isn’t worth the effort, the expense, or the wait.”