Mining Healthcare Variations for Gold

Mining Healthcare Variations for Gold
Cardinal Point’s Enterprise-Scale Multi-Site Improvement Methodology

Transforming multi-site healthcare enterprises is often seen as more of an art than science. Organizations attempting large-scale change often experience fits and starts, and never cross the finish line. The obstacles and distractions are many and include being blinded by the lure of the latest technology, getting lured down rabbit holes in search of perfect solutions to relatively unimportant problems, an inability to execute in the face of a flood of new data, difficulty knowing where to start, difficulty overcoming resistance to change, and difficulty managing the process of change. Even when organizations correctly identify necessary improvements, they are rarely able to scale these improvements enterprise-wide and tend to find even the isolated improvements notoriously difficult to sustain.[1]

To overcome these obstacles, Cardinal Point Healthcare Solutions created a unique methodology that combines clinical epidemiology—specifically, identifying clinical variation—with systems engineering principles in a repeatable process. Applied clinical epidemiology allows hospitals and other healthcare providers to identify pockets of outstanding care. Systems engineering principles enable those organizations to replicate the successful methods across all locations in a long-term, sustainable manner.


Clinical variation is well-documented in the literature—both in publications by our team as well as other clinical epidemiology researchers. These studies have found substantial variation in healthcare service costs and utilization with little or no correlation to improvements in patient health. In fact, variation has been shown to result in the underuse of effective care, misuse of preference-sensitive care and overuse of supply-sensitive care. While Lean-influenced organizations often view variation as a problem to be corrected, Cardinal Point views variation as a source of “gold” that is waiting to be mined. We use clinical epidemiology to unearth pockets of outstanding care that can serve as a model of care across the enterprise. Merely identifying these care protocols, however, does not lead to lasting improvements.[2]

Drawing from many fields of expertise—including multiple types of engineering, natural sciences, social sciences, management and leadership—systems engineers have consistently improved productivity, efficiency, reliability and quality across many industries. Although less common in healthcare, a recent report from the President’s Council of Advisors on Science and Technology (PCAST) provides examples of where using systems engineering principles to analyze a system, its elements and connections between elements, assist with policy and process design and help manage operations, has led to better quality and better outcomes at lower cost.[3]


“You drive. We navigate.”

Cardinal Point’s tagline describes our method succinctly. We guide and advise organizations as they execute the 14 steps provided below.

This approach creates two very strong advantages for our clients. First, it allows them to learn the requisite skills through experiential learning, one of the most powerful and effective teaching methods, Second, by putting our clients “in the driver’s seat,” they learn and retain the requisite skills to sustain a continuous learning and improving environment long after our contract expires.

Methodology Overview

Step 1:   Identify project champion and steering committee
Step 2:   Build common language and understanding of clinical variation
Step 3:   Benchmark aggregate performance
Step 4:   Identify initial targets
Step 5:   Deep dive on individual site performance
Step 6:   Reveal blinded findings
Step 7:   Jointly develop improvement plan and corresponding metrics
Step 8:   Implement small tests of change
Step 9:   Adjust approach and ramp up implementation
Step 10: Document lessons learned
Step 11: Form enterprise improvement center
Step 12: Share success publicly and recognize team contributions
Step 13: Repeat process
Step 14: Move beyond your boundaries with an external learning collaborative

Methodology Details

Step 1: Identify project champion and steering committee

First, an organization must identify a strong project champion. As studies and experience have confirmed time after time, a strong project champion is critical to the success of projects that modify an organization’s culture. Next, the organization must form a steering committee comprised of three to five senior leaders with the power to influence care pathways across all sites. This committee will lend broad support to the project, determine key priorities and order of business. This committee must also visibly demonstrate their involvement and commitment by regularly visiting sites throughout the process and providing updates to executives and front-line staff resident at those sites. Additionally, to aid in informing committee decisions, the organization must assemble a data team capable of retrieving and interpreting clinical, billing, registry and patient reported data.

Step 2: Build common language and understanding of clinical variation

When undertaking a large improvement and culture change effort, communication is key. As such, the organization must invest the time to construct and disseminate a shared glossary of consistently applied terms and a collection of case-study references for illustrative analogies.

Step 3: Benchmark aggregate performance

To build an understanding of the overall health-system performance, the data team compiles aggregated system performance measures and compares those to such external benchmarks as national averages, and regional competitors or “top hospitals” identified by U.S. News & World Report, Consumer Reports or other sources.

Step 4: Identify initial targets

Once a system’s performance against benchmark data is analyzed, the committee should use this analysis to identify target areas for improvement—i.e., areas where the health-system as a whole performs substantially below the benchmark and/or areas targeted for strategic growth. While evaluating candidate areas for improvement, the committee should reflect upon the three ways patient outcomes can be negatively affected from variation: the underuse of effective care, the misuse of preference-sensitive care, and the overuse of supply-sensitive care. While these types of care are not always discrete, they do provide a framework for identifying needed solutions. Underuse of effective care requires system optimization. Misuse of preference-sensitive care requires identifying informed preferences and aligning delivery of care so that those preferences are integrated into the care plan. Supply sensitive care requires a careful analysis of alternatives concerning resource allocation.

Step 5: Deep dive on individual site performance

Next, the data team shifts from evaluating system-wide performance to evaluating how individual sites within the system perform against both the system average and the benchmark. For reasons discussed later, the data team redacts the site names prior to presenting their findings to the steering committee. Once the committee sees the results, they experience an “ah-ha!” moment as it becomes clear how understanding variation enables and informs improvement efforts. Without fail, each site will perform well in some areas and poorly in others; however, in our experience, not a single site has ever performed poorly or well in all areas. Thus, it becomes clear that staff at each site have something to teach to and something to learn from colleagues at other sites.

Step 6: Reveal blinded findings

During this step, the steering committee convenes a meeting with executives representing each site and shares the findings. In order to keep nerves calm, these findings remain blinded. By having the steering committee reveal site performance in this blinded manner, the site executives share feelings of dissatisfaction with the composite system performance and level of variation as well as hope for future improvement as they recognize some sites are already performing well.

Step 7: Jointly develop improvement plan and corresponding metrics

Next, the steering committee and site executives responsible for improvement targets jointly develop improvement plans and mechanisms to measure progress towards established goals. At this stage, the executives still do not know their site’s ranking. As such, their common desire not to be ranked last unifies them and heightens collaboration.

Rather than issuing blanket institutional changes across the enterprise, site executives plan a series of small tests of change that will quickly reveal progress towards established goals or, conversely, provide early indications where procedures must be modified to meet the unique needs of a particular site. Prior to implementing the small tests of change, site executives share the concepts with representative clinical teams that are tasked to implement the plan. The representative clinical teams include representatives from among care providers, administrators, and patients. The clinical teams provide critical inputs on what aspects of the plan are likely to work and which aspects present risks. The site executives and representative clinical teams jointly modify the plans and identify intermediate metrics and targets.

Once the plan, metrics and targets are determined, the steering committee requests an executive volunteer his or her site to pilot test the plan. After the pilot site is identified, the data team reveals the identity of the sites and corresponding performance levels.

Step 8: Implement small tests of change

As a single site pilot tests the proposed change, the site executives share the results with the steering committee, other site executives and the representative clinical team on a regular basis. This collective group will adjust and fine-tune the implementation approach as it is rolled out and executed. The site will also document lessons learned and share those with the steering committee and other site executives.

Step 9: Adjust approach and ramp up implementation

The steering committee, site executives and representative clinical team apply the lessons learned and modify the implementation approach. Then, the site executives simultaneously field the change across all remaining sites. During this process, the steering committee and site executives reconvene at regular intervals to evaluate progress, make further adjustments and eventually begin facilitating the clinical team’s work of sustaining their gains. They also celebrate successes as sites meet or exceed the target goals.

Step 10: Document lessons learned

By this point in the process, much has been learned by all of those involved including, the steering committee, data team, site executives and clinical teams. These groups then document what went well and what went poorly. These lessons learned inform a guide to support future improvement efforts at each site and across the enterprise.

Step 11: Form enterprise improvement center

In order to sustain a culture of continuous learning and improvement, the organization establishes an Enterprise Improvement Center (EIC). This center is comprised similarly to the steering committee with strong leaders augmented by a data team capable of retrieving and interpreting critical data. This group is then responsible for “carrying the torch” of improvement for the entire organization by identifying areas ripe for change as well as serving as a resource center for everyone involved in implementing change.

Step 12: Share success publicly and recognize team contributions

The project champion should make a concerted effort to share findings and progress with the organization as a whole throughout the improvement process. Now, with a verified success case, the project champion should credit and reward the key participants who made success possible. Additionally, the organization introduces the newly formed EIC and explains the role the EIC and its members will serve in support of the organization’s continued drive for improvement and learning. These communications can be released through a combination of newsletters, press releases, website postings, social media and in-person gatherings. Finally, when the organization’s leadership deems appropriate, news of success is shared with patients, payers and other external stakeholders.

Step 13: Repeat process

Following similar steps and procedures (now modified per lessons learned), the EIC and site executives identify and pursue new improvement efforts, reward successes and continually add new tips and techniques to the EIC tool chest of improvement resources.

Step 14: Move beyond your boundaries with an external learning collaborative

Often, after several successful rounds, improvement efforts will start to stall. This can be caused by many factors including running out of “low-hanging fruit,” a lack of fresh perspective or general burn-out. To avoid this plateau, the EIC members join (or establish) a learning collaborative consisting of other organizations that also seek continuous improvement; a desire to share lessons learned, tools, techniques and successes; and a willingness to learn and try new ideas. By participating in a learning collaborative, the organization reignites the friendly competition that helps push boundaries and leads to further success.

Representative Case Studies

While the process described above has yet to be implemented from start to finish, several components have been tested with proven success. Using the principles of systems engineering, we have braided and sequenced these proven techniques. The subset of relevant case studies below highlights successes of major components of our methodology.

  • A military treatment facility in the southeastern US compared utilization rates, costs and outcomes data for the most common treatments and procedures among their active-duty personnel: obstetrics care, joint replacements, lumbar spine surgery and diabetes care. Their analysis of variation between facilities identified one facility with a rate of elective inductions among pregnant women more than three times higher than suggested guidelines and more than twice as high as any other military facility. These data helped target their initial improvement efforts and provided assurance the goals were achievable.
  • A spine treatment facility in New England assessed each of its provider’s referral rates to behavioral medicine for patients with a high likelihood for co-morbid depression. Rates varied between less than 5% to more than 85% with a group mean of 12% for the nearly two dozen providers. The rates were de-identified and fed back to the providers in semi-monthly meetings. Academic papers citing the benefits of behavioral therapy for adjustment disorders were circulated and patient-reported outcomes were compared. Without knowing their own ranking, the providers worked collaboratively in an environment free of punishment or rewards. As a result, the providers decreased variation and increased the mean referral rate to more than 65% over a six-month period.
  • Learning collaboratives have a well-documented history of improving care delivery for specific populations of patients across numerous sites. The Northern New England Cardiovascular Disease Study Group, the Cystic Fibrosis Foundation, the Ohio Perinatal Quality Collaborative and the American Joint Replacement Registry are just a few examples. The NNECDSG and CF Foundation collaboratives produced many peer-reviewed publications detailing improved survival rates following specific procedures, extended lifespan, and reduced complication rates. The CF and OPQC collaboratives also documented their hurdles and efforts to overcome resistance to change, providing a wealth of information as to what to expect when change efforts are launched.
About Cardinal Point

Based in southern California, Cardinal Point Healthcare Solutions helps clients throughout the U.S. successfully navigate today’s complex and chaotic healthcare environment. We are passionate about improving healthcare and look forward to discussing our methodologies, successes and prior lessons learned with those who share our passion. Click on Cardinal Point to learn more or contact us.

[1] Berwick, Donald M. “Disseminating Innovations in Health Care.” JAMA 289.15 (2003): 1969-1975.
[2] Clarke, John R. “Editorial: The Value of Collaborative Learning for Disseminating Best Healthcare Delivery Practices.” Pennsylvania Patient Safety Authority 8.2 (2011): 45-46.
[3] President’s Council of Advisors on Science and Technology. “Report to the President Better Health Care and Lower Costs: Accelerating improvement through systems engineering.” (2014).