Skip To Navigation Skip To Content Skip To Footer

    The MGMA membership renewal portal is experiencing intermittent issues. We are working on a fix. If you're unable to renew, please call 877.275.6462 ext. 1888 or email service@mgma.com to renew.

    Rater8 - You make patients happy. We make sure everyone knows about it. Try it for free.
    Insight Article
    Home > Articles > Article
    Generic profile image
    MGMA Staff Members

    The goal of better performance across your medical group doesn’t happen without a culture that’s focused on continuous monitoring and improvement, and those efforts don’t happen without data.

    Karl Sundberg, FACHE, CMPE, chief executive officer, Catalyst Medical Group, put the need to move from data to insights and action succinctly during his 2022 Medical Practice Excellence: Leaders Conference session, “Creating a Data-Driven, High-Performance Culture.”

    “If we’re not continually problem-solving, what’s the point of having data in the first place?” Sundberg noted, invoking V.F. Ridgway’s cautionary note about understanding why you collect data: “What gets measured gets managed — even when it’s pointless to measure and manage it, and even if it harms the purpose of the organization to do so.”1

    As Sundberg said, the purpose must stay at the forefront for organizations to succeed: “The idea here is not that data is not important or that measuring performance isn’t important — the problem is … a lot of times we start measuring things, and then the measure itself becomes the goal, the purpose sometimes gets lost.”

    That purpose for historically change-resistant healthcare providers is usually clear: “Continually looking for norms that they can challenge,” Sundberg said, and creating slightly better performance with objective goals and testing hypotheses about subjective opinions about what needs to be changed.

    Exercising caution and building alignment

    One of the biggest missteps among healthcare leaders is forgetting that data and various measures are tools and not always capable of telling an accurate story of the practice’s performance. They are not “an exact measurement of truth in and of themselves,” Sundberg stressed. “They require context” to move beyond a “data-rich, information-poor” (DRIP) situation in which data can be misinterpreted or relied upon too heavily.

    Table 1. Examples of alignment versus misalignmentFor example: Setting a goal to improve patient experience scores without looking more broadly at the experience of clinicians, staff and others who might influence that score can lead to the score becoming the target. “The purpose at that point is completely lost,” Sundberg cautioned, if the changes result in cutting corners or other uncommon activities that could have unintended, adverse consequences on finances, employee burnout and beyond.

    Successful transformational cultures typically will build alignment across mission/vision statement(s), operating models and governance structure, with the organization’s people as the central resource to enable change that has widespread buy-in.

    “Be mindful of [whether] the operating model of the organization is structured in a way that decisions can be made, supported and moved forward,” and that those changes actually drive improvement in the measures you’re targeting, Sundberg said.

    Practice leaders also need to examine the organization’s norms — beliefs and myths about performance and how work gets done. “You may have to leverage some data to actually unwind some of those beliefs and prove or disprove some of those subjective truths [e.g., assumptions of what good performance is] that have been established within the organization before you can move forward,” Sundberg added. “Look at some benchmarks: Are we really as effective as we believe we are?”

    Creating a climate for change will involve bringing in the right stakeholders, setting a small number of goals at the outset and keeping those goals top of mind for those involved. Any gaps in those three key areas (as seen in Table 1) can result in distractions, poor execution or resistance from stakeholders.

    “This is something that I have learned the hard way: There are so many different things that need to be improved, and it’s tempting to take them all on at once,” Sundberg said. “But this is a situation where you can create burnout for your team” or otherwise see poor execution from too many competing initiatives.

    M.A.S.C. for data

    Sundberg suggested four key characteristics for using data to drive performance — remember the acronym MASC:

    • Meaningful: “Make it have the impact and tell the story that it’s supposed to,” Sundberg said.
    • Actionable: Select data that will be useful for your purpose(s).
    • Simple: Ensure it is easily understood by everyone involved.
    • Consistent: Especially when building spreadsheet reports, there’s a propensity for errors throughout the file’s formulas that result in manual processes to fix them. Addressing those early will make data reviews easier across projects.

    Data to drive performance

    With an understanding of how to align your organization for a transformative change, the work shifts to ensuring the data you plan to guide the work is standardized.

    Table 2. Examples of data "cubes" for standardization and visualizationThe standardization process should be prefaced by a process to “cube” the data: Assembling like types of data (e.g., finance/accounting, HR, operations, clinical) from the various sources. In operations, that typically will include data around patient type, insurance, appointment length, on-time rate, rescheduled appointments, cancellations and unfilled slots.

    Another example is being able to measure data not just by provider but also by clinical FTE, as many clinicians continue to have either significant hours carved out for leadership duties, academic work, or updated clinical work arrangements for part-time clinicians. Using that level of analysis will yield different results than simply measuring by provider.

    Standardizing that data then should involve making sure common fields across data sources match — such as date formatting, department names, user/employee/provider identifiers — with a common naming/numbering convention. From there, Sundberg advised establishing benchmarks wherever possible, pointing to MGMA DataDive and other sources, as needed. “Many times we don’t have an external benchmark,” he noted, so use of an internal benchmark may be necessary.

    This process will begin to reveal areas in which the practice may not be collecting needed data and spur a look into discovering or implementing new collection methods, be that from the EHR or an integrated system.
    With those elements collected and the data cleaned up, the next step is to build reports and/or dashboards, which vary in frequency and use:

    • Reports are episodic in nature and can miss trends yet spark additional questions
    • Dashboards are longitudinal, easily showing comparisons and trends over time, which often help answer the questions raised by reports.

     
    Sundberg cautioned against the instinct to rely heavily on reports, as they often require multiple steps to produce them and don’t always provide the level of detail needed to answer the questions you as a practice leader have.

    While several EHRs and PM systems now have a payer mix report built in with even some type of data visualization, Sundberg pointed out there are other questions to be answered: “A lot of times they don’t compare to our encounter volume; you can have a payer mix that’s not just based on payments, [but rather] based on number of encounters — and that’s a totally different answer.”

    Longitudinal data visualization in an area such as revenue can show stratified amounts of revenue across different payers over time and hover over the data to identify discrete elements — “who’s bringing it in, where’s it coming from, who’s the highest, who’s the lowest,” Sundberg summarized. This approach can yield actionable information around total collections by payer while still examining the data elements contributing to it, such as Medicare levels versus commercial payers.

    Processes to drive performance

    With data visualized and helping you tell a story about your organization’s performance, it becomes a matter of driving better performance through new or updated processes. Sundberg recommends using what he refers to as the three Fs:

    1. Focus: With clear and aligned goals and three to five key measures per goal, set intervals for reviewing performance toward those goals and provide visible progress updates for your teams to see, especially against available benchmarks.
    2. Fix: Identify areas of needed improvement (what needs changing and why) through the creation of improvement/action teams, with clearly assigned leader responsibilities and goals. Gather stakeholders and subject matter experts on staff as needed to map processes, and consider using Lean Six Sigma concepts for optimal efficiency.
    3. Follow up: Regularly scheduled huddles or virtual huddle boards should be used along with team meetings and individual meetings to engage on how performance has changed and whether the team has achieved what they set out to do.

     
    These steps can help reinforce the idea that the measures themselves are not always the outcome, and that monitoring is not the same as actual performance, Sundberg said. While collecting, standardizing and visualizing the data are vital steps, they are only part of the foundation for the work ahead. The other parts of that foundation are your people and culture.

    “Building a high-performance team really starts with culture,” Sundberg said. “Many times I see people looking at data as the answer. Data is necessary to drive that performance … But too many times we forget that change is driven by people, and changes driven by people need to involve those people from the start.”

    Note:

    1. Ridgway VF. “Dysfunctional Consequences of Performance Measurements.” Administrative Science Quarterly, 1(2): 240-247, 1956.
    Generic profile image

    Written By

    MGMA Staff Members



    Explore Related Content

    More Insight Articles

    Ask MGMA
    An error has occurred. The page may no longer respond until reloaded. Reload 🗙