By the numbers
Bookmark and Share

Dermatologists move toward data gathering, registries to enhance practices

As regulators endeavor to foster provider accountability and reduce the cost of care, both medicine in general and dermatology specifically will be asked to demonstrate value at every turn. Dermatologists, and dermatology as a specialty, are responding. Individual dermatologists are identifying and addressing practice gaps, with their participation in activities required for maintenance of certification (MOC) leading to the adoption of better data-collection methods. Meanwhile, groups within the specialty, including the AAD, are working to build a specialty-specific body of quantitative data. Together, these individual and collective efforts are positioning dermatology for medicine’s data-driven future.

Data demands

The move toward more data gathering, the development of substantial outcomes registries, and the creation of targeted education come from multiple fronts. Payers, regulators, and legislators want to be able to benchmark physician performance and track patient results against treatment costs to ensure that health care spending is being translated into value. In response, specialty societies want to develop targeted continuing education, demonstrate the value their physicians provide, and provide feedback to members about their performance and what they can do to improve. And individual physicians want to ensure that they provide the best quality care to their patients and evaluate how they rank against colleagues in their specialty as well as other physicians who treat similar conditions. 

Data gathering is currently being undertaken by a host of parties including universities, physician organizations, hospital groups, and both the federal and state governments. One example at the government level is the reporting to approved clinical data registries under the Physician Quality Reporting System (PQRS). The system collects certain practice metrics and clinical data from participating physicians and reports it to a central registry in exchange for incentive payments. Under the 2014 Medicare Physician Fee Schedule, eligible professionals must report at least nine measures for at least 50 percent of patients to a qualified clinical data registry to earn a 0.5 percent incentive payment and must report at least three measures this year to avoid a payment adjustment of -2.0 percent in 2016.

While participation status is the only thing about PQRS that Physician Compare ( currently reports to users, data on the results of that participation will become public in the near future. Medicare is in the process of adding quality care ratings for group practices that reported PQRS data in 2013, and plans to have these scores live for patients this year. CMS has stated plans to post performance ratings for individual physicians in 2015 “if technically feasible.” No matter what the timeline, CMS has indicated that individual performance ratings are coming in the near future, indicating a belief that public reporting of performance data will serve to motivate demonstrable performance improvement results.[pagebreak]

Yet some regulators feel that an even more aggressive approach is warranted. In December 2013, the Government Accountability Office issued a report to the Department of Health and Human Services (HHS), “Clinical Data Registries: HHS Could Improve Medicare Quality and Efficiency through Key Requirements and Oversight,” that recommended an increased focus on clinical data registries (CDRs). The report, which can be found at, said the 2014 fee schedule offers “little specificity concerning [clinical data registry] objectives or results,” and recommended that HHS take more proactive steps to benchmark physician performance through patient outcomes and treatments, provide educational activities to close clinical practice gaps, and provide performance feedback to Medicare physicians via a set of registry-specific performance data. The report calls for data registries crafted by physicians who would select a set number of applicable measures from an HHS-defined list of quality measures.

“The new qualified CDR program,” the report said, “will provide physicians with an alternative to participation in HHS’s existing [PQRS].”

In addition to Medicare-mandated clinical data registries, medical specialty societies and hospital systems are also investigating the collection and analysis of members’ care data. Doing so allows for a clearer picture of the behaviors and outcomes of a cohort of physicians, and can lead to the creation of specialty-wide quality improvement measures, according to Joel Gelfand, MD, MSCE, associate professor of dermatology at the University of Pennsylvania, who is currently studying comparative effectiveness under a grant from the National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS). To collect the necessary data, Dr. Gelfand and his partners in the project started a network of practices known as the Dermatology Clinical Effectiveness Research Network (DCERN).

“We created a network that involves 10 sites across the country, two academic sites and the rest private practice. We collected data over a year on nearly 2,000 patients who came in for treatment of moderate to severe psoriasis at their regularly scheduled appointments,” Dr. Gelfand said. “As a result, we (DCERN) have obtained clinical data that can be used to plan outcome-based measures for the AAD to consider as quality performance measures in clinical practice.”

The network’s studies have also demonstrated the importance of collecting data in real-world settings, Dr. Gelfand said, in order to understand how treatments really work for patients.

“We’ve found that with drugs like adalimumab, or infliximab, or ustekinumab, none of these therapies achieved a rate of clearance in the real world close to what they achieve in clinical trials,” Dr. Gelfand said. “A lot of that, we think, has to do with in clinical trials in week 12, they look to see if the patient is clear, whereas in the real world, the patients are on these drugs for more than just 12 weeks, and the drugs lose effectiveness over time.” (See sidebar, “Research and reality,” for more on the difference between trial results and what happens in daily practice.)[pagebreak]

Data and MOC

Better clinical and patient outcome data collection is integral to the requirements of MOC across specialties. Dermatologists can satisfy part of component 4 of Maintenance of Certification —Evaluation of Performance in Practice — through performance improvement CME (PI CME). Participants collect and review clinical data, develop a plan to improve the quantified results, implement that plan, and report on the results. (The Academy offers PI CME modules in acne, atopic dermatitis, melanoma, biopsy, psoriasis, chronic urticaria, and venous insufficiency. Learn more at

“The American Board of Dermatology is very interested in making very straightforward and simple quality improvement measures connect to component 4 of MOC,” said Erik Stratman, MD, editor of the practice gaps section of JAMA Dermatology and chair of the department of dermatology at the Marshfield Clinic in Wisconsin. “The busy clinician isn’t a two-day-a-week clinical researcher with a background in epidemiology and chart abstraction.” The process of introspection about one’s true educational needs doesn’t have to be arduous, he said. “At JAMA Derm, we use our articles to try and stimulate people to look for their own gaps, and look for ways to improve. That’s a great way to start the QI process.”

Participation in PI CME allows dermatologists to not only improve gaps in their practice, but demonstrate that they are engaged in quality improvement efforts. Demonstrating a commitment to quality improvement, Dr. Stratman said, will prove vital to the specialty going forward.

June Robinson, MD, editor of JAMA Dermatology and research professor in dermatology at Northwestern University, agreed. She said that in a future when fee-for-service is slowly phased out (see, every physician and specialty needs to be able to provide quantitative metrics or risk declining reimbursement and poor perception.

“Someday, probably soon, someone’s going to show up at the door and say show me your quality improvement measures,’ and it’s important that you be able to answer what you’re measuring here and why it’s important,” Dr. Robinson said. “If you want to really demonstrate value, you pick something that makes an immediate, noticeable difference to patients. For instance, prescribing gabapentin more frequently for older patients with herpes zoster can prevent post-herpetic neuralgia. You can stop the patient from hurting and track the decrease in occurrence if you start it once you diagnose herpes zoster.”

According to Alexa Kimball, MD, MPH, professor of dermatology at Harvard Medical School and medical director of the Massachusetts General Physicians Organization and a member of the Academy’s Outcome Study Workgroup, the time to get used to measuring performance is now, both because it can lead to better outcomes and because accountability for the results of measurement is around the corner.

“Over the next five to 10 years, all doctors are going to be held accountable for performance on outcomes from payers, credentialing boards, and state licensure bodies,” she said. “I can’t emphasize enough how important this trend is for the field.”[pagebreak]

Gathering dermatology data

Dermatologists who have already participated in data collection have largely reported on their processes was a melanoma diagnosis or a biopsy result shared with a primary care provider, for example. But payers and other stakeholders want data on the outcomes of care as well — whether a patient given a particular treatment improved — according to Robert S. Kirsner, MD, PhD, who chairs the Academy’s Council on Education and Maintenance of Certification and is professor, vice chairman, and the endowed Stiefel Laboratories Chair in the department of dermatology and cutaneous surgery at the University of Miami Miller School of Medicine and chief of dermatology at the University of Miami Hospital. The iPLEDGE program is an example of data reporting that measured processes and also provided some outcome data.

“Our patients taking isotretinoin systematically have labs and those of child-bearing potential have pregnancy tests, they then fill their prescriptions and also receive information on the drug. We then track whether or not pregnancy occurred,” he said. In other words, the outcome being measured, fetal exposure to isotretinoin, was measured at a specialty-wide level, with the results available to guide future health care decision-making by individuals and by payers and regulators. “We’d like to be able to apply that type of process, albeit in a less onerous fashion than iPLEDGE, to more complex dermatologic conditions and learn what that process tell us about the outcomes of our treatments,” Dr. Kirsner said. “It’s very important for registries to capture outcomes. While registries that capture the process and structure of how you integrate knowledge and apply it to patients are helpful, most stakeholders want to see outcomes.”

Some other specialties, Dr. Kimball said, have led the way in developing practitioner-friendly data-collection methods and practice improvement measures. Many surgical specialties, as well as cardiology, have developed robust registries that focus on simple results and short-term outcomes —  rise or fall in blood pressure, or short-term surgical survival rates. (See sidebar, “Data gathering in other specialties,” for details.) But for dermatology, she said, measurements must be carefully chosen and implemented with a longer-term vision in mind, depending on the condition, as many conditions dermatologists treat are complex, with change that can take place over a longer period of time. (See sidebar, “Integrating data collection into workflow,” for discussion of how Massachusetts General Hospital has addressed this challenge for acne and psoriasis.)

“We don’t have hemoglobin A1C or blood pressure to use as a measure. We’re relying on scores that are generated by either physicians or patients,” Dr. Kimball said. “If we’re going to start to collect outcomes data in community-based and academic practices that are not part of a study registry, we have to make it simple.”[pagebreak]

Getting onboard

As with any large data collection effort, the most important aspect, according to Oliver Wisco, DO, chair of the Academy’s Performance Measurement Task Force and clinical assistant professor of dermatology at Tulane University, is getting the majority of the concerned physicians onboard. While some physicians are still wary about sharing detailed practice data, he said, the widespread collection and interpretation of data is only going to become more integrated into daily practice. The most significant hurdle to wider-spread data collection, he said, is that in most cases, larger registries are not yet integrated with EHRs, meaning that to participate in many of them, duplication of data is required.

“I think objections are mainly based on workflow now. If you’re duplicating your data, who is re-entering that data? I don’t fault anyone for being frustrated with that process. It’s very onerous on the user,” Dr. Wisco said. “In terms of fear for the data, I think there’s a lot of skepticism about what we’re collecting. People are scared that these measures are going to look negatively on them. The measures we have now are more focused on enabling participation in PQRS. Eventually, once we have more meaningful data, we will be developing outcome-driven quality measures with appropriate performance benchmarks. The objective of a performance measurement system is not achieving perfection; it is to instill a drive to constantly improve. As for the reporting burden, a number of EHR companies are working to make their records more compatible with the specialty registries, and as CMS qualifies more measures, it should enable more participation.”

Despite their concerns, Dr. Wisco said, dermatologists should embrace data collection and reporting as a way to demonstrate value to patients, in addition to health system regulators and payers —while pushing to make doing so less burdensome.

“You take a baseball card, look at the back, it contains every performance statistic on that player,” Dr. Wisco said. “Those are very specific indices that people can relate to in order to understand value. We have to do the same thing in medicine.” Performance benchmarks should also be set at reasonable levels, much like baseball players are considered good if they can hit the ball one out of three times, he said.

The players, however, don’t collect their own data, he added. They have a system built around them that does it for them, and dermatologists will eventually have the data generated in a way that does not impede the day-to-day workflow of the practice.

Improving the care provided by dermatologists individually and by the specialty as a whole, and demonstrating the improvement to certifying bodies, payers, and the government, will require both individual dermatologists and the specialty to investigate areas where they may be underperforming and apply their findings to practice, Dr. Kirsner said.

“It’s going to be vital to unify the specialty in looking at disease processes and how dermatologists compare in treatment,” he said. “We can now develop our own registries and it’s empowering to collect our own data.” 



Data gathering in other specialties
Melanoma reporting
Integrating data collection into workflow
Research and reality