↓ Skip to main content

Dove Medical Press

Validation of asthma recording in electronic health records: a systematic review

Overview of attention for article published in Clinical Epidemiology, December 2017
Altmetric Badge

About this Attention Score

  • Above-average Attention Score compared to outputs of the same age (53rd percentile)
  • Above-average Attention Score compared to outputs of the same age and source (51st percentile)

Mentioned by

twitter
4 X users

Citations

dimensions_citation
32 Dimensions

Readers on

mendeley
56 Mendeley
Title
Validation of asthma recording in electronic health records: a systematic review
Published in
Clinical Epidemiology, December 2017
DOI 10.2147/clep.s143718
Pubmed ID
Authors

Francis Nissen, Jennifer K Quint, Samantha Wilkinson, Hana Mullerova, Liam Smeeth, Ian J Douglas

Abstract

To describe the methods used to validate asthma diagnoses in electronic health records and summarize the results of the validation studies. Electronic health records are increasingly being used for research on asthma to inform health services and health policy. Validation of the recording of asthma diagnoses in electronic health records is essential to use these databases for credible epidemiological asthma research. We searched EMBASE and MEDLINE databases for studies that validated asthma diagnoses detected in electronic health records up to October 2016. Two reviewers independently assessed the full text against the predetermined inclusion criteria. Key data including author, year, data source, case definitions, reference standard, and validation statistics (including sensitivity, specificity, positive predictive value [PPV], and negative predictive value [NPV]) were summarized in two tables. Thirteen studies met the inclusion criteria. Most studies demonstrated a high validity using at least one case definition (PPV >80%). Ten studies used a manual validation as the reference standard; each had at least one case definition with a PPV of at least 63%, up to 100%. We also found two studies using a second independent database to validate asthma diagnoses. The PPVs of the best performing case definitions ranged from 46% to 58%. We found one study which used a questionnaire as the reference standard to validate a database case definition; the PPV of the case definition algorithm in this study was 89%. Attaining high PPVs (>80%) is possible using each of the discussed validation methods. Identifying asthma cases in electronic health records is possible with high sensitivity, specificity or PPV, by combining multiple data sources, or by focusing on specific test measures. Studies testing a range of case definitions show wide variation in the validity of each definition, suggesting this may be important for obtaining asthma definitions with optimal validity.

X Demographics

X Demographics

The data shown below were collected from the profiles of 4 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 56 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 56 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 10 18%
Researcher 9 16%
Student > Ph. D. Student 6 11%
Student > Postgraduate 3 5%
Professor > Associate Professor 3 5%
Other 5 9%
Unknown 20 36%
Readers by discipline Count As %
Medicine and Dentistry 18 32%
Social Sciences 3 5%
Biochemistry, Genetics and Molecular Biology 3 5%
Nursing and Health Professions 2 4%
Computer Science 2 4%
Other 6 11%
Unknown 22 39%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 16 December 2017.
All research outputs
#12,998,893
of 23,009,818 outputs
Outputs from Clinical Epidemiology
#349
of 727 outputs
Outputs of similar age
#201,449
of 437,935 outputs
Outputs of similar age from Clinical Epidemiology
#16
of 33 outputs
Altmetric has tracked 23,009,818 research outputs across all sources so far. This one is in the 43rd percentile – i.e., 43% of other outputs scored the same or lower than it.
So far Altmetric has tracked 727 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 14.4. This one has gotten more attention than average, scoring higher than 51% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 437,935 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 53% of its contemporaries.
We're also able to compare this research output to 33 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 51% of its contemporaries.