Agreement among healthcare professionals in ten European countries in diagnosing case-vignettes of surgical-site infections.
Détails
ID Serval
serval:BIB_F6593FA765F3
Type
Article: article d'un périodique ou d'un magazine.
Collection
Publications
Institution
Titre
Agreement among healthcare professionals in ten European countries in diagnosing case-vignettes of surgical-site infections.
Périodique
Plos One
ISSN
1932-6203 (Electronic)
ISSN-L
1932-6203
Statut éditorial
Publié
Date de publication
2013
Peer-reviewed
Oui
Volume
8
Numéro
7
Pages
e68618
Langue
anglais
Notes
Publication types: Journal Article ; Research Support, Non-U.S. Gov't
Résumé
OBJECTIVE: Although surgical-site infection (SSI) rates are advocated as a major evaluation criterion, the reproducibility of SSI diagnosis is unknown. We assessed agreement in diagnosing SSI among specialists involved in SSI surveillance in Europe.
METHODS: Twelve case-vignettes based on suspected SSI were submitted to 100 infection-control physicians (ICPs) and 86 surgeons in 10 European countries. Each participant scored eight randomly-assigned case-vignettes on a secure online relational database. The intra-class correlation coefficient (ICC) was used to assess agreement for SSI diagnosis on a 7-point Likert scale and the kappa coefficient to assess agreement for SSI depth on a three-point scale.
RESULTS: Intra-specialty agreement for SSI diagnosis ranged across countries and specialties from 0.00 (95%CI, 0.00-0.35) to 0.65 (0.45-0.82). Inter-specialty agreement varied from 0.04 (0.00-0.62) in to 0.55 (0.37-0.74) in Germany. For all countries pooled, intra-specialty agreement was poor for surgeons (0.24, 0.14-0.42) and good for ICPs (0.41, 0.28-0.61). Reading SSI definitions improved agreement among ICPs (0.57) but not surgeons (0.09). Intra-specialty agreement for SSI depth ranged across countries and specialties from 0.05 (0.00-0.10) to 0.50 (0.45-0.55) and was not improved by reading SSI definition.
CONCLUSION: Among ICPs and surgeons evaluating case-vignettes of suspected SSI, considerable disagreement occurred regarding the diagnosis, with variations across specialties and countries.
METHODS: Twelve case-vignettes based on suspected SSI were submitted to 100 infection-control physicians (ICPs) and 86 surgeons in 10 European countries. Each participant scored eight randomly-assigned case-vignettes on a secure online relational database. The intra-class correlation coefficient (ICC) was used to assess agreement for SSI diagnosis on a 7-point Likert scale and the kappa coefficient to assess agreement for SSI depth on a three-point scale.
RESULTS: Intra-specialty agreement for SSI diagnosis ranged across countries and specialties from 0.00 (95%CI, 0.00-0.35) to 0.65 (0.45-0.82). Inter-specialty agreement varied from 0.04 (0.00-0.62) in to 0.55 (0.37-0.74) in Germany. For all countries pooled, intra-specialty agreement was poor for surgeons (0.24, 0.14-0.42) and good for ICPs (0.41, 0.28-0.61). Reading SSI definitions improved agreement among ICPs (0.57) but not surgeons (0.09). Intra-specialty agreement for SSI depth ranged across countries and specialties from 0.05 (0.00-0.10) to 0.50 (0.45-0.55) and was not improved by reading SSI definition.
CONCLUSION: Among ICPs and surgeons evaluating case-vignettes of suspected SSI, considerable disagreement occurred regarding the diagnosis, with variations across specialties and countries.
Mots-clé
Clinical Competence, Cross Infection, Europe, Health Personnel, Humans, Physicians, Quality Assurance, Health Care, Surgical Wound Infection/diagnosis
Pubmed
Open Access
Oui
Création de la notice
27/12/2013 12:29
Dernière modification de la notice
20/08/2019 17:22