This article has Open Peer Review reports available.
Practice development plans to improve the primary care management of acute asthma: randomised controlled trial
© Foster et al; licensee BioMed Central Ltd. 2007
Received: 13 September 2006
Accepted: 24 April 2007
Published: 24 April 2007
Our professional development plan aimed to improve the primary care management of acute asthma, which is known to be suboptimal.
We invited 59 general practices in Grampian, Scotland to participate. Consenting practices were randomised to early and delayed intervention groups. Practices undertook audits of their management of all acute attacks (excluding children under 5 years) occurring in the 3 months preceding baseline, 6-months and 12-months study time-points. The educational programme [including feedback of audit results, attendance at a multidisciplinary interactive workshop, and formulation of development plan by practice teams] was delivered to the early group at baseline and to the delayed group at 6 months. Primary outcome measure was recording of peak flow compared to best/predicted at 6 months. Analyses are presented both with, and without adjustment for clustering.
23 consenting practices were randomised: 11 to early intervention. Baseline practice demography was similar. Six early intervention practices withdraw before completing the baseline audit. There was no significant improvement in our primary outcome measure (the proportion with peak flow compared to best/predicted) at either the 6 or 12 month time points after adjustment for baseline and practice effects. However, the between group difference in the adjusted combined assessment score, whilst non-significant at 6 months (Early: 2.48 (SE 0.43) vs. Delayed 2.26 (SE 0.33) p = 0.69) reached significance at 12 m (Early:3.60 (SE 0.35) vs. Delayed 2.30 (SE 0.28) p = 0.02).
We demonstrated no significant benefit at the a priori 6-month assessment point, though improvement in the objective assessment of attacks was shown after 12 months. Our practice development programme, incorporating audit, feedback and a workshop, successfully engaged the healthcare team of participating practices, though future randomised trials of educational interventions need to recognise that effecting change in primary care practices takes time. Monitoring of the assessment of acute attacks proved to be a feasible and responsive indicator of quality care.
National and international guidelines provide evidence-based advice about the management of acute asthma, emphasising the need for objective assessment, prompt treatment of the attack, and provision of self-management education as part of structured follow-up. [1, 2] Despite early hopes about the potential of guidelines to improve practice, and increasing emphasis on ensuring wide dissemination,[4, 5] there is continuing concern that the care of acute asthma remains suboptimal. [6–9] This is of particular importance in primary care where 90% of acute asthma is managed. [6–8]
Acute asthma is common, with over 100,000 admissions and 1,500 deaths a year attributed to asthma in the UK. [11, 12] Confidential enquiries into the cause of asthma deaths over the last three decades have consistently implicated failure to appreciate the severity of the attack resulting in delayed, inadequate emergency treatment as a potentially preventable factor. [13–16]
Didactic lectures and written guidelines, even those including a brief summary of relevant points for clinicians, are known to be ineffective in promoting change in practice . This has stimulated the development of less passive educational interventions. Audit and feedback can change professional practice, though the magnitude of effect varies and questions remain about the most appropriate supporting interventions. Recent policy-driven changes in continuing medical education have led to a shift towards formal needs assessment and multi-professional practice-based learning in primary care through professional development plans. Founded on these concepts a Professional Development Programme was developed by the General Practice Airways Group (a UK primary care interest group). The programme provides a framework for practices wishing to improve their management of acute asthma which incorporates audit, feedback and an interactive workshop. An early pilot study using the programme in three selected, asthma-interested practices, suggested it may have the potential for positively changing practice behaviour.
Our randomised controlled trial aimed to establish the effectiveness of the Acute Asthma Professional Development Programme to improve management (specifically the recording of objective assessment of severity) in general practices recruited from one region of Scotland.
The trial was undertaken during 2002 with approval from Grampian Research Ethics Committee.
Recruitment of practices
We invited, by letter, all 59 general practices in the Aberdeen, and Banff and Buchan regions of Grampian, Scotland to participate in the study. All non-responding practices were telephoned and a personal invitation issued. Participating practices gave their fully informed consent to taking part in all aspects of the development programme.
Using random number tables, consenting practices were centrally randomised to early or delayed intervention groups by a researcher not involved in the trial. It was not possible to blind the practices as the intervention was an active process. The researchers were aware of allocation, though the audit data were submitted to a different centre (Tayside Centre for General Practice, Dundee) to that of the research team organising the educational intervention in Aberdeen.
Each enrolled practice undertook audits of their management of acute asthma at baseline, 6-months and 12-months using previously piloted methodology. We provided full instructions on how to undertake the audits and a study helpline was available for support. The educational intervention included feedback of audit data and attendance at a workshop during which a practice development plan was formulated by participants. The intervention was provided immediately after the baseline audit to the early intervention group, and immediately after the 6-month audit to the delayed group.
The Acute Asthma Professional Development Programme
The baseline audit
The educational intervention
• Multi-disciplinary interactive workshop: A 3-hour workshop held in the University of Aberdeen, facilitated by two of the researchers (a general practitioner (HP) and respiratory nurse (GH)), was attended by representatives of the participating practice teams (normally a GP and practice nurse; the practice manager or senior receptionist was also invited). Amalgamated audit results from the group were discussed and used to tailor discussion towards specific aspects of acute asthma care identified as falling short of guideline standards. Case studies were used to facilitate discussion of practical aspects of acute asthma management highlighting key deficiencies reported in published literature. Recording of objective assessment of attacks was emphasised.
• A list of suggested references (selected because of their relevance to primary care management of acute asthma), resources (e.g. guideline summary charts, web-sites of professional organisations, training organisations, equipment manufacturers) and practical materials (e.g. examples of asthma action plans) were provided to all participants and discussed in the workshops.
The audits completed at 6-months and 12-months were undertaken using the same methodology as at baseline and were fed back to all practices. The 6-month audit provided a review of progress for the early intervention group, and baseline assessment for the delayed intervention group. The time scale of 6 months had proved to be feasible in our pilot study.
Our primary outcome measure was the proportion of acute episodes with a recording of a peak flow compared to the patients' best (or predicted if best was not known) at the 6-month audit. This measurement is recommended by current asthma guidelines as a basis for the classification of severity of acute attacks and for determining appropriate emergency treatment. Our pilot study demonstrated an improvement in the proportion of patients with a peak flow compared to best/predicted from the baseline prevalence of 21% to 61% at 6 months.
Other outcome measures, reflecting the recommendations of the guideline in force at the beginning of the trial, from the critical event analysis were considered in three domains: assessment (recording of peak flow, 'peak flow compared to best/predicted', respiratory rate, heart rate, ability to speak), management (provision of oxygen, bronchodilation. systemic steroids actually administered, steroids prescribed, inhaled steroids, referral to hospital) and follow-up (provision of advice, follow up consultation, self-management education). Combined scores for each domain were calculated by summing the questions which were answered 'yes' to give a score out of 5, 6 or 3 for assessment, management and follow-up respectively.
Sample size calculation and statistical analysis
In order to take account of the clustering by practice, the sample size calculation needed to include an estimate of the intracluster correlation coefficient (ICC). Our pilot work suggested that we could expect three acute episodes per full time general practitioner during each three month period and we estimate an average of 5 general practitioners per practice. Using a conservative estimate for the ICC of 0.05, to detect a change of 30% in the proportion with a peak flow compared to best/predicted from a baseline prevalence of 21% with 80% power and 5% two-sided significance level would require 5 practices per arm (ie. an estimated 25 general practitioners and 75 clinical episodes in each group). To allow for withdrawals we decided to recruit at least 20 practices in total.
We analysed associations between categorical data using the chi-square test. Differences in normally distributed combined scores between early and delayed intervention groups were examined with independent sample t-tests.
Because randomisation was at practice level, the statistical analysis had to adjust for the effect of clustering. Multilevel modelling was inappropriate since the outcomes were practice level data across time and not multiple measurements across time on the same individuals. Therefore, the primary analysis was based on the practice summary measures. Analysis of covariance was used to examine differences in outcomes between groups at 6-months and 12-months after adjustment for baseline differences and practice effects. Practices who recorded no exacerbations during an audit cycle were excluded from the analysis at that time point.
Early (n = 11)
Delayed (n = 12)
Median (interquartile range)
Number of partners
5 (3.0, 7.0)
5 (4.0, 8.8)
Number of practice nurses
3 (2.0, 4.0)
3 (2.0, 4.0)
Number of nurses with asthma training
1 (1.0, 1.0)
1 (0.3, 2.0)
Critical event analysis
Comparison of audit data at baseline, 6 m and 12 m in 'early' vs. 'delayed' groups
Number of episodes submitted
Mean age (SE)
Gender: male % (n)
Peak flow compared with best
Unadjusted data % (n)
Adjusted data %*
Early vs. delayed intervention groups
Primary outcome measure: proportion with peak flow compared to best/predicted
The proportion of episodes in which the presenting peak flow had been compared to best (or predicted) was comparable in the early and delayed intervention groups at baseline (see Table 2). There was a trend to continuing improvement in the recording of peak flow compared to best/predicted in the early group practices throughout the year. Unadjusted data showed no difference at 6-months (p = 0.07) but a statistically significant improvement was observed at 12-months (p < 0.001). Although this trend remained, these differences did not reach significance at either time point after adjustment for baseline and practice effects.
Comparison of combined assessment, management and follow-up scores at baseline, 6-months and 12-months in the early vs. delayed groups
Values are mean (SE)
Data adjusted for baseline values and practice
Combined assessment score
Combined management score
Combined follow-up score
Formulation of practice development plans
Twelve practices (6 'early', 6 'delayed') returned their completed practice development plans. Aspects of care they hoped to improve included the assessment process (5 'early', 1 'delayed'), follow up provision (3 'early', 3 'delayed'), the use of self-management plans (4 'early' 3 'delayed') and the use of oxygen (1 'early' 3 'delayed').
Our practice development programme incorporating audit, feedback and a workshop did not result in a significant improvement in our primary outcome measure (the proportion with peak flow compared to best/predicted) at either the 6 or 12 month time points after adjustment for baseline and practice effects. However, one of our secondary outcome measures (the recording of the objective assessment of attacks) showed a trend to improvement which was apparent at 6-months, and reached statistical significance at 12-months, suggesting that the hypothesised 6-month period was too short to enable practices fully to implement change. The marked trend to improvement in follow up arrangements lost its statistical significance after adjustment for practice effects and baseline differences. The latter, however, may be an over-adjustment as collecting and responding to the baseline audit data was a part of the educational intervention. For example: poor baseline performance should stimulate a practice to develop that aspect of their care. The true effect of this complex intervention is, therefore, likely to fall somewhere between our adjusted and unadjusted results.
Limitations of the study
The withdrawal of five practices from the early intervention group before submitting their baseline data was unfortunate. The remaining practices are likely to have been the most motivated making it easier to effect change in this group. By contrast, all but one of the delayed intervention group practices completed the programme, though five did not attend the educational intervention. If less motivated practices in the 'delayed' group had chosen to 'stay the course' rather than withdraw this might have diluted any effect in this group. However, because of the considerable commitment required we believe it is unlikely that reluctant practices would have continued to provide data.
We rejected the option of collecting baseline data before randomisation because the audit was an integral part of the educational intervention. Willingness to participate in all aspects of the programme (including the audit, workshop and development plan) was one of the factors potentially influencing the effectiveness of this 'complex intervention' Randomising only those practices who submitted baseline data would have eliminated one of these important factors.
Practices submitted less than the projected three episodes/GP in each three month audit period. This, combined with the withdrawal of five practices, meant that we did not achieve the required number of 75 acute episodes at baseline and 6-months in the early intervention group so we were slightly underpowered to detect a statisticallysignificant change in the primary outcome measure. Although raising the possibility that some acute episodes had been overlooked, the rate of identification of acute attacks per GP was similar in both groups, and was consistent over the three audit time points. In addition, the demography of the patients suffering acute episodes was similar to that described in previous regional and national audits (allowing for the slightly different recruitment strategies). [6–8] The reduction in number of acute episodes may reflect the reported decline in asthma exacerbations in the UK. The audits were carried out internally by practice staff, which introduced the possibility of inaccurate, incomplete or biased data. Despite this concern, we considered it important that practices undertook their own data collection as part of the learning process. Financial constraints prevented quality checks of the accuracy of submitted data.
Of the 59 practices approached, just over a third agreed to participate, and just over a quarter completed the programme which may be interpreted as limiting generalisability. However, the recruited practices reflected the range of demography typical of the area. In addition, participation in a professional development plan is an active process which demands considerable application with significant workload implications for members of the team. The reasons given for non-participation and withdrawal suggest that the time and effort required was not possible for many practices during the study period. Practice development needs vary over time and it is likely that under other circumstances other practices would have wished to participate.
Main strengths of study
Our trial included a broad range of urban and rural practices, though to facilitate local workshops we limited recruitment to one Scottish region. The demography of the practices in the two study groups was similar. The audit tools were developed from a published survey, and the proforma had been used successfully in an earlier pilot project. The data submitted from the baseline audit was broadly similar to that observed in other published surveys [6–8, 20] increasing confidence in the reliability of our outcomes.
Interpretation of findings in relation to previously published work
In common with the majority of studies of educational interventions, [24, 25] our trial evaluates 'behaviour' – the third tier of Kirkpatrick's hierarchy of levels of evaluation. Although this is one step removed from the ultimate goal (improved patient outcomes) our selected process outcomes are recommended by evidence based guidelines. Importantly, poor performance in objectively assessing severity has been consistently linked with poor patient outcomes in a number of confidential enquiries. [13–16] In addition, keeping adequate records which allow retrospective assessment of severity is good clinical practice and a medico-legal requirement. [27, 28] The development plans prepared by individual practices confirmed that objective assessment was a priority for five of the early intervention group practices. Referral data, a potential outcome measure which we included within the management domain, is difficult to interpret as rates could be influenced by management strategies or different severity of presenting attacks.
Using the structure of Professional Development Plans  our intervention not only involved the recognised formula of 'audit and feedback',  but also sought formally to engage the practice team in the process of identifying the challenges and obstacles relevant to their practice, and to plan how change may be effected. This is a recognised educational strategy,[25, 29] but 'individualising' of the practice development plan, but may have diluted the effect on specific outcomes as some practices may have decided that certain aspects were not relevant to them. Whilst incorporating many of the key elements of the stepwise cycle of change described by Grol, we may have underestimated the importance of the 6-month audit. Audit is associated with improved asthma care, and any improvements at 6-months (albeit not significant) may have encouraged the 'early' practices to further develop their care in time for the 12-month audit. By contrast the delayed intervention group received no feedback until the 6-months and had little time to effect change.
There is no consensus on the ideal duration of educational intervention studies, though six months is the median duration of trials reviewed by Wensing, Ideal duration is a balance between the time needed to effect change, whilst avoiding the danger of deterioration if the time scale were too prolonged. Our trial was designed to demonstrate a difference between the two groups at 6-months, when the early intervention group had received the education component of the programme with the 'delayed' practices acting as controls. However, after adjustment for baseline differences and practice, statistically significant change only occurred at 12-months. Future randomised trials of educational interventions need to recognise that effecting change in practices takes time. Although the timeframe for change was implicit within the structure of the trial, the addition of an explicit timeline within the Practice Development Plan might have facilitated more rapid change.
In order to achieve improvement, the practices had to commit to the project for a year and participate in three audit cycles and a workshop. We offered no financial incentives to cover the cost of the work involved suggesting that the process of formulating a practice-specific acute asthma development plan in response to their baseline performance had sufficiently engaged practices to encourage an on-going focus on aspects of care they wished to improve.
Other factors which may have influenced outcomes include the publication of national guidelines in February 2002, though this is unlikely to have significantly affected our results as the final audit was already underway at the time of the launch. The primary care management of acute asthma was not one of the key messages promoted by the publicity following the launch. The lack of substantial change in the delayed intervention group confirms that this was unlikely to be a significant factor.
Interpretation of the management scores is difficult without objective independent assessment of asthma severity. It is possible that the more aggressive management of attacks at baseline reflected more severe attacks at this audit time point rather than inappropriate management.
Although our randomised trial of an acute asthma professional development programme for general practices did not demonstrate significant improvements at the a priori 6-month assessment point, there was an improvement compared to baseline in the objective assessment of severity 12 months into the trial. Monitoring of the assessment of acute attacks proved to be a feasible and responsive indicator of quality care. Our findings, especially the timescale needed to effect change, have implications for the design of future trials.
The trial was funded by Grampian Health Board. We thank Professor Aziz Sheikh and Dr Philip Cotton who commented on the early drafts of the paper, and are grateful to Sofia Fonseca who assisted with the statistical analysis. We acknowledge the hard work of the practices of Grampian who participated in this trial.
- British Thoracic Society and Scottish Intercollegiate Guidelines Network: British guideline on the management of asthma. Thorax. 2003, 58 (Suppl i): 1-94. November 2005 update available from http://www.brit-thoracic.org.uk and http://www.sign.ac.uk (accessed June 2006)Google Scholar
- Global Strategy for Asthma Management and Prevention: GINA Workshop Report: updated. 2005, (accessed June 2006)., [http://www.ginasthma.com] October.Google Scholar
- Jones K: Asthma care in general practice – time for a revolution?. Br J Gen Pract. 1991, 41: 224-226.PubMedPubMed CentralGoogle Scholar
- Dennis SM, Edwards S, Partridge MR, Pinnock HJ, Qureshi S: The dissemination of the British Guideline on the Management of Asthma 2003. Respir Med. 2004, 98: 832-837. 10.1016/j.rmed.2004.02.018.View ArticlePubMedGoogle Scholar
- Partridge MR, Harrison BDW, Rudolph M, Bellamy D, Silverman M: The British Asthma Guidelines – their production, dissemination and implementation. Respir Med. 1998, 92: 1046-1052. 10.1016/S0954-6111(98)90353-5.View ArticlePubMedGoogle Scholar
- Neville RG, Clark RC, Hoskins G, Smith B, for the General Practitioners in Asthma Group: National asthma attack audit 1991–2. BMJ. 1993, 306: 559-562.View ArticlePubMedPubMed CentralGoogle Scholar
- Neville RG, Hoskins G, Smith B, Clark RA: How general practitioners manage acute asthma attacks. Thorax. 1997, 52: 153-156.View ArticlePubMedPubMed CentralGoogle Scholar
- Pinnock H, Johnson A, Young P, Martin N: Are doctors still failing to assess and treat asthma attacks? An audit of the management of acute attacks in a health district. Respir Med. 1999, 93: 397-401. 10.1053/rmed.1999.0575.View ArticlePubMedGoogle Scholar
- Pearson MG, Ryeland I, Harrison BDW on behalf of the BTS: Comparison of the process of care of acute severe asthma in adults admitted to hospital before and 1 yr after the publication of the guidelines. Respir Med. 1996, 90: 539-545. 10.1016/S0954-6111(96)90146-8.View ArticlePubMedGoogle Scholar
- Fleming DM, Sunderland R, Cross KW, Ross AM: Declining incidence of episodes of asthma: a study of trends in new episodes presenting to general practitioners in the period 1989–98. Thorax. 2000, 55: 657-661. 10.1136/thorax.55.8.657.View ArticlePubMedPubMed CentralGoogle Scholar
- Lung and Asthma Information Agency: Trends in hospital admissions for asthma. LAIA Factsheet 2. 1996Google Scholar
- British Thoracic Society: The Burden of Lung Disease. 2001, [http://www.brit-thoracic.org.uk/article46.html]Google Scholar
- British Thoracic Association: Deaths from asthma in two regions of England. BMJ. 1982, 285: 1251-1255.View ArticlePubMed CentralGoogle Scholar
- Mohan G, Harrison BDW, Badminton RM, Mildenhall S, Wareham NJ: A confidential enquiry into deaths caused by asthma in an English Health region: implications for general practice. Br J Gen Pract. 1996, 46: 529-532.PubMedPubMed CentralGoogle Scholar
- Wareham NJ, Harrison BDW, Jenkins PF, Nichols J, Stableforth DE: A district confidential enquiry into deaths due to asthma. Thorax. 1993, 48: 1117-1120.View ArticlePubMedPubMed CentralGoogle Scholar
- Bucknall CE, Slack R, Godley CC, MacKay TW, Wright SC on behalf SCAID: Scottish Confidential Inquiry into Asthma Deaths (SCAID) 1994–6. Thorax. 1999, 54: 978-984.View ArticlePubMedPubMed CentralGoogle Scholar
- Baker R, Fraser RC, Stone M, Lambert P, Stevenson K, Sheils C: Randomised controlled trial of the impact of guidelines, prioritised review criteria and feedback on implementation of recommendations for angina and asthma. Br J Gen Pract. 2003, 53: 284-291.PubMedPubMed CentralGoogle Scholar
- Jamtvedt G, Young JM, Kristoffersen DT, O'Brien MA, Oxman AD: Audit and feedback: effects on professional practice and health care outcomes. The Cochrane Database of Systematic Reviews. 2006, Art. No.: CD000259. DOI: 10.1002/14651858.CD000259.pub2., 2Google Scholar
- Calman KC: A Review of Continuing Professional Development in General Practice. Department of Health. 1998Google Scholar
- Pinnock H, Hoskins G, Smith B, Weller T, Price D: A pilot study to assess the feasibility and acceptability of undertaking acute asthma professional development in three different UK primary care settings. Prim Care Respir J. 2003, 12: 7-11. [http://www.thepcrj.org/journ/vol12_1/007_011pinnock.pdf]View ArticleGoogle Scholar
- British Thoracic Society: The British Guidelines on Asthma management: 1995 Review and Position Statement. Thorax. 1997, 52 (Suppl 1): S1-20.Google Scholar
- Campbell M, Fitzpatrick R, Haines A, Kinmouth AL, Sandercock P, Speigelhalter D, Tyrer P: Framework for design and evaluation of complex intervention to improve health. BMJ. 2000, 321: 694-696. 10.1136/bmj.321.7262.694.View ArticlePubMedPubMed CentralGoogle Scholar
- Fleming DM, Sunderland R, Cross KW, Ross AM: Declining incidence of episodes of asthma: a study of trends in new episodes presenting to general practitioners in the period 1989–98. Thorax. 2000, 55: 657-661. 10.1136/thorax.55.8.657.View ArticlePubMedPubMed CentralGoogle Scholar
- Davis DA, Thomson MA, Oxman AD, Haynes RB: Evidence for the effectiveness of CME. A review of fifty randomised controlled trials. JAMA. 1992, 268: 1111-1117. 10.1001/jama.268.9.1111.View ArticlePubMedGoogle Scholar
- Davis DA, Thomson MA, Oxman AD, Haynes RB: Changing physician peformance: a systematic review of continuing medical education strategies. JAMA. 1995, 274: 700-705. 10.1001/jama.274.9.700.View ArticlePubMedGoogle Scholar
- Kirkpatrick DI: Evaluation of training. Training and development handbook. Edited by: Craig R, Bittel I. 1967, New York: McGraw-HillGoogle Scholar
- General Medical Council: Good Medical Practice. GMC. 2001, [http://www.gmc-uk.org/guidance/good_medical_practice/index.asp]Google Scholar
- Medical Defence Union: Can I see the records? The MDU's guide to clinical notes, disclosure and patient access. MDU. 2000, [http://www.the-mdu.com/associatedArticles/canisee.pdf]Google Scholar
- Kaufman DM: ABC of learning and teaching in medicine. Applying educational theory in practice. BMJ. 2003, 326: 213-6. 10.1136/bmj.326.7382.213.View ArticlePubMedPubMed CentralGoogle Scholar
- Grol R: Beliefs and evidence in changing clinical practice. BMJ. 1997, 312: 949-52.Google Scholar
- Neville RG, Hoskins G, McCowan C, Smith B: Pragmatic 'real world' study of the effect of audit of asthma on clinical outcome. Prim Car Resp J. 2004, 13: 198-204. 10.1016/j.pcrj.2004.06.007.Google Scholar
- Wensing M, van der Weijden T, Grol R: Implementing guidelines and innovations in general practice: which interventions are effective?. Br J Gen Pract. 1998, 48: 991-7.PubMedPubMed CentralGoogle Scholar
- The pre-publication history for this paper can be accessed here:http://0-www.biomedcentral.com.brum.beds.ac.uk/1471-2296/8/23/prepub
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.