Format and Content of Manuscripts Reporting Evaluation/Demonstration Case Studies of Educational and Other Interventions


Author of Article: DR. Gajanan Bhalerao (PT).

PhD Scholar, MPTH Neuro, Certified Adult NDT therapist, HOD Physiotherapy & Rehabilitation Dept in Sancheti Hospital Shivajinagar Pune. Associate Professor & HOD  PT in Neuro Rehabilitation  Dept at Sancheti Institute College of Physiotherapy, Shivajinagar Pune

We all want to do good research and publish a paper our paper in scientific journal. But very few of us know how to write a research article. Writing of research paper is called as Manuscript writing. M.U.H.S. Nashik had arranged a workshop in Manuscript writing they and invited editor of journal Donald Pathman, MD MP from US for training. Fortunately i got opportunity to attend the workshop. I am sharing with you what ever they trained us and given the guidelines for manuscript.

 

 

Format and Content of Manuscripts Reporting Evaluation/Demonstration Case Studies of Educational and Other Interventions

: Donald Pathman, MD MPH

Sections within Manuscript Example: Clinical Quality Improvement
Introduction  
Background to field ACE inhibitors decrease mortality in patients with heart failure
Problem for field Nationally, many eligible CHF patients are not placed on ACEIs
Purpose of the intervention/initiative undertaken There is a need to develop effective models for increasing proportions of eligible CHF patients on ACEI, and show their effectiveness
Purpose of this evaluation To assess the effectiveness of an intervention that uses chart audit data and feedback to clinicians to increase ACEI
The Program/Intervention  
Organizational setting Three clinics affiliated with an academic center, each with different patient population SES profiles
Issue and initiative’s context (historical, cultural) ACEI use in CHF in these clinics was previously documented to be low, no previous intervention on this issue, but this network’s doctors are notoriously resistant to feedback on their care
Rationale/Purpose/Goals of the initiative To increase proportion of eligible CHF patients on ACEI; to increase physicians’ acceptance of QA data intended to improve their care
Theory/Rational for the intervention design selected Evidence shows that repeated reminders through a variety of sources are most effective in helping clinicians change clinical practices
Programmatic components of the initiative Educational lunch conferences, oversight committee of clinic staff and clinicians formed, monthly chart audits, reminder/alert stickers placed on charts, monthly progress graphs created, token incentives given for “most improved”
Internal programmatic evaluation components (formative and summative) Monitoring ACEI use improvement over time; quarterly provider satisfaction survey
Program history Program initiated October 2008, chart stickers added in February 2009, initiative terminated in May 2011 when funding lost
Evaluation Methods (of intervention)  
Evaluation design Pretest/posttest and time-series analysis (no comparison group); identify the evaluator and his/her connection to program
Outcome measures Proportion of eligible CHF patients whose medication  lists include ACEI; proportion of providers indicating satisfaction with their autonomy, with clinic management, with the quality of care they can perform; qualitative data on provider acceptance of the CHF/ACEI initiative
Data collection methods Augmented sample size of chart audit data already routinely collected as part of the program; added quarterly physician satisfaction survey items drawn from validated instruments, and informal focus groups of physicians and staff
Documentation of program fidelity Retrospective assessment that the  targeted number of charts were audited each month, that feedback reports to providers were generated, that chart stickers were used whenever appropriate and that token incentives were given
Ethical review and funding disclosure Funded by Pfizer; approval by UNC School of Medicine IRB

 

Results (findings of the evaluation)  
Program fidelity indicators >90% of targeted charts reviewed each month; only 60% of monthly provider feedback reports generated; token incentives stopped in third month due to provider backlash
Outcome data 30% increase in ACEI use (from 40% to 70%), but increases found principally in non-physician providers; non-physician satisfaction rose on all indicators, physician autonomy indicators fell; focus group data revealed intense positive and negative reactions to program
Other and unexpected outcomes QA coordinator required supportive counseling; total program costs averaged $35,000 per year
Discussion  
Review of key findings As above in “Outcome data”; identified barriers and facilitators to implementation
National perspective/congruence with literature Mirrors previous reports of effectiveness of chart audit and chart sticker approach to QI, and mirrors problem of physician resistance to external scrutiny and “forced” change
Inferences Use of data and longitudinal approach with continuous feedback were helpful; perceived encroachment on physician autonomy by non-physician-initiated program fueled backlash
Limitations Program terminated earlier than planned; not all desired satisfaction survey items could be used; reasons for physician resistance not fully identified
Conclusions This QA approach is effective in increasing ACEI use but can cause backlash in some physicians; more effective in non-physicians

 

 

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s