Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 
  • Users Online: 164
  • Home
  • Print this page
  • Email this page


 
 Table of Contents  
LETTER TO EDITOR
Year : 2017  |  Volume : 3  |  Issue : 2  |  Page : 331-333

Systematically designing a questionnaire to assess the knowledge levels of postgraduate students and faculty about competency-based medical education


1 Department of Community Medicine, Member of Medical Education Unit and Medical Research Unit, Kancheepuram, Tamil Nadu, India
2 Department of Community Medicine, Shri Sathya Sai Medical College and Research Institute, Kancheepuram, Tamil Nadu, India

Date of Web Publication9-Jan-2018

Correspondence Address:
Dr. Saurabh Ram Bihari Lal Shrivastava
3rd Floor, Department of Community Medicine, Shri Sathya Sai Medical College and Research Institute, Ammapettai Village, Thiruporur - Guduvancherry Main Road, Sembakkam Post, Kancheepuram - 603 108, Tamil Nadu
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/IJAM.IJAM_78_17

Rights and Permissions

How to cite this article:
Shrivastava SR, Shrivastava PS, Ramasamy J. Systematically designing a questionnaire to assess the knowledge levels of postgraduate students and faculty about competency-based medical education. Int J Acad Med 2017;3:331-3

How to cite this URL:
Shrivastava SR, Shrivastava PS, Ramasamy J. Systematically designing a questionnaire to assess the knowledge levels of postgraduate students and faculty about competency-based medical education. Int J Acad Med [serial online] 2017 [cited 2019 Oct 16];3:331-3. Available from: http://www.ijam-web.org/text.asp?2017/3/2/331/222479



To the Editor,

Survey scales or questionnaires have been frequently used in research activities pertaining to medical education.[1] In fact, the findings of the analysis of reputed medical education-related journals have indicated that a significant proportion of published research articles have used a questionnaire as their data collection tool.[1] However, it is quite alarming that most of the performed surveys lack in terms of the validity and reliability of the questionnaire and thus the results obtained from them cannot be generalized.[2] This essentially calls for designing the questionnaire through an organized approach, to ensure that each of the study participants has a similar understanding about the different items of a questionnaire.[1],[2]

In an attempt to assess the knowledge levels of postgraduate students and faculty from the department of community medicine about competency-based medical education (CBME) and entrustable professional activities (EPAs), a questionnaire can be designed.[1],[2] However, for ensuring that the designed questionnaire is reliable and valid, the following seven steps should be adhered,[1],[3] namely

  1. To perform the review of literature: This is the foremost thing which has to be done with an aim to precisely describe the construct and to search whether any similar tool has been developed earlier

    The construct in the present case is an assessment of the knowledge levels of postgraduate students and faculty from the department of community medicine about CBME and EPAs. After defining the construct, it is essential to perform a thorough review of literature and look for any similar tool which has been used earlier by any researcher. If a similar kind of tool is identified, apart from obtaining approval from authors to use the tool, it has to be validated based on the local contexts and the study population.[1],[3]
  2. To organize interviews or focus groups: In case, the review of literature does not reveal a similar tool; the next step is to get an insight into what the study participants think and understand about the construct, which has been defined in the previous step. This step can be executed by collecting data from individuals similar to the target population (postgraduate students and faculty from other departments) in their own language either through interviews or focus group discussions. However, it is very important that these respondents are allowed to speak initially without interfering with their thoughts. Subsequently, the researcher can obtain more information through focused questions to ascertain whether the postgraduates and faculty from other departments have a similar understanding about the defined construct based on the literature review [1],[3]
  3. To synthesize from review of literature and interview/focus group: The findings obtained from the steps 1 and 2 are amalgamated together, and a comprehensive list of indicators or the construct can be formulated at this stage. However, in case of any variation emerges between the findings of steps 1 and 2, it is ideal to go by whatever was told by the postgraduate students and faculty from other departments, other than community medicine, in their own language, for defining the construct [1],[3]
  4. To develop items: The objective of this step is to prepare a comprehensive list of items to adequately cover the predefined constructs (viz., assessment of the knowledge levels of postgraduate students and faculty from the department of community medicine about CBME and EPAs). Nevertheless, while preparing this item list, it is of utmost importance that the language should be in such a way that the study participants (postgraduate students and faculty from community medicine) can easily understand it. Furthermore, it will be wise to adhere to the basic rules pertaining to:
    1. Number of items: It is good to have more number of items (viz. What is CBME? How it is different from conventional medical education? What are the advantages of CBME? What are EPAs? Why EPAs are needed? etc.), and then gradually refines to reach somewhere between 6 and 10 items. Having a concise list of items will definitely improve the level of participation and responses from the potential study respondents
    2. Language: The language used should be clear, unambiguous, and based on the comprehension or understanding of the study participants
    3. General rules: It is a good practice to write questions instead of sentences or not using negatively framed items or using appropriate anchor stems, to enable quantification of the response during the analysis stage of the questionnaire, etc.[2],[3]
  5. To conduct expert validation: This is a very essential step as it involves getting opinions from the content experts regarding whether each of the developed items is related to the construct decided earlier and whether any of the key items has been missed. To ensure the success of this step, the most important thing is to select a group of experts (6–10) based on their experience, knowledge, willingness, and availability. These experts can be from other institutes which have implemented CBME within their setup or educational researchers with expertise in the field of CBME. The opinions of these experts can be obtained in a content validated form, and then, their responses can be categorized for analysis as:
    1. Quantitative: In this, experts comment about the representativeness, clarity, relevance, and distribution of each of the formulated items, and they can be analyzed by the content validity ratio, content validity index, or by factorial validity index
    2. Qualitative: In the content validated form, an extra space is given for free-text comments from the experts to record some of their other valuable input.


    By means of a combination of both of the above things, a meaningful addition or subtractions from the list of items can be made.[1],[3]
  6. To perform cognitive interviews: This step assesses how the study respondents interpret the items and response anchors, which have been initially designed by the investigator and then cross-checked with the help of content experts. The cognitive interviews can be performed by either think-aloud approach or verbal probing (concurrent or retrospective or a combination of both) method, and a small number of postgraduate students and faculty from other departments, other than community medicine, can be interviewed. The qualitative data obtained can be either analyzed through coding or interactions analysis or by a mixed method. The aim of this step is to ensure that respondents understand each item in a similar fashion, and this extensively improves the quality of the survey tool further [1],[4]
  7. To conduct pilot testing: Regardless of the above all steps, some of the survey items might still have some issues, and to deal with them effectively, the questionnaire is pilot tested in an identical target group of postgraduate students and faculty. The objective is to ascertain:
    1. The internal structure of the survey scale and appraise the degree to which formulated items measure the assessment of the knowledge levels of postgraduate students and faculty from the department of community medicine about CBME and EPAs (construct), and this is done with the help of Factor analysis
    2. Assessment of the reliability of the scale by computing Cronbach's alpha for having an inference about the internal consistency of the item scores
    3. Review of descriptive statistics and plotting of histograms to demonstrate the distribution of the responses based on individual item
    4. Finally, a composite score is computed for each item in the questionnaire, to look for the consistency of the association with the previous research.[1],[2],[3],[5]


To conclude, considering that the majority of the educational research employs the use of a questionnaire, there is an immense need to adhere to the above steps to boost the validity and the reliability of the survey tool.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Artino AR Jr., La Rochelle JS, Dezee KJ, Gehlbach H. Developing questionnaires for educational research: AMEE guide no 87. Med Teach 2014;36:463-74.  Back to cited text no. 1
    
2.
Artino AR Jr. Good decisions cannot be made from bad surveys. Mil Med 2017;182:1464-5.  Back to cited text no. 2
[PUBMED]    
3.
Magee C, Rickards G, A Byars L, Artino AR Jr. Tracing the steps of survey design: A graduate medical education research example. J Grad Med Educ 2013;5:1-5.  Back to cited text no. 3
[PUBMED]    
4.
Willis GB, Artino AR Jr. What do our respondents think we're asking? Using cognitive interviewing to improve medical education surveys. J Grad Med Educ 2013;5:353-6.  Back to cited text no. 4
[PUBMED]    
5.
Ayob A, Awadh AI, Hadi H, Jaffri J, Jamshed S, Ahmad HM, et al. Malaysian consumers' awareness, perception, and attitude toward cosmetic products: Questionnaire development and pilot testing. J Pharm Bioallied Sci 2016;8:203-9.  Back to cited text no. 5
    




 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
References

 Article Access Statistics
    Viewed826    
    Printed42    
    Emailed0    
    PDF Downloaded17    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]