In this week’s episode (don’t miss the new theme song!!), the group looks at an article on knowledge flow – the authors use a bibliometric approach to an answer: Is #meded insular, or is it sampling widely from other fields like psychology, sociology, or economics?
KeyLIME Session 337
Albert et. al., Barriers to cross-disciplinary knowledge flow: The case of medical education research Perspect Med Educ. 2021 Oct 14. Online ahead of print.
Jason R. Frank (@drjfrank)
“Where do you get your news from?” may be a very 21st century question. In this era of dueling worldviews and alternative facts, our sources of information are a very modern angst.
Applied to #meded: in your practice as an educator, where do you learn? What journals do you read? Who do you talk to? Which conferences influence you? What media? In #meded, that is likely to be very local.
So is #meded insular, or is it sampling widely from other fields like psychology, sociology, or economics? Enter Albert et al, who chose a bibliometric approach to an answer.
This paper, Barriers to cross-disciplinary knowledge flow: the case of meded research by Albert, Rowland, Friessen, and Laberge appeared in Perspectives in Meded in December 2020. The authors set out to compare meded and higher ed as two similar fields and measure their degree of “knowledge exchange”. This was operationalized as:
Do medical education researchers draw on knowledge developed in education, higher education, and other education-related disciplines (e.g., sociology, psychology, political sciences, economics)?
Key Points on the Methods
Higher education was chosen as a suitable comparative field as they are both domains of education, both postsecondary, and share a confluence of social sciences & education epistemic cultures.
As a theoretical framework, the authors were informed by Bourdieu’s concepts of doxa and field:
- Doxa refers to “set of fundamental beliefs which does not even need to be asserted in the form of an explicit, self-conscious dogma”, or the “cultural orthodoxy of a field”.
- Field is a “space in which social actors struggle for scientific authority” to influence what is legitimate practice for a group of scholars.
The authors chose bibliometric citation analysis for their comparative methodology. Briefly, they:
- Chose the year 2017 to sample meded and higher ed journals;
- They selected the top 5 core meded and higher journals by impact factor, as defined by Web of Science;
- Limited their inclusion to research papers only (not well justified in the text);
- Chose 10% of these “randomly” (whatever that meant);
- Selected the citations from these 10% and sorted them into journal sources or books/chapters;
- Coded these inductively into a typology of 8 “knowledge orientations”:
- Medical Education
- Applied health services or clinical research
- Interdisciplinary health
- Disciplinary research
- Education in general
- Topic-centred non-health
- Science education
- Higher education;
They also looked at author affiliations as another measure of interdisciplinarity. They add a small ~reflexivity statement to say the authorship team has diverse backgrounds.
The top journals of 2017 were:
Dataset from each journal block was as follows:
Main results were:
- 40% of meded citations were from meded vs 36% for higher ed
- 40% of meded citations were from clinical/health services research, so together ~80% of citations were “within medicine”
- By contrast, Higher Ed citations were more diverse across other categories of disciplines
- Within the Disciplinary Research category, Meded tapped Psychology 67% of the time, while Higher Ed had a much wider spread of inputs (e.g. Psychology 15%, economics 25%)
- 90% of meded authors were appointed to Faculties of Medicine, and 60% held MDs
The authors choices of such a limited dataset (2017, 5 journals by JIF, just research papers), their unique typology of sources, small sample size of 10%) may limit the validity of these findings.
The authors conclude…that Meded is “inwardly focused and selective about the sources of knowledge they allow entry into academic space”. Only 20% of meded citations in the dataset were from non-medical citations. External citation rates in other fields are often quoted in the range of 50-60%.
The authors speculate on the sources of this Meded insularity, which they call an “epistemic gap”:
- Culture: The epistemic doxa or intellectual culture of meded is oriented to positivism, “evidence-based medicine”, hierarchies of evidence. and the all-powerful RCT. This impacts the scientific training of those involved in medicine and limits the scope of acceptable ways of academic thinking.
- Busy Docs: Physicians, in an applied, intensive occupation may have less time to seek out and be exposed to other literature and ways of knowing.
- Databases Suck: PubMed and other medical databases are oriented to epidemiology, and not social sciences.
- Promotions: evaluation criteria for quality scholarship are heavily weighted to a high number of publications in clinical journals. Other methods, media (e.g. books, blogs), and paradigms are discouraged.
- Tribes: Higher Ed may be better at valuing and including a breadth of fields / epistemic cultures
Spare Keys – other take home points for clinician educators
- This is another in a series of papers we have covered that use bibliometric methods to answer meded questions.
- The methodological choices may limit the utility of this paper, but even proof of concept papers can get published, so others can build on them.
- A recently published systematic dataset of meded journals may be a better one to sample: the MEJ24 by Maggio et al (Medical Education 2021)
The views and opinions expressed in this post and podcast episode are those of the host(s) and do not necessarily reflect the official policy or position of The Royal College of Physicians and Surgeons of Canada. For more details on our site disclaimers, please see our About Page