Review - Artificial intelligence in clinical practice: Implications for physiotherapy education

Article: Artificial intelligence in clinical practice: Implications for physiotherapy education
Article status: accepted
Author: Ken Masters
Review date: 16 July 2019
DOI: 10.14426/opj/20190716

Peer review (Ken Masters) – Artificial intelligence in clinical practice: Implications for physiotherapy education

The paper is rather speculative in places, but this is inherent in the nature of the topic, so is not necessarily a weakness.  For the most part, the speculation is well-grounded in what is currently known and practised in both AI and medicine, and reads easily and fluently.  While I’m not sure that the paper adds a great deal to our knowledge of the topic, it does provide a useful summary for those that are not entirely aware of AI in medicine and its implications.

Some issues to consider

  • Overall, I would like to see a few concrete examples of where AI is currently being used in physiotherapy and physiotherapy education. (The last bullet point in my comments should also be noted, because, if the authors wish to argue that nothing is being done in the field, then they are opening themselves for a storm of criticism).
  • “The field of AI research was first identified in the 1950s …..” Depending on how intelligence in machines is defined and described, this date could be set a hundred years earlier.  I would not recommend that the authors use the expression “first identified” unless they are using it is a very particular way, and then they would need to define that way.  A reference, anyway, is recommended, as “in the 1950s” is very vague, especially given that Alan Turing’s “Computing machinery and intelligence” was published in 1950.  If this is the author’s reference, then it should be stated properly and cited.
  • “…it has not lived up to the enthusiastic predictions of its supporters.” Again, this is far too vague.  Many of the predictions did not have dates, and many of the predictions have occurred, so a blanket statement like could be easily challenged, and threaten the integrity of the paper.
  • “the shelf life of a human education gets shorter” “shelf-life” might be considered a rather colloquial expression for an academic journal, and, perhaps, the authors can consider a more appropriate term.
  • “Unfortunately, there is no evidence that any health professions programmes are even considering the incorporation of data science, deep learning, statistics, or behavioral science into the undergraduate curricula even though this is what is required to develop, evaluate, and apply algorithms in clinical practice” This is really far too broad.  The authors simply cannot make such a blanket statement.  At best, they can argue that they have not been able to find any evidence, or much evidence.  But for that statement to have validity, the authors will have to give details of how they searched for that evidence, because the lack of evidence may be a failure to find, rather than a failure of existence.  In fact, the statement is plain wrong, as it says that there are no health professions undergraduate curricula programmes that include statistics.  That is just wrong.

So, a generally useful read, supplying some food for thought, but it can be strengthened, and some tidying up is required.

 

[jetpack-related-posts]

One Reply to “Peer review (Ken Masters) – Artificial intelligence in clinical practice: Implications for physiotherapy education”

  1. Dear Ken

    Thank you for taking the time to review my article. I have considered your comments and agree with much of what you have shared and adapted the article in accordance with your suggestions. However, there are also a few places where I do not agree, and have provided a rationale, which I describe below.

    “While I’m not sure that the paper adds a great deal to our knowledge of the topic, it does provide a useful summary for those that are not entirely aware of AI in medicine and its implications.” I should emphasise that, while the article does indeed provide a broad overview of AI, and then narrows it to summarise the use of AI in clinical practice, this is not the main purpose of the paper. My intention is to explore some broad principles for how physiotherapy education might consider changing in order to graduate health professionals who can work within an AI-enabled system. And these broad principles for how higher and professional education might change do provide something more than mere a summary of AI in clinical practice.

    “Overall, I would like to see a few concrete examples of where AI is currently being used in physiotherapy and physiotherapy education.” I have opted not to provide more examples of how AI-based systems are being used in clinical practice. The current examples that are included in the text (i.e. image recognition, clinical decision support, expert systems, wearables/ingestibles, etc.) are used to provide an overview of the kinds of *broad* changes to be expected in practice. At such an early stage of the integration of these systems into healthcare, it seems likely that very specific examples in the article would be at risk of becoming quickly obsolete, as the technology and healthcare teams adapt in response to a rapidly changing context. And again, the purpose of the article is not to provide an overview of how AI is being used in physiotherapy practice. Rather, the article explores how physiotherapy education might adapt (without the integration of AI into higher education) in order to prepare graduates for an AI-enabled health system in which they will need to practice. I believe that the examples of AI in clinical practice that are provided in the previous section of the article are sufficient for this purpose.

    Your comments that several claims in the article are vague and unsupported were useful and I have tried to provide more detail and clarity around the ones that you highlighted. For example, I have updated the sentence related to the beginning of the research discipline of AI to be more precise and also included a citation in support of the claim (“The modern field of artificial intelligence was born in 1956 at the Dartmouth Workshop where the term was coined and was initially conceived as a project that might take a few decades to conclude (Frankish & Ramsay, 2014)”). The precise definition of AI (in the context of this article) is provided in the previous paragraph (…”it is the study of the synthesis and analysis of computational agents that exhibit intelligent behaviour”).

    “…it has not lived up to the enthusiastic predictions of its supporters.” Again, this is far too vague. I have provided two specific examples of the initial enthusiasm of AI pioneers (Simon, and Minsky), and linked them to a specific aspect of AI research i.e. the development of artificial *general* intelligence. The sentence has been rewritten to read, “However, until recently progress in AI research has not lived up to the enthusiastic predictions of its initial supporters. This was particularly true with respect to the claims made by pioneers in AI research around the development of artificial general intelligence. The sentence now reads, “…machines will be capable, within twenty years, of doing any work that a man can do” (Simon, 1965) and “In…3 to 8 years, we will have a machine with the general intelligence of an average human being” (Minsky, 1970), have turned out to be completely wrong. While we are still not much closer to a generally intelligent machine, there are three main characteristics of modern AI research that demonstrate how and why the development of AI-based systems has seen significant growth over the past ten years (Susskind & Susskind, 2015)”.

    Perhaps, the author can consider a more appropriate phrase than “the shelf life of a human education gets shorter”. I have added a citation and changed the sentence to read, “…the relative value of a professional degree is reduced (Susskind & Susskind, 2015)”.

    The claim that there is “…no evidence that any health professions programmes are even considering the incorporation of data science…” is really far too broad and likely wrong. I have changed the sentence to read, “It may be necessary for professional programmes to integrate data science, deep learning, and behavioral science into their undergraduate curricula in order that health professionals are able to develop, evaluate, and apply algorithms in clinical practice (Obermeyer & Lee, 2017; Hodges, 2018).” I have also removed “statistics” from the list.

    In addition to the suggested changes mentioned above, I have also tried to use the spirit of the feedback to make other changes throughout the paper. These changes all deal with the use of colloquial language, missing citations, and vague claims.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.