Artificial intelligence in clinical practice: Implications for physiotherapy education

Article accepted

This article has been accepted for publication. Peer reviews and author responses are available at the end of the article.

Abstract

About 200 years ago the invention of the steam engine triggered a wave of unprecedented development and growth in human social and economic systems, whereby human labour was supplanted by machines. The recent emergence of artificially intelligent machines has seen human cognitive capacity augmented by computational agents that are able to recognise previously hidden patterns within massive data sets. The characteristics of this technological advance are already influencing all aspects of society, creating the conditions for disruption to our social, economic, education, health, legal and moral systems, and which will likely to have a more significant impact on human progress than the steam engine. As AI-based technology becomes increasingly embedded within devices, people and systems, the fundamental nature of clinical practice will evolve, resulting in a healthcare system requiring profound changes to physiotherapy education. Clinicians in the near future will find themselves working with information networks on a scale well beyond the capacity of human beings to grasp, thereby necessitating the use of artificial intelligence to analyse and interpret the complex interactions of data, patients and the newly-constituted care teams that will emerge. This paper describes some of the possible influences of AI-based technologies on physiotherapy practice, and the subsequent ways in which physiotherapy education will need to change in order to graduate professionals who are fit for practice in a 21st century health system.

Reviews

Author: Ken Masters
Review date: 16th July 2019
DOI: 10.14426/opj/20190716
Permalink: Review - Artificial intelligence in clinical practice: Implications for physiotherapy education

The paper is rather speculative in places, but this is inherent in the nature of the topic, so is not necessarily a weakness.  For the most part, the speculation is well-grounded in what is currently known and practised in both AI and medicine, and reads easily and fluently.  While I’m not sure that the paper adds a great deal to our knowledge of the topic, it does provide a useful summary for those that are not entirely aware of AI in medicine and its implications.

Some issues to consider

  • Overall, I would like to see a few concrete examples of where AI is currently being used in physiotherapy and physiotherapy education. (The last bullet point in my comments should also be noted, because, if the authors wish to argue that nothing is being done in the field, then they are opening themselves for a storm of criticism).
  • “The field of AI research was first identified in the 1950s …..” Depending on how intelligence in machines is defined and described, this date could be set a hundred years earlier.  I would not recommend that the authors use the expression “first identified” unless they are using it is a very particular way, and then they would need to define that way.  A reference, anyway, is recommended, as “in the 1950s” is very vague, especially given that Alan Turing’s “Computing machinery and intelligence” was published in 1950.  If this is the author’s reference, then it should be stated properly and cited.
  • “…it has not lived up to the enthusiastic predictions of its supporters.” Again, this is far too vague.  Many of the predictions did not have dates, and many of the predictions have occurred, so a blanket statement like could be easily challenged, and threaten the integrity of the paper.
  • “the shelf life of a human education gets shorter” “shelf-life” might be considered a rather colloquial expression for an academic journal, and, perhaps, the authors can consider a more appropriate term.
  • “Unfortunately, there is no evidence that any health professions programmes are even considering the incorporation of data science, deep learning, statistics, or behavioral science into the undergraduate curricula even though this is what is required to develop, evaluate, and apply algorithms in clinical practice” This is really far too broad.  The authors simply cannot make such a blanket statement.  At best, they can argue that they have not been able to find any evidence, or much evidence.  But for that statement to have validity, the authors will have to give details of how they searched for that evidence, because the lack of evidence may be a failure to find, rather than a failure of existence.  In fact, the statement is plain wrong, as it says that there are no health professions undergraduate curricula programmes that include statistics.  That is just wrong.

So, a generally useful read, supplying some food for thought, but it can be strengthened, and some tidying up is required.

 

Author: David Nicholls
Review date: 16th July 2018
DOI: 10.14426/opj/20180716
Permalink: Review - Artificial intelligence in clinical practice: Implications for physiotherapy education

Michael

Thank you for a very interesting and engaging article on AI and its implications for clinical physiotherapy practice.

Overall I thought the article set out a strong argument for physiotherapy educators and regulators to think carefully about the way physiotherapy thinking and practice evolve over the near future.  Here are a few comments and thoughts on your argument.

Firstly, I wonder whether it is entirely accurate to portray the technological advance of human history in terms of three distinct periods: a pre-machine age, a first and second machine age?  Many historians, for instance, argue that the development of the printing press and the means of mass food production were at least as significant as the invention of steam-power.  And later inventions like electric light and telephony probably brought about different machine ‘ages’ in their own right.  Hence I think your argument that ‘For thousands of years the trajectory of human progress showed only a gradual increase over time with very little changing over the course of successive generations. If your ancestors spent their lives pushing ploughs there was a good chance that you would spend your life pushing a plough’ may be going too far in service of a particular rhetoric.

I also wonder if you create too much of a sense of epoch when you talk about ‘The introduction of machine power [making] a mockery of previous attempts to shape our physical environment and created the conditions for the mass production of material goods, ushering in the first machine age and fundamentally changing the course of history’.  I’m not sure it mocked previous cultures, as much as supplement them. Evidence for mockery would be useful if you hold to that view.

Throughout your paper, there is an implicit socio-political critique running that is unstated.  For instance, Marx and Engels argued that the Industrial Revolution created the conditions for worker exploitation and the creation of capitalism, and many have argued that the move to big data (mentioned later in your piece) is really a manifestation of corporate neoliberalism (big data capture and manipulation by the ‘Big Five’).  Is there a place here, then, for a critical comment on whether these machine ‘ages’ also further corporate interests over the personal? (It would be interesting, for instance, to comment on how much of the AI discourse is currently being driven by American-based military interests).

I wonder, too, whether there is an inadvertent binary created between machines and humans, ‘us’ and ‘them’ – particularly later in the article. You talk about healthcare professions being ‘replaced by more appropriate alternatives’, about learning ‘when to hand over control to them, and when to ignore them’ (my emphasis). You talk about us being deluded about how replaceable we are, and of finding a niche in areas that ‘computers find difficult – or impossible – to replicate’, and you situate this firmly in a traditional humanistic notion of ‘care’.  Surely this goes against your earlier argument that ‘we’ should learn to integrate better with the kinds of augmented intelligences that are now increasingly available; that we should be redefining the boundaries of ‘human’ thought and capabilities like reason, cognition, and even caring; and that we should cease to make artificial distinctions between humans and machines.

To that end, I wonder if a brief discussion of the possibilities of AI to radically reshape our notion of ourselves is in order.  It would be good to see if you think this is any different to the way people talked about telephones or the railways or looms in the past.  You mention, for instance, that people will need ‘technological literacy that enables them to understand the vocabulary of computer science and engineering that enables them to communicate with machines’.  But I’ve never needed to be an electrical engineer to use the phone, so why should things change now?  You also suggest that machines will be the repositories of vast amounts of information that humans cannot possibly handle. But, again, hasn’t this always been the case? Is not a library just a vast machine (repository) for the storage and retrieval of knowledge? We have, of course, seen times in the past when massive data on the population became necessary.  The birth of epidemiology and statistics in the 18th century was made necessary by the growth in urban (poor and rebellious) populations, and look how much data was generated from that that it was impossible for humans to hold in their heads.

Finally, there are one or two typos and grammatical errors within the text that can be ironed out in the final review of the paper. Thank you again for the provocation to think through these issues. I look forward to your response to these thoughts.

Conflict of interest statement

The author of this paper is also the Editor of the journal. As such, all editorial decisions regarding peer-review and subsequent editorial processes around this submission are managed by others on the editorial board.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.