Artificial intelligence in clinical practice: Implications for physiotherapy education

Article under review

This article is currently under peer review and has not yet been accepted for publication. While it may still be referenced at this web address, please bear in mind that amendments to the article may occur as a result of the review process.

Abstract

About 200 years ago the invention of the steam engine triggered a wave of unprecedented development and growth in human social and economic systems, whereby human labour was supplanted by machines. The recent emergence of artificially intelligent machines has seen human cognitive capacity augmented by computational agents that are able to recognise previously hidden patterns within massive data sets. The characteristics of this technological advance are already influencing all aspects of society, creating the conditions for disruption to our social, economic, education, health, legal and moral systems, and which will likely to have a more significant impact on human progress than the steam engine. As AI-based technology becomes increasingly embedded within devices, people and systems, the fundamental nature of clinical practice will evolve, resulting in a healthcare system requiring profound changes to physiotherapy education. Clinicians in the near future will find themselves working with information networks on a scale well beyond the capacity of human beings to grasp, thereby necessitating the use of artificial intelligence to analyse and interpret the complex interactions of data, patients and the newly-constituted care teams that will emerge. This paper describes some of the possible influences of AI-based technologies on physiotherapy practice, and the subsequent ways in which physiotherapy education will need to change in order to graduate professionals who are fit for practice in a 21st century health system.

Reviews

Author: David Nicholls
Review date: 16th July 2018
DOI: to be allocated
Permalink: Review - Artificial intelligence in clinical practice: Implications for physiotherapy education

Michael

Thank you for a very interesting and engaging article on AI and its implications for clinical physiotherapy practice.

Overall I thought the article set out a strong argument for physiotherapy educators and regulators to think carefully about the way physiotherapy thinking and practice evolve over the near future.  Here are a few comments and thoughts on your argument.

Firstly, I wonder whether it is entirely accurate to portray the technological advance of human history in terms of three distinct periods: a pre-machine age, a first and second machine age?  Many historians, for instance, argue that the development of the printing press and the means of mass food production were at least as significant as the invention of steam-power.  And later inventions like electric light and telephony probably brought about different machine ‘ages’ in their own right.  Hence I think your argument that ‘For thousands of years the trajectory of human progress showed only a gradual increase over time with very little changing over the course of successive generations. If your ancestors spent their lives pushing ploughs there was a good chance that you would spend your life pushing a plough’ may be going too far in service of a particular rhetoric.

I also wonder if you create too much of a sense of epoch when you talk about ‘The introduction of machine power [making] a mockery of previous attempts to shape our physical environment and created the conditions for the mass production of material goods, ushering in the first machine age and fundamentally changing the course of history’.  I’m not sure it mocked previous cultures, as much as supplement them. Evidence for mockery would be useful if you hold to that view.

Throughout your paper, there is an implicit socio-political critique running that is unstated.  For instance, Marx and Engels argued that the Industrial Revolution created the conditions for worker exploitation and the creation of capitalism, and many have argued that the move to big data (mentioned later in your piece) is really a manifestation of corporate neoliberalism (big data capture and manipulation by the ‘Big Five’).  Is there a place here, then, for a critical comment on whether these machine ‘ages’ also further corporate interests over the personal? (It would be interesting, for instance, to comment on how much of the AI discourse is currently being driven by American-based military interests).

I wonder, too, whether there is an inadvertent binary created between machines and humans, ‘us’ and ‘them’ – particularly later in the article. You talk about healthcare professions being ‘replaced by more appropriate alternatives’, about learning ‘when to hand over control to them, and when to ignore them’ (my emphasis). You talk about us being deluded about how replaceable we are, and of finding a niche in areas that ‘computers find difficult – or impossible – to replicate’, and you situate this firmly in a traditional humanistic notion of ‘care’.  Surely this goes against your earlier argument that ‘we’ should learn to integrate better with the kinds of augmented intelligences that are now increasingly available; that we should be redefining the boundaries of ‘human’ thought and capabilities like reason, cognition, and even caring; and that we should cease to make artificial distinctions between humans and machines.

To that end, I wonder if a brief discussion of the possibilities of AI to radically reshape our notion of ourselves is in order.  It would be good to see if you think this is any different to the way people talked about telephones or the railways or looms in the past.  You mention, for instance, that people will need ‘technological literacy that enables them to understand the vocabulary of computer science and engineering that enables them to communicate with machines’.  But I’ve never needed to be an electrical engineer to use the phone, so why should things change now?  You also suggest that machines will be the repositories of vast amounts of information that humans cannot possibly handle. But, again, hasn’t this always been the case? Is not a library just a vast machine (repository) for the storage and retrieval of knowledge? We have, of course, seen times in the past when massive data on the population became necessary.  The birth of epidemiology and statistics in the 18th century was made necessary by the growth in urban (poor and rebellious) populations, and look how much data was generated from that that it was impossible for humans to hold in their heads.

Finally, there are one or two typos and grammatical errors within the text that can be ironed out in the final review of the paper. Thank you again for the provocation to think through these issues. I look forward to your response to these thoughts.

Conflict of interest statement

The author of this paper is also the Editor of the journal. As such, all editorial decisions regarding peer-review and subsequent editorial processes around this submission are managed by others on the editorial board.

This site uses Akismet to reduce spam. Learn how your comment data is processed.