Healthcare Technology

By Ben W Morrison and J Michael Innes

In these uncertain times, the demand for psychological services is at an unprecedented level. While the emergence of artificial intelligence offers great promise in addressing the public’s needs, we highlight the increasing disruption to the nature and delivery of psychological practice, which positions the profession at a critical juncture.   

The last three years have seen enormous pressures placed on communities stemming from climate-related disasters, the COVID-19 pandemic and the resultant social and economic turmoil, to even the threat of war escalating to unthinkable exchanges of weapons of mass destruction. With this assault on our lives there has been an associated appreciation of the role of psychology as a discipline. Psychologists in their many areas of practice are employing their skills across a range of diverse, impactful areas, from helping alleviate the stress and strains that now beset ordinary lives, to developing selection and training systems that enable the upscaling of national defence forces. Consequently, the demand for such services and the skilled practitioners and researchers that provide them are at an unprecedented level. Fortunately, emerging technologies are primed to help address any shortfall.  

Despite initial projections of the impact of automation-enabling technologies on the psychology profession being rather modest (economists Frey and Osborne (2013) famously projected psychology to have a .0043% probability of automation), technological disruption in the profession has already been substantial. For instance, the automation of aspects of psychological assessment has seen the profession take enormous strides in the efficient appraisal of clients in numerous contexts (e.g., therapeutic, organisational etc.). Many psychological assessment methods, including tests, surveys and even interviews are now largely delivered online and unproctored, with data collection, scoring, interpretation and report-writing all being automated. More recently, machine learning (ML) techniques have extended our capacity to draw inferences about clients from more ‘organic’ sources of data that obviate the need for a human assessor (Groves, 2011). For instance, information drawn from social media may provide insights into clients’ depression (De Choudhury et al., 2013), while the text mining of narrative comments can predict aspects of workers’ job performance (e.g., performance ratings, promotions, etc.) beyond the capacity of traditional rating measures (Speer, 2018). Concerns that ML may introduce bias that prevents appropriate modelling of behaviour are being addressed and new models are emerging (D’Mello et al., 2022; Jaccobucci & Grimm, 2020;).

As clients become increasingly used to these forms of service delivery and as the technology continues to rapidly improve, we can expect that client approval is very likely to further increase.

Even the delivery of psychological interventions is not off-limits, with ‘bots’ administering therapies such as Cognitive Behavioural Therapy (CBT) to alleviate clients’ depression and anxiety (Vaidyam et al., 2019). Other forms of therapist-guided internet interventions are also showing promise (Titov et al., 2015). Taken together these advancements call into focus the apparently ‘shrinking’ role of the human psychologist in the delivery of psychological services, as well as the researchers who develop and test psychological theories, which may yet prove redundant in the age of Artificial Intelligence (AI) and ML (see the recent discussions of a ‘post-theory’ science; Spinney, 2022).  

It is now abundantly clear that there exist surging effects of automated technologies that can replicate and exceed the cognitive capacity of human workers. Such machines circumvent the well-established biases that plague human judgement, which are too often ignored by practitioners and professions at large. They also provide an antidote to the variability in competency observed among experienced practitioners (e.g., Vollmer et al., 2013). The development of deep learning algorithms and the emergence of predictive analytic systems (Rahwan et al., 2019; Sadler & Regan, 2019; Sejnowski, 2018), with access to big data and the mass of information available in the psychological and related literature, points to a more comprehensive and arguably safer application of the evidence base. Ultimately such advances may signal the restructuring of the psychology employment market, which has already been greatly affected by the onset of the COVID-19 pandemic.   

Many of those who argue against the adoption of technologically dependent methodologies in the delivery of psychological assessments and interventions point to the likelihood of clients preferring human and face-to-face interactions in that delivery. The massive adoption of Telehealth and wellness apps in the post-COVID world, however, have drastically reduced face-to-face interactions and provides further evidence that clients will readily engage technology-mediated psychological services. Recent data in Australia (Oracle, 2020) reveal a preference among workers for technology-mediated counselling, with 68% reporting that they would rather talk to a robot over their manager about stress and anxiety at work. This preference extends beyond issues of mental health to other areas commonly serviced by psychologists like career progression, with 82% of respondents believing that robots can support their careers better than humans (Oracle, 2021).

We acknowledge that there are legitimate concerns about conceptual and methodological issues in the development of AI. For instance, we contributed to the debate about the validity of research into human-robot interaction (Innes & Morrison, 2021c).

As clients become increasingly used to these forms of service delivery and as the technology continues to rapidly improve, we can expect that client approval is very likely to further increase. At the very least the improved efficiencies will increase the number of clients that can be managed by a single human psychologist, but we cannot rule out the mass displacement of practitioners in the future. Although such ‘Doomsday’ projections may be premature for now, for fields such as psychology there is already the hallmarks of mass disruption to the nature and delivery of services. Such technology has the power to greatly alleviate the pressures placed on both members of the community and those of a highly valued profession, allowing a way to keep step with the demands of an increasingly volatile and uncertain world.  

Whilst practitioners’ employment is assured for now, the profession rapidly approaches a judgement day of sorts; a time where we must establish what parts of a psychologist must remain exclusively human. For instance, human empathy is invariably cited as both a critical capacity of psychologists, and one which is presumed to fall outside of the capabilities of existing forms of ‘narrow’ AI. However, the mass migration of psychologists and their clients to highly systematised and automated forms of assessments and interventions (e.g., CBT; Leichseuring & Steinert, 2017) means that the expected value of empathy among future psychologists is likely to have been significantly over-stated.

Health Consultation

Revealing the size and magnitude of components of psychological practice that are human-centred will shape the identity of the profession in the coming years. The nature of the education and training of psychologists needs to be examined and restructured to take account of the technological changes that are occurring and to recognise change in the preferences among the general community. We have previously pointed to the possibility of advanced technologies such as AI having a significant impact upon the profession of psychology (e.g., Innes et al., 2022; Innes & Morrison, 2021a, 2021b, 2021d; Innes & Morrison, 2017) but there has been little response from the profession to date.  

We acknowledge that there are legitimate concerns about conceptual and methodological issues in the development of AI. For instance, we contributed to the debate about the validity of research into human-robot interaction (Innes & Morrison, 2021c). But we also realise the need to respond to the positive developments in AI that will undoubtedly transform the nature of psychological work. Indeed, we believe that there exists an extraordinary opportunity to both elevate and futureproof psychology as a human profession. In opening our eyes to the rise of the machines, we can best shape the trajectory of our expertise in a way that maximises our potential to help our clients.

About the Authors

Ben MorrisonBen W Morrison is a Senior Lecturer in the School of Psychological Sciences at Macquarie University, Australia. He is an organisational psychologist with interests relating to expertise development and the impacts of emerging technologies, especially artificial intelligence. He has a particular interest in applied research, and as such regularly undertakes research projects within real-world work settings or simulations of such environments.
InnesJ Michael Innes is an Adjunct Research Professor at the EU Jean Monnet Centre of Excellence, University of South Australia. He has taught at the Universities of Edinburgh, Michigan, and Adelaide. He is a social psychologist, with interests in the psychology of social influence, group dynamics, psychological methodology and the social consequences of technological development, especially artificial intelligence.

References