At the moment there’s a Senate inquiry into the new computer Integrated Assessment Tool that is being used to assess eligibility and assign funding levels for aged care service. It is completely automated, and there is no human over-ride function when the algorithm spits out an assessment that is inappropriate, insufficient or just plain wrong. At the inquiry, the first assistant secretary of the Department of Health, Disability and Ageing, Robert Day, said
The no override comes from the fact that that is an objective outcome….If you have these scores from your assessment, you get this level of classification … there’s no discretionary element (Guardian, 3 April 2026)
SPOILERS
I was reminded of this when reading Laila Lalami’s The Dream Hotel. My library has designated it ‘Science Fiction’, but there’s not much science fiction about it: it’s just an extrapolation of what is already here. Set in an alternative present day and in response to moral panic about the rising crime rate, The Risk Assessment Administration has been charged with investigating suspicious individuals in order to prevent future crimes, and it can draw on myriad data sources in order to do so. American citizen Sara Hussein is pulled from the arrivals line at the airport because her risk score is too high. An archivist by training, she has been attending a conference, and she bristles and pushes back at being flagged as a risk. The risk assessment has picked up on a complaint from a fellow passenger on the plane who was off-loaded before take-off because she assisted him when he was having trouble breathing; her response that her employer paid for her flight was questioned because, technically, she had not yet submitted the receipts to recoup her expenses. But most damaging of all for her assessment was the information that the authorities could access from her Dreamsaver, a device that she -along with many other Americans- used to maximize the value from her sleep. As the mother of young twins, trying to keep her career afloat, she had turned to this device to overcome her insomnia and although she didn’t realize it, there among the terms and conditions was her permission for the data to be handed on to a third party if required by a legal enforcement authority. Her dreams revealed a propensity to violence, they claimed, and so she needed to be assessed further in Madison Retention Centre.
So started her months-long stint in an ‘retention centre’ which increasingly became prison-like with 24 hour surveillance, curtailed freedoms presented as ‘privileges’, and enforced work. The organization contracted to run the Madison retention centre, Safe-X Inc., has its own internal economy. It has contracts with outside clients like film studios to have AI generated video content assessed for its verisimilitude; it has its internal laundry and catering facilities which fall under Safe-X budgets. Communications are provided and monitored by the AI-driven PostPal; there is a commissariat where Residents could purchase goods from their own money or from funds provided by their families. She can receive visitors, but the scheduling program is capricious, cancelling her visits without any recourse. Her Dreamsaver is monitored daily, and periods of detention could be extended at whim by the Attendants. In the narrative, you (and she) are never quite sure what is dream, or increasingly nightmare, and what is real.
What seems to be a Kafka-esque and dystopian situation does come to an end- the book has an ending, after all- when she resists, using time-worn tactics of strike and solidarity. In fact, the book is almost optimistic in its ending:
…isolation is the opposite of salvation…she owes her release to the women who joined together to say not….Freedom isn’t a blank slate..[it] is teeming and complicated and, yes, risky, and it can only be written in the company of others…This is what Madison has given her, even as it has taken so much from her- the knowledge that she isn’t alone, that she doesn’t have to be. (p. 321,322)
This is a fantastic book. I only had ten pages to go to the end, and so I sat on the station as my train went past, wanting to finish it. It seems that so many articles and events are converging: I just read Anna Krein’s The screens that ate school, from The Monthly, 2020; at a recent appointment my doctor asked me if I would agree to HeidiAi Co Pilot for Modern Healthcare. Do I read the screen after screen of Terms and Conditions? Did I take the doctor’s word that the recording of my appointment wouldn’t go any further than her computer? Do parents have the courage to push back against Google and Apple programs in their schools? No, no and no. This book isn’t Science Fiction: it’s a warning.
My rating: 9/10
Sourced from: Yarra Plenty Regional Library
Read because: I had read excellent reviews of it.














