menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Australia’s aged care algorithm is under fire. At last, someone’s listening

12 0
latest

The way Australians are assessed for home-based aged-care funding is being investigated by the Commonwealth ombudsman.

Critics say assessment for funding under the Support at Home program is flawed, leaving some older people unable to access the right level of care they need to safely live at home.

Complaints about the process are increasing significantly. Even an expert who helped design the system is unhappy.

Here’s why the Commonwealth should reconsider its approach.

What’s the key issue?

The new Support at Home program was introduced in 2025. One of its aims is to support more Australians to remain at home rather than moving into residential aged care.

When an older person wants to join the program, they are assessed in an interview with a structured digital assessment known as the “Integrated Assessment Tool”. This tool assesses the support they need – physical, cognitive and psychosocial. It also assesses the urgency and the level of assistance required.

An algorithm then analyses the answers and determines Support at Home funding levels.

To be useful, assessments need to predict the actual service levels required for high-quality outcomes for older people with different levels of need.

In developing assessment tools, the gold standard is to first conduct a large number of assessments to see what kind of care older people need, and at what level. The next stage is to determine if the services actually provided produce high-quality outcomes for people with different levels of need.

But there is no publicly available evidence this has been done.

Instead, a second-best option was adopted. Experts gave a score to estimate what level of support someone would need based on answers to assessments.

But there is room for expert disagreement even when they use well developed tools.

The Integrated Assessment Tool includes 11 separate validated tools, each with an inherent error rate. These error rates compound when they are combined.

Worse, given there are no studies of the extent to which integrated assessments predict actual services and outcomes, it is difficult to say how good the algorithm is. Lack of transparency means it’s a black box, which is why the ombudsman’s inquiry is welcome.

This is particularly true because funding determined by the algorithm may be systematically lower than funding determined by experts. This means elderly people may have their cognitive, safety and complex care needs underestimated.

How about human oversight?

Despite the limitations and against expert advice, the Commonwealth has explicitly removed the power to manually override the algorithm’s allocation of support levels. The idea is for the algorithm to provide consistent results for thousands of older people.

However, this approach has a number of serious potential consequences.

The Support at Home Program has eight levels of support ranging from A$10,731 a year for level 1 (the most basic support) to $78,106 a year for level 8 (the highest level of support).

If the algorithm allocates one level of support higher or lower than what a person actually needs, this can mean a difference of between $5,300 and $20,000 a year depending on the level.

Appeals are increasing

If an older person or their family wants to question the funding allocation, they can appeal. But they often don’t know the specific reasoning behind the scoring that led to their allocation. And the appeals process can be cumbersome and stressful.

Some 800 older people have requested a review of their assessment since the introduction of the new system.

The Older Persons Advocacy Network says requests for information and advocacy have gone up by 50% in the three months in the same period.

One of the system’s designers, Lynda Henderson, said she felt “fury” that the tool she helped design has been turned into a prescriptive algorithm.

What needs to happen next?

The Robodebt Royal Commission warned government agencies that automated systems must ensure transparency, fairness and human oversight.

But this has not happened when assessing individuals’ circumstances for home-based aged-care funding.

The best approach is to use the algorithm as a guide for making individual decisions about older people’s support needs and to allow assessors to override the algorithm when the circumstances warrant it.

Systems-level data should then be used to refine the algorithm and provide guidance to assessors as the system matures.


© The Conversation