Worldwide, machines are undertaking complex tasks, “learning” from the outcomes and improving their performance accordingly. The possibilities are revolutionary; and many are being realised at the University of Adelaide.
“We are now developing technology that can compete with, and sometimes exceed, human capabilities in tasks like recognition, statistical analysis and classification,” says Professor Anton van den Hengel, Director of the University of Adelaide’s Australian Institute for Machine Learning.
The big breakthrough, he believes, has been the advent of “deep learning” technology, a form of machine learning—itself a subset of artificial intelligence (AI)—based on the human brain’s neural networks. “That’s enabled machines to distil and interpret huge amounts of prior and incoming information, and particularly visual information.”
Professor Ian Reid, a senior colleague of van den Hengel’s and Deputy Director of the Australian Research Council Centre of Excellence for Robotic Vision, agrees. “Artificial neural networks, together with vast computing power and data volume, have enabled step-change in the level of intelligence machine learning can achieve.”
Speeding disease diagnosis
A particularly exciting extension of this pioneering work is the University of Adelaide’s collaborative creation of the world’s first AI microbiology screening technology for use in pathology laboratories. Developed in partnership with Australia-based LBT Innovations (LBT), the Automated Plate Assessment System (APAS) went into production in 2017 and is attracting huge international interest.
Professor van den Hengel led the University’s six-person APAS software development team. He says the system promises to dramatically accelerate patient diagnosis and treatment, giving humanity a powerful new weapon in the fight against infectious disease.
“APAS will enable doctors to order more tests, which will give them more information, sooner. It could even allow country or developing-world hospitals to run their own tests without having to ship samples to a central lab. That would save a huge amount of time, and potentially many lives.”
The system automates the traditionally time-consuming functions performed by microbiologists in screening culture plates after incubation. It takes high-quality images of the plates, then analyses and interprets any microbial growth, matches this against key patient data, presents a diagnosis, and continually updates its own knowledge base.
Significantly, APAS also removes non-pathogenic plates from the workflow. “This is very important,” explains LBT co-founder Lusia Guthrie, now Chair of Clever Culture Systems, the joint-venture company bringing APAS to market. “In routine microbiology testing, up to 70 per cent of plates may be negative. Removing them automatically will give microbiologists more time to spend on complex decisions, enabling even greater accuracy and allowing more tests to be run.”
LBT CEO Brent Barnes believes the system will ultimately mean faster recovery for millions. “The science we’ve put into practice through APAS with the University of Adelaide could well become part of hospital protocols all over the world,” he says.
“More, and more accurate, testing will see patients getting the right treatment earlier and spending less time in hospital.”
Guthrie believes a decision to embed a University of Adelaide researcher in its internal APAS project team was key to its success. Van den Hengel originally appointed computer scientist Rhys Hill to assist LBT in-house with proof-of-principle research. “It worked so well,” says Guthrie, “particularly in terms of communication flow, that Rhys stayed with us throughout prototype development and right up to United States Food and Drug Administration compliance.”
Keen to build on the foundation laid with APAS, the University and LBT are now jointly developing three other related medical devices utilising the University’s AI image-analysis technology.
“Our work with the University of Adelaide has taken LBT to a new level,” says Lusia. “We’ve transitioned from a ‘robotics company’ to an ‘AI company’, and now we have the opportunity to become a ‘digital health company’.”
Accelerating crop farmers’ adaptation to climate change
Another valuable world-first application of the University of Adelaide’s AI image-analysis technology is in the agricultural sector. Again led by Professor van den Hengel, the technology is being tailored to accurately estimate potential new cereal varieties’ yields after only very short periods of growth, enabling rapid identification of robust varieties able to thrive in harsh conditions.
Here too, the potential global impact is significant—a fact not lost on industry partner Bayer CropScience, which has signed on to help commercialise the ground-breaking technology. While food security has been on the international agenda for some time, the confluence of climate change and rising global population has elevated the issue to emergency status.
“It’s estimated we’ve already lost nearly 33 per cent of the planet’s arable land over the past 40 years through erosion and pollution,” says van den Hengel. “A further increase in global average temperatures could be catastrophic.”
In Australia, a major contributor to the world’s cereal stocks, climate models suggest drought could be as much as 20 per cent more common by 2030 across much of the country; and up to 80 per cent by 2070 in the crop-reliant south-west.
“This novel approach promises to transform crop breeding,” van den Hengel continues.
“By using image analysis to understand plants’ shape and structure at all stages of growth, we’ll be able to identify and automatically measure attributes associated with high yields very early in test plants’ lifespans.”
The system uses multiple images taken from numerous angles to construct computerised 3D models of the plants for analysis. Once completed, it will be incorporated into the University’s state-of-the-art Plant Accelerator facility, which provides important complementary capability.
“The Plant Accelerator’s fully robotic plant management system allows automatic and repeatable control of growing conditions for up to 2400 plants at a time, and enables automatic delivery of those plants to our imaging stations.
“That’s going to allow rapid, detailed and accurate estimations of vast numbers of crop varieties’ potential yields under all kinds of climate-change-related stresses, such as high salinity or drought. We’ve no doubt it will expedite the development of hardy, high-yield varieties and help improve global food security.”
Emulating nature’s perfect pursuit
The University of Adelaide’s machine learning expertise is also making waves in defence and other sectors, by enhancing autonomous-pursuit capabilities.
In an entirely novel approach, computer scientists, engineers and neuroscientists at the University have adapted dragonflies’ neuronal processes into a unique algorithm that emulates the insect’s phenomenal visual tracking capability.
Widely considered nature’s most effective predator, dragonflies are able to target, pursue and capture tiny flying prey in mid-air at speeds of up to 60 km/h—even if that target attempts to disappear within a seething swarm—with an incredible hit-rate of over 95 per cent.
“Tested in various nature-mimicking virtual reality environments, our pursuit algorithm matches all other state-of-the-art pursuit algorithms’ accuracy, but achieves that while running up to 20 times faster,” says Professor Ben Cazzolato. “So it requires far less relative processing power.”
Mechanical engineering researchers at the University have also incorporated the algorithm in an autonomous robot that, in testing, has effectively and efficiently pursued targets in unstructured environments.
The interdisciplinary project is being led by neuroscientist Dr Steven Wiederman, of the University of Adelaide Medical School’s Visual Physiology and Neurobotics Laboratory. It was Wiederman’s team that first identified how the dragonfly is able to focus on a single moving target and shut out all else—a remarkable find in itself.
“We recorded the activity of dragonfly neurons, and discovered the first identified neuron in any animal that exhibits an ‘attentional spotlight’,” he explains. “It’s able to select a single target amidst distracters. We also recorded from target-detecting neurons that predictively encode trajectory, enabling the dragonfly to estimate its target’s future location and ambush it.”
Keen to see how much further this translational path can take them, Wiederman and his team are now collaborating with Professor Reid to develop neurobiology-inspired machine-learning drone-tracking systems.
“We’re excited to further define the principles underlying neuronal processing. Translating them into advanced artificial vision systems could result in some incredibly effective autonomous robotics and drones, as well as neuronal prosthetics and many more applications.”