A missing brother. Chickens. Lithium.
What do these three things have in common?
Each one was featured in an article I read recently. A common link between these articles was that they described how AI is being used in three projects involving, respectively, chickens, lithium, and missing people.
I am not a technologically sophisticated creature. When chatter about AI started to surface a few years ago I must admit that, at first, I struggled to understand what AI even was. During 2023 that chatter built up to a crescendo of proclamations - some euphoric and some doom-laden - about the far-reaching impact that AI is going to have on us. But amongst the wilder predictions, some good journalism and useful commentary about AI also surfaced. I want to share three good articles with you in this post; although they seem to be about disparate things, they also speak to how AI can be applied and, taken together, they helped me to a fundamental understanding of how AI functions and, if used towards ethical ends, the good it could do in the world*.
In ‘AI breakthrough hope in missing man case’, the article describes how software using AI can search for and analyse large amounts of different types of data to track down traces of missing people. The data includes locations, telephone numbers, names, and bank details. The software can also transcribe audio, and then summarise findings, highlighting patterns that may point to a missing person’s whereabouts.
In ‘AI discovers new materials that could slash lithium use in batteries’, the article describes how AI has been used to search vast amounts of data to identify an electrolyte that could reduce the amount of lithium needed in batteries by up to 70%.
And, in my favourite article out of the three** - ‘Fowl language’ – some researchers share how they have been using AI to analyse audio data to decode chicken language:
“Chickens are quite the communicators — their clucks, squawks and purrs are not just random sounds but a complex language system.”
These three articles highlighted some things about AI for me:
We have all been told - ad nauseum - that AI is good at analysing data, but these three articles shared different examples of just how good it is. In all three cases, the amount of data that needed to be processed was so huge as to make it almost inhumanly impossible to handle.
“AI can analyze vast amounts of audio data… our algorithms are learning to recognize patterns and nuances in chicken vocalizations. This isn’t a simple task…” – ‘Fowl language’.
In the example of tracking down missing people, I imagine that sifting through and cross referencing personal and financial details across a range of platforms and then recognising a possible pattern might be akin to finding some needles in a very big and messy haystack. The same goes with analysing data to identify battery components:
“The system analyzed over 32 million potential inorganic materials and, within just 80 hours, it managed to weedle this down to 18 promising candidates that could be used in battery development. Humans then tested these candidates and discovered one electrolyte that looked particularly promising… Tasks like this, which essentially involve sifting through colossal amounts of data, are perfect for AI.” – ‘AI discovers new materials’
“Humans then tested…”
The other thing that these articles highlighted for me was how humans and AI worked together. And, for me, this is where the question of human agency – a theme which fascinates me - came in.
In all of these examples, AI was used to tackle data sets that humans would have found overwhelming in ways that were faster, more efficient, more precise, and more comprehensive. But it took clever and innovative human brains to conceptualise and scope the projects, oversee the artificially intelligent research and analysis, test the outcomes, and patrol the ethical boundaries.
During 2023 I went to hear a digital artist and designer talk about his work. He told us that he had started experimenting with AI and was enjoying the process. His advice to us? ‘Treat it like a collaborator. It’s an alien; it doesn’t think like us. But treat it like a collaborator.’ I feel that in these three articles we can see examples of these alien+human collaborations.
There is one final point about human agency that I was glad to see in these articles: decision-making to use AI for projects that clearly have beneficial goals.
“Natural deposits of lithium are relatively scarce, while mining can be costly, damaging to the environment and local communities, and prone to rile up geopolitical conflict. This is why scientists, as well as the powers that be, are so keen to find materials that could be used as a substitute.” – ‘AI discovers new materials’
The importance of easing the anguish of the families and friends of missing persons, and perhaps tracking down instances of foul play, are obvious. And even in the case of understanding the language of chooks, what might seem to be a cute and quirky use of AI has far more serious objectives:
“Understanding these vocalizations can transform our approach to poultry farming, enhancing chicken welfare and quality of life… Our journey into decoding chicken language is more than just an academic pursuit: it’s a step towards a more empathetic and responsible world.” – ‘Fowl language’
I am not a technologically sophisticated creature; after a lifetime of professional creative practice, starting off with a stint in the performing arts, narrative is more my thing. And what has irritated me greatly about the discourse around AI is how much flimsy narrative has been thrown around: AI will save us; AI will kill us; AI is ‘better’ at creativity than us; AI is ‘smarter’ than us. The technological-numpty in me has no idea what to make of all of this but the creative worker with a background in the arts and humanity recognizes dodgy narrative when I see it.
For me, articles like these serve to mitigate the dizzying effect of the hyperbole surrounding AI and helped me to understand some of its potential.
*I’ll get onto whining about what worries me about AI in another article.
**Blame my mother. She would keep chickens. We all doted on them.
Thank you for reading!
Do you want to support this project? Here are some things you can do:
Subscribe to this free Substack.
Share this post and help me get past the stifling affect of social media algorithms.
Contribute to my crowdfunding campaign so I can write more about this. Thanks to the ACF Boost program, your donations will be matched dollar-for-dollar with a funding ‘boost’ up to $2,000 from Creative Australia.