Meditation on Country: Professor Angie Abdilla
I’m always really interested and curious about the different ways in which our old people designed systems that were innately tied to the natural rhythms and ebbs and flows of Country. And they were systems that were didactic, they had utility, but they also had an incredible ability to regulate the ways in which humans and the environment connect. So that’s what keeps me curious and interested.
Angie Abdilla creates video installations interrogating Indigenous deeptime knowledges, automation and AI, focusing on technology as cultural practice. Her research, artworks and films have been exhibited at premier cultural institutions, including the current Data Dreams: Art and AI exhibition at the MCA, and previously, the United Nations; Ars Electronica, Linz, Austria; the International Documentary Festival Amsterdam; the Museum of Old and New Art, nipaluna/Hobart; and the Goethe-Institut, Sydney.
Her pioneering research on cultural governance for AI has influenced governments globally. She is the founder and director of Old Ways, New; co-founder of the Indigenous Protocols for AI working group; has won the inaugural Women in AI Award for Creative Industries; and is a Professor at the School of Cybernetics at the Australian National University.
Transcript
Sue Keay: Good evening, everybody, and welcome to tonight’s conversation with Angie Abdillah; Meditations on Country. My name is Sue Keay and I’m the Director of the UNSW AI Institute.
First, I would like to acknowledge the Gadigal people of the Eora Nation, the traditional owners of the land and waters on which we gather tonight. I would like to pay my respect to their elders, both past and present, and extend that respect to other Aboriginal and Torres Strait Islander people who are with us here today.
Tonight, I’m very delighted to welcome Angie Abdillah to join us to share her experience in developing Meditations on Country.
Professor Angie Abdillah is a professor at the ANU School of Cybernetics and is the founder and director of Old Ways New. Angie is a Palawa woman who works at the nexus of technology and the complexity and sophistication of Indigenous knowledge systems. Through research, film and video installations, she explores technology as a cultural practise grounded in deep time, knowledge and connection to country.
Her work has been shown internationally and her research on Indigenous-led AI governance has influenced governments across the world. Angie’s Meditation on Country, featured in the Data Dreams exhibition on this level, brings together Indigenous knowledge systems and Western astrophysics by exploring the intersections between ancestral time and cosmic evolution. So please find time to see her artwork. Please join me in welcoming Angie.
Well, first, Angie, I’d like to, you know, if you wouldn’t mind, just describe, you know, what drives you to do the work that you do?
Angie Abdillah: Well, I originally trained as a filmmaker and so I think the work I do, I’m very fortunate and very grateful to be able to make work that comes from a creative basis.
I’m also extremely curious about the various different types of knowledge systems that I’ve learned about, but I’ve been very fortunate enough to be taught a lot from elders over the years. And I guess as I know more, the more you know you, the more you realise you know nothing. And so that, I guess, that’s what drives me.
I’m always really interested and curious about the different ways in which our old people designed systems that were innately tied to the natural rhythms and ebbs and flows of Country. And they were systems that were didactic, they had utility, but they also had an incredible ability to regulate the ways in which humans and the environment connect. So that’s what keeps me curious and interested.
And I guess I’m very lucky to be able to do that work, to have a research practise that is also creative.
Sue Keay: And tell us a bit about Meditations on Country and what was some of the thinking behind your creation of that piece?
Angie Abdillah: The work is an installation and is essentially based on a creation story that was shared with me. And in the sharing of this creation story many years ago now, I was and still am quite blown away by this reference to the Big Bang. You know, when I heard this reference, I thought I sort of did a double take. Are you talking about the Big Bang, Uncle? Yeah. In such a really kind of matter of fact way.
And so once I heard that, I couldn’t stop thinking about how did they know? And how did they know not just about the Big Bang, but also a number of key evolutionary events that are referenced in this particular creation story. I then later learned that aligned with the sort of core knowledges within astrophysics and evolutionary science. So when I started to bring these different people together, Uncle Ghillar Michael Anderson, who’s a senior lawman and renowned for the specificity of his cultural knowledge and sky knowledges.
And distinguished Professor Brian Schmidt, astrophysicist, ex-VC of ANU, and also Karlie Noon, a Gamilaraay woman, also doing her PhD in astrophysics. And a bunch of other people. We got into the room and we started teasing out these different versions of evolution and the alignments and the commonalities, the common understandings of these particular moments in time was quite profound.
So the work itself is really about bringing these two different paradigms of knowledges together in a cross-cultural simulation to highlight the power and the beauty of resonance and vibration in these particular different ways of understanding the world.
Sue Keay: So one of the things that I’ve heard you talk about that I find fascinating is your thinking on the resonance of the spaces in between. And I guess the value of having a look at, you know, I guess the blankness, or is it the black hole? I can’t remember quite how you described it.
Angie Abdillah:Well, in sky knowledges, the study of sky knowledges is all about dark matter just as much as stars and other galaxies and so forth. So I’m sure that there’s probably a number of people in the room that have heard about the dark emu, which is about the negative space between stars.
And, you know, it’s that that is also quite fascinating to me that there can be quite radically different ways of understanding the same context and the different types of knowledges that come from these different ways of seeing, being, knowing, doing are really important. And it’s really interesting to me to think about how you can have a particular scenario and one group of people will identify particular elements within a frame or within whatever the context might be, and just not see anything else that exists. Whereas, you know, a different group of people and their capacity to understand and know quite intimately a whole array of other different type of knowledges that exist that are just completely invisible to the other, you know.
So that’s just one particular scenario. This is quite evident in the way in which First Nations people also understand data. So, you know, we can go on and on and on about the different ways of seeing, being, knowing, doing, and there’s various different examples of this.
And I think it’s important to be able to draw that out because it shapes and informs the way we understand the world, the way that systems and societies are shaped.
Sue Keay: One I would like you to draw out on, I mean, you’ve looked at the intersection between Indigenous knowledge and astrophysics, but you’ve also identified an interesting intersection between computer science and Indigenous knowledge in terms of pattern thinking and pattern recognition, which are terms that have quite specific meaning in computer science and in the development of artificial intelligence. So, you know, would you mind expanding on that?
Angie Abdillah: Yeah, so many years ago, I think it was back in 2003, I ran the very first, from what I know, the very first prototype workshop introducing robotics to a group of young Aboriginal kids from Glebe Primary School. And it was a really great event.
It was, all the kids got so into it. We were using Lego Mindstorm kits. And then after the workshop, we, myself and a roboticist, Professor Robert Fitch from UTS, co-wrote a paper exploring these ideas, well, exploring the workshop, but also delving deeper into these concepts of, you know, what are Indigenous knowledges and how did the makeup of this workshop, which was fundamentally interested in drawing out a sense of pride and strength in culture.
So for these young people, we did that through devising a way to introduce the concepts of code through a commonality. So in robotics, it’s about time and space, how you move through time and space. And likewise, within Indigenous cultures, there’s all sorts of different knowledges and protocols around how you move through time and space.
And so we used these concepts of tracking and moving, and the protocols of moving through different countries as a conceptual framework for designing the course. But then in the paper, we delved much deeper into these notions of what pattern thinking and pattern recognition is from an Indigenous standpoint. I started talking about this with this roboticist, Rob, and he was like, you know, yeah, I know what you mean about pattern thinking. And he told me what he understood about pattern thinking and pattern recognition. I was like, oh, that’s interesting. I’d never heard about, you know, from that perspective.
My old uncle, who’s passed on now, when we were teasing these ideas out about, you know, how could we do this workshop? He was teaching me about, you know, softly, softly and gently, gently. He would give me little bits more information over time. It took many, many years to be able to really understand what he was talking about. But this idea of pattern thinking and pattern recognition is essentially, it’s about relationality. It’s about understanding how all things are connected and interrelated within an Indigenous paradigm, like kin and country.
There’s all different ways in which we are able to understand those relationships between things, humans and humans, humans and plant and animal, and other realms of things. And so pattern thinking is this ability to see these relationships at play and pattern recognition is the ability to see. Pattern thinking is, yeah, another extension of that.
Sue Keary: So you’ve been able to identify quite a close connection between things that are quite technical with cultural knowledge, but unfortunately there’s a huge gap at the moment between the ethical and technical approaches to how AI is designed. So perhaps you could talk about why there is such a gap when, you know, in your experience, you clearly have been able to see where the connections lie.
Angie Abdillah: Yeah, well, you know, what we’ve just been talking about is it comes from a particular cultural paradigm that extends to all different ways of life. You can see it, it’s evident within the different types of automation, like automated systems that exist within country. It’s evident within kinship systems, for example. It’s evident in all sorts of different ways of life.
And I guess the problem with these, from a computer scientist perspective, which I’m not, the concept is confined to this very particular set context. And it doesn’t often have relationality or an ability to extend beyond a very narrow context, like inside computer science only. And so I think this is where, you know, when we were talking about the differences between sky knowledges and astrophysics, the difference often, I think there’s many, but one of them is Indigenous sky knowledges can’t be understood without a relationship to earth and water because country is those three elements combined. And so all things always have a relationality.
So we’re talking about the ways in which at a fundamental level, a sort of very Eurocentric enlightenment thinking tends to rationalise the segregation of things, segregations of knowledges and so forth. Whereas from an Indigenous paradigm, that’s not the case. All things are related and connected.
So coming back to sort of your question, big round loop, ethics within an Indigenous system, complex system, automated system or otherwise, the ethics of the system are part of the system itself. It’s not another layer that is slapped on at the end or another sort of layer within a stack, a technology stack. Ethics are not even really talked about in Indigenous culture or Indigenous communities. It’s cultural protocols, which are very, there’s law and there’s protocols. And protocols, I think are often an extension of those laws. You know, it’s the active expression, typically. It’s how you do.
Sue Keay: So in some of the conversations we have around AI governance, where there is a focus on ethics principles, you know, from your experience, it’s much easier to, if you use cultural protocols, they are much easier to translate into programmatic engineering practises.
Angie Abdillah: I wouldn’t say easy. None of it’s easy, really. But I think what is possible is this particular framework that I’ve developed and that was part of a paper called Out of the Black Box: Indigenous protocols for AI.
We were looking at, well, how do you, like if you’re a, when you’re training a model, is there a way to embed these protocols within the model so they’re part of the model itself? And there is. I mean, depending on the models, like there’s very, different capacity and different, depending on what you’re working with.
But in essence, essentially there is a, you know, there’s a cultural ethic or, and that can be translated into, okay, well, here’s the, essentially the value or the principle, and here’s how it translates in action. And so you can move through a process where you translate these cultural ethics and cultural protocols into programming protocols and programming logic.
Sue Keay: So in essence, you’ve created some tools that if AI engineers were so motivated, they could use at the design phase so that they could insert these cultural protocols as part of the full solution, not just as a layer over the top that may or may not work.
Angie Abdillah: Look, to be honest, it’s long, hard, deep work. And a lot of it is about these, you know, a lot of it for me comes up when I’m working with a developer, often it’s the, the questions we ask and the way they’re framed that often embed assumptions within those particular ways of working that are almost invisible because they seem benign.
And it’s like, you know, cultural assumptions like productivity or survival of the fittest when a programme is not out, you know, must programme in survival of the fittest into this model. Like it’s not working like that. They’re not, they’re not thinking like that, but essentially when you look at the, how often, if I go back and say, why did you make this particular decision? What were you like, where were you coming from? Often it’s these, these cultural assumptions that unless you’re, you’re, you’re aware of, you’re checking yourself, they just, they find their way in there. And so it’s slow work.
It’s slow work. It’s deep work. And we’ve spoken about this before, you know, like what is the incentive for a technologist, not just those who are, you know, coding, but there’s many people within an AI lifecycle or, you know, even just developing the models themselves and, and everyone’s contributing these different cultural biases without even realising.
And so, so how do you do this work? Well, you do it, well, it’s you, it’s finding the right team to begin with that, where you can have a frank and open and honest and courageous conversations about all things really. And, and it’s often finding developers that have an ability to work conceptually and working outside of these normative binary ways in which are quite fundamental to the ways in which sciences, these particular types of sciences are, are taught.
Sue Keay: Well, and I think what we’re seeing now is the result of, unfortunately, a lot of technology that is being created. I think you used the, an interesting term, where we have now got these general-purpose technologies that are untethered to the communities who are using them. And as you’ve said, it’s, will be a lot of hard work to try and address that situation. But, you know, what would the world look like if we were paying more attention to how these technologies intersect with community?
Angie Abdillah: Well, I guess this whole notion of black box systems and they just don’t really exist within Aboriginal culture, as far as I’m aware. So, you know, the, I think that’s a problem, you know, like if we’ve got, and of course it comes back to this, these neoliberal models that are also often quite invisible. They’re just, people don’t even sort of think about them, about the types of tools and technologies that you allow into your world and your bodies and your homes and, and the, you know, these, the business models in which are driving these particular agendas and therefore then the nature and the shape and the type of ways in which these automated systems exist within our society.
So there is, I think there’s, yeah, there’s a need for us to start asking more questions, better questions, but also we need to, I think, have a more critical conversation about what is regulation, I think. And it’s a provocative and controversial conversation for some. I don’t think it’s that scarier conversation. I think that there’s all different types of ways in which, you know, regulation with either a capital R or a small r could occur.
And it doesn’t have to, I think that what would happen with that is a lot of more confidence in the sector because there’s a lot of, from what I can see around the world, there’s where there is regulation, there’s a lot more confidence where people are moving ahead in different types of ways because of that capacity to know, all right, this is safe. This is responsible. Yes, it’s, there are very particular guardrails. They’re very clear. And this is how I work with them.
Whereas we don’t have that here. There’s a lot of ambiguity still. And, and I think it’s, it’s not good for anybody.
Sue Keay: So apart from advocating for regulations, how can people here in the audience have some agency and control over the AI tools that they’re being asked to use?
Angie Abdillah: I think we’ve all got to come to, I guess, come back to that. I think there was a documentary on Facebook years ago where, you know, one of the ex-Facebook employees said something profound, like, you know, if you’re not paying for it, you are the product. You know, essentially it’s the same thing, I think, with AI, you know, be careful, be mindful, be more discerning about what you’re giving over and how much, yeah, I guess, you know, the ramifications of the types of information that you feed into these models, like, where is it going?
Sue Keay: So in the final sequence, going back to your artwork of meditation on country, we see an abstract morphing of soaring Australian native birds created by generative AI. So what does this metaphor tell us about the flattening AI can produce through its frameworks and understandings on a practical level?
Angie Abdillah: Well, in the making of the work, I was really reluctant to use any large language models and it was through...
Sue Keay: Perhaps you could just, you know, let people know about generative AI and large language models.
Angie Abdillah: I guess generative AI is a product of a large language model and it’s the basis of all the types of AI that we use now. But pre-LLMs, pre-large language models, there’s a world of AI that exists. And in those various different models, I think there’s more capacity to have agency and control, agency and autonomy, and more individual control over the how your, what data, how it’s being trained. And there’s more, I think there’s more capacity to work with those models as opposed to generative AI. So I was very reluctant to use, but because of a, there was a whole bunch of reasons at the time in the creative development, my developers said, let’s just give it this go. Like, let me test something for you.
And I remember saying, you know, I really am interested to sort of see how these, this data set of native Australian birds would go. And he was not showing anything back after weeks and weeks. I was like, dude, like what’s going on? Like, just show me, show me what you’re working on, please. I need to see something. And finally he showed, he was reluctant to show me because he was having all these problems. And the problems were this lossy, this technical term lossy, which is the information degradation or the signal loss between from one node to another, very technical term that I immediately started, I saw what he was, what was, what the issue was. There was this sort of morphing and skewing of these birds and they, and they weren’t beautiful and pretty like they are in the final artwork. They were, it was a mess, but I could see there was something interesting going on there conceptually. It’s like, oh, let’s, let’s sit with this. Let’s tease this out and see what we can make from this. Because I’m curious to, you know, this could be a subversive way of drawing or highlighting the problems we have with LLMs and this information, this lossy and this cultural loss that happens when from this technical term that often is not thought of as a cultural loss.
So in the skewing of these native Australian birds, when we started working with the skewing, as opposed to trying to fix the skewing, this morphing kind of reminded me of a very sort of Dali-esque kind of abstraction. And so we pushed that and made it them beautiful. And then in the process started thinking about this, this lossy in terms of the loss of Australian, the extinction of Australian native birds and linking it to the, the really, really serious concerning reliance of natural, of our natural resources, in particular water that LLMs have.
This linkage between LLMs and water is, you know, people are starting to talk about it now, but you know, a couple of years ago, people really weren’t. So it was a way to draw, to create some visibility over the invisibility of country and its connection within LLMs.
Sue Keay: Yeah. Well, thanks Angie. And thank you everyone here for joining us tonight. This has been co-presented by the UNSW Centre for Ideas and Museum of Contemporary Art Australia. I’d like to thank Angie for joining us this evening in conversation. Thank you everyone and good night
Angie Abdilla
Angie Abdilla creates video installations interrogating Indigenous deeptime knowledges, automation and AI, focusing on technology as cultural practice. Her research, artworks and films have been exhibited at premier cultural institutions, including the current Data Dreams: Art and AI exhibition at the MCA, and previously, the United Nations; Ars Electronica, Linz, Austria; the International Documentary Festival Amsterdam; the Museum of Old and New Art, nipaluna/Hobart; and the Goethe-Institut, Sydney. Her pioneering research on cultural governance for AI has influenced governments globally. She is the founder and director of Old Ways, New; co-founder of the Indigenous Protocols for AI working group; won the inaugural Women in AI Award for Creative Industries; and is a Professor at the School of Cybernetics at the Australian National University.
Sue Keay
Sue Keay is the Director of the UNSW AI Institute and founder of Robotics Australia Group, the peak body for the robotics industry. As an expert in robotics, AI and automation, she led the development of Australia’s robotics roadmap leading to Australia’s first National Robotics Strategy. A strong advocate for the Australian AI ecosystem, Sue is a fellow of the Australian Academy of Technology and Engineering (ATSE), a Chaikin medallist, a member of the Kingston AI Group and Chief Executive Women, and is on the board of computer vision start-up, Visionary Machines. Sue has an MBA from UQ Business School, PhD in Earth Sciences from ANU and is a Graduate of the Australian Institute for Company Directors.