Skip to main content
Scroll For More
Listen

Dark Technologies

Michael Richardson, Antony Lowenstein,Toby Walsh

In the wrong hands for the wrong ends these are incredible powerful technologies that will be used by state actors, by non-state actors, to commit harm and to commit atrocities.

Toby Walsh

Machines lead the charge on today’s battlefields, but what does this mean for the people caught in the crossfire? 

Learn from journalist Antony Loewenstein, whose Walkley Award-winning investigation, The Palestine Laboratory: How Israel Exports the Technology of Occupation Around the World, uncovered the widespread commercialisation and global deployment of Israeli weaponry tested in Palestinian territories. Antony is joined by AI expert Toby Walsh, whose new book, Faking It: Artificial Intelligence in a Human World, explores how AI impersonates human intelligence.  

Listen to this vital conversation with host Michael Richardson about the intersection of technology, conflict, occupation and surveillance.

This event was presented by the Sydney Writers' Festival and supported by UNSW Sydney. 

Transcript

Michael Richardson: Hello, good afternoon and welcome to Dark Technologies at the Sydney Writers' Festival. My name's Michael Richardson, I'm an Associate Professor in Media and Culture at the University of New South Wales in the Faculty of Arts, Design and Architecture, and an Associate Investigator with an ARC Centre of Excellence on Automated Decision Making and Society.

My research looks at how emerging technology, power and culture intersect. I've got my own new book out, although that's not what we're here to talk about, it's called Non-Human Witnessing War Data and Ecology After the End of the World. And one of its major themes is how drones, autonomous weapons and artificial intelligence are transforming how we witness and experience war.

So you can imagine that I was thrilled to be asked to have a conversation with Toby Walsh and Antony Loewenstein. These topics are going to feature quite heavily in our conversation today but it's important to remember, I think, that these new technologies, like facial recognition and drones and chat GPT and so on, are built on older systems of knowledge and control and power, many of them designed and refined in places like Australia.

Colonisation in this place was not only a matter of guns and steel but of property registers, land surveys, lines on maps and fences on the ground. It depended on categorising people and places, on controlling who moved where and when and how. So I want to acknowledge that today we're meeting on the unceded and sovereign lands of the Gadigal people of the Eora Nation, the traditional owners of this place, and pay my respects to elders past and present and to Aboriginal and Torres Strait Islander peoples who are here today. This always was and always will be Aboriginal land.

Now today's session is a remarkably timely one. This current wave of AI technologies shows few signs of slowing, while tragically we're seeing in Gaza, Ukraine and elsewhere the impact of new technology on war surveillance and control and what that means for the people and places there. And both our authors have been longstanding and outspoken voices on these issues.

So Professor Toby Walsh is a Laureate Fellow and Scientia Professor of Artificial Intelligence at UNSW and CSIRO Data61. He's the Chief Scientist for the UNSW AI Institute and an Adjunct Professor at QUT.

Toby has an astonishingly long list of academic appointments, honours and contributions but he is best known outside the academy for his fierce advocacy for limits to ensure AI is used to improve our lives. Having spoken at the UN and to Heads of State, Parliamentary Bodies, Company Boards and many other bodies on this topic. That's why he received the prestigious Celestino Eureka Prize for promoting understanding of science and was named on the International Who's Who in AI list of influencers, and also why he's been banned from ever going to Russia. He appears regularly on TV and radio, has been profiled by the New York Times and has authored four books on AI for general audiences including his most recent book, Faking It: Artificial Intelligence in a Human World.

Antony Loewenstein is an independent investigative journalist, filmmaker, co-founder of Declassified Australia and best-selling author. He's written for mastheads, from the New York Times, to The Guardian, to the New York Review of Books and is the author of a number of influential books including Disaster Capitalism, The Blogging Revolution and all the way back in 2006, My Israel Question. Throughout his career, Antony has been an unflinching chronicler of state secrecy and violence and his reflections and investigations of Israel extend back more than two decades.

So his latest book, The Palestine Laboratory: – bought a copy of this too – How Israel Exports the Technology of Occupation Around the World, is not only the product of intensive research but builds upon long reflection, investigation and analysis of Israel and Palestine. A global bestseller, The Palestine Laboratory won the 2023 Walkley Book Award, the People's Choice Award at the Victorian Premier's Literary Awards, and those are two of Australia's most prestigious book awards and that's among numerous other honours and short lists.

The book has played a major role in debates about the role of weapons technology and the Israeli arms industry in the occupation of Palestine, and these questions of course have become even more urgent in the wake of the Hamas attacks of October 7, which killed around 1,200 Israelis and produced over 200 hostages. The invasion that has followed has killed at some estimates over 35,000 people with 14,000 of them children, which is just a devastating and ongoing instance of violence.

So Antony, let's start with you and I think it'd be great for the audience if they heard just a little about your book, and so what's the central idea behind The Palestine Laboratory and what motivated you to write it?

Antony Loewenstein: So the reason I started investigating this I guess a number of years ago was that I found that so much of the reporting on Israel-Palestine was very much around what happened today or yesterday, and that's not irrelevant. It's important to know what happens in the West Bank or Gaza or elsewhere, and I've been visiting as a reporter Israel-Palestine since 2005. I visited every three or four years.

I went to Israel, the West Bank and Gaza and between 2016 and 2020 I was living there with my partner in East Jerusalem, and I thought it was really important to understand how what Israel was doing in Palestine, trialing and testing a variety of forms of surveillance weapons that were then ending up around the world. In other words, what was being done to Palestinians – there's about five million or so under occupation in Palestine – was bad enough, of course, but what I was seeing more and more was almost a Palestinianisation of many other conflicts around the world. What I mean by that is that in the last – really as I show in the book – pretty much since Israel was born in 1948, and pretty much from the 1950s, there was already an understanding from Israel's perspective that they needed to have a weapons industry to sell to the world, to get influence, essentially. And it was a way to both make money but it was more than that was also to sell a certain idea.

And so I document in the book a range of examples from apartheid South Africa to the Iranian regime before 1979. So pretty much if you think of every single repressive regime since 1948, most people know that the US supported those regimes across the globe, but what is far less known is Israel was often there with them. And what I was seeing was that what was being done to Palestinians, this horrific and brutal this unrelenting occupation, was also increasingly being replicated to other minorities around the world and that's finally what essentially what the book explains in great detail is that, many other countries want what Israel is doing.

So this was obviously an issue long before October 7 and even since October 7 which I know we'll get to in a bit, many nations are looking with admiration to what Israel is doing, not disgust. Of course many people in the world are appalled and disgusted by this unrelenting genocide against Palestinians, but I'm saying many nations and militaries and police forces are looking with pride and saying, how do we get a piece of that technology, that drone? And I'm already seeing evidence that a lot of global arms fairs since October 7, there's been two particular ones, Paris and Singapore. Israel is promoting the tools that they are battle testing in Palestine, in Gaza. So that is the threat that I think we need to be aware of.

Michael Richardson: Thanks Antony.

We're going to get into some more detail on that but I want to give Toby a chance to set up his book for us too, and we can start to bring the conversation together.

So Toby you've written in the past about the role of AI technology in warfare and as I'm sure many in the audience know, Toby's been a leading advocate around the dangers of developing autonomous weapon systems and so-called killer robots.

Toby Walsh: Which includes, I'm sure we'll come to this shortly, the way it's being used in Gaza.

Michael Richardson: That's right, yeah.

But this new book, Faking It, in some ways pursues a broader line of inquiry.

What's the new book about? And I'm curious too that while you as an AI scientist someone who really knows the technology, why did you feel compelled to illuminate the fakery of AI?

Toby Walsh: Well as the title suggests, it is about the artificiality of artificial intelligence, about how increasingly we're going to be fooled by the fake AI and the AI fakes, that are there deceiving us, that they are more capable or more humane in the case of warfare, than they actually are. And certainly, you know, for most people I think when ChatGPT arrived it suddenly became very concrete. We had suddenly this person, almost seemed like a person that was typing away that you could have a conversation with and how easily fooled we were going to be.

So I wrote that but the reason I wrote it is that is a fundamental that the Promethean challenge that you have as a scientist. The one that, you know, was so well documented in that Oscar-winning film Oppenheimer, which is, you work on these technologies they're incredibly powerful, incredibly useful, but they're always a double-edged sword. They're incredible benefits that we will get from artificial intelligence. I'm convinced, that's why I get up in the morning and work on it.

But equally, in the wrong hands for the wrong ends these are incredible powerful technologies that will be used by state actors, by non-state actors, to commit harm and to commit atrocities. And so I think it's really important as a role as a scientist with some public footprint, to engage the public in the conversation, because these things will be done in your name otherwise. And we need as a society to make choices you know, as we did with the nuclear bomb, we had to make choices as to, you know what we were going to do with this amazing discovery, were we going to just use it for peaceful purposes to generate energy, or we're going to use it for destructive purposes to possibly even end humanity itself? And similarly with AI I mean, it's a technology which has equally positive and negative applications and I felt a responsibility as someone who spent 40 years now working on the field that, you know we need to push forwards, engage the conversation. Wonderful to see all you in the room engaging with this conversation because it's about the world we're going to build.

Michael Richardson: Us academics sometimes have a tendency to get into the weeds of our subject matter and we've been accused from time to time of being a little obscure in the way we say and describe things, but that's not a feature of your public work. And so I was wondering, Toby how do you translate having such deep technical knowledge into text for a public audience that's so clear and easy to understand?

Toby Walsh: I think it's like teaching, I mean, it's about trying to engage people with the story. We're storytellers, you're in a place this is a place where people are telling stories, and so you've got to find the stories you've got to take complex ideas and craft a narrative around them that people can engage with. You know, just like we've always crafted stories, just like Antony crafted a story in his book as well.

Michael Richardson: Turning to your book, Antony I mean, the story and the many stories, really, in the book are very compelling and the level of research that you've undertaken is incredibly impressive. One of the challenges I've often found in my work in trying to write and research military technologies and their impacts, and similarly with policing and intelligence technologies, is that they're often very difficult to access and to discover things about. In AI and technology writing more generally we often talk about things being black boxed, that they're closed up and you can't see inside the black box to understand what's going on. Antony, you have the problem of the technologies themselves not only being difficult to see inside, if not impossible, but then you might have national security, secrecy, military confidentiality, and so on, all surrounding what you're going to look at. So how do you overcome that problem?

Antony Loewenstein: It's challenging. One of the things that's always been important in all my work including in the Palestine Laboratory is to humanise these stories. That I have in this book on Palestine lots of examples and interviews with people who have suffered in Palestine, yes. But more broadly who have suffered, for example, who have been hacked by Israeli spyware, from Mexico to India. And again this shows what I try to explain in the book this, Palestinianisation, that many states seem very keen to mimic what Israel is doing in Palestine.

And you're right obviously Israel is one of the most secretive countries in the Western world, it doesn't leak much information at all, it's the only Western country in the world where journalists who write about military issues have to go through a censor, that happens in no other Western country. And I think that partly explains the political and media culture in Israel, which of course there are some good journalists doing good work, but in general I think it's a very sick blindly patriotic media culture, and frankly political culture, where you have – and this I think goes to the heart – that people often ask me about why there are not more protests in Israel for example about what's going on.

Toby Walsh: There have been huge great protests though.

Antony Loewenstein: Yes and no, there are people protesting /

Toby Walsh: Yes.

Antony Loewenstein: But they're ultimately protesting a particular policy, they're not protesting the broader questions. And that's important to say because, in lots of public opinion polls since October 7, but frankly for years – and I have some of these in the book – there is a – and I've said this for a long time – there's a – Israeli society has been radicalised – talking about Jewish society now – has been radicalised where hatred of Palestinians is so normalized, and mainstream, obviously not every Israeli Jew thinks like that. But, the studies are clear, I mean since October 7 there's been countless studies of the public in Israel showing how the vast majority think that there's too much support of aid – too much – aid going into Gaza when people are literally starving to death, more force should be applied.

And this I think goes to your question, which is, when you have a culture which is so used to impunity for so long, Israel has essentially got away with this for its entire existence, and it's only now, maybe, possibly that there's a beginning international criminal court, want to arrest Netanyahu and the Defense Minister, and leaders of Hamas. Various other attempts to try to bring a degree of accountability, and I would hope in years to come that there'd be therefore more pressure within Israel to change the equation, but I'm actually not that convinced that'll happen, I think what'll happen finally is like what happened with South Africa years ago, that anyone who talks about the apartheid regime there will say, this was never going to change from within. Of course there were, obviously black South Africans opposed it, and a handful of whites did as well, but the vast majority of whites were very happy with apartheid, suited them very very well. And this is exactly the same in Israel, that unless there is an economic price paid for decades and decades of occupation and racism and apartheid, why would you change? Putting aside the moral reason.

So that secrecy that exists in Israel was challenging, lots of declassified documents harassing people in a lovely polite sort of way.

Audience Laughter

Antony Loewenstein: To get information, and people have been giving me more information since, but Israel is a bit of a black box. But it is changing if you know the right people.

Michael Richardson: So, in both books something that really struck me was the role of hype in selling technology, and technologies of occupation of, and of artificial intelligence more generally, and I come across this too in my own research all the time. You mentioned some of this just before Toby, that, you know, drones and other technologies will make war more precise, more humane, we should welcome these developments, and that hype can be pretty powerful.

Toby Walsh: Before you move off that point, I mean, that's I think, you know, one of the most interesting, often put forward, moral argument. That we are morally obliged to develop these technologies because it will transform warfare in that positive sense. It's, I mean, it's fundamentally flawed in a number of couple of ways. One of which is, that is, that's not the current state of the technology we have, they're actually going to commit lots of errors, a lot more errors.

And then second is that that ignores the fact that they're going to change the nature of war, they're going to allow you to do things that you couldn't previously do, they're going to perhaps reduce the barriers to conflict, you know the president of the United States can sit and the comfort of the White House thinking he’s not risking U.S. soldiers by putting boots on the ground, and he can just send the drones in, not realizing that that is just going to inflame the situation, not going to, it's like, we never solved Afghanistan, we were never going to solve it by raining terror in from above. You had to literally go there in there, and put, you know, risk people's lives to make that change. And then just change the speed and character of war. The fact that we can now kill people quicker. Maybe they're not necessarily the right people, we can now… it's not a little substitution of robots, you know, humans for robots. It's actually you're changing the very nature of the way we kill.

Antony Loewenstein: And mass killing actually is the point, I mean, in terms of with Gaza for example, this idea somehow that Israel has extensively using AI in terms of its warfare there, to be sure. We know some of those details much of it we don't yet. But mass killing of Palestinians that is the point. The idea somehow that Israel believes in a profoundly diluted way, that by doing that, by killing tens of thousands of people that makes them safer, or any Jew, or anyone safer, is insane. And interestingly enough even a few days ago just as an aside, the U.S. intelligence released a assessment report, and obviously we take everything the U.S. says with a grain of salt, but let's say it's true, on this case. That they claim that only 35 percent of Hamas fighters have been killed. Now we're seven plus months in, Israel has killed 35-45 000 people at least. The vast bulk of civilians, women and children and obviously men, so essentially a relatively small amount of Hamas fighters, so the U.S. and Israel claims have been killed.

So therefore the end result is what? Gaza is rendered unlivable, Israelis are far less safe than they were frankly on October the 6th, and all this technology that Israel is deploying with the assistance of the U.S, and I might add Australia too – which we can get to if you like – has shown that mass killing is the end game. There is no point beyond death in warfare, in this particular case, and other contexts as well, but particularly in Gaza.

So what is the so-called,  I mean if you ask, what's the Israeli thinking along those lines, I think revenge is the primary driver. And although yes there have been protests in Israel against some aspects of the war, let's not be convinced that that actually is what the majority of Israelis think. Any replacement of Netanyahu if he falls tomorrow on these issues, is no different. That's the sad reality occupation is Israel. Israel is the occupation, that is it, and until we accept that and I think as a globe respond to that, then we're living in this world of delusion.

Audience Applause

Toby Walsh: I think it's worth pointing out these are you know ultimately crimes against humanity, right? There is /

Antony Loewenstein: And the ICC alleges.

Toby Walsh: Yeah. And we have enshrined in international humanitarian law, principles the principle of proportionality, and it's very clear you know that 20 civilian deaths, or even possibly 100 civilian deaths, for one Hamas fighter is violating what the international community has accepted was a proportional response to a threat. You know, of course, both sides have you know committed crimes against humanity, right? Going in and taking those hostages, and killing those people, again that's a that was a war crime as well. But, you know, one war crime does not justify another war crime.

Michael Richardson: Yeah I mean, we saw, not in you know in Afghanistan for two decades of American drone warfare, and a different kind of occupation, but an occupation. We saw not just like huge numbers of civilian deaths but also transformations of the way people live which I think too is part of the purpose behind these kinds of technologies, right? Where for example, if you have drones circling overhead and you go out to hang up the washing, or to visit a friend or to have a meal, and you can hear the buzzing sound of a drone overhead, that you can't see, you have to live with the idea that a bomb might be coming any second, and that you will not have enough warning to escape it, and so the way people live their lives changes.

So we saw in Afghanistan for example that traditional tribal councils and forms of gathering started to disappear, because those were deemed potential threats. A bunch of men, often armed, sitting around in a circle would be sort of considered something that needed to be monitored and perhaps acted against. And so you know when we think about the applications of these technologies, obviously the killing and dying is horrific, and part of the crimes against humanity, but so too are the transformations of the way people actually live their lives day to day.

Toby Walsh: And then you have to think, how is this actually going to resolve itself? That's just going to, you're just radicalizing, understandably, people in Gaza. You know, that is only going to encourage and inflame and continue and perpetuate this. It's not a way you're going to get a solution.

Antony Loewenstein: And it's why I think the decision last week by the International Criminal Court against Hamas, yes but also Israel, is an earthquake, now obviously, let's see where it goes no one really knows. But there's a reason why the US particularly, and frankly Australia, is petrified of that. It's not particularly because they're just supporting their ally Israel, it's because it's going to come after them.

Like everyone knows in Iraq and Afghanistan the listening of US-led war crimes is long. And it's interesting America is happy to support the ICC going after Putin, he's an enemy we can't support that guy. But if it comes after us, Australia or the US, or Israel, we apparently say ICC has no legitimacy. Sorry there's a reason why the legitimacy of the US and Israel, and many other western states, in those states and frankly in the global south, has never been worse. This is profound hypocrisy, and it can be dressed up in a multitude of languages of AI, and whatever other tools Israel and the US chooses to use. You can't take away from the fact that people can smell bullshit, and that's what it is!

Michael Richardson: I'd like to, if you don't mind, come back to this question of, sort of, hype and the economies of AI and weapons technologies. And so, I wondered if you Toby, could talk a little bit about the role that hype plays in selling AI, because one of the themes I think throughout your book is this idea of imitation and fakery, and so on. And some of that is about what AI is and how it works but some of it too is about pitching these technologies and trying to sell them to all of us to transform our lives. I think if you turn on the TV, or, I was going to say open a newspaper, but what I mean is click on a website, open TikTok for the young people. You will find stories about how AI is transforming the world around us, how much of this is real how much of it is hype, and what does the hype do, do you think?

Toby Walsh: Yeah in the book I talk about how it's one of the original sins of the field, from the very beginnings in 1956 when the field race started the very first proposals to work on artificial intelligence promised to make a significant resolution of the field, in the next, over the course of the summer. And of course we didn't solve it over the course of the summer of 1956, 70 years later we're still working on it. But people have been, you know, over promising and under delivering ever since. And there's a lot of, you know, that's delivered us some fantastic technologies, but equally it's inflated people's expectations.

And we're also we're quick to fool ourselves. I mean it's a human trait, we see something, and we see a machine, a computer, do something intelligent, and then we immediately think, oh well it's smart like us! We build a computer that can play chess as we did in 1997, IBM's Deep Blue, got to play chess better than Garry Kasparov. And you think oh well if it can play chess, well that's an intelligent activity, all the people I know who can play chess well are intelligent people. But that AI couldn't do anything else but play chess. Indeed. it couldn't even play a variant of chess. There's a variant of chess called killer chess where the idea is to lose your king as quickly as possible. It wouldn't be able to play that game, humans will be able to adapt.

So we apply our values, our experience of intelligence, the rich experience that we have when we open our eyes, and are alive and experienced and think, oh the machines if they do something intelligent they're going to be like us, so they're going to lots of things. Not realizing that they're not going to and that they actually will be quite brittle, and therefore we should be very reluctant, for example, to hand over responsibility of deciding who lives and dies to machines that are brittle.

Michael Richardson: This kind of hype and mythologizing around technology is really significant in the Israeli arms industry, but also in this, sort of, idea that you write about in the book about the myth of Israel as a startup nation, full of innovation that’s just sort of bubbling away out of a mix of like, you know, inbuilt creativity and good government policy or something. What do you mean by this sort of myth of the startup nation in Israel, and how's it tied to arms?

Antony Loewenstein: I mean a startup nation philosophy, ideology, propaganda, really, is what it is. Was, has been pushed in some ways very much since the beginning of Israel's existence, but it really took off in the last 20 years. Essentially it was promoted by pro-Israel propagandists, and Israelis as well. Who advocated technologies, not just in the defense sector, but particularly in the defense sector, and how Israel was a tiny country, it's a tiny population, which is obviously true, and it was developing these amazing tools that could assist all of us from water management, to environmental support, and climate change, to weapons.

And as I talk about extensively in the book the one thing that was usually ignored in most of this coverage, which was pushed mostly by my profession, the media, and talk about AI hype and it's coming from my cretinous profession. That's where it's coming from! Journalists who are on the teat of these companies. Not necessarily in a literal financial sense, but that's how it happens.

Why is Elon Musk who clearly is a bigot from way back, celebrated as someone who's a visionary, he is not! And what's remarkable about the startup nation concept, is that in some ways October 7, you'd think would be the end point of that. Israel had spent billions and billions of dollars surrounding Gaza, the 2.3 million Palestinians there, in this open-air prison, huge amounts of technology walls, surveillance tech. It was breached by what is essentially a guerrilla army, Hamas. In a matter of, yes it was a very sophisticated attack in a way, but it was breached.

And you'd think that logically speaking, how would any country now look at Israel and say yeah we're going to buy amazing surveillance tech from you. But in fact, so far, at least and it's only seven, eight months after that event, I see no evidence that many nations are not still doing so. And are not still interested in purchasing some of the tools and hardware that Israel, A has been testing in Gaza, and indeed now in the West Bank too.

So hype is a massive factor but the hype often is only can be stopped if people in my profession think about that, and actually ask those serious questions. Rather than going, I mean so much of the IT hype in the last nearly 20 years, and you see this constantly in articles at the bottom it often says person X was a guest of company Y in San Francisco, or whatever. Now doesn't by definition mean that the story is going to be crap, but usually it does. And this is a problem that people in my profession are willingly giving up their skepticism for access.

Access is what is killing, in my view, my profession. Access is important, you need to speak to people. I mean access often speaking to sources is partly like a romance. I don't mean in a sexual sense, I mean in a in a sense of getting to know people, and trying to find information. But that access can lead to what I think often we see with the Israeli arms industry, which is still celebrated in many parts of the world, including here. In much of the media even despite Israel committing genocide in Gaza, even still, it is celebrated in many parts of the world.

And finally, even many countries that have opposed publicly what Israel is doing in Gaza, rest assured in the coming years, they will want to buy some of the repressive tech that Israel has been using in Gaza. That's how it works. As they often say, look at what countries do, not what they say.

Michael Richardson: How big a role does the defense industry play in the Israeli economy?

Antony Loewenstein: It's big. It's big.

A few years ago, the latest figures we had for the Israeli arms industry was 2022. We haven't got the figures yet from last year, not officially anyway, but almost certainly will be high. And in 2022, Israel was the ninth or 10th biggest arms dealer in the world. America remains number one, yay America, but Israel was about ninth or 10th. It sold about 12 and a half billion US dollars of weapons. We don't have an exact number of how many Israelis in these industries, but it's a sizable proportion.

And the industry there, as I said before, is not just an arms industry. It's not just we're building surveillance tech. Yes, that's obviously part of it. We're testing it in Palestine on Palestinians, and then we're selling it globally as battle tested. That's essentially how this sick industry works. But it's also an ideological belief.

This idea that, as I said before, which starts very young in many… in Israeli society, and again, like in any example, there's exceptions to this. But if you dehumanize Palestinians so much for so long, it's not such a big deal if you need to surveil them the whole time. They're terrorists. What choice do we have? We have to surveil them. We have to monitor them. They're terrorists. This is how it works.

And the fact that there is so little dissent within Israel too, that means that arms and surveillance industry can thrive. And Israel promotes itself as the only democracy in the Middle East, which has always been bullshit, especially if you're Jewish, maybe yes, but if you're not, clearly not. And the only way that that arms industry can be curtailed or challenged, even after the last seven months, is an arms embargo. It's the only way. As I said before, if a country feels economic pain, it changes. Beyond that, words are no longer enough.

So it's a big part of Israeli society and has been for years. And in fact, there's finally, because Israel's economy has taken a bit of a hit as an ongoing war in Gaza, that arms industry, in fact, will need to, from its perspective, increase. They'll need to promote it even more. And that's something that we should be really opposing, clearly.

Toby Walsh: Antony, I get the impression from my Israeli friends and from time I've spent in Israel, that it really is sort of really embedded, not just as an industry sector, but into the fabric of the society. The way that everyone has to do national service for many, many years. Everyone walks around with guns. because when they go home at the weekend, they have to keep hold of their gun. The universities are very closely connected to the military. Everyone who does military intelligence, ends up in the startup, or working for the defense industry. I mean, it's just in the fabric of the society.

Antony Loewenstein: It is.

Toby Walsh: Everyone is engaged in it.

Antony Loewenstein: And that is why that can only succeed, in inverted commas, if there is a belief that Palestinians are not equal, to me as a Jew. That's the only way that can work. Because if you see a Palestinian as an equal to you, you're not going to be convinced that, well, they're all potential terrorists. We therefore have to surveil them 24 seven or monitor them, or drone them, or whatever it may be. That's the only way that can work. And that has worked, in inverted commas, for so long because much of the world, particularly in the West, has given us support. And not just financial support, but diplomatic support and financial support.

And that's still happening, including from here in Australia, where our government's response since October 7 has been woefully inadequate, to put it politely, to our appalling prime minister and government. And it's because they're complicit in this system. I mean, most people aren't aware of this, but Australia for years, like many Western countries, is part of a global supply chain for weapons that Israel uses in Palestine.

Australia has been, before October 7 and since, sending parts from Australian companies that are being used in Gaza as we speak. So we, as Australians, as an Australian government, complicit in what is going on. And the Australian government knows that very well. And that doesn't make much of a difference, frankly, if it's Albanese as prime minister or whether it's Scott Morrison. That has not changed. And that very unhealthy support, and frankly, illegal support, considering what Israel is doing, needs to be changed and stopped.

Michael Richardson: Toby, you've been a very outspoken advocate on regulation of autonomous weapon systems and AI technologies, and some of the technologies that Antony has been talking about on the global stage. You're expressing just a tiny little bit of cautious optimism that we might be moving in the right direction on that issue backstage a little bit. But we've had decades of stalled progress on regulating AI and autonomous weapons through the Convention on Conventional Weapons, which is a confusing name for something straightforward, but it's in contrast to nuclear weapons.

So in the Convention on Conventional Weapons, there's been discussion for decades now about how do we regulate these things. And you've gone and spoken at various various conventions and meetings of that group. So could you talk a little bit about that experience and why you think there might be a little shift happening, perhaps?

Toby Walsh: Well, I mean, on a positive sense, it is positive that a random professor in Australia could end up speaking at the United Nations to the diplomats trying to make these important decisions. And it's worth being positive. We have made some positive decisions as a global society, that chemical weapons are outlawed, biological weapons, clustering munitions, blinding lasers. There's actually quite a significant raft of technologies that we've decided are inhumane ways to fight war against the morals of the public appetite.

And we've got plentiful ways of defending ourselves and doing stuff that we don't need to use those technologies. And so there have, as you say, been discussions at the United Nations now for the last 10 or more years around, what are the rules? What are going to be the guardrails around using AI in warfare?

And the thing that's always troubled me is that I'm pretty sure we're going to decide that this is going to make warfare not a better thing, not a safer thing, not a more humane thing, but a more terrible thing. And therefore, we probably should put some guardrails in.

The thing that's always troubled me is that it's only after, if you look at almost every one of those regulations, it was only after the event, after we had seen the horrors of chemical warfare in the First World War, after we saw the horrors of the nuclear bombs being used at the end of the Second World War, that we actually had the ability to say, okay, let's draw a line here. Let's actually decide that there's some things that we shouldn't do as a world.

That's the thing that troubles me, is that I'm pretty sure we're going to at some point decide that some of these uses of AI are actually not making us a safer world, not making warfare a better thing, but making it a worse thing. But we'll only do that once we've seen it on our TV screens.

But the good news is that after 10 years of discussion, there was a very important meeting last month in Vienna, where many nations of the world came together, brought together by the insight of the Austrian government. And many nations now are actually saying that. Many nations have declared that we should actually regulate the space. What worries me, what troubles me most though, is what we see actually in Gaza, which is that it's easy enough when it's concrete and it's like an autonomous drone and a piece of kit like that. But it's where you see systems like Lavender, the way that now you're getting AI systems deciding who the targets are and telling humans where to do the killing.

And as far as we can tell, most of those targets are not legitimate. And you hear stories about how the IDF are going to the people running these systems saying, give us more targets. We need more targets. We need another. And so there's no reasonable human oversight of those targeted decisions being made. And the sad irony is that AI is being used to make the decisions, it's humans that are being used to do the killing.

Michael Richardson: Yeah, I mean, I've had exactly the same worry as you for some time now. And part of the problem, it seems to me, is that if we contrast what's happening with AI and warfare and national security more generally, to nuclear weapons, which require huge amounts of state resources, very specific and difficult to obtain materials and so on and so forth to build.

It's incredibly difficult to have a nuclear program that no one knows about. And it's something that even though we have had struggles over nuclear proliferation, and I do not want to minimize the risks of nuclear warfare, there's a different capacity perhaps to regulate that. Versus AI where the techniques and things that we see in a tool like CHAT-GPT, or other AI systems that we use, can be taken up and applied and used in a bunch of different places and big tech companies rather than states are the main drivers of the development and so on.

So it seems to me there's a big cause for concern about who's driving this too, states, yes, but also lots of other actors.

Toby Walsh: It is, but it's easy to get pessimistic for this front, because AI is going to be a more accessible technology. But then look at the example of chemical weapons. Chemical weapons are really accessible technology. But what stops chemical weapons being used? Ultimately, it comes down to public disgust. That's why they don't get used.

Antony Loewenstein: They are still used.

Toby Walsh: Unfortunately, yes.

Antony Loewenstein: Including in Gaza right now.

Toby Walsh: Yes, including Gaza.

Antony Loewenstein: But yes, in general, less than they used to be.

Toby Walsh: They're not used that much.

Antony Loewenstein: Less than they used to be.

Toby Walsh: And when they are used, there's headlines in the New York Times, there’s…

Antony Loewenstein: Well, that helps, yeah.

Toby Walsh: That helps, yes.

But ultimately, it comes down to the public has decided, rightly so, that these are incredibly distasteful. We do want that being done in anything to do with our name. Which means, and this is the most important limitation, arms companies do not openly sell you chemical weapons. And that limits their proliferation. Organizations, both state and non-state actors, decide it's probably not going to help their cause to be seen to be the cause of this chemical weapon attack.

There's plentiful other things, conventional explosives they can use, and conventional ways of killing people so they don't go down that alleyway. And that's largely contained, the threat of chemical weapons, and they don't get used. You're right, they still get used, Syria, Gaza, various other places they have been used, and that is terrible. But they're largely not used, because the public, you, have decided that they're not. And that's what we have to decide with AI and warfare, autonomous weapons. We have to decide that that's not to be done in our name.

Michael Richardson: You mentioned a minute ago the system Lavender, which has been reported on by 972, an Israeli investigative magazine, along with Gospel, a similar program, that’s used...

Toby Walsh: Who comes up with these wonderful names?

Michael Richardson Laughs

Toby Walsh: Lavender sounds like something you'd want.

Michael Richardson: I believe they're both translations from Hebrew, but I won't attempt to bungle them.

Toby Walsh: Do they have better meanings in Hebrew?

Michael Richardson: No, I mean another part of that tool is called Where's Daddy?

Toby Walsh: Where's Daddy? Oh my God. Sorry, excuse me.

Michael Richardson: So there's these reports about these targeting systems and these are pieces of software, essentially, large AI models that are interfaced through some kind of piece of software. And as we talked about before, it's very difficult to know exactly how that system works. Even if we had it in front of us, it might take Toby a while to take it apart and give us some guesses about how it actually functions.

But I did want to ask you, in general, how would an AI target recommendation system work? If you had to just speculatively take us through what it might look like to have a system that looks at a population of people and attempts to identify targets.

Toby Walsh: The sad thing is four or five years ago, there was this big demonstration. Workers at Google resigned over this Project Maven that you may remember, where they said, we're being asked to develop object recognition systems, AI systems to recognize targets.

The problem is we're drowning in a sea of knowledge, right? We're getting all this intel back. We've got satellites, we've got drones, bringing all this imagery back. Too much imagery to look at, for a human to look at. So of course the obvious solution is throw some AI at the problem and you could do that. And the people working for Google said, well, wait a second, this could be used to automate the targeting. And they were poo-pooed at the time. And it turned out Google said they weren't going to do that, but Palantir, another tech company, took up the contract.

And five years later, we see exactly, we see it being operationalized in Gaza. And we see exactly the worries that people said five years ago, which is that, they'll make mistakes! And they will make mistakes.

Antony Loewenstein: It's not mistakes. No, I disagree. It's not mistakes.

Toby Walsh: Well, actually, no, yes, you're right. They know.

Antony Loewenstein: Exactly. So exactly. Mistakes suggest, oh, sorry, we just killed, you know, your entire family. I don't know. It's willful, it's willful destruction. This is the point.

Toby Walsh: They're fully aware.

Antony Loewenstein: They're fully aware. I mean, obviously under the guise that we're going after terrorists. But ultimately when you, yeah, it's a, it's a conscious decision to not care about whether if you do kill huge amounts of Palestinians, which is exactly what is happening. To what end? To the end of Gaza has been rendered unlivable. And the ultimate outcome will be that Israel, I mean, you asked for, sort of, a positive, so to speak, ending because we've got three minutes to go.

Audience Laughter

Antony Loewenstein: The, the so-called positive, which is on one level impossible to say when there's literally still mass killings going on at the moment. So I'm not saying this in any glib way, but Israel is for the first time, certainly in my adult life, is facing a backlash and pressure which they’ve never experienced before. And it's only beginning. And that's something that I obviously welcome. Because this will, what would that look like? Divestment, potential arms embargoes, sanctions. Not every state's going to impose that next week. I'm under no illusion about that. The US won't, yet.

But there's no doubt that Israel is aware that they are facing increasing global isolation and they will bombard us with influencers being sent to Israel saying what a glorious, wonderful democracy it is. Mark my words, I know the script. I don't think the majority of people, including young people who are the future, thank God, will buy that bullshit. Because they're not buying it now while it's happening. This not based on my view, based on public opinion polls, the US and elsewhere, and here.

So I think it's something to be aware that it only took decades and decades of apartheid South Africa before finally enough of the world came on board and said enough. Enough. You're a pariah state. You either join and change and evolve and treat people equally, or you're going to be excommunicated. This is how it works. And the last seven months have shown every reason why that needs to happen sooner rather than later.

Michael Richardson: Indeed.

Audience Applause

Michael Richardson: So we have AI technologies being deployed in genocide in Palestine right now against Palestinian people. We've seen AI applied in a host of harmful ways, including in this country and around the world in various kinds of applications.

Toby, you've talked a lot over many years, and you've written about in your book some things that we might do to try and bring, to tame AI, to bring it back in line a bit. And so what would be your number one desire to see? Like what's the number one action that people here or the government could take?

Toby Walsh: As I said, I think, as Antony says, it comes down ultimately to public pressure. And so we live in democracies. Every time you go to the ballot box, every time you turn up at a political demonstration, you're helping to change the world.

Michael Richardson: Be part of the public pressure, everybody.

I'd love to continue this conversation for another hour, but I'm afraid we've come to the end of the time that we have. So thank you to Sydney Writers Festival for bringing this important topic to the stage. And thank you all for being here.

Please join me in thanking Toby Walsh and Antony Loewenstein.

UNSW: Thanks for listening. This event was presented by the UNSW Centre for Ideas and the Sydney Writers Festival. For more information visit unswcentreforideas.com

Speakers
Antony Loewenstein

Antony Loewenstein

Antony Loewenstein is an independent journalist, best-selling author, filmmaker and co-founder of Declassified Australia. He's written for The Guardian, The New York Times, The New York Review of Books and many others. His books include Pills, Powder and Smoke, Disaster Capitalism and My Israel Question. His documentary films include Disaster Capitalism and the Al Jazeera English films West Africa's Opioid Crisis and Under the Cover of Covid. He was based in East Jerusalem 2016–2020.

Michael Richardson

Michael Richardson

Michael Richardson is an Associate Professor in Media at UNSW Sydney in the School of the Arts and Media. His transdisciplinary research looks at how technology, and culture intersect with war and ecological violence. 

Richardson recently held an ARC Discovery Early Career Researcher Award for a project on drones and witnessing, and is now working on the intersection of automation, military technologies and the climate crisis.  

The author of academic books, journal articles, chapters, and essays, Michael often appears in the media to discuss drones, surveillance, and military technology. He is the co-director of the UNSW Media Futures Hub and the Autonomous Media Lab, and an Associate Investigator in the ARC Centre of Excellence for Automated Decision-Making and Society. 

Toby Walsh

Toby Walsh

Toby Walsh is Chief Scientist of UNSW.AI, UNSW Sydney’s new AI Institute. He is a strong advocate for limits to ensure AI is used to improve our lives, having spoken at the UN and to heads of state, parliamentary bodies, company boards and many others on this topic. This advocacy has led to him being "banned indefinitely" from Russia. He is a Fellow of the Australia Academy of Science and was named on the international "Who's Who in AI" list of influencers. He has written four books on AI for a general audience, the most recent is Faking It! Artificial Intelligence in a Human World.

For first access to upcoming events and new ideas

Explore past events