Friday, 05 April 2019

3AW Melbourne Tom Elliott Drive



Subjects: social media laws

TOM ELLIOTT: Okay. Now, I mentioned this new law that actually passed through the Parliament yesterday. I don't know how they did that with all the wash-up of the Budget, but they did. Now, it's essentially a reaction to the massacre in Christchurch two and a half weeks ago. The new law is entitled the Sharing of Abhorrent Violent Material Bill and what it seeks to do is stop people who engage in violent acts like terrorism or murder or assaults or kidnapping or rape from filming it and putting it online. And of course, we know that Brenton Tarrant, the alleged Christchurch gunman, that's exactly what he did - he livestreamed his assault on the two mosques.

Now, I see what the Government is trying to do. They want to try and stop people from doing it. The bill - and I read through it - also puts the onus on media organisations and hosting platforms, and I guess companies like Facebook, to take down such material if it is posted. But I have concerns about it. I do have concerns. For example, it says that if you kidnap someone and you film yourself doing it, that is the sort of violent abhorrent act that should be taken down. And yet three years ago, Sally Faulkner, whose ex-husband had kidnapped their two children and taken them to Beirut and Lebanon, she and a Channel 9 film crew went into Lebanon, into Beirut, and tried to extract the two kids and in fact, the encounter was filmed. Now, arguably under these laws, an Australian mum trying to get back her own kids would not be allowed to film the efforts and put them on a website. I just wonder if this legislation has been properly thought through.

Joining me on the line now, the Attorney-General of Australia, Christian Porter. Mr Porter, thank you for joining us.

CHRISTIAN PORTER: Tom, good to be here.

TOM ELLIOTT: Okay. So look, I've read through the bill. What sort of material are you trying to stop being posted to the web?

CHRISTIAN PORTER: Well, the absolute, most serious and violent materials. So, if I could just perhaps clarify with respect to that kidnapping example, kidnapping is specifically defined for the purposes of the act as a scenario where a person detains another person without their consent for the purpose of holding the person to ransom or hostage, or to murder or torture the person, or cause them serious harm. So quite clearly, the scenario that you've described where the mother was attempting to retrieve children from a foreign country wouldn't fall into that definition of kidnapping. So it's meant to do two things with respect to defining abhorrent violent material. Firstly, narrowly define the material as the most graphic violent types of offending, but most importantly, the footage has to emanate from, be created by the perpetrator themselves. So it is a very narrow category of video footage, if you like, and what we are saying is with respect to that very narrow category of video footage, there are particular responsibilities on social media platforms and other platforms to ensure that they become aware at an appropriate time, reasonable time, that they're carrying that material and when they do, to take it down expeditiously.

TOM ELLIOTT: Okay. Well, can I just throw another scenario which is realistic? So, during the Syrian civil war, the Free Syrian Army, which was declared to be a terrorist organisation by the official government of Syria, that's Bashar al-Assad's Government, it quite often filmed battle scenes. Now, in those battle scenes, they were shooting artillery and firing machine guns and whatever and they would have been killing Syrian soldiers in the process. Now, that sort of imagery was put up on the internet. In fact, it was widely used by many news groups out there. Under this law, I reckon that'd be stopped because they were officially a terrorist organisation by the Assad Government and they filmed themselves engaging in military action against the Assad Government, and that film or that video was put up on many websites and used on many news services.

CHRISTIAN PORTER: Well, I'm not overly familiar with that particular piece of footage but terrorism has a particular definition and indeed, the definition of terrorism has been particularly narrowed for the purposes of defining abhorrent violent material. So that the terrorism has to involve physical impact on other human beings ….harm.

TOM ELLIOTT: Well, this terrorism involved- well, sorry, but this terrorism involved firing artillery pieces and shooting 50-calibre machine guns. I mean, I would say that's a pretty decent physical impact.

CHRISTIAN PORTER: Well, but showing the firing of a weapon is very different from showing the impact of the weapons, ballistic projectile, into a person that kills them. So, it's the harm here. And I understand your point that even with respect to the Christchurch video, there was a range of material in the video which, if you like, was preparatory to the actual violence of the video, and this bill doesn't seek to prevent that from being used by media outlets.

TOM ELLIOTT: But I'm sorry but it would, because if you showed a battle scene that was graphic - and they've had graphic battle scenes from - going back to the Vietnam War. I remember people talking about how you watch the nightly news in the late 60s and early 70s and how graphic it was. I mean, take that little girl who was running away from a napalm attack. What I'm saying is, this bill would stop that sort of footage.


TOM ELLIOTT: It would. I'm sorry, it would.

CHRISTIAN PORTER: No, it wouldn't. But let me explain why, because that footage of that young girl in Vietnam, who was horrifically injured with napalm, wasn't taken by the perpetrator of the violence. It was taken by a journalist, right? So you can understand that this is about violent footage that emanates from the perpetrator of the violence...

TOM ELLIOTT: Okay. But you're ignoring the recent example I gave you - the Free Syrian Army. They did film themselves in battle. They did share that footage because they were trying to get the rest of the world to come onto their side. And I'm just saying that they were officially deemed however to be terrorists by the Assad Regime, and I'm just saying that under this law, them posting that footage of them fighting a battle against someone most of the world thought was a monster, being Assad, that would not be allowed.

CHRISTIAN PORTER: But there is a difference between footage of a battle and footage of someone being killed, literally the act of their death, and no mainstream media in Australia show that sort of footage on their nightly news. They just don't do it, which you'd acknowledge, surely.

TOM ELLIOTT: No, I don't know. I've seen some pretty horrific …

CHRISTIAN PORTER: Well, when was the last example of Channel 7 News showing someone being physically shot and killed?

TOM ELLIOTT: I can't think of Channel 7 News but I can tell you I've seen things on plenty of websites like …

CHRISTIAN PORTER: Well, they shouldn't be there. Like, murder should not be shown on websites, I'm sorry.

TOM ELLIOTT: Well wait a minute. But this is the point. This is the point. Is a resistance army fighting against an evil dictator, which does involve deaths, is that murder? And the point is the evil dictator would say yes, it is murder and …

CHRISTIAN PORTER: Well, that would be up for the prosecution to have to prove evidently in a matter of this type, and there are a range of defences about reporting in the public interest, about having to show the footage that is necessary to uphold a law. So those sort of marginal cases are catered for in this legislation.

TOM ELLIOTT: Well, I'm sorry. I disagree. I read through the legislation and I reckon it's got a few holes in it. Okay …

CHRISTIAN PORTER: Well, every example you've given me as the holes, is itself holey. I mean, you haven't provided an example …

TOM ELLIOTT: Alright. Well, here's another one. One of my producers emailed me the Christchurch footage before I went to air two and a half weeks ago, so I knew what was going on. Now, we are an organisation that hosts material. We circulated that material, admittedly, only amongst ourselves, but we did circulate it and it was available for people in here to look at. Did we - this law didn't exist back then - but did we break a law by doing that?

CHRISTIAN PORTER: No. This law doesn't deal with the on-sharing, if you like, and you know there were we think about in excess of a million people who downloaded it and saw it or shared it, which is one of the problems obviously, and in about 1.5 million instances Google managed to stop that. But this bill doesn't deal with the on-sharing by people who might have gone on to the original platform that hosted the material and downloaded it and on-shared it. I mean, that is still itself issue …

TOM ELLIOTT: Well, I'm sorry. Section 474.33 - a person commits an offence if they provide a content service - I would say that's us; that's exactly what we do. The person is aware that the service provided by the person can be used to access particular material that the person has reasonable grounds to believe is abhorrent violent material. So …

CHRISTIAN PORTER: …that is occurring or has occurred in Australia, and they don't need to do that …

TOM ELLIOTT: Well, let's just, okay.

CHRISTIAN PORTER: … They don't need to do that if they reasonably believe the AFP is already aware of it. That's the notice.

TOM ELLIOTT: … But we didn't know. We didn't know. Okay. Let's just, I mean, obviously …

CHRISTIAN PORTER: How could you not know- how could you not be reasonably aware that the AFP knew that the attacks were going on in Christchurch when it was on every major news server in the country?

TOM ELLIOTT: Yes, but this is the thing: sometimes we get tip-offs about things and we get stuff. We're the first people to see it. It happens all the time.

CHRISTIAN PORTER: I'm happy to talk about every example that you raised but you've not yet raised an example that would cause anything that any reasonable citizen would think is problematic under these laws. They're narrowly designed to stop platforms like Facebook livestreaming and then having available to play the most horrific events of mass murder that a 10-year-old can go online and see. And we're a Government who decided that we had to do something urgently because there was no appropriate law that would give us recourse against an organisation that was so derelict in its duty with respect to content that they allowed recklessly that to exist on their server and be able to be accessed.

TOM ELLIOTT: So do you think there's enough carve-outs here, enough protection, for journalists to be able to do their jobs?

CHRISTIAN PORTER: Absolutely. And there's a specific defence that any journalist in the public interest in reporting news considers that it's in the public interest to relay the material - that they have a defence in that respect. But again, the type of material that this bill specifically is designed to prevent being available on electronic platforms is the type of thing that responsible editorial decisions ensure doesn't get shown on the nightly news or on a radio station's website; and the reason it doesn't get shown because responsible journalists, like yourself and your editorial staff, make a clear decision before they allow anything like that to be shown as to whether it's appropriate and in or outside the law.

The problem with Facebook and Twitter is that this material goes up; in the case of the Christchurch material, it live-streams for 17 minutes. At the 29-minute mark, there's a complaint, and they do precisely nothing to remove it from their hosting service until after the New Zealand Police formally called them to tell them that it's on their site. Well after that point in time where the rest of the world…..

TOM ELLIOTT: So, but is that what you really want: you want the Facebooks of this world to look at these sorts of clips and sort of make a decision on the spot: yes, this should go down, before the police come and knock on their door?

CHRISTIAN PORTER: Well, the bill describes a set of circumstances where if a hosting service like Facebook is reckless as to the existence of content that we would describe as abhorrent violent material and then don't make efforts to take it down expeditiously, they commit an offence. Now, I can't tell you precisely at what point in time it was reasonable for Facebook to act - a jury would have to make that decision - or when they were reckless as to the point that the material was on their website. But what I can say is it's totally unreasonable that this goes on for well over an hour and the rest of the world knows about it. They themselves receive a formal complaint at 29 minutes that the material is on their platform and they do nothing until the New Zealand Police called them to take it down. I mean, no one thinks that that's reasonable, and there has to be ultimately a sanction which can be brought against an organisation that behaves that irresponsibly with respect to its content.

Now, we very much hope that this will change behaviour of the major social media platforms with respect to their content.

TOM ELLIOTT: Thank you so much for joining us.

Christian Porter there, the federal Attorney-General. I don't have the same lack of doubt that he does. Obviously, he had carriage of the legislation. I just feel that this is - there are going to be some grey areas and there are going to be times when we might look to see or view or host footage that others might find uncomfortable. We might think it has a purpose and the law might say otherwise.