Should Australia Ban Teens from Social Media? w/ Cam Wilson

Should Australia Ban Teens from Social Media? w/ Cam Wilson

Released Thursday, 5th December 2024
Good episode? Give it some love!
Should Australia Ban Teens from Social Media? w/ Cam Wilson

Should Australia Ban Teens from Social Media? w/ Cam Wilson

Should Australia Ban Teens from Social Media? w/ Cam Wilson

Should Australia Ban Teens from Social Media? w/ Cam Wilson

Thursday, 5th December 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

We're now at a point where tech is

0:02

stronger than ever. ever. We're seeing the power consolidated

0:04

in just a few tech companies. And so a

0:06

a result, we need to we need to rely

0:08

on government to be able to regulate these

0:10

things. If we're If we're doing things, if we're

0:12

taking steps that we're saying, going to is gonna

0:14

help it it doesn't, the next time we push

0:16

for something that maybe is more targeted, is

0:18

maybe is more proven, to say who's to say

0:20

whether the public is gonna support something like

0:22

that? that? Hello

0:39

and welcome to Tech Won't Save Us, Made

0:41

in Partnership with the your host, magazine. and

0:43

this week my guest is Marks, and this Cam

0:45

is an is editor at is an associate And based

0:47

on that name, maybe you can tell

0:50

where Cam is from. where Cam You've probably heard

0:52

about this policy in Australia in the government

0:54

wants to ban people under the age

0:56

of the of 16 using social media platforms, or

0:58

at least some social media platforms. It has

1:00

kicked up a lot of debate, not just

1:02

in Australia, but around the world as other

1:05

countries, or at least the media in

1:07

other countries, start to talk about about this

1:09

is something that should actually be considered in

1:11

their jurisdiction. in their jurisdictions. wanted to talk

1:13

to somebody in Australia in understands these

1:15

issues these issues well and just run me through

1:17

me through is happening there, but can

1:19

actually dig into this as we do

1:21

on tech Save Us to understand what

1:23

this policy is, what the implications of

1:25

it would be would whether this is

1:28

really the right approach that we

1:30

should be taking. we Because because we recognize

1:32

that we should be regulating social

1:34

media platforms that they do not have

1:36

all of our best interests at

1:38

heart are they are making their decisions. does

1:40

that that mean we simply under the age age

1:42

of leave leave everyone else to use these

1:44

platforms regardless of what the companies do

1:46

with them the the decisions that they make

1:48

about how they should work. I'm not so

1:51

sure about that and neither is Cam as

1:53

you'll hear in our discussion. I'm

1:55

not not even wholly opposed to

1:57

age limits on social media or

1:59

certain things online. it seems to me

2:01

that if we have issues with how

2:03

these companies are operating when it comes

2:05

to the interaction of people under 16

2:07

with using these platforms, maybe that means

2:09

there are issues with these platforms that

2:11

will affect everyone else who is using

2:14

them as well. And instead of just

2:16

banning people up to the age of

2:18

16, we should be putting much stricter

2:20

regulations on how these platforms actually work

2:22

to make sure that they better align

2:24

with the values that our societies have,

2:26

how we want these platforms to work

2:28

to make sure that we maximize the

2:30

public good of them and minimize the

2:32

harms, instead of just doing a blunt ban

2:35

of users of certain ages that we're not

2:37

even totally sure how it will be enforced?

2:39

So I think you're really going to enjoy

2:41

this conversation because we dig not only into

2:44

the policy but also the interests that are

2:46

pushing it, you know, why Australia is considering

2:48

it right now, and also the broader context

2:50

of Australian tech regulation in general because it

2:53

feels like this country of 20 odd million

2:55

people is actually taking a lot of bold

2:57

moves even if you don't agree with all

2:59

of them in a way that countries many

3:01

times their size don't even attempt. So I

3:04

found this conversation fascinating, fascinating, I love talking

3:06

with CAM and I think you're really going

3:08

to enjoy it as well. If you do,

3:10

make sure to leave a five-star view on

3:13

your podcast platform of choice. You can also

3:15

share the show on social media or with

3:17

any friends or colleagues who you think would

3:19

learn from it, and certainly ones who are,

3:21

you know, skeptical of the role that social

3:24

media is playing in our lives and might

3:26

be open to a policy like this. Maybe

3:28

this helps to drive a bit of a

3:30

discussion there as to what a better approach

3:33

might be. And if you do want to

3:35

support the work that goes into making Tech

3:37

Won't Save us every single week, get access

3:39

to the premium full-length interviews from our Data

3:41

Vampire series that are slowly being published on

3:44

our premium feed for Patreon supporters, you can

3:46

join people like the highly sensitive gaze, which

3:48

is a band in Los Angeles that supports

3:50

the show, along with Jim from Atlanta, Shane

3:53

and Dublin, and Dave from Hamilton, Ontario, by

3:55

going to patron.com/Tech Won't Save, where you can

3:57

become a supporter as well. Thanks so much

3:59

and enjoy this week's conversation. Cam, welcome to

4:02

Tech Won't Save Us. So good to be

4:04

here. I'm really excited to have you. you

4:06

on the show. You know, we've been in

4:08

touch for a while, even got a coffee

4:10

or a beer or something at one point

4:13

when I was in Sydney. So it's great

4:15

to finally have you on the show to

4:17

talk about a proper Australian issue. Yeah, and

4:19

Will Kay, I've been a big fan of

4:22

you and the podcast, I mean, the way

4:24

that you've covered, not just take, you know,

4:26

internationally, but looking at the experience outside of

4:28

the American experience. The rest of us out

4:30

here, I think it's been really great. Thanks

4:33

so much. And yeah, you know, maybe it

4:35

helps that I'm coming from outside the US

4:37

as well. So, you know, you got a

4:39

bit more of that perspective in there. But

4:42

I wanted to start with that broader view,

4:44

right? I'm sure a lot of people have

4:46

heard about this ban of under 16s from

4:48

using social media that Australia has embarked upon

4:50

and that a lot of other countries are

4:53

talking about now as a result of that.

4:55

And some countries were talking about it before

4:57

anyway, but it has really gotten into the

4:59

conversation because of what Australia is doing. But

5:02

before we dig into that specific policy, I

5:04

wanted to ask you because coming from Canada,

5:06

I often feel like Canada is really behind

5:08

in talking about tech policy and is often

5:10

just kind of a follower of what other

5:13

countries are doing. But I feel like when

5:15

I look at Australia, you know, for a

5:17

country of just over 20 million people, it

5:19

feels like it is actually trying to move

5:22

things forward. Maybe it doesn't always get things

5:24

right and kind of messes things up sometimes,

5:26

but it feels like it's much more focused

5:28

on trying to get some degree of like

5:30

control over what these tech companies are doing

5:33

in a way that you maybe don't expect

5:35

of a country of its size. I wonder

5:37

how you kind of reflect on that as

5:39

someone who is down there. Yeah, I mean,

5:42

okay, it's funny, I think we feel an

5:44

affinity with our Canadian brothers and sisters. I

5:46

think we call them snow waltzes or maybe

5:48

we're, I guess, beach Canadians, but the approach

5:51

that we've taken is, I think it comes

5:53

from the context of a few things. not

5:55

well known outside of Australia. Is that Australia

5:57

actually fun enough ends up being the testing

5:59

ground for a lot of big tech things

6:02

because we are a country that is similar

6:04

and demographic to a lot of the other

6:06

bigger Western countries. You know, we're quite a

6:08

off country but we are kind of small

6:11

and so for quite a while we've been

6:13

the testing ground for a lot of their

6:15

product features you know Google and Facebook or

6:17

meta are always testing out new things here

6:19

to see how they kind of go like

6:22

I remember I think they first tested out

6:24

the idea of hiding likes on Instagram posts

6:26

down here before rolling out to the rest

6:28

of the world but at the same time

6:31

we've kind of turned that back on them

6:33

and also turned ourselves into a bit of

6:35

a bit of a testing ground about some

6:37

forms of tech regulation here. I think the

6:39

kind of optimistic view of it is that

6:42

the context of Australia is a bit different.

6:44

We don't have the same kind of free

6:46

speech protections, but as a result we've kind

6:48

of got greater speech regulation. We also don't

6:51

have like a massive tech industry or at

6:53

the very least like we don't have a

6:55

huge like representation. of big tech company, you

6:57

know, employees in industry in Australia. So politicians

6:59

are not to the same extent are worried

7:02

about, you know, pissing off constituents or an

7:04

industry here that's drastically going to, you know,

7:06

lobby against them or affect their kind of

7:08

re-election chances. So as a result, we've kind

7:11

of done a lot of interesting things. The

7:13

more cynical view is that Australia, more so

7:15

than almost any country in the world, has

7:17

a highly concentrated media market. In particular, News

7:19

Corp, which started out down here and is

7:22

now, across the world, has a huge influence

7:24

over public policy in Australia. You know, they're

7:26

very, very active in campaigns. And they've kind

7:28

of been on the front foot for a

7:31

lot of pushing for regulation for tech companies

7:33

as well. And so that's how we've ended

7:35

up with things like the News Media Bargaining

7:37

Code, you know, this kind of world-first plan

7:39

to force tech companies to... negotiate with news

7:42

media publishers here and you know essentially like

7:44

pay them for their work and that's kind

7:46

of how we ended up with the social

7:48

media ban as well which is that you

7:51

know there was this real populist campaign led

7:53

by news corps publications here to get tech

7:55

companies to as they would say you do

7:57

something about the harms that are being done

8:00

on social media you can probably guess some

8:02

of the reasons why this campaign came along.

8:04

But yeah, for whatever reason, whether you think

8:06

it's because the difference in, you know, kind

8:08

of the way that the country is or

8:11

the way that we have different players and

8:13

invest in interest here, we've ended up with

8:15

some really interesting tech regulation that you see

8:17

playing out in all kinds of different ways.

8:20

That makes a lot of sense and it's

8:22

really interesting as well right to think about

8:24

how on the one hand Australia is very

8:26

similar to the United States Canada these types

8:28

of countries so it's a good market to

8:31

test these products but then on the other

8:33

hand you know it has a different set

8:35

of interests so that it takes different policy

8:37

decisions as a result of that you know

8:40

whether it's on tech policy or for example

8:42

one of the things that I follow really

8:44

closely is EV policy as well. And I

8:46

know a lot of the Western automakers are

8:48

watching the Australian market closely to see what

8:51

happens when Chinese brands are allowed to sell

8:53

next to them as well, which is really

8:55

interesting. But as you say, Australia has this

8:57

very powerful news media industry with News Corp

9:00

in particular, and so that plays into the

9:02

types of policies that get adopted as a

9:04

result of it. You know, up here in

9:06

Canada, we followed you with the News Media

9:08

Bargaining Code, and I wouldn't be surprised if

9:11

now as a result of seeing the push

9:13

to pass this ban of under 16s that

9:15

we're going to start seriously having that kind

9:17

of conversation because you guys pushed it forward

9:20

first, not so much because we're a kind

9:22

of like a tech policy innovator or something

9:24

like that. It's like, oh Australia has done

9:26

it. They're kind of similar to us. Maybe

9:28

we should consider something like this. Yeah, for

9:31

sure. I mean, look, I think it's interesting,

9:33

you know, the government likes being able to

9:35

say that they can be a world leader

9:37

in this stuff. And, you know, speaking about

9:40

the kind of popularity of these policies, big

9:42

tech in Australia, like a lot of the

9:44

rest of the world, isn't super popular. And

9:46

sometimes, you know, companies like Facebook have at

9:49

times, some of the worst. capability ratings. Not

9:51

that much better, often about the same as

9:53

some news media companies, but you know there

9:55

is a widespread support for policies that crack

9:57

down on big tech. You know it's a

10:00

way of showing yourself as a strong government.

10:02

The website here is that you know like

10:04

to an extent I don't think that because

10:06

of the size of Australia, the big tech

10:09

armies themselves aren't super concerned about what's happening

10:11

down here, as in like, you know, this

10:13

isn't a massive moneymaker for them. They've all

10:15

seen what happens when we kind of institute

10:17

policies like the news media bargaining code, which

10:20

then started to like find, I guess, imitators

10:22

or people kind of taking inspiration from it.

10:24

around the rest of the world. So they've

10:26

definitely kind of caught on to the idea

10:29

that maybe they should care or what's happening

10:31

down here. But at the same time, I

10:33

just don't see from them as much of

10:35

a, you know, like the warbeing efforts that

10:37

you see overseas as much, because for them,

10:40

it's not a massive line on the balance

10:42

sheet. And so as a result, you know,

10:44

we're saying that governments feel like they can

10:46

pass these policies, they're not necessarily going to

10:49

face a huge amount of opposition from the

10:51

tech companies. it's popular and also of course

10:53

like you know the tech companies have a

10:55

lot of money and so what might be

10:57

not a massive amount for the tech companies

11:00

but can end up being like quite a

11:02

lot for Australian industries down here and so

11:04

you know like the Newsome bargaining code which

11:06

we can maybe talk about or not but

11:09

like in summary like it had a intention

11:11

to fund journalism and doesn't in a very

11:13

kind of bizarre way that you know people

11:15

could call like wing tax or whatever but

11:17

like Essentially, like, you know, it has funded

11:20

a lot of journalism here, regardless of the

11:22

actual process of it. And now at the

11:24

moment they are looking to many of the

11:26

deals that were signed under the News Media

11:29

Banking Code, or should I say, like, kind

11:31

of in response to it, have finished. And,

11:33

you know, there's this push for News Media

11:35

companies to be able to sign new agreements.

11:37

The tech companies are a lot less. Happy

11:40

to do so. We've obviously seen this transition

11:42

over the last few years as places like

11:44

Meadow have just said. We're kind of out

11:46

of the journalism supporting business is not really

11:49

our problem anymore. And so as a result,

11:51

you know, this kind of context where the

11:53

companies being like, well, we don't want to

11:55

be part of this anymore, has actually created

11:58

the conditions for the social media band because

12:00

places like News Corp are looking for a

12:02

way to, in my opinion, pressure these tech.

12:04

and you know make the implicit argument that

12:06

they don't really have social license to operate

12:09

so as a result as a way of

12:11

getting money out of them in other aspects.

12:13

That's so fascinating and we'll move more specifically

12:15

onto the ban you know in the proposal

12:18

and what it's going to mean in just

12:20

a second but I wanted to pick up

12:22

on that piece in particular right because you

12:24

said that the companies are not too worried

12:26

often about what is happening down in Australia

12:29

because you know it's a relatively small market

12:31

compared to the other places like that. I

12:33

wonder if you think that changes at all

12:35

because, you know, as you're saying, when the

12:38

news media bargaining code passed in Australia, Google

12:40

and meta, you know, made these deals and

12:42

kind of just made it work, right? But

12:44

when Canada moved forward and did something similar,

12:46

Google has made a deal with publishers in

12:49

Canada, but meta just said, note we're just

12:51

removing all news links from our platform and

12:53

both of the companies started to, I feel

12:55

like, talk a lot more aggressively about these

12:58

types of proposals to make sure that they

13:00

didn't expand to California in places like that.

13:02

Do you feel like maybe they start to

13:04

pay more attention to what's happening in Australia

13:06

because they get more worried that what happens

13:09

there might spread to other places and they

13:11

need to head it off before that can

13:13

happen? Yeah, well I think it's hard to

13:15

say at the moment because we're still in

13:18

the midst of it. I mean, I think

13:20

like not to get ahead of ourselves, but

13:22

the social media ban in terms of the

13:24

like commercial aspect of it, I don't think

13:26

it's a massive commercial aspect. So their thoughts

13:29

about what they're doing in response to that

13:31

might not reflect their response to a policy

13:33

that might have cost them a lot more.

13:35

So, you know, like I said, but they're

13:38

kind of looking at So renewing these partnerships,

13:40

Meda says they're not. Google has indicated in

13:42

reporting that they're saying, well, get back into

13:44

some partnerships, but they're going to be worth

13:47

a lot less. You know, they clearly know

13:49

that the news media industry is a lot

13:51

less powerful than it was, even just like

13:53

a few years ago. I think we're kind

13:55

of seeing tech companies in Australia, I think,

13:58

feel like more or less were kind of

14:00

coming to a little bit more of a

14:02

stalemate. But yeah, I think, I mean, definitely

14:04

in terms of how that they're saying the

14:07

rest of how they're saying the rest of

14:09

the rest of the rest of the rest

14:11

of the rest of the rest of the

14:13

rest of the world the response to Canada

14:15

was obviously a lot stronger. I still just

14:18

think that to some extent, you know, the

14:20

only reason that, you know, the mother ship,

14:22

the big officers in the US repaying attention

14:24

to what Australia is doing is really because

14:27

they don't want it to happen anywhere else

14:29

if it's significant, but if not, they're kind

14:31

of happy to let the kitties table down

14:33

in Australia deal with it. Yeah, that makes

14:35

a lot of sense, unfortunately. But let's move

14:38

on to talking about the social media potential

14:40

ban of under 16s, you know, whatever is

14:42

going to come of it. So you were

14:44

talking a lot about the leverage that News

14:47

Corp has had and how News Corp, which

14:49

of course is the company that runs Fox

14:51

News in the United States, has been running

14:53

this campaign across its media properties in Australia

14:55

in order to advocate for social media to

14:58

be banned for under 16s, which of course

15:00

is the policy that the government has now

15:02

moved forward. What would you say is driving

15:04

this ban? You know, is it just the

15:07

news corp thing? Like, what else is behind

15:09

this? Yeah, for sure. I mean, I think

15:11

like we've seen around the world, there's been

15:13

a real push over the last few years

15:15

around the ideas of what tech companies are

15:18

doing and how they're treating children. And I

15:20

think, you know, saying we can look at

15:22

the news corp kind of role in this

15:24

in a second, but like, you know, the

15:27

broader context around the world is that, you

15:29

know, you know, what they knew about how

15:31

teens were feeling when they used Instagram and

15:33

just to the kind of you know the

15:35

vibes like people for the first time you

15:38

know we're seeing generations of parents look down

15:40

at their kids and saying you know my

15:42

kid has had a mobile phone since they're

15:44

11 years old I see them using it

15:47

I don't like it you know I think

15:49

that it's replacing things like in person interaction

15:51

and you know generally knowing that around the

15:53

world also like kids are for the most

15:56

part West happy than they have been in

15:58

the past they're reporting more mental health issues

16:00

and even you know things as severe as

16:02

suicide attempts and people have kind of linked

16:04

that to you know what they kind of

16:07

pointed the mobile phones as that seeing the

16:09

big change over the last few years and

16:11

saying we've got to do something about that

16:13

perhaps the kind of front of a risk

16:16

over the last year or two is the,

16:18

I think, social psychologist, I think that's the

16:20

great term, Jonathan Hade, who's an author who

16:22

published a kind of pop psychology book called

16:24

The Anxious Generation, which makes the case that

16:27

this generation of young people have essentially much

16:29

worse off as a result of using mobile

16:31

phones and social media. Now what the actual

16:33

real research says from experts who do actually

16:36

conduct studies on young people and understand this

16:38

kind of stuff and look at it really

16:40

really closely is it's a lot more complicated

16:42

than that and you know the kind of

16:44

consensus is that it's very hard to know

16:47

whether social media itself is making children less

16:49

happy, more mentally unwell, or having other negative

16:51

outcomes? And essentially that we need kind of

16:53

more research on this because it's very hard

16:56

to draw out things, for example, like, yes,

16:58

like kids have obviously had mobile phones, you

17:00

know, only in the last few years. But

17:02

a lot of the things have changed around

17:04

the world as well. Like you talk about

17:07

it all the time in this podcast, but

17:09

of course, like, they're massive societal trends and

17:11

global trends that are making a lot of

17:13

people less happy. And so we can that

17:16

purely to mobile phones is kind of a

17:18

very elementary link. And of course, I should

17:20

mention as well, social media allows teens and

17:22

people for wages to connect with other people

17:24

as well. So there are benefits, you know,

17:27

having this kind of nuance. The context is

17:29

that, you know, there's this fear about what

17:31

social media companies and the products have done

17:33

to young people's brains and then you kind

17:36

of have this push from a there was

17:38

really two campaigns led by mainstream media companies

17:40

News Corp which you mentioned before which was

17:42

I think the campaign was called Let Them

17:45

Be Kids and one also led by a

17:47

radio host here in Australia who works in

17:49

Nova which fun enough I think is or

17:51

was partly owned by Lockon Murdoch as well

17:53

which is called 36 months, which was calling

17:56

to raise the minimum age of social media

17:58

use from 13 to 16. So, you know,

18:00

against this kind of like round swell of

18:02

international kind support for changing something like this,

18:05

capped off by these two really mainstream campaigns,

18:07

you ended up having both the Prime Minister

18:09

and the opposition leader, so the heads of

18:11

both are major parties saying that they wanted

18:13

to ban teens from social media. There was

18:16

a lot of chat about that, but the

18:18

actual process of kind of creating a bill

18:20

and legislating it happened very short order in

18:22

pretty much a week or two. and then

18:25

as of last week they've passed the legislation

18:27

saying that Australia will ban teens under 16

18:29

from being able to create accounts on social

18:31

media and you've got a year tech companies

18:33

to figure out exactly how you're going to

18:36

do it. That gives us a really good

18:38

picture of what played out there so thank

18:40

you for that and it's also really interesting

18:42

to see how you know the influence of

18:45

major interests in Australia can push forward a

18:47

policy like this because you know anyone who

18:49

follows Australia knows the influence that news corp

18:51

and that these media organizations have down there,

18:53

and that media can have in many countries

18:56

when they use their influence in order to

18:58

drive a particular policy or position. Now you

19:00

were talking about the evidence behind this, right?

19:02

And I think how you described it really

19:05

lines up a lot with how I have

19:07

understood this issue, right, where there is a

19:09

legitimate concern here about the broader effects of

19:11

social media, the effects on mental health, but

19:13

it often feels like that is conflated or

19:16

exaggerated in order to try to create this

19:18

like moral panic that is happening and that

19:20

people like Jonathan Hate have really picked up

19:22

on, which is to say that yeah there

19:25

probably should be something done here and we

19:27

should be looking at this issue but is

19:29

an outright ban really the right way to

19:31

approach it or should we be trying to

19:33

look at something more nuanced here? Why is

19:36

it that the government has gone with this

19:38

bad approach? You know, you've been explaining the

19:40

media push behind it, so maybe that's just

19:42

all of it. And what is the real

19:45

goal that they're trying to achieve here? Like,

19:47

what are they saying that this policy is

19:49

going to do? Yeah, so I mean, look,

19:51

I think there's obviously a lot of players

19:54

involved in this, but, you know, setting the

19:56

table, like, it is a widely popular policy,

19:58

depending on which you look at, there's

20:00

somewhere between like 55 to 77, I

20:03

think it was, percent of people support

20:05

banning children from social media. So, you

20:07

know, doing something like this is very

20:09

popular. And I should also add that

20:11

like, you know, all the time we

20:13

talk about polling and policies have a,

20:15

you know, certain support or whatever, like

20:17

there was a misinformation bill, which you

20:19

might get a chance to chat about

20:21

later, which was broadly, I think, supported

20:23

or always the, you know, the idea

20:25

of doing something about misinformation on social

20:27

media is broadly supported. What's different about

20:29

this policy is that not only is

20:31

it popular, but I do think it

20:33

actually really matters to a lot of

20:35

people. A lot of parents out there

20:37

are worried about their kids, particularly after

20:39

COVID, after we spent a few years

20:41

indoors and people were so worried about

20:43

their children who lost all this direct

20:45

face-to-face communication. So a policy like this

20:47

not only is like, you know, a

20:49

lot of people like, but also is

20:51

potentially one that is I think like

20:53

a vote getter, a vote change or

20:55

a vote winner, whatever you want to

20:57

call it. The other thing is like

20:59

if you think about it from like

21:01

a political perspective, it's a pretty good

21:03

policy to have in terms of you

21:05

set a war and the war itself

21:07

was kind of created as a very

21:10

basic framework. It says Social media companies

21:12

have to take reasonable steps to restrict

21:14

children from under 16 years old from

21:16

using their platforms. Reasonable steps is going

21:18

to get defined in the next year

21:20

or so. Australia has an internet regulator

21:22

called the E Safety Commissioner. If you've

21:24

done that, like you know if you

21:26

said you guys need to ban this

21:28

and then you said we'll give you

21:30

some rough guidelines and the government is

21:32

also running a trial looking at some

21:34

of the different technologies of how to

21:36

figure out people's ages online to be

21:38

able to restrict 16 year olds and

21:40

under. Then like you know you've kind

21:42

of done everything doesn't really cost anything,

21:44

you know, it's like in terms of

21:46

like policies There's very little downside for

21:48

the government. It's only they have to

21:50

you know balance the budget and I

21:52

can see from their perspective Why it's

21:54

something that they kind of want to

21:56

push at the same time? It undercuts

21:58

a lot of the work that Australia

22:00

has done, you know I was talking

22:02

about some of the big pieces of

22:04

regulation, but some of the other stuff

22:06

that doesn't get quite as much attention

22:08

in Australia is, like I mentioned before,

22:10

we have this regulator called the E

22:12

Safety Commissioner. Started out as actually the

22:14

Children's E Safety Commissioner in the mid-2010,

22:17

a lot of what it has done

22:19

at the start was working with tech

22:21

companies essentially more or less is like

22:23

almost like an ombudsman or almost like

22:25

the Australian government kind of like a

22:27

liaison to big tech like almost like

22:29

an ambassador or something what the role

22:31

did was a lot of it was

22:33

just getting compliance about how children were

22:35

having bad experiences online everything from like

22:37

cyber bullying to image-based abuse because of

22:39

the kind of relationships that set up

22:41

with the tech companies, it was able

22:43

to report stuff and get them acted

22:45

on quickly. So essentially helping Australians navigate

22:47

our tech companies' existing policies, you know,

22:49

in terms of like, you know, that's

22:51

not necessarily the most powerful role, but

22:53

in terms of like... kind of what

22:55

it did. You know, I do hear

22:57

from a lot of people that it

22:59

was of great assistance when they, you

23:01

know, had problems like this. And then

23:03

they've kind of added more and more

23:05

power to this role to be able

23:07

to regulate the tech companies. And so

23:09

over the last few years it's been

23:11

coming up with these online safety codes,

23:13

which are regulations that were essentially what

23:15

it does is it kind of says

23:17

hey tech providers everything from social media

23:19

companies to search providers you have to

23:21

come up with rules about how you're

23:24

keeping children and Australians safe online and

23:26

so I'm saying you know these are

23:28

the things that we're worried about we're

23:30

worried about you know abhorrent violent material,

23:32

child sex abuse material, you need to

23:34

come up with rules that say how

23:36

you're doing with this. and then I'll

23:38

decide whether those rules that you've kind

23:40

of created for yourself are good enough.

23:42

If they're not good enough, then I

23:44

will come up with my own. In

23:46

most cases, the tech companies kind of

23:48

came up with stuff that met the

23:50

standards of what this regulator wanted, and

23:52

then it says, now you have to

23:54

put them into place, and then you

23:56

have to report on how you go

23:58

with them, and then if you don't

24:00

reach standards that we expect that will

24:02

either improve these rules, or we'll rewrite.

24:04

for the most part, these tech companies

24:06

have not end up facing any fines

24:08

or anything. It's mostly been like, you

24:10

know, like, as I kind of described,

24:12

it's pretty co-regulatory, like it's pretty friendly

24:14

with these companies. But as a first

24:16

step of kind of being like, what

24:18

are you doing? This is the way

24:20

we need to head towards. Think quite

24:22

effective. There has been a little bit

24:24

of fighting with Eon Musk over Twitter,

24:26

who obviously, as you might know, is

24:29

not super cooperative with some of these

24:31

schemes. But generally, you know, it has

24:33

been about this kind of softer approach

24:35

to regulation that has helped, I think,

24:37

at least a little bit, in terms

24:39

of companies and how they are training

24:41

Australian citizens and, you know, how they're

24:43

expected to behave in Australia under our

24:45

rules. When you kind of introduce this

24:47

tech social media ban for under 16s,

24:49

all of a sudden all the kind

24:51

of progress that they've made on saying

24:53

we needed to do more about these

24:55

kinds of things, something kind of goes

24:57

out the window because rather than being

24:59

like what's closely regulated and understand how

25:01

children are using the technology, we're just

25:03

saying you can't use it. And at

25:05

the same time, the saying you can't

25:07

use it and saying tech companies, you're

25:09

responsible for keeping kids off it. The

25:11

Australian government, the prime minister, everyone involved

25:13

with said we expect kids will get

25:15

around this in some ways. It's about

25:17

creating friction. It's not about stopping every

25:19

single teen from getting on social media,

25:21

but it's about making it harder. The

25:23

funny thing is that you've ended up

25:25

with the system where you've said for

25:27

so long we've been trying to encourage

25:29

tech companies to change their products in

25:31

Australia to, you know, for example, customize

25:33

them, change features, they're more friendly for

25:36

Australian users and in this case children.

25:38

And then all of a sudden we're

25:40

going to get rid of those features

25:42

and we expect that you're now just

25:44

going to use the products without them

25:46

being customized for children and possibly facing

25:48

the very problems that they were trying

25:50

to regulate them out of by changing

25:52

the features. Now I'm just going to

25:54

be exposed to that because in the

25:56

eyes of tech companies, to try to

25:58

make sure that they are taking that

26:00

are helping to address some of the

26:03

problems without needing like explicit regulation to

26:05

affect every different thing and I feel

26:07

like you know for me for my

26:09

approach to it this has kind of

26:11

been my biggest criticism of this attempt

26:13

at a ban right you know it

26:15

doesn't differentiate between the different users, it

26:17

just says, okay, if you're under a

26:19

certain age you can't access things and

26:21

if you're above it you can, but

26:23

if these, you know, particular features, if

26:25

the way that these products are designed

26:27

are such that people under 16 are

26:29

being harmed from them, then you would

26:31

imagine people over 16 or 16 and

26:33

above are also being harmed by them

26:35

in certain ways. So why shouldn't we

26:37

be having a discussion not just about

26:39

banning social media for a certain age,

26:41

but talking about the way that these

26:43

platforms work? the way that these different

26:45

features are designed, the way that the

26:48

algorithms work, all these sorts of things

26:50

to say, okay, there are certain aspects

26:52

of social media that we don't agree

26:54

with in our societies that don't align

26:56

with our values, and we should be

26:58

targeting those instead of just saying this

27:00

whole group of people is off of

27:02

social media completely. Yeah, totally. I mean,

27:04

this is like a really interesting aspect

27:06

to it, which is that we kind

27:08

of acknowledge in society that children are

27:10

more vulnerable, and so we expect to

27:12

take greater steps to protect them. But,

27:14

you know, implicitly, when you take a

27:16

step to protect someone, you also kind

27:18

of infringe upon their rights. And we

27:20

acknowledge, for example, Instagram, run by Meadow,

27:22

obviously has a feature that it calls

27:24

children's accounts. And what it does is

27:26

it changes the platforms in a few

27:28

ways. including restricting children that can't message

27:30

people who aren't other teens. So, presumably

27:32

the idea is to stop, you know,

27:35

any exploitation or untoward communications between adults

27:37

and kids. Obviously that is a step

27:39

that's been taken to protect teens, but

27:41

at the same time, of course, that's

27:43

like literally limiting their ability to communicate

27:45

on the platform. It is, I just

27:47

find it endlessly funny that like we've

27:49

ended up in this position where Around

27:51

the world, you said, there's a lot

27:53

more push for regulation for tech for

27:55

children. might say some of the features

27:57

that are pushed for children and have

27:59

been pushed on other people. But generally

28:01

I kind of see it actually almost

28:03

as like an uncomfortable bifurcation where we

28:05

say we want to have kids, we're

28:07

more worried about them, but once you

28:09

hit 16 you're kind of on your

28:11

own, anything that happens to you as

28:13

a result of these platforms, well you've

28:15

kind of taken the choice. Whereas I

28:17

think like for most of us There's

28:19

not really that much of a choice

28:22

using social media other than whether you

28:24

use it or not. Wouldn't it be

28:26

great if we had some more of

28:28

these kind of nuanced conversations about how

28:30

we could regulate for adults about some

28:32

of the things that you raise as

28:34

well? But unfortunately, I think this is

28:36

very much this false dichotomy where it's

28:38

like if you're trying to do anything

28:40

that in any way changes platforms, you're

28:42

somehow infringing on free speech. and you

28:44

know this kind of like it's almost

28:46

like a trump card right like anything

28:48

that you might do for example some

28:50

of the proposed changes in this misinformation

28:52

war which was pushed and then ultimately

28:54

ditched by the government you know the

28:56

opposition was not like well you know

28:58

how do we kind of balance this

29:00

about with ideas of speech how do

29:02

we make sure that we can you

29:04

still have good political discussion but maybe

29:06

stop some of the speech that is

29:09

inhibiting political discussion because it's you know

29:11

bullshit and it's overpowering any discourse it's

29:13

just you're either for or like limitless

29:15

free speech or you're against it. Unfortunately

29:17

I think that ends up stopping a

29:19

lot of attempts regulation but when it

29:21

comes to children we're not as much

29:23

worried about that and you know that

29:25

the children themselves I think have a

29:27

right to political communication. And interestingly, that's

29:29

actually been flagged as a way that

29:31

this social media team band might actually

29:33

be challenged in court, but at the

29:35

same time, like, I would love if

29:37

we could have a bit more of

29:39

the, you know, the thoughts that we

29:41

have about children, we decide that and

29:43

how they use technology and how it

29:45

might be affecting them, that, you know,

29:47

our approach to it wouldn't just change

29:49

the day that someone reaches 16 years

29:51

old and one day or whatever. We're

29:55

sponsored today by Audio Maverick. A new

29:57

nine-part documentary podcast about one of the

30:00

most figures in radio, Hyman Brown. Explore

30:02

the Golden Age of Radio through the

30:04

life of this famous New Yorker whose

30:06

programs brought millions of families around their

30:08

radios each night. Audio Maverick features archival

30:11

audio and contemporary interviews with media scholars

30:13

and podcasters, learn about the birth of

30:15

radio, height of radio drama, and discussions

30:17

about the new generation of podcasting audio

30:19

Mavericks that Brown inspired. As a podcast

30:22

listener, you'll love learning about Nikola Tesla

30:24

and the history of radio. Plus, hearing

30:26

incredible clips of famous shows like Dick

30:28

Tracy in Mystery Theater. Produced by Cune

30:30

TV and the Hymen Brown Archive, this

30:33

series covers decades of history in nine

30:35

episodes. Subscribe to Audio Maverick in your

30:37

podcast app now. That's Audio Maverick brought

30:39

to you by Cune TV. I

30:44

feel like because countries like Canada and Australia

30:46

have different understandings of free speech than say

30:49

the United States that there's a lot more

30:51

opportunity to have that kind of a conversation

30:53

around the different tweaks to these platforms around

30:55

the different ways that we want to change

30:58

them that might encourage a different kind of

31:00

dialogue and conversation on them a different kind

31:02

of usage pattern than we typically see on

31:04

these platforms, but that these very commercial social

31:06

networks that we have now are not designed

31:09

to engage with or to encourage because at

31:11

the end of the day they want to

31:13

increase the ad profits that they are making

31:15

and the engagement that they're receiving, they're less

31:18

concerned about the broader public benefits to these

31:20

interactions and what these platforms are providing. And

31:22

to me, that feels like the biggest missed

31:24

opportunity of what Australia has embarked on. You

31:27

know, I'm not against talking about regulating social

31:29

media or whether things should be a bit

31:31

different at different ages depending on who is

31:33

using them, but it feels like this blunt

31:36

approach has unfortunately really missed the mark and

31:38

missed out on what could have been a

31:40

very productive conversation, which as you say, it

31:42

often gets sidelined by often disingenuous arguments around

31:45

free speech and not to say that free

31:47

speech is not an important thing but it's

31:49

this particular conception of free speech that is

31:51

often tied up in these discourses that really

31:54

can push things away from where they would

31:56

be more productive and actually lead us in

31:58

a better direction. Yeah, for sure. And I

32:00

think like, you know, the greatest example of

32:03

how Australia's, you know, this social media ban

32:05

kind of undermined a lot of the other

32:07

work that Australia was doing, it was the

32:09

last day of Parliament this year that they

32:12

passed the social media ban. and the Australian

32:14

government was trying to pass through a whole

32:16

bunch of stuff before the end of the

32:18

the speculation they might go to an election

32:20

before parliament sits next year or if not

32:23

either way they're kind of running out of

32:25

time this parliament and so they wanted to

32:27

pass a bunch of stuff that they've been

32:29

promising so they I think I'm not passing

32:32

somewhere close to 40 bills including the social

32:34

media ban which actually got like a lot

32:36

a lot of attention there was actually a

32:38

huge amount of public interest about this and

32:41

in the end quite a significant amount of

32:43

opposition despite the fact that both major parties

32:45

ended up supporting it. But one of the

32:47

other bills that passed that day was some

32:50

amendments, some long-awaited amendments to the Privacy Act,

32:52

and in that was an obligation for Australia's

32:54

information commissioner, who's kind of also a privacy

32:56

commissioner, who's responsible for privacy protections in Australia.

32:59

there was something that required that she would

33:01

set up a children's online privacy code, which

33:03

was to create a new set of obligations

33:05

for online companies, including social media companies, to

33:08

have greater obligations about how they take care

33:10

of children's data and their privacy online. They

33:12

passed out at the exact same time that

33:14

they actually banned kids from using social media.

33:17

So at the same time that they're actually

33:19

giving them greater protections and saying, maybe they

33:21

can use these products, but maybe we need

33:23

to think about it in a slightly different

33:26

different way. at the same time completely wiped

33:28

out this this more long worked-on approach to

33:30

policy was wiped out with a ban that

33:32

essentially was kind of headed up by the

33:34

government in and a real populist campaign. That's

33:37

hilarious and so unfortunate at the same time

33:39

to hear that. Does that suggest to you

33:41

that this like social media ban, yes we

33:43

know it was pushed by media and you

33:46

know news corp in particular and in certain

33:48

interest does that suggest to you that, you

33:50

know, the government really pursued this policy because

33:52

it feels like they're going to an election

33:55

early in the new year, whether it's right

33:57

away or after a few months, and that

33:59

they figured it would be kind of good

34:01

electorally to do that? Like, is it that

34:04

craven the calculus? Yes, yeah totally. And I

34:06

think like we actually saw this at our

34:08

last federal election where like, you know, the

34:10

government actually proposed like some bizarre wars including

34:13

like tech companies either need to know who

34:15

a user was or if they didn't, they

34:17

would be then responsible for as the publisher

34:19

would be sued for defamation. So if you

34:22

think about like kind of like getting rid

34:24

of section 230 in West they knew who

34:26

the person was behind the account in which

34:28

case they would then like forward on the

34:31

defamation suit. So you know kind of as

34:33

a result you kind of had this policy

34:35

where essentially that was going to require the

34:37

end of anonymity on the internet. But the

34:40

reason I mention that is that that's obviously

34:42

like a huge change but it was sold

34:44

to the public and not ultimately passed as

34:46

a policy that was about protecting children online

34:48

from cyber bullying when like if you think

34:51

about it realistically how many kids are suing

34:53

other kids for defamation like doesn't really happen

34:55

it was going to end up being another

34:57

kind of like tool probably used by largely

35:00

like powerful and rich people but the way

35:02

that it was sold kind of shows that

35:04

they were trying to think of ways to

35:06

appeal to this audience of like parents who

35:09

are really worried about their kids which is

35:11

a significant audience that I've kind of heard

35:13

on from both major parties here they're like

35:15

we know that these are voters who this

35:18

matters to them a lot and they'll vote

35:20

for either party depending on something on this

35:22

so there's a real like you know this

35:24

is something that the government clearly had marked

35:27

down to themselves this is something that we

35:29

know would do really well for us and

35:31

that's kind of why we're pursuing it despite

35:33

the fact that it had you know opposition

35:36

from like you know most of the academic

35:38

community you know obviously the tech companies as

35:40

well mental health groups youth groups like all

35:42

the kinds of people involved in it were

35:45

largely against this but it did of course

35:47

have this very big media campaign that was

35:49

powered by like you know very sad anecdotes

35:51

about people for example parents who'd lost their

35:53

kids who died suicide after cyber bullying and

35:56

stuff. I do think that in that regard,

35:58

like Australia's media, whether it was the media

36:00

that was actually pushing it or like the

36:02

rest of the media, did actually kind of

36:05

fail on their responsibility there because I think

36:07

that this policy was not very well communicated.

36:09

For example, I was reading this article in

36:11

News Corp the other day where it said

36:14

that it was very sad, like a parents

36:16

were saying we're calling for Snapchat to be

36:18

included as social media to the government. They're

36:20

saying we're calling for it to be banned

36:23

because our child sadly killed herself after being

36:25

cyber-bodied on it. And the article went on

36:27

to describe that while Snapchat is a messaging

36:29

app which is supposed to be excluded, so

36:32

you know the social media band isn't going

36:34

to stop kids from using WhatsApp for example

36:36

or something like I guess like signal if

36:38

they wanted to, we consider it different because

36:41

you can have group chats that can have

36:43

a whole school in it that act more

36:45

or less like Instagram. And I read that

36:47

and I was like, that's just, that's not

36:50

true. In fact, I took it up, like,

36:52

what's up, you can have larger groups and

36:54

you can have on Snapchat. You know, there

36:56

was this concern about, like, one of the

36:59

big things the government spoke about in proposing

37:01

this policy was this idea of cyber bullying.

37:03

It doesn't really make sense, like, you know,

37:05

if you think about it, to ban one

37:07

app that you're saying is being used to

37:10

cyber boy people through messaging features, and I

37:12

think like the reason like that that was

37:14

obviously expressed in the newspaper and people kind

37:16

of took that as fact and I think

37:19

the lack of like really clear reporting about

37:21

how this policy would work you know for

37:23

example how a tech company is actually going

37:25

to implement this how are they going to

37:28

figure out what age a user is what

37:30

is that going to require facial analysis is

37:32

going to require giving government ID like that

37:34

kind of stuff was really really like not

37:37

hashed out and when it really did start

37:39

to get a bit of the end of

37:41

the campaign and a lot of people were

37:43

already you know, very broadly in support of

37:46

it, and also kind of already had enough

37:48

men to kind of make its way through

37:50

despite this kind of push at the end.

37:52

It's quite a a sad story in a

37:55

way because if you think about it like

37:57

you know those grieving parents who who called

37:59

for bands to apps because they're trying to

38:01

stop what happened to their child I don't

38:04

I don't blame them I totally understand it

38:06

in fact I can only imagine that if

38:08

I was in this circumstance I would do

38:10

the exact same thing They're not the tech

38:13

experts, you know, it's supposed to be the

38:15

experts that these outlets quoted. It's supposed to

38:17

be the journals who scrutinize claims and kind

38:19

of, you know, decide what context they provide

38:21

and who else they quote on it. You

38:24

know, those are the people who actually, in

38:26

my opinion, exploited, really sad stories because, you

38:28

know, based on what I understand, you know,

38:30

in the various ways that this policy is

38:33

going to work. It's not going to stop

38:35

cyber bullying once and for all. In fact,

38:37

I think it may restrict some, but largely,

38:39

I imagine people will just use other means

38:42

that they'll still be able to use. Another

38:44

big concern they raised was the impact of

38:46

algorithms. But one thing that I realize in

38:48

the law is written, the law actually applies

38:51

to teens having social media accounts. So you

38:53

can still use TikTok, you can still use

38:55

YouTube shorts, you just can't log in, you

38:57

just can't like like a photo, like a

39:00

video or comment or upload yourself. But the

39:02

algorithm, you know, this recommendation engine that they're

39:04

so worried about, still works. And I imagine

39:06

teens will still be using this after the

39:09

ban actually comes into force. For all these

39:11

reasons, you know, we just had this extremely

39:13

poorly covered policy that I'm just, I'm really

39:15

honestly worried will end up helping the people

39:18

that they hope to help, will end up

39:20

hurting people. This is what kind of comes

39:22

out in the research that marginalized groups, people

39:24

in situations where they don't have people who

39:27

they know who fit in the same identity

39:29

groups as them. So, you know, the LGBT

39:31

community, people were facing familiar violence. People who

39:33

turn to social media as a way to

39:35

contact people with experiences that they might not

39:38

have represented around them. They're the kind of

39:40

people who end up, I think, in my

39:42

opinion, most affected by this. And so when

39:44

you've got this government and a media supporting

39:47

it, who are both pushing for the welfare

39:49

of children, but are doing something that might

39:51

end up hurting them, it's a really sad

39:53

outcome. one that I think that I just

39:56

hope that once it kind of happens that

39:58

people don't turn away, that they kind of

40:00

say, well that's the way it is now

40:02

and that's how we think about it. I

40:05

hope it's like reviewed and closely evaluated and

40:07

also evaluated against the context of we shouldn't

40:09

compare a policy like this to either doing

40:11

something about social media or not. we should

40:14

compare it to another road that's not taken,

40:16

which is thinking cleverly about how to regulate

40:18

tech companies to make their products better for

40:20

all of them. And I think that like,

40:23

you know, that kind of approach is a

40:25

real lost opportunity that happens as a result

40:27

of pursuing something like a blanket ban. Yeah,

40:29

I think you've made a lot of really

40:32

good points there. And I feel like when

40:34

you're talking about how the media campaign drives

40:36

this and doesn't dig into the other aspects

40:38

of it, the potential downsides of a policy

40:41

like this, the other approaches that can be

40:43

taken, I feel like we saw something similar

40:45

when it came to the news media bargaining

40:47

code and the equivalent up here in Canada,

40:49

where the media was really pushing this particular

40:52

policy outcome and wasn't really talking a whole

40:54

lot about the other potential approaches or the

40:56

downsides of this kind of a policy because

40:58

that is the outcome that they wanted. And

41:01

then that really, you know, it doesn't just

41:03

affect the policy process, but it really diminishes

41:05

the public conversation that can be had about

41:07

these things so that we can have a

41:10

democratic debate about tech policy and what we

41:12

want all of this to look like and

41:14

how we want these companies to operate in

41:16

our jurisdictions to operate in our jurisdictions to

41:19

pursue. and want to make sure the government

41:21

passes that you know kind of serves them

41:23

or at least some of the interest that

41:25

they have. I mean the thing that depresses

41:28

me is like I'm a journalist I think

41:30

of all the time about like the journalism

41:32

industry and how we actually reach people and

41:34

you know make people trust in us because

41:37

ultimately that's what needs to happen. Government I'm

41:39

sure is having the same thought in the

41:41

heads as well like you know seeing the

41:43

dropping support in public institutions. And like, you

41:46

know, in this regard, giving a policy that

41:48

the public wants, but it's very in my

41:50

opinion unlikely to have. outcome that it says

41:52

it's going to, I think like, you know,

41:54

maybe you end up winning an election, maybe

41:57

you don't, you know, maybe this doesn't end

41:59

up manoring for them, but you end up

42:01

again, you know, promising something that doesn't come

42:03

through and ultimately making people more cynical. And,

42:06

you know, in this regard, like, people are

42:08

already cynical of all three parties in this.

42:10

They're already cynical about the media, the cynical

42:12

about big tech. Like if you want anything

42:15

done about, for example, big tech, you need

42:17

to have people trust in the media, you

42:19

just have people trust in the government to

42:21

be able to regulate them because, you know,

42:24

that's one of the things that I'm really

42:26

starting to see. Australian tech policy for years

42:28

and years, decades, has kind of been maligned

42:30

as not very good. And I think we've

42:33

done some good stuff and some bad stuff.

42:35

You know, I think it was back in

42:37

the late 2000s that the Australian government was

42:39

proposing to have a mandatory internet filter for

42:42

explicit content. So, you know, any person who

42:44

was using an ISP, this ISP was supposed

42:46

to ban adult content. They tested the policy

42:48

and then I think it was this really

42:51

famous thing where a like a 15 year

42:53

old was able to get around the filter,

42:55

I think, just using like a VPN or

42:57

something in like 45 seconds. That's remembered as

43:00

one of the like, you know, massive failures

43:02

of Australian tech policy and although the policy

43:04

never actually went into place, it's kind of,

43:06

you know, represents governments don't understand technology and

43:08

we can't trust them to regulate them. We're

43:11

now at a point where tech is stronger

43:13

than ever. We're seeing the power consolidated in

43:15

just a few tech companies. And so as

43:17

a result, you know, we need to rely

43:20

on government to be able to regulate these

43:22

things. If we're doing things, if we're taking

43:24

steps that we're saying, this is going to

43:26

help and it doesn't, the next time we

43:29

push for something that maybe is more targeted,

43:31

maybe is more proven, who's to say whether

43:33

the public's going to support something like that?

43:35

So well said, and I completely agree. Do

43:38

you think that there is any opportunity or

43:40

possibility that this ban ends up falling by

43:42

the wayside, say, after an election or something

43:44

like that? already moving forward to try to

43:47

take a more reasonable and more evidence-based approach

43:49

to actually dealing with what is a legitimate

43:51

problem, even if it is often exaggerated for

43:53

certain people's purposes. Yeah, I think so. So

43:56

just for context, I'm going to explain how

43:58

the news media bargaining code works, because I

44:00

think there's actually an example of how this

44:02

happens, which is the news media bargaining code

44:05

gives the government the ability to say, hey,

44:07

meta or Facebook, or I can choose another

44:09

social media company, you need to negotiate with

44:11

this publisher. And the negotiation is this weird

44:14

star, I've never heard of it before, I

44:16

think it's called like baseball star, which is

44:18

where, you know, the two parties come in,

44:20

they say, here's how much I think that

44:22

this partnership is worth, and someone who's making

44:25

a decision, like an adjudicator, I guess, they

44:27

have to pick out of the two. So

44:29

if the government said, meta, you have to

44:31

negotiate with News Corp, News Corp comes in

44:34

and says, Google, we need to pay us

44:36

10, just those two options. And so what's

44:38

called designation, you know, the decision that you

44:40

need to come to the table and actually

44:43

negotiate has never actually been used. It's just

44:45

the fear that this policy will be used,

44:47

that you'll end up in a negotiation where

44:49

really you're going to pay a crazy amount,

44:52

that is the thing that has forced the

44:54

tech companies to the table. And they've done

44:56

all these other deals that essentially is the

44:58

government to say, well, well, we don't need

45:01

to use the stick because things are going

45:03

as we hoped. I can see something similar

45:05

happening with this. The fact they need to

45:07

take reasonable steps and the government has yet

45:10

to figure out reasonable steps. And I should

45:12

also add as well, like the fact that

45:14

the communications administrator decides what social media companies

45:16

are included under this, and so it can

45:19

be everything from meta to tiny social media

45:21

companies. I think there is a very large

45:23

chance that essentially we see a few of

45:25

the major companies, so I'm thinking, you know,

45:28

meta, Tik-tik, Snapchat. X-retted, we're told that they

45:30

need to take some steps, and those steps

45:32

might be like, there might be not that

45:34

intrusive, and they might be actually quite like

45:36

non-stop. for example, if you had an account

45:39

on Facebook for 10 years, we'll decide, well,

45:41

you're probably over 16, because I doubt you

45:43

started out at six years old. All they

45:45

might even just say, whatever steps you're taking,

45:48

as long as we can be roughly sure,

45:50

based on everything from investigations to just vibes,

45:52

if we see no reason to think that

45:54

there are massive amounts of 16-year-olds on this

45:57

platform, you're fine. So in practice you might

45:59

have a policy that ends up being these

46:01

tech companies don't do anything that makes us

46:03

have to kind of crack down on you,

46:06

make it seem like you're doing enough and

46:08

we'll be happy. And in practice, we might

46:10

end up saying, you know, social media companies

46:12

still end up having like a significant amount

46:15

of teen users. It's just not enough to

46:17

cause any ruckus about it. So that could

46:19

happen and we could continue to see regulation

46:21

coming from other parts of like through other

46:24

policies as well. That being said I think

46:26

we will definitely see those major ones step

46:28

up their practices. And that's kind of where

46:30

I think really ends up some really interesting

46:33

questions, which is how did I actually do

46:35

that? And to what extent does the average

46:37

punter who's told either give me your face

46:39

or show me your driver's license, do they

46:42

blame the government for that? Or do they

46:44

keep blaming social media companies who end up

46:46

kind of copying the flag? Because I don't

46:48

think that's going to be very popular in

46:50

practice. I think the question is who ends

46:53

up being blamed, who ends up being more

46:55

vibes-based. Yeah, I was feeling that as well

46:57

when I was reading about it in the

46:59

sense that, you know, these companies already have

47:02

limits on users below the age of 13

47:04

using the platforms already have initiatives that they

47:06

take to try to, you know, limit how

47:08

many of those people are using the platforms

47:11

already, have accounts on the platforms. we can

47:13

debate about how good they are doing that

47:15

or how much work they actually put into

47:17

that kind of policing or you know do

47:20

they go really far and as you're saying

47:22

ask for the IDs and things like that

47:24

it feels like we've been down that road

47:26

before with the tech companies and they've already

47:29

felt the flack of like trying to ask

47:31

for people's IDs like in the case of

47:33

Facebook but you know maybe they that that

47:35

seems appealing to them so that they are

47:38

lobbyists and you know they are kind of

47:40

PR arms can try to blame the government

47:42

for it and get a fair bit of

47:44

blowback in that direction but yeah it depends

47:47

right and and that's kind of what I

47:49

wanted to ask you because you know this

47:51

bill has been passed as you said but

47:53

it doesn't take effect for I believe it's

47:55

like 12 months or something like that as

47:58

they work out exactly how that is going

48:00

to happen. and as I understand it, it's

48:02

the E Safety Commissioner that, you know, kind

48:04

of determines what these ultimate requirements will be.

48:07

How do you feel about that kind of

48:09

process? What do you think is going to

48:11

come of that? And I guess just broadly,

48:13

how do you feel about this position of

48:16

the E Safety Commissioner? Do you think that

48:18

they are often like a positive kind of

48:20

government liaison in that discussion with tech companies?

48:22

Or, you know, is it part of Australian

48:25

tech policy or the government's approach to tech

48:27

that doesn't work so well? Yeah, I mean

48:29

God, I could literally talk about this for

48:31

so long. I would put it out in

48:34

two things. I think that the regulator role

48:36

itself has actually been given a crazy amount

48:38

of powers and they have the ability to,

48:40

for example, even block fourth app stores and

48:43

search engines to stop linking to websites and

48:45

apps. if they haven't complied with some of

48:47

their other laws. So there is actually this

48:49

incredible capacity in this role if it was

48:52

misused, I think, to be a very, very

48:54

powerful internet sensor. At the same time, I

48:56

think that the person who's held the role,

48:58

Julie Eminem and Graham, interestingly, like I think

49:01

her one of her first jobs, she was

49:03

like a Republican staffer. She then has worked

49:05

for Microsoft, she's worked in B Tech, and

49:07

then was appointed as the role when it

49:09

was first started in 2016. when it was

49:12

called the Children's E Safety Commissioner, and then

49:14

it was like largely powerless and more to

49:16

do with that kind of what I was

49:18

talking about before being that kind of interface

49:21

with tech companies to be like, you know,

49:23

you need to do something about this content

49:25

which I think seems to like, you know,

49:27

violate your own policies. I'm just like flagging

49:30

this with you. It's grown into this massive

49:32

role that now is going to be in

49:34

charge of writing guidelines that would determine what

49:36

steps companies will be expected to take and

49:39

it's incredibly powerful. I think that she has

49:41

for the most part kind of taken, I

49:43

would say very good politically, I'd say that

49:45

she has for the most part has kind

49:48

of been in Australian media as has helped

49:50

the Australian government with messages about cracking down

49:52

a big tech, but with mostly through I

49:54

think a very like amicable relationship with big

49:57

tech which has helped them. She hasn't used

49:59

the nuclear option really at any opportunity. She's

50:01

actually currently having some fights with Elon Musk

50:03

which has been like the biggest ones over

50:06

Twitter, but for the most part she's kind

50:08

of mostly I think avoided making waves. So

50:10

in terms of like how it's actually going

50:12

to work, I mean I like her office

50:15

has a really really strong sense of digital

50:17

literacy and I think the way that they've

50:19

kind of run the office while she's been

50:21

there has been very, I would tell you

50:23

like, you know, while she's definitely pushed for

50:26

things like age verification for explicit content and

50:28

stuff, she mostly has come from a place

50:30

of like I think understanding technologies and understanding

50:32

some of the privacy tradeoffs. So I kind

50:35

of think when she's kind of been tasked

50:37

to do something like this, I expect something

50:39

that will be like quite nuanced. Finally enough,

50:41

like while this whole process was happening while

50:44

the Prime Minister and the opposition leader was

50:46

saying we sport this policy, she was actually

50:48

in public subtly, politely saying that she didn't

50:50

actually support the policy, essentially saying that the

50:53

research, including research done by her office, doesn't

50:55

support the benefits of banning social media. She's

50:57

now in charge of actually figuring out how

50:59

it's going to work. I think it's a

51:02

very interesting position to be in. I think

51:04

there's, again, a very good chance that a

51:06

lot of the steps that companies are required

51:08

to take won't be like massively onerous. But

51:11

that being said, like, you know, I shouldn't

51:13

like mention as well, like... I think it's

51:15

very reasonable to that tech companies should be

51:17

figuring out ways to actually stop kids of

51:20

any age from accessing their services. So for

51:22

example, like, you know, Snapchat, which is, you

51:24

know, a major company, enormous, like, used by

51:26

children, obviously has an appeal to children, not

51:29

just like 13 to 16, but under that,

51:31

you know, the only way that they figure

51:33

out the age of their users is by

51:35

asking them. They say, hey, what's your birthday,

51:37

and then after then they just take their

51:40

word for it. Is it reasonable to assume

51:42

that tech companies are kind of doing something

51:44

more? I mean I think so, like I

51:46

think that the last, you know, like 20

51:49

years of the internet has kind of convinced

51:51

us that like, you know, essentially, like we

51:53

should just take people's word when they say

51:55

their age. I mean, I'll put my hands

51:58

up now, we're what, 50 minutes into the

52:00

podcast, so only people who are massive fans

52:02

are still listening, hopefully, and so hopefully no

52:04

one who's gonna get me in trouble, but

52:07

like, you know, like porn starts when I

52:09

was like... 15, like I was doing the

52:11

things saying that I was 18 and I

52:13

definitely wasn't. There are kind of like two

52:16

interesting questions when it comes to this. Like

52:18

one, what age should people be able to

52:20

access things? And two, how do you actually

52:22

figure out what age people are? Let's just

52:25

move aside whatever age you think people should

52:27

be able to access things and just say

52:29

that like there is an age, just make

52:31

up whatever age you want in your head.

52:34

I do think at the moment the fact

52:36

that for the most part a lot of

52:38

the internet is just kind of based off

52:40

the trust system and the technology is what

52:43

we do have like interesting technologies which again

52:45

like a complex and her tradeoffs about you

52:47

know whether they limit people's access who you

52:49

know, for instance, people who might not have

52:51

government ID, whether they're invasive for things like

52:54

facial scanning, at the very least, like, there

52:56

are kind of other novel ways that we

52:58

should be like, I think, probably like investigating

53:00

a bit more, but they're really, really not

53:03

used even by these most massive companies who

53:05

have enormous like capacity to a very complex

53:07

and nuanced conversation. I agree with you with

53:09

that, right? I think that we should be

53:12

exploring these options in the way that we

53:14

used to have laws like limiting what advertising

53:16

could be directed at young people and you

53:18

needed to know like what age these people

53:21

were in order for those things to be

53:23

effective. I think it's reasonable that we should

53:25

have those, you know, those checks in place

53:27

for the internet and I feel like even

53:30

discussing that can often be headed off by

53:32

these conversations. oh, now everyone's going to be

53:34

asking for your ID and all this kind

53:36

of stuff online. I think that we should

53:39

be open to discussing that, to talking about

53:41

potential solutions, to accepting that maybe we wouldn't

53:43

be okay with everyone having to present their

53:45

ideas, but maybe there are other ways of

53:48

doing these forms of authentication. And it will

53:50

be interesting to see if, you know, that

53:52

examination from the E-safety commissioner of these trials

53:54

that they're running will. have any interesting results

53:56

that maybe we can learn from. And I

53:59

feel like it was fascinating when you were

54:01

talking about the e-safety commissioner. Usually when you

54:03

hear like former Republican formerly work with big

54:05

tech, it's not usually someone you inherently trust,

54:08

right? So it's interesting to see that in

54:10

this position, this person has actually done some

54:12

reasonably good things and we'll see how that

54:14

continues. But to close off our conversation, I

54:17

wanted to ask you this, right? Because we've

54:19

been talking a lot about this particular bill,

54:21

to tech policy more generally. Is there anything

54:23

else that as an international audience we should

54:26

be paying attention to with regards to tech

54:28

in Australia? You mentioned earlier a misinformation bill

54:30

and changes to the privacy bill. Yeah, is

54:32

there anything else that we should be kind

54:35

of watching that is important for maybe international

54:37

viewers to be paying attention to in the

54:39

Australian context? Yeah, I mean, so the mis-disinformation

54:41

bill, I think I mentioned before, the bill

54:44

has the powers to get records from tech

54:46

companies to say, we want to be able

54:48

to know certain facts about your platforms, so

54:50

we can kind of audit it ourselves. And

54:53

also, we want to be able to, similar

54:55

to those other codes I was talking about.

54:57

require that you take some steps, you can

54:59

suggest what steps you're taking to deal with

55:02

misinformation. And then if we don't like those

55:04

steps, we can instead write our own, and

55:06

then we can also then enforce those steps

55:08

that you're supposed to be taking to make

55:10

sure that you're actually carrying out what you

55:13

say you are. Obviously anything around like the

55:15

ideas of centering and acting on people and

55:17

specifically I should say platforms because there was

55:19

never anything in this law about like for

55:22

example like jailing people for sharing you know

55:24

a conspiracy theory it was always about understanding

55:26

the platforms and their responsibilities. It's a very

55:28

sensitive area and obviously it's incredibly inflammatory. I

55:31

can kind of understand, you know, I think

55:33

it's reasonable at some level to just be

55:35

like at the end of the day, like

55:37

I'm uncomfortable about that. But I do think

55:40

like one of the massive things that's happening

55:42

in tech at the moment, I still just

55:44

don't think gets enough scrutiny and attention is

55:46

the fact that like all the social media

55:49

companies are becoming increasingly opaque about their platforms.

55:51

You know, Twitter, you can't get API, like

55:53

an outside. More than ever before we have

55:55

no idea what's happening on these platforms other

55:58

than what we can kind of like cobble

56:00

together from the outside. So a wall like

56:02

this which yes was being like we can

56:04

potentially fine you if you're not doing enough

56:07

about misinformation on your platform. I can understand

56:09

how people might get uneasy about that. But

56:11

at least half of it, which was about

56:13

saying, we have the right and the ability

56:16

to compel you to give us some information

56:18

about what's happening. I was like, I want

56:20

to see, you know, I think the first

56:22

step in regulation around tech is understanding what's

56:24

happening inside because at the moment, these tech

56:27

companies, they hold all the cards and that

56:29

allows them to really influence how the debate

56:31

happens. So they got kind of abandoned by

56:33

the government, but that was a kind of

56:36

interesting aspect of it. And yeah, like I

56:38

mentioned, you know, the quiet work of the

56:40

E Safety Commissioner with these kinds of regulations.

56:42

They're very interesting and a lot of other

56:45

governments around the world have kind of copied

56:47

this E-50 role. You're seeing it more and

56:49

more. Seeing the kind of regulation happen, I

56:51

mean, on one hand it kind of goes

56:54

on of the radar. It's got names like

56:56

the basic online safety act codes, class one

56:58

and two content that, you know, if you

57:00

weren't already into tech, you probably fall asleep

57:03

now when I just mentioned that. You know,

57:05

you don't really get much public debate about

57:07

it, which makes me kind of a little

57:09

bit uneasy and largely leaving it to one.

57:12

unelected regulator again makes me kind of uneasy

57:14

but the same time like coming up with

57:16

more like nuanced regulations that allow them to

57:18

say hey let's avoid some of the populism

57:21

that kind of you know happened with this

57:23

teen social media ban and that's like trying

57:25

come up with really, really sophisticated expectations, I

57:27

think is really cool. You know, I've kind

57:30

of sounded like a bit of a safety

57:32

booster here. In some regards, I think it's

57:34

like, you know, this co-regulatory response is you're

57:36

always going to kind of end up a

57:38

bit like. closer to the tech companies as

57:41

a result of it and I think sometimes

57:43

that's frustrating but I think sometimes and I

57:45

don't mean to say like you know we

57:47

should be kindly working with tech companies I

57:50

think it's the opposite we should actually be

57:52

saying in a country like Australia we set

57:54

the rules for what happens in Australia. In

57:56

the same way that when we had commercial

57:59

television, or when television was the biggest format,

58:01

and there's only a handful of television stations,

58:03

we decided that we want to regulate it

58:05

because there wasn't a bunch of choice. And

58:08

so as a result, government needs to do

58:10

something about it. I think it's kind of

58:12

the same with how tech works these days,

58:14

because the utopian idea of tech that essentially,

58:17

like, there'll be an infinite amount of choice.

58:19

And so you'll be able to decide, you

58:21

know, we want to go. if a company

58:23

isn't serving your interest and you go somewhere

58:26

else. You know, we've kind of ended up

58:28

with a century of the same model as

58:30

television. We've ended up with a handful of

58:32

massive companies who kind of are now increasingly

58:35

these wall gardens who walk all this information

58:37

in and make it very hard for people

58:39

to choose. I think that we should be

58:41

trying to regulate them and in Australia, particularly

58:44

where like, you know, many of them don't

58:46

even have that many employees here. That doesn't

58:48

mean we shouldn't have expectations about how expectations

58:50

about how Australians should be able to use

58:52

these these these platforms. I am really supportive

58:55

of regulation around the stuff. I like the

58:57

idea that we're saying, hey I don't really

58:59

care if you know you're a multinational organization

59:01

who doesn't even think Australia is that important.

59:04

If you want to operate here you've got

59:06

to run by our rules and we want

59:08

to set these rules in an interesting way

59:10

that like you know again maybe it's a

59:13

little bit too close to tech but the

59:15

very least it's kind of allowing some kind

59:17

of changes to happen instead of these kind

59:19

of populous policies aren't Oh wow either. Yeah,

59:22

I'm completely on the same page that we

59:24

need to be going more aggressively. No one

59:26

will be surprised about that. And I think

59:28

the point that you bring up about the

59:31

opaque these platforms and

59:33

how that really compels further

59:35

regulation, not just to

59:37

find out what's going

59:40

on on these platforms,

59:42

but also to know, address

59:44

the the further problems

59:46

that come of that is

59:49

a a really good

59:51

point and something that

59:53

we need to be

59:55

paying more attention to

59:57

and bringing more into

1:00:00

these conversations. these Cam, it's

1:00:02

been fantastic to learn

1:00:04

more about what's been

1:00:06

happening in Australia, not

1:00:09

just about this under

1:00:11

not just media ban, but

1:00:13

the broader context of

1:00:15

what's happening down in

1:00:18

your part of the

1:00:20

world. Thanks so much

1:00:22

for taking the time

1:00:24

to talk to the world.

1:00:27

Thanks, Paris. Cam Wilson

1:00:29

is the associate editor at Tech Won't Save Us made Save

1:00:31

Us is made in partnership with The Nation and

1:00:33

is is hosted by Marks. Production is by is by

1:00:35

Eric Wickham and transcripts are by Pollu Frye. Tech Won't Save

1:00:37

Tech Won't Save Us relies on the support

1:00:39

of listeners like you to keep providing critical

1:00:41

perspectives on the tech industry. You can You can

1:00:43

join hundreds of other supporters by going to

1:00:45

patreon.com Tech Won't Save Us and making a pledge of

1:00:47

your own. Thanks for listening and make sure

1:00:49

to come back next week. week.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features