Omar Suleiman – Gaza Diaries – How Your Tech Is Being Used For Genocide

Omar Suleiman
AI: Summary ©
The speakers discuss the negative impact of tech on children and the role of technology in atrocities. They describe the negative impacts of the lack of awareness and lack of understanding of risks of technology on people's lives, as well as the negative impact of the American political system on people's views of the world. They also discuss the negative impact of the private sector, the need for people to be more aware of the negative consequences of protesting Google's actions, and the potential impact of the US's suppression of bleeding and the broader population. The conversation ends with thanks and a continuing struggle.
AI: Transcript ©
00:00:00 --> 00:00:01

I can't sleep.

00:00:02 --> 00:00:04

I'm lying in bed every night and images

00:00:04 --> 00:00:06

of Gaza are running through my head. Fathers

00:00:06 --> 00:00:09

holding their babies, dead, caked in dust.

00:00:09 --> 00:00:12

Bombs dropped on homes, on hospitals,

00:00:12 --> 00:00:13

on schools.

00:00:13 --> 00:00:17

Tens of thousands of dead and indiscriminate bombings.

00:00:17 --> 00:00:18

Children crying,

00:00:19 --> 00:00:21

pulling through rubble to find their families.

00:00:21 --> 00:00:23

This was the first paragraph

00:00:24 --> 00:00:26

in a blog post that many of us

00:00:26 --> 00:00:27

read back in December

00:00:28 --> 00:00:30

called, I can't sleep.

00:00:30 --> 00:00:33

And it was written by an individual named

00:00:33 --> 00:00:36

Paul Bigger, who has since gone on to

00:00:36 --> 00:00:36

found

00:00:37 --> 00:00:39

a group called Tech for Palestine and has

00:00:39 --> 00:00:41

been vocal about the role of Tech

00:00:42 --> 00:00:44

in this atrocious genocide that we have been

00:00:44 --> 00:00:46

witnessing day in and day out.

00:00:46 --> 00:00:48

Dear brothers and sisters, as we've been speaking

00:00:48 --> 00:00:48

about

00:00:49 --> 00:00:52

Gaza diaries, people that have been in Gaza

00:00:52 --> 00:00:53

on the ground

00:00:53 --> 00:00:55

and have witnessed the atrocities firsthand.

00:00:56 --> 00:00:58

I think one of the things that all

00:00:58 --> 00:00:59

of us have greatly underestimated

00:01:00 --> 00:01:03

is the way that tech has been deployed

00:01:03 --> 00:01:04

in this atrocious genocide

00:01:05 --> 00:01:06

that we have been there

00:01:07 --> 00:01:10

in different ways. That the same products that

00:01:10 --> 00:01:12

we consume from the same companies

00:01:12 --> 00:01:14

are being used to

00:01:15 --> 00:01:17

destroy the lives of others, even as they

00:01:17 --> 00:01:19

seemingly improve ours.

00:01:19 --> 00:01:22

That drones that fly above the heads, you

00:01:22 --> 00:01:24

might remember when we interviewed doctor Hayfa and

00:01:24 --> 00:01:26

she spoke about how

00:01:26 --> 00:01:28

you're safe as long as you can hear

00:01:28 --> 00:01:30

the sound of drones. You're not safe when

00:01:30 --> 00:01:31

they disappear because that means that a bombing

00:01:31 --> 00:01:32

is imminent.

00:01:32 --> 00:01:34

You live with this sound,

00:01:35 --> 00:01:35

zzz.

00:01:36 --> 00:01:37

And I was like, what is this?

00:01:38 --> 00:01:40

Because my bed was exactly next to the

00:01:40 --> 00:01:41

window.

00:01:51 --> 00:01:54

And they say it's so casual. Oh, this

00:01:54 --> 00:01:55

is the drones.

00:01:55 --> 00:01:56

Dactora,

00:01:56 --> 00:01:57

don't be scared.

00:01:58 --> 00:02:00

I said drones. They said, oh, no. As

00:02:00 --> 00:02:01

long as you hear it,

00:02:02 --> 00:02:04

then you're safe. It's when it's quiet

00:02:05 --> 00:02:06

you need

00:02:07 --> 00:02:09

to be careful. Mhmm. And if you hear

00:02:09 --> 00:02:10

the sound of bombing,

00:02:11 --> 00:02:13

you need to smile. Listen to this,

00:02:13 --> 00:02:14

SubhanAllah.

00:02:14 --> 00:02:16

If it is coming to you, you will

00:02:16 --> 00:02:18

feel nothing, either you will feel nothing because

00:02:18 --> 00:02:19

you're dead

00:02:20 --> 00:02:22

or the next thing you'll see, everything is

00:02:22 --> 00:02:24

in your head. You will not hear the

00:02:24 --> 00:02:26

sound of the bombing.

00:02:27 --> 00:02:28

That those same drones

00:02:29 --> 00:02:32

have made the sounds of crying children to

00:02:32 --> 00:02:34

force Palestinians to run out to try to

00:02:34 --> 00:02:38

save children only to become the next set

00:02:38 --> 00:02:38

of casualties.

00:02:39 --> 00:02:41

I remember a cousin that,

00:02:41 --> 00:02:42

lived in Gaza

00:02:43 --> 00:02:44

speaking to me, and he just made it

00:02:44 --> 00:02:45

out of Rafah,

00:02:46 --> 00:02:47

just a few days ago.

00:02:47 --> 00:02:49

And I remember in 2021, he shared this

00:02:49 --> 00:02:52

with me. He said that what made this

00:02:52 --> 00:02:55

bombing so different, what made this particular round

00:02:55 --> 00:02:57

of atrocities so different back in 20

00:02:58 --> 00:02:59

21, and he's lived through all of them.

00:03:00 --> 00:03:02

He said that it felt like this time

00:03:02 --> 00:03:05

we were lab rats, that this is just

00:03:05 --> 00:03:05

some

00:03:06 --> 00:03:08

big experiment and we are the subject of

00:03:08 --> 00:03:10

that experiment. He said that

00:03:10 --> 00:03:12

of those that he lost in 2021

00:03:13 --> 00:03:13

are those that

00:03:14 --> 00:03:16

came closest to the bombings in 2021 that

00:03:16 --> 00:03:19

they described these tiny devices coming right to

00:03:19 --> 00:03:19

their windows

00:03:20 --> 00:03:22

and then entering in as tiny devices before

00:03:22 --> 00:03:23

they exploded

00:03:23 --> 00:03:26

that they had seen sophisticated weaponry that they

00:03:26 --> 00:03:28

had not seen before.

00:03:28 --> 00:03:29

We are witnessing

00:03:29 --> 00:03:31

a genocide on our screens,

00:03:32 --> 00:03:35

but the same companies that manufacture our screens

00:03:35 --> 00:03:37

are also part of the perpetrating of this

00:03:37 --> 00:03:38

genocide.

00:03:39 --> 00:03:41

Paul Biggar, I want to welcome you, and

00:03:41 --> 00:03:42

I want to thank you,

00:03:43 --> 00:03:44

for the work that you have been doing

00:03:44 --> 00:03:47

in the past few months to try to

00:03:47 --> 00:03:49

bring about awareness and then to also try

00:03:49 --> 00:03:50

to bring about a solution

00:03:51 --> 00:03:51

to

00:03:52 --> 00:03:55

the atrocious nature of tech these days. Thank

00:03:55 --> 00:03:58

you so much for being with us, today.

00:03:58 --> 00:04:00

If you could briefly introduce yourself and the

00:04:00 --> 00:04:01

work that you do,

00:04:02 --> 00:04:04

to the audience here today. Thank you so

00:04:04 --> 00:04:05

much for having me.

00:04:05 --> 00:04:07

The I I've always been

00:04:08 --> 00:04:09

a startup founder.

00:04:10 --> 00:04:12

I'm a software engineer by training, but I

00:04:12 --> 00:04:14

got into startups and entrepreneurship,

00:04:14 --> 00:04:16

and I've been doing that for

00:04:17 --> 00:04:18

15 ish years.

00:04:18 --> 00:04:19

The,

00:04:20 --> 00:04:22

I started Tech for Palestine

00:04:23 --> 00:04:24

along with a group of

00:04:25 --> 00:04:26

25 other people,

00:04:27 --> 00:04:29

because we were all doing projects that were

00:04:29 --> 00:04:32

that were in some way intended to help

00:04:33 --> 00:04:35

to help people in Palestine, to help change

00:04:35 --> 00:04:38

the narrative around Palestine in the US,

00:04:38 --> 00:04:41

and in particular, to to change the narrative

00:04:41 --> 00:04:42

in, the tech field.

00:04:43 --> 00:04:43

What made

00:04:44 --> 00:04:45

you

00:04:45 --> 00:04:47

write this blog post on December 14th? If

00:04:47 --> 00:04:49

you can kinda walk me through the process,

00:04:49 --> 00:04:51

when did you think about writing it,

00:04:52 --> 00:04:54

What made you write it, and what has

00:04:54 --> 00:04:55

been the reaction since?

00:04:55 --> 00:04:56

I was on vacation

00:04:57 --> 00:05:00

for for a couple of weeks in October,

00:05:00 --> 00:05:01

November,

00:05:04 --> 00:05:05

and I just I wasn't

00:05:06 --> 00:05:07

you know, all the time, I was just

00:05:07 --> 00:05:10

checking my phone to see what was happening

00:05:10 --> 00:05:11

in Gaza.

00:05:11 --> 00:05:12

And so I was I was sort of

00:05:12 --> 00:05:13

marinating

00:05:13 --> 00:05:15

in in the genocide.

00:05:16 --> 00:05:18

And the whole time, I was thinking, what

00:05:18 --> 00:05:19

what can I do

00:05:19 --> 00:05:21

about this? What you know? I I I

00:05:21 --> 00:05:23

think we all expect that that at some

00:05:23 --> 00:05:24

point in our lives, we get to the

00:05:24 --> 00:05:26

point where where we can have some sort

00:05:26 --> 00:05:27

of impact, where where, you know, if we're

00:05:27 --> 00:05:30

successful in our careers, then, you know, finally,

00:05:30 --> 00:05:32

people will will listen to us or we

00:05:32 --> 00:05:33

can do something about

00:05:34 --> 00:05:36

the bad things that happen in the world.

00:05:37 --> 00:05:39

And I was I was feeling extremely helpless,

00:05:41 --> 00:05:41

and

00:05:42 --> 00:05:44

I thought, you know, just what is the

00:05:44 --> 00:05:46

thing that that I can do? And I

00:05:46 --> 00:05:47

realized that,

00:05:47 --> 00:05:48

you know,

00:05:48 --> 00:05:49

I'm good at writing,

00:05:51 --> 00:05:52

and I have a little bit of a

00:05:52 --> 00:05:54

platform, a little bit of a following, not

00:05:54 --> 00:05:55

not huge.

00:05:56 --> 00:05:58

But I realized that the the thing that

00:05:58 --> 00:06:00

I had was was that I have extremely

00:06:00 --> 00:06:01

high

00:06:03 --> 00:06:04

tolerance for risk,

00:06:04 --> 00:06:06

and I'm at a place in my life

00:06:06 --> 00:06:07

where

00:06:07 --> 00:06:10

being canceled just isn't the thing that's gonna

00:06:10 --> 00:06:10

affect me.

00:06:11 --> 00:06:13

And I realized that if I don't write

00:06:13 --> 00:06:15

a piece like this,

00:06:15 --> 00:06:17

then then who will?

00:06:17 --> 00:06:19

The the narrative that I that I realized

00:06:19 --> 00:06:21

I had to take was was one where,

00:06:22 --> 00:06:23

you know, I I don't you know, I

00:06:23 --> 00:06:25

was I was new to Palestine. I I

00:06:25 --> 00:06:27

I've never been, I've never been to Middle

00:06:27 --> 00:06:28

East at all. The,

00:06:30 --> 00:06:31

you know, I I wasn't

00:06:32 --> 00:06:34

someone who could discuss the the that I

00:06:34 --> 00:06:36

had, you know, just read about for the

00:06:36 --> 00:06:37

first time weeks beforehand.

00:06:38 --> 00:06:40

But I realized that that what I could

00:06:40 --> 00:06:41

write was just the feelings that I was

00:06:41 --> 00:06:44

having, and I I felt that probably a

00:06:44 --> 00:06:45

lot of other people were having the same

00:06:45 --> 00:06:47

feelings. So I kinda wanna,

00:06:47 --> 00:06:48

you know,

00:06:49 --> 00:06:50

ask you to sort of take us through

00:06:50 --> 00:06:52

a process of how

00:06:52 --> 00:06:55

this all happens in terms of the tech

00:06:55 --> 00:06:57

world's involvement. You know, over the last few

00:06:57 --> 00:06:57

years,

00:06:58 --> 00:07:00

we've been trying to paint a picture for

00:07:00 --> 00:07:02

people of what the process of apartheid is.

00:07:02 --> 00:07:04

Right? And how that is so

00:07:05 --> 00:07:05

embedded

00:07:06 --> 00:07:06

in the institutions

00:07:08 --> 00:07:10

that we partake in on a daily basis.

00:07:10 --> 00:07:12

So at the core of the BDS Movement,

00:07:12 --> 00:07:14

many people are learning about divestment for the

00:07:14 --> 00:07:16

very first time. Right? They kind of understood

00:07:16 --> 00:07:18

the concept of boycott at a personal level,

00:07:18 --> 00:07:19

but divestment

00:07:19 --> 00:07:21

at the education level. And you can see

00:07:21 --> 00:07:22

the way

00:07:23 --> 00:07:23

that people

00:07:24 --> 00:07:27

are sickened when they come to realize that

00:07:27 --> 00:07:29

the same institutions of higher learning

00:07:29 --> 00:07:30

that they put their kids in or that

00:07:30 --> 00:07:32

they enroll in

00:07:33 --> 00:07:35

to become change makers in the world are

00:07:35 --> 00:07:36

also investing

00:07:36 --> 00:07:39

in such a horrendous occupation, investing

00:07:39 --> 00:07:41

in Apartheid. And you kind of paint that

00:07:41 --> 00:07:42

picture of

00:07:43 --> 00:07:45

someone who goes to school, who works hard

00:07:46 --> 00:07:47

to pay their tuition

00:07:48 --> 00:07:50

with whatever money is left over after the

00:07:50 --> 00:07:52

tax dollars that are also being used to

00:07:52 --> 00:07:53

fund the weaponry

00:07:54 --> 00:07:56

and then coming to realize that that money

00:07:56 --> 00:07:57

is going to

00:07:58 --> 00:08:01

also the investment in occupation and apartheid.

00:08:01 --> 00:08:04

Making people aware of defense contractors and the

00:08:04 --> 00:08:07

role that the military industrial complex plays in

00:08:07 --> 00:08:08

all of this,

00:08:08 --> 00:08:10

the cruelty of this, that this is a

00:08:10 --> 00:08:13

big market. Right I think many people average

00:08:13 --> 00:08:15

Americans came to know about Halliburton and how

00:08:15 --> 00:08:17

that factored into the Iraq war

00:08:18 --> 00:08:20

and so many different ways that

00:08:21 --> 00:08:22

it just speaks to the cruelty

00:08:24 --> 00:08:26

of the so called progress,

00:08:27 --> 00:08:29

in society where you've got someone,

00:08:29 --> 00:08:31

you know, sitting behind a computer

00:08:32 --> 00:08:34

in Nevada, you know, eating a bag of

00:08:34 --> 00:08:35

chips,

00:08:35 --> 00:08:38

sipping on a Coke can that just presses

00:08:38 --> 00:08:40

a button and then that deploys this drone

00:08:41 --> 00:08:43

that goes and blows up a wedding and

00:08:43 --> 00:08:46

murders 200 innocent people. So it's it's painting

00:08:46 --> 00:08:47

the picture. Right?

00:08:47 --> 00:08:50

And walking people through the entire process of

00:08:50 --> 00:08:53

apartheid, the the process of being invested in

00:08:53 --> 00:08:54

apartheid as a country.

00:08:55 --> 00:08:57

How do you start to paint that picture

00:08:57 --> 00:09:00

for someone who doesn't understand how tech factors

00:09:00 --> 00:09:02

into all of this? What is the role

00:09:02 --> 00:09:03

that these companies play

00:09:04 --> 00:09:05

in apartheid,

00:09:05 --> 00:09:08

in the genocide, and how were we maybe

00:09:08 --> 00:09:11

subconsciously playing a part in that process without

00:09:11 --> 00:09:13

even realizing it? I think the challenge that

00:09:13 --> 00:09:15

people have in understanding it is is how

00:09:15 --> 00:09:18

systemic it is. It's it's not

00:09:19 --> 00:09:22

a direct connection between the things that we

00:09:22 --> 00:09:22

do.

00:09:23 --> 00:09:23

It's rather

00:09:24 --> 00:09:24

a

00:09:25 --> 00:09:28

system of everything being connected in a way

00:09:28 --> 00:09:30

that makes it so the money that we

00:09:30 --> 00:09:31

make

00:09:31 --> 00:09:32

when

00:09:32 --> 00:09:34

we scroll through Instagram, right, we we we

00:09:34 --> 00:09:36

make money for Meta. Meta uses that

00:09:37 --> 00:09:37

to,

00:09:38 --> 00:09:38

suppress

00:09:39 --> 00:09:40

the,

00:09:42 --> 00:09:44

the the content that we see. They build

00:09:44 --> 00:09:47

they build AI, and that AI is used

00:09:47 --> 00:09:49

to decide which voices are going to be

00:09:49 --> 00:09:50

seen.

00:09:51 --> 00:09:52

And when you

00:09:53 --> 00:09:55

take a step back from a company that's

00:09:55 --> 00:09:58

doing specific suppression like Meta or Google or

00:09:58 --> 00:10:00

YouTube, the

00:10:00 --> 00:10:03

and you take it back 20 years earlier.

00:10:03 --> 00:10:06

There was a point at which Mark Zuckerberg

00:10:06 --> 00:10:07

was trying to set up Meta or that

00:10:07 --> 00:10:09

Larry Page was setting up Google

00:10:10 --> 00:10:13

where a room full of people,

00:10:15 --> 00:10:16

were deciding,

00:10:17 --> 00:10:18

should we give

00:10:19 --> 00:10:19

$500,000

00:10:20 --> 00:10:22

to Mark Zuckerberg to decide that he can

00:10:22 --> 00:10:24

make this this thing that's gonna take over

00:10:24 --> 00:10:27

the world? And the people in that room

00:10:27 --> 00:10:28

are,

00:10:28 --> 00:10:30

you know, a huge part of the system.

00:10:30 --> 00:10:32

They're you you're talking about universities. They are

00:10:32 --> 00:10:34

also getting a lot of their funding from

00:10:34 --> 00:10:36

universities. So people like Peter Thiel, for example.

00:10:36 --> 00:10:39

Peter Thiel, who made a decisive

00:10:39 --> 00:10:40

and important,

00:10:41 --> 00:10:41

donation

00:10:42 --> 00:10:44

to Donald Trump in the 20 16 election.

00:10:44 --> 00:10:47

Peter Thiel, who invested in Palantir,

00:10:47 --> 00:10:50

and who who who started and cofounded Palantir,

00:10:50 --> 00:10:51

which is part of the AI behind the

00:10:51 --> 00:10:53

war, is the same Peter Thiel who gave

00:10:53 --> 00:10:54

Mark Zuckerberg $500,000

00:10:55 --> 00:10:58

that was instrumental in building Facebook in 2,005.

00:10:59 --> 00:11:03

The systems of venture capital, of right wing,

00:11:03 --> 00:11:05

of weapons manufacturer,

00:11:05 --> 00:11:07

of private equity, they're all,

00:11:08 --> 00:11:08

you know,

00:11:09 --> 00:11:09

interrelated

00:11:10 --> 00:11:12

with the tech companies, whether they're they're social

00:11:12 --> 00:11:14

media tech companies or whether they're they're tech

00:11:14 --> 00:11:16

companies that are, you know, doing

00:11:17 --> 00:11:19

cybersecurity and testing it directly on the Palestinian

00:11:20 --> 00:11:21

people. I think that one of the things

00:11:21 --> 00:11:22

that

00:11:22 --> 00:11:24

really shocks people is not the idea that

00:11:24 --> 00:11:27

these social media companies are actively suppressing

00:11:28 --> 00:11:30

the voices on behalf of the Palestinian cause,

00:11:32 --> 00:11:33

you know, in service to

00:11:34 --> 00:11:36

the Zionist project, I think that what is

00:11:36 --> 00:11:38

shocking to people is how active these same

00:11:38 --> 00:11:41

companies have been in the actual oppression and

00:11:41 --> 00:11:43

the actual operation that's taking place. So, like,

00:11:43 --> 00:11:45

if you were to tell me a few

00:11:45 --> 00:11:46

months ago

00:11:47 --> 00:11:47

that WhatsApp

00:11:48 --> 00:11:51

would be directly complicit, that Meta would be

00:11:51 --> 00:11:53

feeding facial recognition

00:11:54 --> 00:11:56

to the Israeli government, right, which literally puts

00:11:56 --> 00:11:58

people's lives in danger.

00:11:58 --> 00:12:00

That a group of billionaires on a WhatsApp

00:12:00 --> 00:12:01

group

00:12:01 --> 00:12:04

are directing the New York mayor on behalf

00:12:04 --> 00:12:06

of a foreign government

00:12:06 --> 00:12:07

to

00:12:07 --> 00:12:11

shut down encampments at Columbia. That AI

00:12:11 --> 00:12:14

is being used to eliminate Palestinian people already,

00:12:15 --> 00:12:16

that that they have a way by which

00:12:16 --> 00:12:19

they deploy pure tech and that they are

00:12:19 --> 00:12:21

experimenting, that there's big money in this.

00:12:22 --> 00:12:23

You know, I wouldn't have thought

00:12:24 --> 00:12:26

it would be this advanced already. Right? I

00:12:26 --> 00:12:28

think that there have been warnings about AI.

00:12:28 --> 00:12:30

There have been warnings about where these tech

00:12:30 --> 00:12:31

companies are going,

00:12:32 --> 00:12:34

but how far have we already lost the

00:12:34 --> 00:12:35

plot right that's kind of the question is

00:12:35 --> 00:12:37

like how far gone are we already you've

00:12:37 --> 00:12:38

already got

00:12:38 --> 00:12:41

so much that's been done and so much

00:12:41 --> 00:12:42

that's been developed that

00:12:43 --> 00:12:45

there still haven't been leaks about that we

00:12:45 --> 00:12:46

still haven't had,

00:12:46 --> 00:12:48

you know, the the Pauls of the world

00:12:48 --> 00:12:49

come out and tell us about that are

00:12:49 --> 00:12:51

that are operating behind the scenes. So how

00:12:51 --> 00:12:52

far

00:12:52 --> 00:12:55

has this gone, and can it ever be

00:12:55 --> 00:12:58

reeled back? We are gone gone. It's it's

00:12:58 --> 00:12:59

so far.

00:13:01 --> 00:13:02

The

00:13:02 --> 00:13:03

everywhere

00:13:03 --> 00:13:04

that you look,

00:13:05 --> 00:13:05

there's

00:13:06 --> 00:13:07

people who are

00:13:08 --> 00:13:09

promoting Israel,

00:13:09 --> 00:13:10

who are,

00:13:11 --> 00:13:14

controlling some aspect of the conversation,

00:13:14 --> 00:13:16

whether it's the editors of the New York

00:13:16 --> 00:13:18

Times who are printing

00:13:19 --> 00:13:21

direct propaganda direct false propaganda.

00:13:22 --> 00:13:25

Everywhere I look, there's there's someone pulling on

00:13:25 --> 00:13:26

the strings. So I'll give you I'll give

00:13:26 --> 00:13:29

you one example that you're you're asking about

00:13:29 --> 00:13:30

Meta, and this is this is, you know,

00:13:30 --> 00:13:31

frankly shocking.

00:13:32 --> 00:13:32

The

00:13:33 --> 00:13:34

you know, when when you look at the

00:13:34 --> 00:13:35

meta leadership,

00:13:36 --> 00:13:37

Mark Zuckerberg

00:13:37 --> 00:13:40

donated money to Zaca, which is one of

00:13:40 --> 00:13:42

the largest or sorry, one of the creators

00:13:43 --> 00:13:44

of the,

00:13:44 --> 00:13:48

October 7th atrocity propaganda that that has been

00:13:48 --> 00:13:50

consistently and repeatedly proven false by by lots

00:13:50 --> 00:13:51

of different,

00:13:53 --> 00:13:53

publications,

00:13:54 --> 00:13:57

including Israeli ones. They have Sheryl Sandberg, who's

00:13:57 --> 00:13:59

on their board and who for a long

00:13:59 --> 00:14:01

time was was their leader, who is one

00:14:01 --> 00:14:03

of the most active,

00:14:04 --> 00:14:05

spreaders of the,

00:14:07 --> 00:14:08

of the mass * hoax,

00:14:09 --> 00:14:10

that is used to dehumanize

00:14:11 --> 00:14:11

Palestinians.

00:14:12 --> 00:14:15

Their their chief information security officer, Guy Rosen,

00:14:15 --> 00:14:15

is

00:14:16 --> 00:14:17

former IDF,

00:14:18 --> 00:14:20

was former 82100, which is the same people

00:14:20 --> 00:14:23

that built Lavender, Israeli NSA.

00:14:23 --> 00:14:24

He lives in Israel

00:14:25 --> 00:14:27

and lives in Tel Aviv. And this is

00:14:27 --> 00:14:30

the person who has the most power in

00:14:30 --> 00:14:31

deciding at Meta

00:14:32 --> 00:14:34

what conversations are had,

00:14:34 --> 00:14:37

what policies are made about the content that

00:14:37 --> 00:14:37

we're seeing,

00:14:39 --> 00:14:39

and

00:14:40 --> 00:14:40

what

00:14:41 --> 00:14:43

you know, who gets suppressed and and who

00:14:43 --> 00:14:45

doesn't get suppressed. And so everywhere

00:14:45 --> 00:14:47

you look in

00:14:48 --> 00:14:50

Meta, in LinkedIn, in Twitter, in,

00:14:53 --> 00:14:54

in Google,

00:14:55 --> 00:14:56

you are facing

00:14:58 --> 00:15:01

the same sort of of suppression, and it's

00:15:01 --> 00:15:03

the same it's the same story every time.

00:15:03 --> 00:15:06

There's there's people who are extremely pro Israel

00:15:06 --> 00:15:09

in very important positions who are making the

00:15:09 --> 00:15:11

decisions about what content

00:15:12 --> 00:15:14

you and I get to see. And it's

00:15:14 --> 00:15:16

it's in tech, but it's the same thing

00:15:16 --> 00:15:18

that you see in newspapers. It's the same

00:15:18 --> 00:15:21

thing that you see in in CNN and

00:15:21 --> 00:15:23

on Fox News and and on the,

00:15:24 --> 00:15:27

and in international media as well. It's it's

00:15:27 --> 00:15:28

the same thing everywhere.

00:15:28 --> 00:15:29

What do you say

00:15:31 --> 00:15:33

to the idea of making genocide bad for

00:15:33 --> 00:15:36

business? Right? So, like, this idea that, look,

00:15:36 --> 00:15:36

you have,

00:15:37 --> 00:15:39

just from a pure numbers perspective, more people

00:15:39 --> 00:15:42

in the world that use these platforms

00:15:42 --> 00:15:43

that are pro Palestinian

00:15:44 --> 00:15:46

than pro Israel. Right? I mean, in in

00:15:46 --> 00:15:48

the United States, we live in a bubble,

00:15:48 --> 00:15:51

a very carefully crafted genocidal bubble. Right? I

00:15:51 --> 00:15:53

mean, it's it's clear as night and day

00:15:53 --> 00:15:56

for anyone that travels outside of the United

00:15:56 --> 00:15:57

States or

00:15:57 --> 00:16:01

doesn't limit themselves to US media, corporate media,

00:16:01 --> 00:16:04

or talking points from their politicians at every

00:16:04 --> 00:16:06

at every level of government. From just a

00:16:06 --> 00:16:08

business perspective, at what point is it just

00:16:08 --> 00:16:10

bad for business for these companies? Is that

00:16:10 --> 00:16:13

a way forward to make more people around

00:16:13 --> 00:16:16

the world aware and then perhaps to try

00:16:16 --> 00:16:16

to instrumentalize

00:16:17 --> 00:16:19

that people power, that numbers gain

00:16:20 --> 00:16:21

against these companies?

00:16:21 --> 00:16:23

I think that's exactly the way that it

00:16:23 --> 00:16:23

has to go.

00:16:24 --> 00:16:27

It doesn't seem like tech is gonna reform

00:16:27 --> 00:16:27

itself.

00:16:29 --> 00:16:30

In terms of

00:16:31 --> 00:16:33

how the narrative has shifted in the US,

00:16:33 --> 00:16:35

we've seen the narrative shift massively over the

00:16:35 --> 00:16:36

last 6 months,

00:16:38 --> 00:16:41

But tech has been extremely stubborn. And in

00:16:41 --> 00:16:43

fact, I would say is,

00:16:43 --> 00:16:44

is,

00:16:44 --> 00:16:46

you know, one one one of the holdouts

00:16:46 --> 00:16:49

in terms of of narrative shift. You've seen

00:16:49 --> 00:16:51

very little shift at all. When you look

00:16:51 --> 00:16:52

at at

00:16:53 --> 00:16:55

what changed things in South Africa, the the

00:16:55 --> 00:16:56

economic boycott

00:16:56 --> 00:16:58

was one of the

00:16:58 --> 00:16:59

major

00:16:59 --> 00:17:01

aspects of of what caused

00:17:02 --> 00:17:05

apartheid to fall eventually. I don't think that

00:17:05 --> 00:17:07

it's gonna be exactly the same in Israel.

00:17:07 --> 00:17:10

But the economic boycott, the academic boycott, the

00:17:10 --> 00:17:13

sports boycott, the divestment, those are all gonna

00:17:13 --> 00:17:14

be key parts

00:17:15 --> 00:17:15

of,

00:17:16 --> 00:17:17

damaging,

00:17:18 --> 00:17:21

the Israeli economy and and and making

00:17:22 --> 00:17:24

our displeasure felt within Israel,

00:17:25 --> 00:17:27

but also in making it clear to the

00:17:27 --> 00:17:30

United States that you can't keep doing this,

00:17:30 --> 00:17:32

that if you continue to suppress, to shut

00:17:32 --> 00:17:33

down,

00:17:33 --> 00:17:36

to, you know, even to to side with

00:17:36 --> 00:17:36

Israel

00:17:37 --> 00:17:39

or in this case or in in the

00:17:39 --> 00:17:42

most recent case with the block out 2024

00:17:42 --> 00:17:43

campaign,

00:17:43 --> 00:17:45

if you stay quiet during a genocide,

00:17:46 --> 00:17:48

you know, we, as people, are not gonna

00:17:48 --> 00:17:51

continue to support you, and we know, you

00:17:51 --> 00:17:53

know, it's it's America. The place where

00:17:54 --> 00:17:56

we have an effect is with money.

00:17:57 --> 00:17:59

And it's purely for for for these companies.

00:18:00 --> 00:18:01

The only thing that they care about is

00:18:01 --> 00:18:03

is how much money they're gonna make, and

00:18:03 --> 00:18:04

so it becomes

00:18:04 --> 00:18:05

how can

00:18:06 --> 00:18:06

the,

00:18:06 --> 00:18:08

the economics be changed

00:18:08 --> 00:18:11

from continuing to support Israel, which which for

00:18:11 --> 00:18:13

many of them is a,

00:18:14 --> 00:18:14

ideological

00:18:15 --> 00:18:17

interest and not a financial interest.

00:18:18 --> 00:18:20

So as it starts to as it starts

00:18:20 --> 00:18:22

to hurt, there's gonna there's a lot of

00:18:22 --> 00:18:24

people in those organizations who don't have that

00:18:24 --> 00:18:27

ideological interest and who are then saying, why

00:18:27 --> 00:18:28

are we doing this? You know, it's it's

00:18:28 --> 00:18:30

hurting the bottom line.

00:18:30 --> 00:18:32

The reality is is that you've got these

00:18:32 --> 00:18:35

companies that operate that we're all familiar with.

00:18:35 --> 00:18:38

Right? So suppression is happening across the board.

00:18:39 --> 00:18:42

I wouldn't be surprised if Google, YouTube suppressed

00:18:42 --> 00:18:44

this video. They've shadow banned much of our

00:18:44 --> 00:18:47

content on Palestine already. We've been shadow banned

00:18:47 --> 00:18:49

on Meta. We've even TikTok,

00:18:50 --> 00:18:50

x is,

00:18:51 --> 00:18:53

you know, long gone as well. So the

00:18:53 --> 00:18:55

shadow banning, the suppression is all there with

00:18:55 --> 00:18:57

these companies that we're all familiar with, but

00:18:57 --> 00:18:59

then there are names that operate in the

00:18:59 --> 00:19:01

background that I think are even less familiar

00:19:01 --> 00:19:02

to people

00:19:02 --> 00:19:05

than the military contractors. Right? So now people

00:19:05 --> 00:19:07

are kind of starting to become familiar with,

00:19:08 --> 00:19:10

you know, some of the ways that,

00:19:10 --> 00:19:13

Lockheed Martin and Raytheon and, like, you hear

00:19:13 --> 00:19:14

some of these names and and some people

00:19:14 --> 00:19:17

that are not privy to that industry

00:19:17 --> 00:19:18

can catch on a bit.

00:19:19 --> 00:19:21

But when you hear a name like Palantir,

00:19:22 --> 00:19:24

that means nothing to 99.9%

00:19:24 --> 00:19:25

of people.

00:19:25 --> 00:19:27

And they just had a conference

00:19:28 --> 00:19:29

a few weeks ago on,

00:19:30 --> 00:19:33

AI warfare, I guess, the first AI warfare

00:19:33 --> 00:19:33

conference.

00:19:34 --> 00:19:36

And the thought of a bunch of business

00:19:36 --> 00:19:38

executives sitting in a room,

00:19:39 --> 00:19:42

eating their 5 course meals and, you know,

00:19:42 --> 00:19:45

laughing about how they've just developed a new

00:19:45 --> 00:19:47

technology that could eliminate people more efficiently while

00:19:47 --> 00:19:49

still being profit generating

00:19:49 --> 00:19:51

is terrifying to people. But I think that

00:19:51 --> 00:19:52

it's a sign of the moral decay of

00:19:52 --> 00:19:54

society too. Right? Like,

00:19:54 --> 00:19:56

you know, as as the United States sort

00:19:56 --> 00:19:59

of pontificates to the entire world about,

00:19:59 --> 00:20:01

human rights and and and freedom

00:20:01 --> 00:20:02

and,

00:20:03 --> 00:20:04

just

00:20:05 --> 00:20:05

everything

00:20:06 --> 00:20:08

that it stands in complete

00:20:08 --> 00:20:09

contradiction of

00:20:11 --> 00:20:13

as the curtain is kind of being drawn

00:20:13 --> 00:20:15

and the face is being exposed.

00:20:15 --> 00:20:17

How do you start to expose the sick

00:20:17 --> 00:20:20

world of the likes of Palantir,

00:20:20 --> 00:20:22

right, in these conferences? What do you do

00:20:22 --> 00:20:23

to start

00:20:24 --> 00:20:26

sort of shining a spotlight on that so

00:20:26 --> 00:20:28

that people become more aware of the companies

00:20:28 --> 00:20:29

that operate

00:20:29 --> 00:20:30

behind the companies?

00:20:31 --> 00:20:33

The fact is there's nothing we can do

00:20:33 --> 00:20:35

about Palantir, and I think it's it's very

00:20:35 --> 00:20:37

much designed that way in the same way

00:20:37 --> 00:20:39

that there's that there's, you know, nothing we

00:20:39 --> 00:20:41

can directly do about about Lockheed Martin.

00:20:42 --> 00:20:43

Or there's very you know, we can actually

00:20:43 --> 00:20:45

do more about Lockheed Martin because they make

00:20:45 --> 00:20:47

physical products that they need to ship to

00:20:47 --> 00:20:49

places. But when you make

00:20:49 --> 00:20:51

AI that that ships over the Internet, right,

00:20:51 --> 00:20:53

it's it's quite a different,

00:20:54 --> 00:20:56

quite a different experience. You know, one of

00:20:56 --> 00:20:58

the major things that the that we can

00:20:58 --> 00:20:59

do, and this is why I push on

00:20:59 --> 00:21:02

venture capital so much, is that we can

00:21:02 --> 00:21:04

change what company gets fund.

00:21:05 --> 00:21:07

The companies that get funded

00:21:08 --> 00:21:09

at the

00:21:09 --> 00:21:10

very start of the process

00:21:11 --> 00:21:13

are the ones that are decided by a

00:21:14 --> 00:21:16

relatively small group of people that are

00:21:17 --> 00:21:18

very right wing,

00:21:19 --> 00:21:20

folks like Andreessen Horowitz,

00:21:21 --> 00:21:24

Peter Thiel. You know, the the these are

00:21:24 --> 00:21:25

people who

00:21:25 --> 00:21:25

are,

00:21:26 --> 00:21:27

you know, extremely

00:21:28 --> 00:21:28

pro

00:21:29 --> 00:21:29

defense

00:21:30 --> 00:21:31

and war,

00:21:32 --> 00:21:33

and they're

00:21:34 --> 00:21:36

you know, it's it's hard to to

00:21:37 --> 00:21:39

back a society away from that when the

00:21:39 --> 00:21:40

entire

00:21:40 --> 00:21:41

economic engine

00:21:42 --> 00:21:44

of the society is is is based on

00:21:44 --> 00:21:46

that. People make a lot of money for

00:21:46 --> 00:21:47

more.

00:21:47 --> 00:21:49

And the fact that,

00:21:49 --> 00:21:51

you know, we we we moved

00:21:51 --> 00:21:53

a lot of our,

00:21:54 --> 00:21:56

a lot of the the war machine into

00:21:56 --> 00:21:59

the private sector means that the private sector,

00:21:59 --> 00:22:01

you know, has to make money, has to

00:22:01 --> 00:22:03

make more money. You know, they're they're traded

00:22:03 --> 00:22:05

on the stock exchange, and then our,

00:22:05 --> 00:22:08

our pensions get wrapped into it, and our,

00:22:08 --> 00:22:11

you know, public investments and everything

00:22:11 --> 00:22:14

gets pulled into the same system. And then

00:22:14 --> 00:22:16

people start having conversations like, well, if we

00:22:16 --> 00:22:18

divest from Raytheon, the

00:22:18 --> 00:22:19

university

00:22:20 --> 00:22:20

will make,

00:22:21 --> 00:22:23

you know, 9% instead of 10% on its

00:22:23 --> 00:22:26

investments this year, and then the poor students

00:22:26 --> 00:22:27

won't have won't have funding,

00:22:28 --> 00:22:30

for for their for their activities. And, you

00:22:30 --> 00:22:32

know, we we we don't wanna deprive the

00:22:32 --> 00:22:34

students, do we? You know, we have here

00:22:34 --> 00:22:36

in Dallas, we have, General Dynamics, which is

00:22:36 --> 00:22:39

one of the weapons manufacturers. It's literally right

00:22:39 --> 00:22:39

next door

00:22:40 --> 00:22:41

to UTD,

00:22:42 --> 00:22:43

which has stock in it, and it's just

00:22:43 --> 00:22:45

crazy to think about that process. Right? It's

00:22:45 --> 00:22:47

like, you know, we we're gonna need to

00:22:47 --> 00:22:49

continue to blow up university so that you

00:22:49 --> 00:22:51

can have a better gym. Right? It's, you

00:22:51 --> 00:22:53

know, the it is it's a matter of

00:22:53 --> 00:22:55

1% or 2% in the in in the

00:22:55 --> 00:22:57

stock exchange or in the investment in in

00:22:57 --> 00:23:00

the ROI. In tech, you you you have

00:23:00 --> 00:23:02

that that exact same thing that people people

00:23:02 --> 00:23:04

talk about, you know, making the world a

00:23:04 --> 00:23:06

better place. They're they're, you know, we're building

00:23:06 --> 00:23:09

the the future. We're building the the companies

00:23:09 --> 00:23:11

that are gonna change the future. They're gonna

00:23:11 --> 00:23:12

change our lives.

00:23:12 --> 00:23:14

And a lot of these companies are gonna

00:23:14 --> 00:23:16

change our lives in extremely negative ways.

00:23:17 --> 00:23:18

And, you know, even,

00:23:18 --> 00:23:21

you know, even looking past, they're gonna blow

00:23:21 --> 00:23:24

up our homes or they're gonna militarize the

00:23:24 --> 00:23:25

police or they're gonna,

00:23:26 --> 00:23:27

you know, have have,

00:23:28 --> 00:23:30

AI checking on us and and,

00:23:31 --> 00:23:31

you know,

00:23:32 --> 00:23:33

you you you you take it back to

00:23:33 --> 00:23:35

what has actually happened and what tech has

00:23:35 --> 00:23:38

actually done for us so far. They've destroyed

00:23:38 --> 00:23:39

our our attention,

00:23:40 --> 00:23:43

you know, the the dopamine economy of of

00:23:43 --> 00:23:46

the social media networks. They've created this this

00:23:46 --> 00:23:48

gig economy where where people don't have good

00:23:48 --> 00:23:50

jobs anymore, and people are, you know, stringing

00:23:50 --> 00:23:53

2 or 3 jobs together in order to

00:23:53 --> 00:23:55

be able to to to make their rent,

00:23:55 --> 00:23:55

the,

00:23:57 --> 00:24:01

they're they're working with landlords to to collude,

00:24:01 --> 00:24:03

to drive rent prices up. You know, the

00:24:03 --> 00:24:05

the the kind of things that that tech

00:24:05 --> 00:24:07

have done for the world have not been

00:24:07 --> 00:24:08

extremely positive for us,

00:24:09 --> 00:24:11

even when they do the things that they

00:24:11 --> 00:24:13

say are supposed to be positive for us.

00:24:13 --> 00:24:14

There's an interesting question,

00:24:16 --> 00:24:17

that I've personally been asked. So I'm I'm

00:24:17 --> 00:24:19

sure you know many good people that work

00:24:19 --> 00:24:22

at these tech companies. I personally know many

00:24:22 --> 00:24:24

good people that work at Google,

00:24:24 --> 00:24:26

Muslims and otherwise. Right? Palestinians

00:24:27 --> 00:24:29

that work at these companies, and they've tried

00:24:29 --> 00:24:30

to make change

00:24:30 --> 00:24:33

from within. And they kinda have this moral

00:24:33 --> 00:24:35

conundrum. Right? You've got, like, if you think

00:24:35 --> 00:24:37

about it in the government sense, right, you've

00:24:37 --> 00:24:39

got people that still talk about how far

00:24:39 --> 00:24:41

should we engage, which politician should we engage,

00:24:41 --> 00:24:43

how much do we engage the system versus

00:24:43 --> 00:24:44

fighting the system. And,

00:24:45 --> 00:24:46

you know, you've seen a number of people

00:24:46 --> 00:24:49

resign from this particular administration, the Biden administration,

00:24:49 --> 00:24:52

which has proven to be the worst manifestation

00:24:52 --> 00:24:54

of everything that Donald Trump would express, right,

00:24:54 --> 00:24:55

in regards to,

00:24:56 --> 00:24:56

Palestine.

00:24:57 --> 00:24:58

What do you kind of say to people

00:24:58 --> 00:24:59

that work in tech? I mean, I get

00:24:59 --> 00:25:01

asked this question regularly as an Imam. Right?

00:25:01 --> 00:25:03

People come to me with the faith crisis

00:25:03 --> 00:25:05

of sorts, like, look, I work in these

00:25:05 --> 00:25:06

companies.

00:25:06 --> 00:25:08

I've been trying to move the needle.

00:25:08 --> 00:25:10

I thought I was more important to the

00:25:10 --> 00:25:12

company than I actually am because you see

00:25:12 --> 00:25:14

how many I mean, Google laid off what?

00:25:14 --> 00:25:17

Like, 170 people after those protests, 174 people

00:25:17 --> 00:25:19

in in one strike. I thought I was,

00:25:19 --> 00:25:21

you know, I thought I was in good

00:25:21 --> 00:25:23

standing with the company and that my expression

00:25:23 --> 00:25:25

would mean something, but clearly that hasn't been

00:25:25 --> 00:25:28

the case. Do I continue to stay

00:25:28 --> 00:25:30

and keep pushing,

00:25:31 --> 00:25:34

you know, some sort of ethical boundaries and

00:25:34 --> 00:25:35

what is

00:25:35 --> 00:25:36

inherently unethical?

00:25:38 --> 00:25:39

Do I try to make

00:25:39 --> 00:25:41

the best of the situation,

00:25:41 --> 00:25:43

or do I exit

00:25:43 --> 00:25:46

and fight, from the outside? What what do

00:25:46 --> 00:25:47

you say to young people in tech that

00:25:47 --> 00:25:49

come to you with that moral conundrum?

00:25:50 --> 00:25:52

We need people on the inside as well

00:25:52 --> 00:25:52

as the outside.

00:25:53 --> 00:25:53

I'm

00:25:54 --> 00:25:56

not someone who works very well on the

00:25:56 --> 00:25:56

inside.

00:25:57 --> 00:25:58

I'm

00:25:58 --> 00:26:00

much more likely to to start fires on

00:26:00 --> 00:26:01

the outside.

00:26:02 --> 00:26:04

The it is definitely true that that that

00:26:04 --> 00:26:04

people,

00:26:05 --> 00:26:07

you know, are needed on the inside. It's

00:26:07 --> 00:26:09

a difficult it's a difficult game to play.

00:26:09 --> 00:26:10

It's,

00:26:10 --> 00:26:12

you know, when you start getting that that

00:26:12 --> 00:26:13

tech money,

00:26:13 --> 00:26:15

it's difficult to step away from. You know,

00:26:15 --> 00:26:16

people,

00:26:17 --> 00:26:19

I I I've talked to dozens of people

00:26:19 --> 00:26:20

on the inside and, you know, they they

00:26:20 --> 00:26:22

want to speak up. They they know that

00:26:22 --> 00:26:24

the things that they're working on are harmful.

00:26:25 --> 00:26:27

But they have mortgages, and they have kids

00:26:27 --> 00:26:29

in private schools, and it's it's,

00:26:30 --> 00:26:32

it's a difficult line for people,

00:26:33 --> 00:26:35

and I I I I think that that

00:26:35 --> 00:26:36

a lot of people who

00:26:37 --> 00:26:39

who want to to work on it from

00:26:39 --> 00:26:40

the inside,

00:26:42 --> 00:26:44

I think they end up massively dissatisfied by

00:26:44 --> 00:26:45

their own impact.

00:26:46 --> 00:26:48

But at the same time, it is something

00:26:48 --> 00:26:50

that's needed, and the the the people who

00:26:50 --> 00:26:53

are getting fired for for protesting Google,

00:26:54 --> 00:26:56

the people who are writing open letters,

00:26:57 --> 00:26:58

who are publishing

00:26:58 --> 00:26:59

about

00:26:59 --> 00:27:01

what Meta does internally,

00:27:01 --> 00:27:03

even the people who are talking to me

00:27:03 --> 00:27:05

and giving me you know, telling me the

00:27:05 --> 00:27:07

story about what really happens within these companies

00:27:07 --> 00:27:09

can be massively impactful. And if they're not

00:27:09 --> 00:27:10

there,

00:27:11 --> 00:27:11

then,

00:27:12 --> 00:27:14

you know, then the only people who are

00:27:14 --> 00:27:16

having voices in those rooms

00:27:16 --> 00:27:18

are people who are pro Israel. So it

00:27:18 --> 00:27:19

is

00:27:20 --> 00:27:20

absolutely

00:27:21 --> 00:27:21

necessary,

00:27:23 --> 00:27:24

but I think it is also

00:27:26 --> 00:27:28

yeah. I can I can only imagine the

00:27:28 --> 00:27:30

toll that it takes on them? And, I

00:27:30 --> 00:27:32

mean, some of those people, frankly, are are

00:27:32 --> 00:27:33

are there.

00:27:33 --> 00:27:35

And what's keeping them there is not just,

00:27:35 --> 00:27:37

you know, and I'm sure you know this

00:27:37 --> 00:27:38

obviously. And

00:27:38 --> 00:27:39

it's not just the mortgage. It's not just

00:27:40 --> 00:27:41

it's it's actually that they feel like this

00:27:41 --> 00:27:43

is the best way they can make change.

00:27:43 --> 00:27:44

And they usually do,

00:27:46 --> 00:27:48

after a few small wins, suffer a massive

00:27:48 --> 00:27:51

loss, and they wonder about themselves. And, you

00:27:51 --> 00:27:52

know, my advice to them has been along

00:27:52 --> 00:27:55

similar lines. Right? Look. I mean, we need

00:27:55 --> 00:27:56

people there.

00:27:57 --> 00:27:59

You know, obviously, continue to be principled, continue

00:27:59 --> 00:28:02

to be unambiguous about where you stand, continue

00:28:02 --> 00:28:03

to try to push the needle,

00:28:04 --> 00:28:06

and to be wise and to be calculated,

00:28:06 --> 00:28:08

but never to to

00:28:09 --> 00:28:11

to never forsake your courage or your conscience

00:28:11 --> 00:28:13

in the process of that. And it's a

00:28:13 --> 00:28:15

it's a rough conversation even from I mean,

00:28:15 --> 00:28:16

I come to it from a pastoral perspective.

00:28:16 --> 00:28:17

Right? Like,

00:28:18 --> 00:28:20

look, you're doing the right thing, I hope,

00:28:20 --> 00:28:22

based upon what you're saying to me. But,

00:28:23 --> 00:28:24

you know, at the same time,

00:28:25 --> 00:28:27

we need people on the outside as well.

00:28:27 --> 00:28:29

What does the outside look like? You started

00:28:29 --> 00:28:30

Tech for Palestine.

00:28:31 --> 00:28:33

There is a group,

00:28:33 --> 00:28:35

no tech for apartheid, which is not the

00:28:35 --> 00:28:37

same thing as tech for Palestine for those

00:28:37 --> 00:28:38

that might be wondering.

00:28:38 --> 00:28:41

Right? What does the outside look like for

00:28:41 --> 00:28:42

someone that works in tech, and how can

00:28:42 --> 00:28:45

they be effective on the outside if that's

00:28:45 --> 00:28:46

the if that's the route that they choose?

00:28:47 --> 00:28:49

So, fundamentally, tech is a sort of a

00:28:49 --> 00:28:51

a way of thinking about the world.

00:28:51 --> 00:28:53

The there there is the industry, and there's

00:28:53 --> 00:28:55

there's an awful lot that people can do

00:28:56 --> 00:28:58

to call out what's happening in the industry,

00:28:58 --> 00:29:00

but it's also a place where

00:29:00 --> 00:29:01

you can build

00:29:01 --> 00:29:02

tech projects

00:29:03 --> 00:29:05

that actually help change things. So for example,

00:29:06 --> 00:29:07

I was on the I was on a

00:29:07 --> 00:29:09

call earlier with people who are working on

00:29:09 --> 00:29:10

this blockade 20 24 thing

00:29:11 --> 00:29:14

where, you know, they're they're they're helping influencers

00:29:14 --> 00:29:16

feel the the pain of not having spoken

00:29:16 --> 00:29:19

out for for Palestine, and they're helping automate

00:29:19 --> 00:29:20

that. You know, people are going out and

00:29:20 --> 00:29:23

they're unfollowing Kim Kardashian 1 at a time,

00:29:24 --> 00:29:25

and there's people out there who are building

00:29:25 --> 00:29:27

software to allow people to

00:29:27 --> 00:29:28

unfollow

00:29:28 --> 00:29:29

1,000,

00:29:29 --> 00:29:31

you know, 100 of 1,000 of of these

00:29:31 --> 00:29:32

influencers

00:29:32 --> 00:29:34

for for not speaking up. So it's it's

00:29:34 --> 00:29:35

a thing where

00:29:38 --> 00:29:39

where where people can have

00:29:40 --> 00:29:41

impact, the same,

00:29:42 --> 00:29:44

type of impact that makes tech a lucrative

00:29:45 --> 00:29:45

industry

00:29:46 --> 00:29:48

for you know, because of because of automation,

00:29:48 --> 00:29:51

because of AI. All these things can be

00:29:51 --> 00:29:52

can be used to

00:29:52 --> 00:29:54

to help Palestine and to fight against the

00:29:54 --> 00:29:57

system at the same time. So with that

00:29:57 --> 00:29:59

being said, what can the person who is

00:29:59 --> 00:30:01

not in tech do? You know, you've got

00:30:01 --> 00:30:02

thousands of people watching this.

00:30:03 --> 00:30:04

How do they get involved?

00:30:04 --> 00:30:06

Where's the hope in all of this? I

00:30:06 --> 00:30:08

mean, obviously, we've all been

00:30:08 --> 00:30:10

you know, you write that you can't sleep.

00:30:10 --> 00:30:11

None of us have been able to sleep.

00:30:11 --> 00:30:13

Right? Anyone with a moral conscience,

00:30:14 --> 00:30:16

immediately resonates with the words that you wrote,

00:30:17 --> 00:30:18

and they're looking for

00:30:18 --> 00:30:20

a silver lining. I mean, it's hard to

00:30:20 --> 00:30:22

say silver lining in the midst of a

00:30:22 --> 00:30:23

genocide, but how do

00:30:24 --> 00:30:26

we get involved? How can the average person

00:30:26 --> 00:30:29

be more conscious with what they purchase, with

00:30:29 --> 00:30:30

what they consume,

00:30:30 --> 00:30:32

how they use their time, how they involve

00:30:32 --> 00:30:34

something that is so foreign to them being

00:30:34 --> 00:30:35

tech, right, when they're not when they're not

00:30:35 --> 00:30:37

in that world? I I think the main

00:30:37 --> 00:30:38

thing that,

00:30:40 --> 00:30:42

that that affects people and makes them think

00:30:42 --> 00:30:44

that they're that they're not able to do

00:30:44 --> 00:30:44

anything

00:30:45 --> 00:30:45

is

00:30:46 --> 00:30:48

that there's so much going on, and they

00:30:48 --> 00:30:50

see so many things that that they they

00:30:50 --> 00:30:52

feel overwhelmed and and helpless.

00:30:53 --> 00:30:54

And I think that that

00:30:55 --> 00:30:56

if people do wanna get involved,

00:30:57 --> 00:30:59

pick one thing that comes by,

00:31:00 --> 00:31:02

that that you see come past you and

00:31:02 --> 00:31:05

figure out how you can get more involved

00:31:05 --> 00:31:06

in that. So for example,

00:31:08 --> 00:31:10

you see things about Israel bonds sometimes.

00:31:11 --> 00:31:14

Israel bonds is is a thing where,

00:31:14 --> 00:31:17

people actually invest. And when I say people,

00:31:17 --> 00:31:17

I mean,

00:31:17 --> 00:31:18

often

00:31:19 --> 00:31:21

cities, municipalities,

00:31:21 --> 00:31:22

large companies,

00:31:23 --> 00:31:26

invest in Israel bonds in order to support

00:31:26 --> 00:31:28

the Israeli economy and also to to make

00:31:28 --> 00:31:29

money because they're in investments.

00:31:30 --> 00:31:31

This is something that they can start to

00:31:31 --> 00:31:32

investigate

00:31:32 --> 00:31:34

in their locality,

00:31:34 --> 00:31:36

in what time you're in. Is there is

00:31:36 --> 00:31:38

there an investment that has been made by

00:31:38 --> 00:31:38

the municipality?

00:31:39 --> 00:31:41

Is there one in your state?

00:31:41 --> 00:31:44

These this information is public. You can do

00:31:44 --> 00:31:45

similar investigations

00:31:46 --> 00:31:47

on your local politicians,

00:31:48 --> 00:31:50

on the companies that that are in your

00:31:50 --> 00:31:51

area.

00:31:51 --> 00:31:52

There's

00:31:53 --> 00:31:55

there's so many facets

00:31:55 --> 00:31:58

to how Israel has this control in the

00:31:58 --> 00:32:00

United States that it actually makes it really

00:32:00 --> 00:32:02

easy to get involved because you just have

00:32:02 --> 00:32:04

to pick one thing that they're doing

00:32:04 --> 00:32:06

and start to spend time on it, find

00:32:06 --> 00:32:08

the other people who are working on it,

00:32:09 --> 00:32:11

and get organized. Have you been able to

00:32:11 --> 00:32:12

work with anyone from Palestine?

00:32:13 --> 00:32:14

Have you been able to take, like, a

00:32:14 --> 00:32:17

young person from Gaza or from Palestine and

00:32:17 --> 00:32:18

sort of train them up, attack

00:32:19 --> 00:32:20

minds from there? I mean, could you share

00:32:20 --> 00:32:23

any stories about that or any particular individual

00:32:23 --> 00:32:24

that maybe stands out to you? So that

00:32:25 --> 00:32:27

that's not the sort of advocacy that we're

00:32:27 --> 00:32:28

doing at the moment.

00:32:29 --> 00:32:31

We are we are very much focused on,

00:32:32 --> 00:32:33

advocacy within the US,

00:32:34 --> 00:32:36

and and and in the west, but an

00:32:36 --> 00:32:37

awful lot of our

00:32:40 --> 00:32:42

volunteers and the people people the people who

00:32:42 --> 00:32:44

who set up Protect for Palestine, the the

00:32:44 --> 00:32:46

25 initial were 80%,

00:32:47 --> 00:32:49

Palestinian, Arab, or Muslim.

00:32:49 --> 00:32:52

So it's a very, you know, very heavy

00:32:52 --> 00:32:55

involvement in it. The the funny thing about,

00:32:55 --> 00:32:56

you know, about asking about,

00:32:57 --> 00:32:58

you know, helping Palestinians

00:32:59 --> 00:33:00

get technical

00:33:00 --> 00:33:04

is that Palestine is incredibly heavy heavily technical.

00:33:05 --> 00:33:08

The there is a huge number of engineers,

00:33:09 --> 00:33:11

software engineers in particular in Palestine.

00:33:12 --> 00:33:13

As as you know, it's, you know, an

00:33:13 --> 00:33:15

incredibly well educated,

00:33:15 --> 00:33:18

workforce and and population, and a lot of

00:33:18 --> 00:33:21

that is within tech and within within software

00:33:21 --> 00:33:22

engineering.

00:33:22 --> 00:33:25

You know, you're Irish, which is, obviously,

00:33:26 --> 00:33:27

something that

00:33:27 --> 00:33:30

in some ways has to be formative to

00:33:30 --> 00:33:31

the way that you kinda enter into this

00:33:31 --> 00:33:32

arena. You know, I had the pleasure of

00:33:32 --> 00:33:35

going to Ireland, to Dublin, in fact,

00:33:35 --> 00:33:36

you know, back in February,

00:33:37 --> 00:33:39

And I was absolutely blown away by the

00:33:39 --> 00:33:39

solidarity,

00:33:40 --> 00:33:43

of the Irish people with Palestine. It's something

00:33:43 --> 00:33:44

you hear about. It's something you you you

00:33:44 --> 00:33:47

see, but it's another thing to experience that.

00:33:47 --> 00:33:50

And it was absolutely overwhelming. I think that

00:33:50 --> 00:33:51

the experience

00:33:52 --> 00:33:53

of Palestinian solidarity,

00:33:54 --> 00:33:56

finding that within our allies,

00:33:58 --> 00:34:00

this time around, as ugly as the genocide

00:34:00 --> 00:34:01

has been,

00:34:02 --> 00:34:04

as refreshing as the breath of the allyship

00:34:04 --> 00:34:07

has been. We're seeing, obviously, Ireland and South

00:34:07 --> 00:34:09

Africa, but if you go to any encampment,

00:34:09 --> 00:34:11

you see a pretty large Jewish presence.

00:34:12 --> 00:34:14

You see a large presence of people across

00:34:14 --> 00:34:16

the board, Black Palestine solidarity.

00:34:17 --> 00:34:18

Do you see any hope,

00:34:19 --> 00:34:21

you know, in in sort of the breadth

00:34:21 --> 00:34:23

of the pro Palestine movement now and the

00:34:23 --> 00:34:26

broadening of the pro Palestine movement? Is that

00:34:26 --> 00:34:27

something that gives you

00:34:28 --> 00:34:31

hope in particular? Absolutely. And I feel that

00:34:31 --> 00:34:32

that it's barely even started.

00:34:34 --> 00:34:36

You know, we're we're seeing this in the

00:34:36 --> 00:34:36

incumbents.

00:34:37 --> 00:34:37

But

00:34:38 --> 00:34:41

the vast majority of the people in America,

00:34:41 --> 00:34:42

for example,

00:34:42 --> 00:34:44

are are not aware of the other side

00:34:44 --> 00:34:47

of narrative. They've only gotten what they've been

00:34:47 --> 00:34:47

fed

00:34:47 --> 00:34:48

for for decades.

00:34:49 --> 00:34:52

The, you know, the the anti Arab sentiment

00:34:52 --> 00:34:53

and and Islamophobia

00:34:53 --> 00:34:55

after 911, for example,

00:34:56 --> 00:34:57

they

00:34:58 --> 00:35:00

have very much been only fed the idea

00:35:00 --> 00:35:02

of Israel as our ally. Israel is the

00:35:02 --> 00:35:04

only democracy in the Middle East. We need

00:35:04 --> 00:35:06

to support them. We need to keep sending

00:35:06 --> 00:35:07

money their way.

00:35:08 --> 00:35:10

That's that's the narrative they've been fed. And

00:35:10 --> 00:35:13

as, you know, Gen z, in particular, starts

00:35:13 --> 00:35:14

to

00:35:15 --> 00:35:17

starts to ask questions, they learn very quickly

00:35:17 --> 00:35:19

that that they're being lied to. And so

00:35:19 --> 00:35:20

when you see that

00:35:21 --> 00:35:24

narrative break open as we are seeing now,

00:35:24 --> 00:35:26

And, you know, it is it is mainstream

00:35:26 --> 00:35:27

questioning

00:35:28 --> 00:35:30

our alliance with Israel is mainstream,

00:35:32 --> 00:35:34

Questioning whether we should be bombing children in

00:35:34 --> 00:35:36

Gaza and in particular,

00:35:36 --> 00:35:38

whether we should be spending our money on

00:35:38 --> 00:35:40

that instead of health care is is,

00:35:41 --> 00:35:43

you know, we're we're still at the at

00:35:43 --> 00:35:46

the very start of of that movement for

00:35:46 --> 00:35:48

that that to become pervasive within the US.

00:35:48 --> 00:35:49

And

00:35:49 --> 00:35:50

the amount of supporters

00:35:51 --> 00:35:51

that,

00:35:53 --> 00:35:56

that Israel has is actually incredibly low. The

00:35:56 --> 00:35:59

diehard Zionists in in in the US, I

00:35:59 --> 00:36:01

think, you probably have

00:36:02 --> 00:36:02

3,000,000

00:36:03 --> 00:36:06

Jewish Zionists, maybe 10,000,000 Christian Zionists. It's, you

00:36:06 --> 00:36:08

know, it's it's not a,

00:36:09 --> 00:36:09

representative

00:36:10 --> 00:36:12

of a very large portion of the United

00:36:12 --> 00:36:13

States at all. It's just in

00:36:14 --> 00:36:17

pretty powerful positions like the ADL, like APAC,

00:36:18 --> 00:36:19

like the tech companies.

00:36:20 --> 00:36:21

And as the broader

00:36:23 --> 00:36:24

population

00:36:25 --> 00:36:25

learns

00:36:26 --> 00:36:27

as they are learning,

00:36:29 --> 00:36:30

then it's it's

00:36:30 --> 00:36:32

dramatically gonna change things. And you can see

00:36:32 --> 00:36:33

this in

00:36:33 --> 00:36:36

the in how heavy the suppression is coming

00:36:36 --> 00:36:38

down. The the students are being attacked by

00:36:38 --> 00:36:41

power military police. They're being beaten.

00:36:41 --> 00:36:42

TikTok

00:36:42 --> 00:36:44

is being banned because

00:36:45 --> 00:36:46

and and they have, you know, Mitt Romney

00:36:46 --> 00:36:49

and Anthony Blinken sat down in Arizona to

00:36:49 --> 00:36:51

tell us why that was, and it's because

00:36:51 --> 00:36:53

there were pro Palestinian voices on it.

00:36:54 --> 00:36:56

The suppression that we're seeing

00:36:57 --> 00:36:57

is

00:36:58 --> 00:37:01

because it it it has the potential to

00:37:01 --> 00:37:03

bring down the whole system, frankly,

00:37:04 --> 00:37:07

and they are terrified. And that terror is

00:37:07 --> 00:37:07

because

00:37:07 --> 00:37:09

as people learn what's happening,

00:37:10 --> 00:37:13

they immediately say, that's not right. Why are

00:37:13 --> 00:37:15

we supporting that? So I wanna ask you

00:37:15 --> 00:37:17

a deeply personal question that you probably won't

00:37:17 --> 00:37:20

be asked on any, podcast or in any

00:37:20 --> 00:37:20

interview.

00:37:22 --> 00:37:24

But, you know, one of the things that

00:37:24 --> 00:37:25

many have spoken about

00:37:26 --> 00:37:28

not being Palestinian, not being Muslim, is that

00:37:28 --> 00:37:30

as horrified as they are

00:37:31 --> 00:37:34

by the genocide and our complicity in the

00:37:34 --> 00:37:34

genocide,

00:37:35 --> 00:37:37

they're also inspired by the resilience of the

00:37:37 --> 00:37:40

Palestinian people, and perhaps they see their faith,

00:37:40 --> 00:37:41

their

00:37:41 --> 00:37:44

courage as being generated by something deeper.

00:37:45 --> 00:37:47

What have you as Paul Bigger? What have

00:37:47 --> 00:37:49

you kind of learned about the Palestinian people,

00:37:49 --> 00:37:51

maybe even about Islam as a faith,

00:37:51 --> 00:37:54

by watching the resilience of the Palestinian people?

00:37:54 --> 00:37:56

How does that kinda factor into your own

00:37:56 --> 00:37:58

ability to take on

00:37:58 --> 00:38:00

these companies and,

00:38:01 --> 00:38:02

to take this path that you've taken? Oh,

00:38:02 --> 00:38:04

that's an interesting question.

00:38:06 --> 00:38:06

The

00:38:07 --> 00:38:08

you know, I'll I'll I'll say that that

00:38:08 --> 00:38:11

that I knew very little about, about Palestine

00:38:11 --> 00:38:13

and very little about Islam when,

00:38:14 --> 00:38:16

when I started my journey, and I think

00:38:16 --> 00:38:17

that

00:38:17 --> 00:38:18

the,

00:38:19 --> 00:38:21

probably the narrative that I had was was

00:38:21 --> 00:38:23

one that was fed by the the same

00:38:24 --> 00:38:26

US system that's very,

00:38:26 --> 00:38:27

Islamophobic.

00:38:27 --> 00:38:29

Then it's been really interesting to see,

00:38:30 --> 00:38:30

you know,

00:38:31 --> 00:38:31

how peaceful

00:38:32 --> 00:38:34

Islam is. That was

00:38:35 --> 00:38:36

not something that I've been led to believe

00:38:36 --> 00:38:37

my entire life. And,

00:38:38 --> 00:38:39

you know, as we,

00:38:40 --> 00:38:42

as we're breaking down the narrative, so we're

00:38:42 --> 00:38:44

breaking down, you know, some of ours as

00:38:44 --> 00:38:45

well. And

00:38:46 --> 00:38:47

it seems extremely

00:38:48 --> 00:38:48

important

00:38:48 --> 00:38:50

to how the Palestinians,

00:38:51 --> 00:38:52

you know, the people undergoing

00:38:53 --> 00:38:55

the the bombing in in Gaza, how they

00:38:55 --> 00:38:57

they they continue their lives. It's, you know,

00:38:57 --> 00:38:58

as a as a,

00:38:59 --> 00:39:01

not quite lifelong, but but, you know, most

00:39:01 --> 00:39:02

of my life atheist.

00:39:03 --> 00:39:05

It's it's certainly an an interesting

00:39:05 --> 00:39:07

lesson in how people can apply it,

00:39:07 --> 00:39:09

can apply religion positively.

00:39:09 --> 00:39:11

I come from a country that is

00:39:13 --> 00:39:16

removing itself from a multi 100 year,

00:39:17 --> 00:39:19

suppression by the Catholic church.

00:39:20 --> 00:39:23

And so it's, you know, I've always been

00:39:23 --> 00:39:24

in my life going in the other direction,

00:39:24 --> 00:39:25

so it's it's very interesting to see people

00:39:25 --> 00:39:28

who are applying religion very positively.

00:39:28 --> 00:39:30

So 5, 10 years from now, let's let's

00:39:30 --> 00:39:32

take 10 years from now, and I'll kinda

00:39:32 --> 00:39:33

conclude with this.

00:39:34 --> 00:39:36

If you were to kinda look back 10

00:39:36 --> 00:39:38

years from now and say you succeeded,

00:39:39 --> 00:39:40

we

00:39:40 --> 00:39:41

have succeeded,

00:39:42 --> 00:39:44

what does success look like in 10 years,

00:39:45 --> 00:39:47

you know, considering just how overwhelming

00:39:48 --> 00:39:49

and how far gone this all seems? Like,

00:39:49 --> 00:39:51

what do you what do you hope this

00:39:51 --> 00:39:53

conversation looks like in 10 years,

00:39:54 --> 00:39:57

between between you and I and and perhaps

00:39:57 --> 00:39:59

you and and others that have, embarked upon

00:39:59 --> 00:40:00

this mission

00:40:01 --> 00:40:01

to,

00:40:02 --> 00:40:04

reign in tech? I mean, I I I

00:40:04 --> 00:40:05

think fundamentally,

00:40:05 --> 00:40:06

our our mission,

00:40:07 --> 00:40:09

the the mission of all of us in

00:40:09 --> 00:40:11

this are are is to reign in

00:40:12 --> 00:40:14

quite a lot of the the system. Right?

00:40:14 --> 00:40:16

We're we're trying to rein in tech. We're

00:40:16 --> 00:40:17

trying to rein in the US. We're trying

00:40:17 --> 00:40:19

to rein in Zionism, trying to rein in

00:40:19 --> 00:40:21

Israel, of course. If we have succeeded

00:40:21 --> 00:40:24

in in 10 years, then then things will,

00:40:24 --> 00:40:26

you know, have the potential to look dramatically

00:40:26 --> 00:40:26

different.

00:40:27 --> 00:40:30

A free Palestine, obviously, being 1, but one

00:40:30 --> 00:40:33

where a lot of power is is pulled

00:40:33 --> 00:40:34

out of tech and in particular pulled out

00:40:34 --> 00:40:36

of big tech where they no longer have

00:40:36 --> 00:40:37

power

00:40:37 --> 00:40:41

to suppress content because it's it's politically advantageous

00:40:41 --> 00:40:41

to them.

00:40:42 --> 00:40:43

One where,

00:40:44 --> 00:40:47

the war machine is not promoted and not

00:40:47 --> 00:40:47

funded,

00:40:48 --> 00:40:50

and one, you know, ultimately, where where

00:40:51 --> 00:40:53

where workers in,

00:40:53 --> 00:40:56

in tech have a lot more power to,

00:40:56 --> 00:40:57

to effect change,

00:40:58 --> 00:40:59

from the bottom.

00:41:00 --> 00:41:01

And where that's true of the United States

00:41:01 --> 00:41:05

in general, where where the actual people who

00:41:05 --> 00:41:05

are

00:41:07 --> 00:41:07

dramatically

00:41:08 --> 00:41:11

pro ceasefire, who are dramatically against bombing, who

00:41:11 --> 00:41:14

are dramatically against funding Israel, are are listened

00:41:14 --> 00:41:15

to in not only

00:41:16 --> 00:41:18

Middle Eastern foreign policy decisions,

00:41:19 --> 00:41:21

but also in the broad range of things

00:41:21 --> 00:41:24

in which we are ignored and in which

00:41:24 --> 00:41:26

the the US system doesn't work to serve

00:41:26 --> 00:41:27

us.

00:41:27 --> 00:41:29

Paul, thank you so much for

00:41:30 --> 00:41:31

all that you've done and all that you

00:41:31 --> 00:41:33

continue to do and for taking the time

00:41:33 --> 00:41:34

out to be with us and to share

00:41:34 --> 00:41:35

your insights,

00:41:36 --> 00:41:37

to expose us to,

00:41:38 --> 00:41:40

you know, what what we are deeply unfamiliar

00:41:40 --> 00:41:41

with

00:41:41 --> 00:41:43

at the process level, but, unfortunately, have been

00:41:43 --> 00:41:45

seeing at the outcome level. Thank you for

00:41:45 --> 00:41:47

your conscience, and, thank you for your solidarity.

00:41:48 --> 00:41:50

And we look forward to having you on

00:41:50 --> 00:41:53

once again and continuing to struggle alongside you.

00:41:53 --> 00:41:54

Thank you so much for having me.

Share Page