Shadee Elmasry – NBF 167 All You Need to Know About Artificial Intelligence

Shadee Elmasry
Share Page

AI: Summary ©

The COVID-19 pandemic has caused " pest apocalypse" for the technology industry, with the speakers emphasizing the importance of Sun parables and their own accuracy in their work. They also discuss the potential harm from AI and the shift in definition of deep fakes, with personal data and deep fakes being highlighted as potential threats. The speakers stress the importance of privacy laws and the need for people to stay up-to-date and adapt to the " metaverse" before the "has."

AI: Summary ©

00:00:00 --> 00:00:04
			Love, he was so happy when Manuela
welcome everybody to the Safina
		
00:00:04 --> 00:00:09
			side to the Safina society live
stream, nothing but facts. I
		
00:00:09 --> 00:00:12
			totally butchered that this time
around what is it? Welcome
		
00:00:12 --> 00:00:16
			everybody to the nothing but
facts. What do I usually say?
		
00:00:18 --> 00:00:22
			Oh welcome everybody to this just
nothing but facts the Safina
		
00:00:22 --> 00:00:27
			society live stream. And where
when I say nothing but facts I
		
00:00:27 --> 00:00:28
			mean facts are
		
00:00:30 --> 00:00:35
			what's going to set us free from
conjecture. All right. But today
		
00:00:35 --> 00:00:39
			is finally we got some snow and
hamdulillah is a day in which
		
00:00:40 --> 00:00:44
			the snow has come down. It's not
sticking yet but it is coming down
		
00:00:44 --> 00:00:48
			in huge flakes. Such a gorgeous
thing. Such a romantic thing is
		
00:00:48 --> 00:00:51
			the snow. Right? You gotta love
the snow.
		
00:00:52 --> 00:00:57
			And on a Sunday, or a Monday
morning, or Monday afternoon, in
		
00:00:57 --> 00:00:58
			which
		
00:01:02 --> 00:01:04
			the month of Rajab had come in
last night.
		
00:01:06 --> 00:01:09
			And the month of Rajab is one of
those things where you wanted to
		
00:01:09 --> 00:01:12
			make sure you take advantage of
the DUA on that night because it
		
00:01:12 --> 00:01:18
			is from many books of Hadith,
stating that the a bad
		
00:01:19 --> 00:01:22
			the.in first night of Rogers
McCool.
		
00:01:23 --> 00:01:24
			Next announcement is that
		
00:01:26 --> 00:01:29
			we're shifting things a little
bit. We're moving Shema on Mondays
		
00:01:29 --> 00:01:32
			because that's what makes sense.
She meant it to be on Mondays
		
00:01:32 --> 00:01:36
			because everything that Ciara
related is
		
00:01:39 --> 00:01:42
			is to be done on Mondays. Just
because the Prophet sallallahu
		
00:01:42 --> 00:01:44
			alayhi wa sallam was asked should
I fast on Monday Prophet said
		
00:01:44 --> 00:01:47
			Would it to fee so therefore
things related to the prophets
		
00:01:47 --> 00:01:52
			lights on them happen on Monday,
we're on the age of the Prophet
		
00:01:52 --> 00:01:56
			Cindy Rasulullah sallallahu alayhi
wa salam Babel, my geography
		
00:01:56 --> 00:01:59
			Sydney Rasulullah sallallahu
alayhi wa sallam. That's going to
		
00:01:59 --> 00:02:03
			be segment one segment two. We're
going to have Maureen coming on to
		
00:02:03 --> 00:02:08
			talk about AI and it updates on
AI. What are the facts on the
		
00:02:08 --> 00:02:13
			ground? On a what do we know? As
investigators and police officers
		
00:02:13 --> 00:02:14
			like to say
		
00:02:15 --> 00:02:18
			investigators so what do we know?
What are the facts? And then what
		
00:02:18 --> 00:02:23
			are the concerns or the theories
surrounding these facts? So
		
00:02:23 --> 00:02:25
			that'll be segment two, segment
three, we'll take your questions
		
00:02:25 --> 00:02:28
			and answers as usual on a Monday
and that'll take us right up to
		
00:02:28 --> 00:02:31
			three o'clock. So let's get going
straight up. Babble my Jaffe,
		
00:02:31 --> 00:02:35
			Sydney rasool Allah He sallallahu
alayhi wa alayhi wa sallam
		
00:02:36 --> 00:02:41
			had nothing to do with pneumonia,
her death in a row. Ivanova had
		
00:02:41 --> 00:02:46
			done as Korea is Huck had death
and now I'm in the dinar. And if
		
00:02:46 --> 00:02:51
			nobis kala Mecca Nabil sal Allahu
Allahu alayhi wa sallam, we met
		
00:02:51 --> 00:02:57
			Cata filata Asha Rata Senate and
you have la heat within Medina to
		
00:02:57 --> 00:03:00
			ashram. What will fear Well, what
happened with the left? And was it
		
00:03:00 --> 00:03:05
			Dean, this is the dominant opinion
on the matter. Because remember,
		
00:03:05 --> 00:03:08
			in those days, people didn't
always keep track of things.
		
00:03:09 --> 00:03:12
			People kept track we know when the
Prophet was born because Allah
		
00:03:12 --> 00:03:17
			made it in on the year in which a
massive event with Toto occurred
		
00:03:17 --> 00:03:23
			which is a feat. And, and so we
know that and then they used to
		
00:03:23 --> 00:03:26
			count how many months or whatever
everyone would count differently,
		
00:03:26 --> 00:03:31
			how many winters whatever, people,
people would count, well, that
		
00:03:31 --> 00:03:34
			actually wouldn't work in the
lunar calendar. Right. So you
		
00:03:34 --> 00:03:40
			would just know that year and how
many modems have passed or things
		
00:03:40 --> 00:03:40
			like that?
		
00:03:42 --> 00:03:42
			Yep.
		
00:03:44 --> 00:03:48
			i Some of you are putting
questions up on Instagram. I will
		
00:03:48 --> 00:03:53
			take the questions after we finish
this segment. And between getting
		
00:03:53 --> 00:03:56
			more in on I'll take some
questions and then after Maureen
		
00:03:56 --> 00:04:00
			will take questions again. Maureen
is the guest he's a person who has
		
00:04:00 --> 00:04:02
			a great interest in AI and he's
going to talk to us about it
		
00:04:07 --> 00:04:08
			have fun with that.
		
00:04:10 --> 00:04:14
			had done a Muhammad a bit of a
shot Davina Muhammad Ali jofra on
		
00:04:14 --> 00:04:19
			Shaba and Abby is hop on I'm gonna
miss out. And Jared and Anwar
		
00:04:19 --> 00:04:24
			here. And now Samia. Some Yahoo
Yakubu Carla Murthy. Rasulullah
		
00:04:24 --> 00:04:28
			sallallahu alayhi wa alayhi wa
sallam oho Nutella thing was it
		
00:04:28 --> 00:04:33
			Dean? Will Abu Bakr Omar well and
even with an earthen was a teen.
		
00:04:33 --> 00:04:37
			So Abu Bakr and Omar also died at
the same age as the prophets of
		
00:04:37 --> 00:04:39
			Allah. What he was saying is that
well Buckaroo was two years
		
00:04:39 --> 00:04:40
			younger
		
00:04:41 --> 00:04:44
			than the prophet and he died two
years after the Prophet said no
		
00:04:44 --> 00:04:48
			Ahmad was 13 years younger than
the prophets of Allah who sent
		
00:04:48 --> 00:04:53
			them and passed away to over 13
years after him. Please be born
		
00:04:53 --> 00:04:57
			say North man, however, lived much
longer life.
		
00:04:58 --> 00:04:59
			He lived into his 80s
		
00:05:05 --> 00:05:09
			Next Hadith Hadith Anna Hussein
Abu Mahdi al bacillary had death
		
00:05:09 --> 00:05:14
			and Abdul Razak, an oblate,
Eurasian and his ovary and orwa
		
00:05:14 --> 00:05:15
			Ana Aisha
		
00:05:18 --> 00:05:21
			and then the via sallallahu alayhi
wa. He was salam Amato hope
		
00:05:21 --> 00:05:24
			Nutella and was it Dina Sena so
why are they bringing the same
		
00:05:24 --> 00:05:28
			narration over and over different
sayings the same thing showing you
		
00:05:28 --> 00:05:30
			that many have said this
		
00:05:31 --> 00:05:34
			not just one person who have said
this was said this or two people
		
00:05:34 --> 00:05:39
			many, many people have said this.
Okay, and that's the point here
		
00:05:39 --> 00:05:40
			many have said this.
		
00:05:42 --> 00:05:45
			Why does not have made a bit of
money I Ananya COVID and a
		
00:05:45 --> 00:05:50
			behemoth, a Dota fleet got up had
otherness eliminar Alia and Carla
		
00:05:50 --> 00:05:55
			didn't have that and burner Amar
Mola, Benny Hashem, Carla, Samia
		
00:05:55 --> 00:05:59
			to ignite best in your code to
fear Rasulullah sallallahu alayhi
		
00:05:59 --> 00:06:02
			wa alayhi wa sallam Waldner comes
in we're sitting so So even our
		
00:06:02 --> 00:06:09
			best, says 65. So even our best is
the total Jamal Quran. He is the
		
00:06:09 --> 00:06:13
			chief interpreter of the Quran,
from the companions of the prophet
		
00:06:13 --> 00:06:18
			named him that Tata joumana Al
Quran, the translator meaning the
		
00:06:18 --> 00:06:21
			interpreter of the Quran, yet
nonetheless, on this matter, we
		
00:06:21 --> 00:06:23
			say that he's not the one who's
right.
		
00:06:24 --> 00:06:30
			Okay. And the one that the one
that the answer is right, is that
		
00:06:30 --> 00:06:32
			he was 63 years of age, not 65.
		
00:06:34 --> 00:06:37
			Somebody can be the most noble
person, you know, they could make
		
00:06:37 --> 00:06:42
			a mistake, they could say
something wrong. Right? A chef a
		
00:06:42 --> 00:06:47
			honored ematic right or wrong. He
then went on to author a whole
		
00:06:47 --> 00:06:51
			different philosophy of Islamic
law will soon be completely
		
00:06:51 --> 00:06:54
			different. Or I should say
methodology philosophies a bit
		
00:06:54 --> 00:06:58
			different, but the methodology of
deriving rulings completely
		
00:06:58 --> 00:07:01
			different. Was that supposed to be
like offensive or something? No.
		
00:07:03 --> 00:07:07
			Some say that he waited until
medic passed away first. So that
		
00:07:07 --> 00:07:09
			he can maintain the data but not
hurt his feelings.
		
00:07:11 --> 00:07:17
			Or, or or not affect magic in that
way. Maybe, maybe not. But there
		
00:07:17 --> 00:07:18
			is nothing wrong with that.
		
00:07:20 --> 00:07:26
			So nobody should. So when people
have see you, and that shade has
		
00:07:26 --> 00:07:29
			an opinion, and you have a better
one, and you're qualified now Chef
		
00:07:29 --> 00:07:33
			a was qualified, then you feel
free to formulate it. I was just
		
00:07:33 --> 00:07:36
			talking to shake it on me. And he
said that. What I've said hutch,
		
00:07:36 --> 00:07:40
			one of his students wrote it up,
done him a response to what Robert
		
00:07:40 --> 00:07:43
			said Hutch and we were talking
about how there are some people
		
00:07:43 --> 00:07:47
			that have an ideological
groupthink, they will never, that
		
00:07:47 --> 00:07:51
			one of their shields will say
something that's out of bounds. It
		
00:07:51 --> 00:07:53
			happens because he's a human
being. Nobody will say anything.
		
00:07:55 --> 00:07:58
			Right? That's a problem right
there. That means that all your
		
00:07:58 --> 00:08:02
			other reputations are insincere.
You're all your other refutations
		
00:08:03 --> 00:08:06
			the sincerity is not yet there. If
you're truly sincere to the book
		
00:08:06 --> 00:08:09
			and the Sunnah, that's what we
follow. That's it, then you have
		
00:08:09 --> 00:08:10
			to admit when you're wrong.
		
00:08:11 --> 00:08:13
			You have to admit when your your
shake is wrong.
		
00:08:15 --> 00:08:18
			Because the book and the Sunnah,
if the book and the Sunnah can be
		
00:08:18 --> 00:08:21
			known than the accuracy of your
shoe can be known right or wrong.
		
00:08:22 --> 00:08:26
			You never follow a chef because,
oh, he has a secret source of
		
00:08:26 --> 00:08:30
			knowledge. That's nonsense,
always. Okay, legislation of
		
00:08:30 --> 00:08:34
			Ocarina and FIP. political
matters, worldly matters is never
		
00:08:34 --> 00:08:38
			ever going to be based upon
somebody has a secret source of
		
00:08:38 --> 00:08:42
			knowledge, right? Oh, he's got he
sees visions that we don't know.
		
00:08:42 --> 00:08:46
			He knows the future that we don't
know. So he has wisdoms that we
		
00:08:46 --> 00:08:50
			don't know. That will that may be
the case, but it will never be the
		
00:08:50 --> 00:08:54
			source of our legislation. It will
never be the source of a federal
		
00:08:54 --> 00:08:56
			it will never be the source of a
political action.
		
00:08:57 --> 00:09:01
			Okay, the stuff will never be the
source. We have to understand
		
00:09:01 --> 00:09:06
			this. Secondly, if we can know the
Quran and the Sunnah and the
		
00:09:06 --> 00:09:09
			shittier, then we can know the
accuracy of our own schewe.
		
00:09:09 --> 00:09:13
			Eventually you will come to that
point, right.
		
00:09:14 --> 00:09:18
			You can study with somebody and
the beginning you have view you
		
00:09:18 --> 00:09:22
			really have no knowledge at all,
he is the only source of knowledge
		
00:09:22 --> 00:09:23
			that you have.
		
00:09:24 --> 00:09:29
			Okay, you can bring in more in
whenever you are moyen can even
		
00:09:29 --> 00:09:31
			pitch into this discussion before
we get to the AI.
		
00:09:33 --> 00:09:39
			Good. Okay, good. No problem. So,
we're 20 minutes out until we have
		
00:09:39 --> 00:09:42
			the AI discussion. But for now,
this concept is extremely
		
00:09:42 --> 00:09:46
			important in Islam. The difference
between a student and a disciple,
		
00:09:46 --> 00:09:50
			a student will eventually come to
learn the sources of his teacher.
		
00:09:51 --> 00:09:55
			The student will eventually come
to learn and even if he advances
		
00:09:56 --> 00:10:00
			will be able to assess his teacher
but the
		
00:10:00 --> 00:10:03
			disciple always remains his head
bowed and he never looks at his
		
00:10:03 --> 00:10:06
			teacher. That's the difference.
That's the difference between
		
00:10:06 --> 00:10:10
			student and disciple when we're
studying Aki and FIP anyone who
		
00:10:10 --> 00:10:12
			advances can look at the
		
00:10:14 --> 00:10:15
			at the
		
00:10:16 --> 00:10:20
			methodology of his teacher and the
conclusion and make an assessment
		
00:10:20 --> 00:10:24
			and that does not decrease from
the teachers rank will not one
		
00:10:24 --> 00:10:30
			iota what decreases of his rank if
is is if he a does a moral wrong
		
00:10:30 --> 00:10:33
			like he does something haram for
example, or decreases from his
		
00:10:33 --> 00:10:37
			rank if another situation where he
becomes arrogant and refuses to
		
00:10:37 --> 00:10:41
			admit his error? Or a decrease in
his rank? If he deviates from
		
00:10:41 --> 00:10:45
			something that is what we call
copy something that should never
		
00:10:45 --> 00:10:50
			even be a question. Then we'd say
yeah, okay, that's something that
		
00:10:50 --> 00:10:50
			we
		
00:10:51 --> 00:10:56
			that's a deviation. But for for
you ever have a teacher and that
		
00:10:56 --> 00:11:00
			teacher taught you everything you
know about a certain subject and
		
00:11:00 --> 00:11:02
			you advance to another teacher to
another teacher? And then you come
		
00:11:02 --> 00:11:04
			back to your original teacher and
say, Okay, well, he actually made
		
00:11:04 --> 00:11:08
			a mistake here that how does that
decrease from his rank? Did you
		
00:11:08 --> 00:11:10
			take him as a teacher because he
was my assume
		
00:11:11 --> 00:11:13
			there are groups out there
		
00:11:14 --> 00:11:18
			that refute that the the right and
the wrong is not determined by the
		
00:11:18 --> 00:11:21
			book and the Sunnah is really
determined by the their teacher,
		
00:11:21 --> 00:11:24
			and that's a problem. And that's
where you get you have to close
		
00:11:24 --> 00:11:25
			yourself off from the OMA.
		
00:11:27 --> 00:11:32
			And anytime you see, even a Sunni
their concept of a Sunni cult,
		
00:11:33 --> 00:11:37
			yes, they're upon the Sunnah, but
they don't they the treatment they
		
00:11:37 --> 00:11:41
			give to their teacher versus the
rest of the Ummah, forces them to
		
00:11:41 --> 00:11:44
			cut off everyone else, and they
have to cut off everybody else.
		
00:11:45 --> 00:11:49
			Right. That is that type of
ideological cultish behavior, that
		
00:11:49 --> 00:11:55
			is a problem. Okay, a major
problem. Right? So we all have to
		
00:11:55 --> 00:11:58
			realize when you take on a
teacher, you do not take him on as
		
00:11:58 --> 00:11:59
			being my assume.
		
00:12:01 --> 00:12:04
			You take him on because he's
willing to teach you he went out
		
00:12:04 --> 00:12:08
			of his way, he learned something
that you didn't learn and
		
00:12:09 --> 00:12:15
			benefited you. And you owe them
respect for life. Unless he
		
00:12:15 --> 00:12:19
			deviates, completely goes crazy.
But otherwise, if you make some
		
00:12:19 --> 00:12:22
			mistake on a small matter, a
political decision, a practical
		
00:12:22 --> 00:12:27
			decision, or he just made a
mistake in FIP, even an arcade in
		
00:12:27 --> 00:12:27
			the floor.
		
00:12:29 --> 00:12:32
			No, so what people make mistakes.
		
00:12:33 --> 00:12:34
			And as he said,
		
00:12:36 --> 00:12:39
			and shikigami said, and what
Robert said has restrict on being
		
00:12:39 --> 00:12:41
			respected by his
		
00:12:43 --> 00:12:47
			students, yet one of our students
wrote a lot. He wrote a response
		
00:12:48 --> 00:12:50
			thing, that photo of my shirt was
wrong.
		
00:12:51 --> 00:12:56
			That doesn't take anything away
from her all and told me that shit
		
00:12:56 --> 00:12:57
			that chicken McFeely.
		
00:12:59 --> 00:13:04
			And chicks do hair. They go at it
all the time on photo, on machine
		
00:13:04 --> 00:13:07
			slaughter chicken, they just
differ all the time on machine
		
00:13:07 --> 00:13:08
			slaughter chicken.
		
00:13:09 --> 00:13:12
			Except for all the time on the
subject, what's wrong with that?
		
00:13:14 --> 00:13:16
			And one goes, Okay, good. There's
more chicken for me. And the other
		
00:13:16 --> 00:13:19
			one goes, enjoy your Mehta.
They're even joking about it.
		
00:13:19 --> 00:13:23
			Because it's a matter of fun. It's
a matter of speculation, you're
		
00:13:23 --> 00:13:27
			thinking, if I'm thinking and
you're thinking, then my thought,
		
00:13:27 --> 00:13:31
			and my methodology is correct, and
your methodology is correct. And
		
00:13:31 --> 00:13:35
			you arrive at different
conclusions. Obviously, some
		
00:13:35 --> 00:13:38
			perceptions are going to be
different along the way. So you're
		
00:13:38 --> 00:13:40
			going to arrive at different
conclusions. You're not
		
00:13:40 --> 00:13:41
			blameworthy.
		
00:13:43 --> 00:13:46
			And we're saying all of this you
wonder why we're saying this.
		
00:13:46 --> 00:13:48
			We're saying this because here we
have if not best, saying the
		
00:13:48 --> 00:13:52
			Prophet was 65 Previous to that
say that it's just that he was 63.
		
00:13:53 --> 00:13:57
			And another one of my best is 63.
So what gets what probably
		
00:13:57 --> 00:13:58
			happened?
		
00:13:59 --> 00:14:00
			Even our best probably change the
position.
		
00:14:02 --> 00:14:05
			When you have two narrations from
the best one saying 65 One st 63
		
00:14:06 --> 00:14:10
			by dt without a doubt he changed
his position, obviously change his
		
00:14:10 --> 00:14:13
			position, right. It's okay to
change your position.
		
00:14:15 --> 00:14:17
			Wayne, is it alright let's finish
this chapter real quick. I don't
		
00:14:17 --> 00:14:21
			know how many of you have a shot,
or how many have an event called
		
00:14:21 --> 00:14:25
			her death and I'm while ignition
had Devaney Abby. Katahdin has an
		
00:14:25 --> 00:14:30
			N double Velib Ambala and then
absl Allahu alayhi wa sallam will
		
00:14:30 --> 00:14:34
			be the one who comes in with city.
There you go. Another one Sahabi
		
00:14:34 --> 00:14:35
			said 65
		
00:14:36 --> 00:14:37
			Good.
		
00:14:39 --> 00:14:40
			Let's see the next one.
		
00:14:42 --> 00:14:45
			On NS of numeric and spend a lot
of time with the messenger peace
		
00:14:45 --> 00:14:47
			be upon him and live long after
him so we can gather information.
		
00:14:48 --> 00:14:51
			What does he say? Canada sal
Allahu Allahu alayhi wa sallam
		
00:14:51 --> 00:14:54
			laced with the wheel baton. The
Prophet peace be upon him was not
		
00:14:54 --> 00:14:59
			too tall will ever proceed and he
wasn't sure it will be available M
		
00:14:59 --> 00:15:00
			Huck, nor was
		
00:15:00 --> 00:15:04
			See pale white will eyeball Adam
nor was he black like very dark
		
00:15:05 --> 00:15:09
			willable judge of cotton, so his
skin was between the two. So
		
00:15:09 --> 00:15:12
			whatever two sides you have, he
was in the middle. His hair
		
00:15:12 --> 00:15:16
			neither was very curly willable
was cept nor was it perfectly
		
00:15:16 --> 00:15:21
			straight. Botha hello to Allah
Allah Rossi Arbaeen Asana, Allah
		
00:15:21 --> 00:15:25
			sent him while he was 40 years old
at the beginning of his 40th year
		
00:15:25 --> 00:15:29
			for a comma v McCosh. Listening.
He stayed in Mecca for for for 10
		
00:15:29 --> 00:15:34
			years. 10 years we've been Medina,
Tasha Sydney and 10 years in
		
00:15:34 --> 00:15:37
			Medina whatsoever who La La
assisted Tina center. So he
		
00:15:37 --> 00:15:39
			concluded 60 years What did NS
Miss there three years of the
		
00:15:39 --> 00:15:45
			secret Dawa. Right? What a Sufi
ROTC he will adhere to Yash Runa
		
00:15:45 --> 00:15:48
			charlatan, Badal there was not in
his hair or his beard.
		
00:15:49 --> 00:15:53
			More than 20 white hairs less than
20 the number of white hairs in
		
00:15:53 --> 00:15:55
			his hair and his beard was less
than 20.
		
00:15:56 --> 00:15:56
			Okay.
		
00:15:58 --> 00:16:01
			So what do we have about that
Ennis and Malik? Giving another
		
00:16:01 --> 00:16:06
			number and what do we say about
that? That's a mistake he made he
		
00:16:06 --> 00:16:09
			missed the three years of secret
doll that was just to the family.
		
00:16:11 --> 00:16:12
			Okay.
		
00:16:13 --> 00:16:15
			isn't nothing wrong with saying
your shoe has made a mistake?
		
00:16:16 --> 00:16:18
			If we're here make saying as a
hobby.
		
00:16:19 --> 00:16:24
			So hobby, the Sahaba are not their
rank, because everything they ever
		
00:16:24 --> 00:16:28
			said was right. The Sahaba are not
what they are because every single
		
00:16:30 --> 00:16:35
			thing they said or action they
took was correct. It was not
		
00:16:35 --> 00:16:37
			because they lived in the desert
there's a harbor or not no,
		
00:16:37 --> 00:16:40
			because they lived in the desert
because they were hungry because
		
00:16:40 --> 00:16:43
			that no it's because of their sit
and there is to come on the path
		
00:16:43 --> 00:16:47
			of the prophets of Allah he was on
them that's what they were. That's
		
00:16:47 --> 00:16:50
			what they were upon. So therefore
says okay, no, that's his
		
00:16:50 --> 00:16:53
			narration that's not what we're
going by. If you go to automotive
		
00:16:53 --> 00:16:57
			NAB that as is what he told to.
		
00:16:59 --> 00:17:02
			Who was it? What he all might have
been Abdulaziz
		
00:17:04 --> 00:17:06
			Subhanallah His name is keeping my
mind
		
00:17:07 --> 00:17:07
			secret.
		
00:17:11 --> 00:17:13
			What was his name? Subhan. Allah,
I don't know how I'm skipping his
		
00:17:13 --> 00:17:15
			name because I haven't eaten. So
		
00:17:16 --> 00:17:19
			I'm gonna bring out that as he is
when he said gather the Hadith of
		
00:17:19 --> 00:17:22
			the Prophet leave off the
strictness of Lebanon.
		
00:17:23 --> 00:17:27
			In Omar, the leniency of if not
best, and the odd statements of
		
00:17:27 --> 00:17:27
			viburnum episode.
		
00:17:29 --> 00:17:29
			Okay.
		
00:17:32 --> 00:17:36
			Spatola Mohammed bin Muslim,
what's he known by? How's it
		
00:17:36 --> 00:17:38
			skipping my mind right now?
Muhammad have been Muslim.
		
00:17:39 --> 00:17:41
			Okay, anyway, skipping my mind.
		
00:17:43 --> 00:17:46
			That's what he said about the
Sahaba about what their fatawa
		
00:17:46 --> 00:17:50
			they had fatawa they had states
that had fatawa on matters that
		
00:17:50 --> 00:17:50
			were
		
00:17:51 --> 00:17:52
			up for discussion.
		
00:17:54 --> 00:17:56
			So so they concluded and he's a
Tevye
		
00:17:58 --> 00:18:02
			and he's saying, this one he is,
was too strict, this one is too
		
00:18:02 --> 00:18:06
			lenient, strict and lenient
relative to what relative to the
		
00:18:06 --> 00:18:11
			opinion of the people of the other
scholars. Right, not strict and
		
00:18:11 --> 00:18:12
			lenient relative to like some
culture.
		
00:18:14 --> 00:18:18
			So if they're saying that about
the Sahaba, and that's not
		
00:18:18 --> 00:18:22
			anywhere near a wrong action for
them to do, they're saying about
		
00:18:22 --> 00:18:26
			the judgments. So then what about
your automap? When they say
		
00:18:26 --> 00:18:29
			something that should be altered,
it should be corrected and 99% of
		
00:18:29 --> 00:18:34
			the OMAS on one wavelength, you're
shaved on this wavelength is here.
		
00:18:35 --> 00:18:38
			If you think that you follow him
because he's my assume you got
		
00:18:38 --> 00:18:41
			issues. Alright, let's go to
segment two before we answer that
		
00:18:41 --> 00:18:45
			what I just said about the secret
Dawa means that the Tao of the
		
00:18:45 --> 00:18:47
			Prophet sallallahu alayhi wa
sallam for the first period of
		
00:18:47 --> 00:18:52
			time was just his family and who
he selected. It was not open
		
00:18:52 --> 00:18:56
			announcement to everybody. Okay,
so he was gathering first. The
		
00:18:56 --> 00:19:00
			prophets of Allah what He was was
gathering first, the initial core
		
00:19:00 --> 00:19:00
			of Sahaba
		
00:19:01 --> 00:19:02
			All right.
		
00:19:03 --> 00:19:08
			Okay, is there some big deal here?
Kevin Lee, do we know him? UFC
		
00:19:08 --> 00:19:12
			fighter entered Islam Very good.
Kevin Lee since being public about
		
00:19:12 --> 00:19:15
			my conversion as a Muslim. I've
had a lot of people reach out to
		
00:19:15 --> 00:19:18
			give support to the message and
calls and I feel the love Allah
		
00:19:18 --> 00:19:21
			always had a plan. And I'm glad
I'm on the right path. And he is
		
00:19:21 --> 00:19:26
			friends with Phil Ross is Zabi who
was a UFC fighter.
		
00:19:28 --> 00:19:29
			convert to
		
00:19:30 --> 00:19:30
			Islam
		
00:19:32 --> 00:19:36
			really accepted my place in life
as a Muslim and just there and
		
00:19:36 --> 00:19:41
			alone kind of brought me into the
brotherhood of speaking with more
		
00:19:41 --> 00:19:43
			Muslims and and
		
00:19:45 --> 00:19:48
			me forming like more of a bond
with these people
		
00:19:50 --> 00:19:56
			have accepted Islam, right. I've
converted over to Islam, and
		
00:19:56 --> 00:19:59
			really accepted my place in life
as a Muslim and
		
00:20:00 --> 00:20:01
			Just there and alone. No.
		
00:20:02 --> 00:20:08
			All right, very good. That's nice.
Next let's go to Moines. We have
		
00:20:08 --> 00:20:09
			Wayne here today.
		
00:20:11 --> 00:20:13
			It is good. Wayne is
		
00:20:15 --> 00:20:18
			looking there sort of like he's
coming out of Star Wars.
		
00:20:20 --> 00:20:23
			This is technical raise your mic
online. Sound like can you hear
		
00:20:23 --> 00:20:27
			me? Well, I will say that we got
you. Yes. All right. My Europe
		
00:20:27 --> 00:20:33
			talk to us. What do you think what
is going on? In terms of AI?
		
00:20:33 --> 00:20:36
			First, I would like to ask you a
question the facts about AI.
		
00:20:37 --> 00:20:41
			Ignition. Thank you, Mohammed. I'm
sorry Ryan had
		
00:20:42 --> 00:20:46
			the name that was given was the
famous Legend of his time ignition
		
00:20:46 --> 00:20:51
			hub. zoete. How many Muslim? Who
all might have been Abdulaziz said
		
00:20:51 --> 00:20:56
			gather as the Sunnah. Leave off
the excess. The excess strictness
		
00:20:56 --> 00:21:00
			of Ivanova, the leniency of an
ibis and the odd statements of
		
00:21:00 --> 00:21:04
			him. So he's talking about their
fatawa their judgments, and he is
		
00:21:04 --> 00:21:08
			assessing their judgments. Oh, how
can he assess their judgments?
		
00:21:08 --> 00:21:12
			Because we're a religion of
sources, textual sources, which we
		
00:21:12 --> 00:21:15
			can all understand. If you
understand them, well, you can
		
00:21:15 --> 00:21:17
			eventually come to even assess
your teacher.
		
00:21:18 --> 00:21:21
			So that's nothing wrong with that
in our religion, we don't honor
		
00:21:21 --> 00:21:25
			people because they're sinless, or
they never make mistakes in
		
00:21:25 --> 00:21:25
			academia.
		
00:21:27 --> 00:21:31
			We honor people by their piety,
their longevity, right, things
		
00:21:31 --> 00:21:36
			like that. Alright, so let's now
move on to back to AI. So first, I
		
00:21:36 --> 00:21:38
			would like to ask you my the
facts.
		
00:21:39 --> 00:21:41
			Number two, the concerns. Alright.
		
00:21:43 --> 00:21:45
			I will administrate the energy
Bismillah R Rahman r Rahim,
		
00:21:45 --> 00:21:49
			Allahumma salli ala Sayyidina,
Muhammad Ali, Salam Bismillah, the
		
00:21:49 --> 00:21:53
			loaded MassMutual invalid. And I
also knew that him. So to get
		
00:21:53 --> 00:21:54
			started,
		
00:21:55 --> 00:21:59
			I think you've been brought on to
talk about AI. Right. But AI is
		
00:21:59 --> 00:22:04
			broad, right? It's like trying to
bring on a doctor to talk about,
		
00:22:04 --> 00:22:08
			you know, medicine in general,
there's a lot in medicine. So we
		
00:22:08 --> 00:22:12
			have to dive deeper, right. And
I'm not an expert on AI. But I do
		
00:22:12 --> 00:22:13
			work with
		
00:22:15 --> 00:22:18
			technology. And I've been a
technologist, I suppose you could
		
00:22:18 --> 00:22:23
			say for most of my career. And
there's a lot of chatter that's
		
00:22:23 --> 00:22:26
			been going on about AI over the
last
		
00:22:27 --> 00:22:30
			six, I mean, in the tech world has
been going on for a while. But in
		
00:22:30 --> 00:22:34
			the lay people community who are
not aware it's been going on for
		
00:22:34 --> 00:22:38
			the last few months, especially
with the advent of chatbot, or
		
00:22:38 --> 00:22:43
			chat JpT, which was announced by
open AI, back in December. And
		
00:22:43 --> 00:22:46
			this was a free this is an was a
free tool that's open to the
		
00:22:46 --> 00:22:49
			public. And it really shocked
people, especially people, I've
		
00:22:49 --> 00:22:50
			never seen
		
00:22:52 --> 00:22:56
			what some of the advances in
technology have been over the
		
00:22:56 --> 00:22:56
			last,
		
00:22:58 --> 00:23:01
			you know, 15 years, if you're
coming on to this, and all of a
		
00:23:01 --> 00:23:04
			sudden you log on to chat bot
where you log on to mid journey,
		
00:23:04 --> 00:23:07
			which I'll talk about in a bit,
where you log on to any of these
		
00:23:07 --> 00:23:11
			newer AI generation tools as we
can say,
		
00:23:12 --> 00:23:18
			it's surprising, and it's shocking
to a lot of people. Now, your next
		
00:23:18 --> 00:23:23
			question was, you know, what are
the facts? Okay, so the facts are
		
00:23:23 --> 00:23:29
			that I'm very hesitant on saying
that AI is something that we
		
00:23:29 --> 00:23:33
			should be worried about. But
simultaneously, I'm also hesitant
		
00:23:33 --> 00:23:36
			on saying it's something that we
shouldn't be worried about.
		
00:23:36 --> 00:23:38
			Because at the end of the day,
right, Allah subhanaw taala is the
		
00:23:38 --> 00:23:41
			one who controls all things. And
even the invention and creation of
		
00:23:41 --> 00:23:44
			AI is from humans. And you know,
Allah subhanaw taala has allowed
		
00:23:44 --> 00:23:47
			this to happen, right? So it's,
it's there in the world. So now we
		
00:23:47 --> 00:23:48
			need to contend with it right.
		
00:23:49 --> 00:23:54
			Now, how do we need to worry?
Right? Because there's two types
		
00:23:54 --> 00:23:58
			of worries one is sort of this
existential type worry, what's
		
00:23:58 --> 00:24:01
			going to happen to me, what's
going to happen to me if my family
		
00:24:01 --> 00:24:07
			look, even in a nuclear, you know,
war, if you're meant to be taken
		
00:24:07 --> 00:24:09
			care of, you're going to be taken
care of. So let's let's not even
		
00:24:09 --> 00:24:12
			discuss the possibility of what's
going to happen to you the answer
		
00:24:12 --> 00:24:16
			is probably nothing. Right? You
know, even in the worst of times,
		
00:24:16 --> 00:24:18
			if nothing's gonna happen to you,
nothing's gonna happen to you,
		
00:24:18 --> 00:24:21
			right? So if Allah has nothing to
happen to you, nothing will happen
		
00:24:21 --> 00:24:24
			to you. So let's just leave that
off the discussion. Right? Let's
		
00:24:24 --> 00:24:28
			assume that nothing will happen to
you. Now let's talk about what are
		
00:24:28 --> 00:24:33
			the practical things that may be
impacted in your personal life and
		
00:24:33 --> 00:24:39
			with your family and the world
with the growing use of artificial
		
00:24:39 --> 00:24:42
			intelligence technologies. So I'll
stop there and then see if you
		
00:24:42 --> 00:24:46
			want to ask any questions, I just
want you to separate between AI
		
00:24:46 --> 00:24:47
			and self learning.
		
00:24:49 --> 00:24:54
			So AI is a broad category. And
there's there's a subcategory of
		
00:24:54 --> 00:24:59
			when we say AI, it's it's an easy
to use term because what is art
		
00:25:00 --> 00:25:03
			Official intelligence, right?
Because intelligence if we're
		
00:25:03 --> 00:25:08
			defining it is the ability to be
able to process and think and make
		
00:25:08 --> 00:25:12
			conclusions, right. So for
example, is a computer intelligent
		
00:25:12 --> 00:25:16
			the way that a human being is? Not
yet, right?
		
00:25:17 --> 00:25:24
			Because we would say that a human
being is able to reason at a farm
		
00:25:24 --> 00:25:29
			at a much more complex level right
at a at a capability that is far
		
00:25:29 --> 00:25:35
			further than a computer. Now,
someone what may say, and this is
		
00:25:35 --> 00:25:39
			the belief of many people, such as
Sam Harris, who believe that human
		
00:25:39 --> 00:25:43
			beings are just new neurons being
fired. And it's all just chemical
		
00:25:43 --> 00:25:47
			reactions, right, there is no
separation between a human being
		
00:25:47 --> 00:25:52
			and a machine, all you need to do
is map all of the data points and
		
00:25:52 --> 00:25:55
			beliefs, beliefs and behaviors
that a human being has. And you
		
00:25:55 --> 00:25:59
			can all map it all back to data,
and you could simulate a human
		
00:25:59 --> 00:26:01
			being right. This is the
		
00:26:02 --> 00:26:07
			belief amongst most materialist
naturalist, especially in the AI
		
00:26:07 --> 00:26:09
			community. So that's why it's
called artificial intelligence,
		
00:26:09 --> 00:26:12
			because there's this, the
overarching terms is that
		
00:26:12 --> 00:26:14
			eventually we'll reach this point
of intelligence where we can
		
00:26:14 --> 00:26:18
			essentially simulate a human
being. Now it's artificial, in
		
00:26:18 --> 00:26:23
			that it's not real intelligence,
right. And So machine learning is
		
00:26:23 --> 00:26:27
			a subcategory within artificial
intelligence, machine learning is
		
00:26:27 --> 00:26:31
			different types of algorithms that
are used in order to
		
00:26:33 --> 00:26:36
			do AI, you can say, so, for
example,
		
00:26:37 --> 00:26:39
			there's different types of
algorithms out there and machine
		
00:26:39 --> 00:26:42
			learning algorithms. We don't need
to go into all the nuances, that
		
00:26:42 --> 00:26:43
			doesn't really matter.
		
00:26:45 --> 00:26:48
			But people use those techniques
and algorithms in order to do
		
00:26:48 --> 00:26:51
			certain things. Now, someone might
say, well, what's the difference
		
00:26:51 --> 00:26:54
			between regular programming and
machine learning? Isn't it just
		
00:26:55 --> 00:26:58
			advanced? advanced programming?
Well, yes, of course. It's just
		
00:26:58 --> 00:27:03
			advanced programming. But that's
as that's almost as simple as
		
00:27:03 --> 00:27:04
			saying, What's the difference
between,
		
00:27:05 --> 00:27:10
			you know, bits on a, you know,
motherboard and, you know, going
		
00:27:10 --> 00:27:12
			on to Gmail, it's like, well,
they're, yeah, they're both
		
00:27:12 --> 00:27:16
			programming. But one is obviously,
something that has much more
		
00:27:16 --> 00:27:19
			impact in the real world and is
far more complicated and complex,
		
00:27:19 --> 00:27:24
			right? So it's not the same thing.
So saying that, Oh, AI is just
		
00:27:24 --> 00:27:27
			advanced programming. It's like,
well, yeah, sure. So is the cell
		
00:27:27 --> 00:27:32
			phone, right. But it obviously
makes a difference. So let me let
		
00:27:32 --> 00:27:36
			me look at it this way. So chat
GPT.
		
00:27:37 --> 00:27:39
			Made by open AI.
		
00:27:40 --> 00:27:45
			Yes. Made by open AI, which is an
Elon Musk's company. Okay. Yeah.
		
00:27:46 --> 00:27:50
			That's irrelevant. I'm just
pointing that out. They have a
		
00:27:50 --> 00:27:54
			database of information. Okay,
where we got that we don't know.
		
00:27:54 --> 00:27:59
			Okay. And they've basically been
able to program this thing to
		
00:27:59 --> 00:28:02
			simulate human writing.
		
00:28:03 --> 00:28:05
			Correct. So
		
00:28:06 --> 00:28:08
			AI is simply
		
00:28:10 --> 00:28:13
			mimicking, basically what it sees
already.
		
00:28:14 --> 00:28:15
			And finding solutions,
		
00:28:16 --> 00:28:21
			or gathering the data and then
Fasching it together. So a chat
		
00:28:21 --> 00:28:25
			GPT is one level up from from
Google, essentially. Because
		
00:28:25 --> 00:28:28
			Google, you can see the
information, you have to put it
		
00:28:28 --> 00:28:32
			together yourself. You see the
information a lot slower at the
		
00:28:32 --> 00:28:36
			pace of you clicking and reading,
right, you then have to synthesize
		
00:28:36 --> 00:28:41
			information on your own. So chat,
GPT took this one level up. So
		
00:28:41 --> 00:28:46
			that's why it is in fact, a Google
killer. Really, because it now is
		
00:28:46 --> 00:28:50
			gathering far more information
than you could have ever. And it's
		
00:28:50 --> 00:28:53
			synthesizing it in a way that you
could read. And you don't have to
		
00:28:53 --> 00:28:56
			worry, right? So
		
00:28:57 --> 00:29:00
			if Facebook arguments Twitter
arguments caused a lot of people
		
00:29:00 --> 00:29:05
			to study, right, a lot of people
would not read or study it unless
		
00:29:05 --> 00:29:08
			someone bothered them with a post.
They go then they read the
		
00:29:08 --> 00:29:13
			research, they come back with a
really fancy answer. Okay. But
		
00:29:13 --> 00:29:15
			they had to research it
themselves. They had to synthesize
		
00:29:15 --> 00:29:18
			it themselves. GPT now has done
that for them.
		
00:29:19 --> 00:29:22
			Is that that's an accurate summary
of what's going on. Right.
		
00:29:24 --> 00:29:28
			It's accurate in that it does
synthesize information, whether
		
00:29:28 --> 00:29:34
			that information so that
information is accurate or not. It
		
00:29:35 --> 00:29:41
			but I can filter it. I could say
only tell me what Encyclopedia
		
00:29:41 --> 00:29:45
			Britannica it says about
rhinoceroses, right, I could do
		
00:29:45 --> 00:29:49
			that on chat. GPT Yes, you can
filter stuff. Yeah, depending. So
		
00:29:49 --> 00:29:51
			it doesn't right now it doesn't
have access to the internet,
		
00:29:52 --> 00:29:54
			right. It doesn't have access, it
has access to what they fed it
		
00:29:54 --> 00:29:58
			right. Which is correct. And it's
been it's been trained essentially
		
00:29:58 --> 00:30:00
			on the internet and massive
amounts of data.
		
00:30:00 --> 00:30:05
			All right, yeah. So one thing that
when we look at every AI, we have
		
00:30:05 --> 00:30:08
			to ask the methodology of the AI,
and the resource, the sources, the
		
00:30:08 --> 00:30:12
			sourcing and the method. So with
chat GPT, we don't know what the
		
00:30:12 --> 00:30:19
			founders deemed knowledge worthy
of feeding this thing. Because if
		
00:30:19 --> 00:30:23
			if I came in, I got a reliable
source that said, Here, we're
		
00:30:23 --> 00:30:26
			going to put all of the let's say,
Chef a,
		
00:30:28 --> 00:30:33
			in in a in a database, and no
human being came out and said,
		
00:30:33 --> 00:30:37
			Yes, we got the chef, a fake, it
was reliable. We tested it, we
		
00:30:37 --> 00:30:41
			made sure that the PDFs, and the
books and everything were
		
00:30:41 --> 00:30:46
			reliable. Here it is. Now you can
go use it as a search engine.
		
00:30:46 --> 00:30:49
			Wonderful. We all accept that. And
scholars have been using these
		
00:30:49 --> 00:30:52
			search engines forever. Okay.
		
00:30:53 --> 00:30:54
			You all hear the construction?
		
00:30:55 --> 00:31:02
			It's very light. Okay, good. So
all that this AI does, is gather
		
00:31:02 --> 00:31:05
			that information faster than you
ever gathered it, and then
		
00:31:05 --> 00:31:09
			synthesizes it now here's Now my
next question. What is the
		
00:31:09 --> 00:31:12
			methodology of synthesizing? So if
I said,
		
00:31:14 --> 00:31:17
			chat GPT and or my hyper,
		
00:31:18 --> 00:31:26
			my example of Chef a chatbot? What
it what things break will do in
		
00:31:26 --> 00:31:30
			the chef a school? According to
the dominant opinion of a no,
		
00:31:30 --> 00:31:31
			Rafi?
		
00:31:32 --> 00:31:32
			Okay,
		
00:31:33 --> 00:31:37
			it's gonna go to No, it have a,
what if I didn't put that? What if
		
00:31:37 --> 00:31:40
			I didn't put a filter? So that's
where I'm asking the methods we
		
00:31:40 --> 00:31:45
			need to know, for every chatbot,
the methodology, the sources, and
		
00:31:45 --> 00:31:50
			then the methodology of how it's
synthesizing things. So this
		
00:31:50 --> 00:31:54
			depends a little bit on how on the
model that it's been trained on,
		
00:31:54 --> 00:31:55
			right.
		
00:31:56 --> 00:31:59
			So for example, you mentioned you
know, a specific book is not or
		
00:31:59 --> 00:32:02
			specific text is not in this
model, then it won't be able to
		
00:32:02 --> 00:32:04
			reference it, it's just not
something that it will be able to
		
00:32:04 --> 00:32:08
			do. Now. Chad GPT is known to do
something called hallucination,
		
00:32:08 --> 00:32:11
			right, which is that it will
imagine that that thing is there.
		
00:32:12 --> 00:32:14
			And it will just hallucinate what
that answer is, and give it back
		
00:32:14 --> 00:32:19
			to you. Right and other models. I
know. Google has one that they've
		
00:32:19 --> 00:32:23
			done, that might come out called
Sparrow. Sorry, not Google.
		
00:32:24 --> 00:32:27
			DeepMind has one called Sparrow,
and a Google has one called
		
00:32:27 --> 00:32:30
			lambda. And they also do similar
sorts of hallucinations. Right.
		
00:32:30 --> 00:32:35
			And this is a problem that AI is
the AI community is dealing with,
		
00:32:35 --> 00:32:40
			right? It's how do you deal with
this hallucination aspect of how
		
00:32:40 --> 00:32:43
			this information is presented?
Because the way it really presents
		
00:32:43 --> 00:32:47
			information is it reads character
by character, right? Okay. I read
		
00:32:47 --> 00:32:51
			the letter A, and then I match it
up to what that A is supposed to
		
00:32:51 --> 00:32:54
			mean. And then I and I understand
from that, what is the next letter
		
00:32:54 --> 00:32:56
			that comes and then from that it
understands what a word is, it's
		
00:32:56 --> 00:32:59
			similar to how a human being would
understand the word and then based
		
00:32:59 --> 00:33:03
			on all that, it's now able to go
look up. Oh, so let's say it's,
		
00:33:03 --> 00:33:08
			it's the sentence, the fox jumped
over the road, the box jumped over
		
00:33:08 --> 00:33:12
			the log, right? It reads the Fox,
and then it goes and tries to
		
00:33:12 --> 00:33:15
			understand what is a fox, it looks
in its database of all these
		
00:33:15 --> 00:33:17
			things. And then from that, it's
like, okay, the fox jumped. Okay,
		
00:33:17 --> 00:33:21
			what does jumped me in, so then it
attaches it back to that Fox. And
		
00:33:21 --> 00:33:24
			so this is how it would construct
this entire understanding of this
		
00:33:24 --> 00:33:28
			sentence. And similarly, if you
asked it, let's say, in chef, a
		
00:33:28 --> 00:33:32
			fit, I want this thing. It's going
to understand that question based
		
00:33:32 --> 00:33:35
			on those words, and then it'll go
try to look up based on its
		
00:33:35 --> 00:33:38
			understanding, now it may
understanding correctly, that's
		
00:33:38 --> 00:33:38
			why
		
00:33:39 --> 00:33:45
			in the modern iteration of these
intelligent these models,
		
00:33:46 --> 00:33:49
			you have to be pretty specific in
what you're looking for. Because
		
00:33:49 --> 00:33:52
			otherwise, if you're not, you're
going to cause you know it to
		
00:33:52 --> 00:33:55
			hallucinate or give you something
that's exactly that's why you need
		
00:33:55 --> 00:33:58
			to be able to the methodology, the
sources and the methodology.
		
00:33:59 --> 00:34:03
			Right? The the methodology that
you can add filters, that's really
		
00:34:03 --> 00:34:08
			great. Yeah, so But besides that,
the sources of the other question.
		
00:34:09 --> 00:34:14
			So, right now Chatbot is just like
a general somebody's saying my
		
00:34:14 --> 00:34:18
			thesis here. Oh, there you go. Now
we got Zen, we got the guy we had
		
00:34:18 --> 00:34:21
			now we have the real guy, you
know, he's the he's the guy who
		
00:34:21 --> 00:34:24
			actually could talk about the
details of how these things work.
		
00:34:24 --> 00:34:24
			That's your mic.
		
00:34:27 --> 00:34:27
			Okay.
		
00:34:29 --> 00:34:34
			So that's how that's how these
things work. All that's fun. Let's
		
00:34:34 --> 00:34:35
			go into the science fiction stuff.
		
00:34:38 --> 00:34:41
			Ai develops and grows and starts
to manage bigger and bigger
		
00:34:41 --> 00:34:46
			things. But there has to always be
a manual override to AI right or
		
00:34:46 --> 00:34:48
			wrong. Okay.
		
00:34:49 --> 00:34:54
			Oh, the science fiction is part of
this thing asked the question. Can
		
00:34:54 --> 00:34:57
			AI override the manual override?
		
00:34:58 --> 00:34:59
			Can it realize that this is a
stopgap
		
00:35:00 --> 00:35:05
			I write and override that manual
override. And then you have all
		
00:35:05 --> 00:35:09
			your Netflix movies after that. Is
that something that's pure
		
00:35:09 --> 00:35:11
			fiction, or is it even possible?
		
00:35:13 --> 00:35:19
			At the moment, so that first line,
can you go? Is he allowed? Okay?
		
00:35:20 --> 00:35:25
			Right now, it's not visible. But
at some point,
		
00:35:26 --> 00:35:30
			I can see some scenarios where it
will become, it may become
		
00:35:30 --> 00:35:35
			plausible. So, so right now it is
not even a scenario not even a
		
00:35:35 --> 00:35:38
			scenario. Okay. So cancel right
away. So right away, yes. Good.
		
00:35:39 --> 00:35:40
			Now, next question is,
		
00:35:42 --> 00:35:46
			by the way, just for, can we be
introducing the fees as well, just
		
00:35:46 --> 00:35:50
			so that the viewers know, some of
these is out of Harvard? What did
		
00:35:50 --> 00:35:55
			you do in Harvard? I did research
with artificial intelligence and
		
00:35:55 --> 00:35:59
			computational biology. Okay. As
for what program an MA or PhD,
		
00:36:00 --> 00:36:03
			that is my postdoctoral work.
postdoctoral, postdoctoral. So
		
00:36:03 --> 00:36:08
			after a PhD, this is one of these.
Yes. He's a guy. He's the guy you
		
00:36:08 --> 00:36:09
			want to bring on? Yes.
		
00:36:11 --> 00:36:17
			Oh, Frenchie. Yes. Princeton. So
we got postdoc at Princeton and AI
		
00:36:17 --> 00:36:22
			part of our crew. We got postdoc
from Harvard and our crew. So we
		
00:36:22 --> 00:36:26
			got to mesh a lot really good crew
here. And so in the fees, so where
		
00:36:26 --> 00:36:30
			did you do doc? I stayed in
university. Okay. But then you
		
00:36:30 --> 00:36:32
			went to Harvard and your study and
you said,
		
00:36:33 --> 00:36:37
			What conclusions you have that
could benefit regular people from
		
00:36:37 --> 00:36:40
			my postdoctoral work? Yeah. So
basically, I worked
		
00:36:42 --> 00:36:48
			with CRISPR, which is genome
editing. So you can you can genome
		
00:36:48 --> 00:36:52
			editing with genome editing, you
can change people's DNA, with cell
		
00:36:53 --> 00:36:53
			in yourself.
		
00:36:56 --> 00:36:59
			Like you were like before they're
born. Now after they're born. So
		
00:36:59 --> 00:37:04
			now, yeah, now. So mainly, mainly,
we were focused on diseased cells.
		
00:37:04 --> 00:37:08
			So suppose for leukemia, you might
say might have seen the news
		
00:37:08 --> 00:37:13
			recently, that there was a car or
seven years old, okay. She was a
		
00:37:13 --> 00:37:19
			terminal with leukemia. Then they
used the genome editing, to change
		
00:37:19 --> 00:37:24
			the only DNA DNA part that was
responsible for that leukemia. Now
		
00:37:24 --> 00:37:28
			she is completely cancer free,
amazing. Without any radiation
		
00:37:28 --> 00:37:33
			without any emo. I think it's the
technical base editing, base
		
00:37:33 --> 00:37:36
			editing everyone here in this,
their audience going through it.
		
00:37:37 --> 00:37:42
			Yeah. So Paula, that is amazing.
Okay, keep going. Yeah. So my work
		
00:37:42 --> 00:37:46
			was basically how we can use these
latest AI works to make genome
		
00:37:46 --> 00:37:49
			editing more efficient, where
people are gonna go and make
		
00:37:49 --> 00:37:51
			themselves read has one day
blondes and the next day.
		
00:37:52 --> 00:37:56
			How does that work? Yeah. So
there. So there's, there's the
		
00:37:56 --> 00:37:59
			ethics part can come in? Well,
yeah, because I'm going straight
		
00:37:59 --> 00:38:02
			to human instincts. Okay, I'm
better now. Let's use this
		
00:38:02 --> 00:38:08
			technology for a consumer purpose
for just a personal Yeah, whimsy
		
00:38:08 --> 00:38:11
			whimsical purpose. Let me make
myself blue out today. Yeah. So
		
00:38:11 --> 00:38:14
			there's basically so it's a big
discussion in the scientific
		
00:38:14 --> 00:38:18
			community. Yeah. So for example,
you can go into a woman's body,
		
00:38:18 --> 00:38:24
			and while she is pregnant, and
change the embryos, DNA to make it
		
00:38:24 --> 00:38:28
			not susceptible to some diseases.
So the discussion is whether we
		
00:38:28 --> 00:38:32
			should even do it or not, for
example, a Chinese scientist who
		
00:38:32 --> 00:38:40
			went rogue, and he changed the
baby's DNA, while they were still
		
00:38:40 --> 00:38:42
			in the in their mother's body.
		
00:38:43 --> 00:38:48
			And it sparked a big controversy.
And later that scientists like it
		
00:38:48 --> 00:38:51
			really vanished because it was so
controversial. The Chinese
		
00:38:51 --> 00:38:54
			government like they disappeared,
disappeared.
		
00:38:55 --> 00:39:00
			Okay, can you tell me something?
What did he change? Gender? No. So
		
00:39:00 --> 00:39:04
			it was I forgot the specific
disease, but there's, there's,
		
00:39:05 --> 00:39:08
			oh, it's for their health for
their health. So okay, so So the
		
00:39:08 --> 00:39:12
			specific part is responsible for
the disease. So he went there and
		
00:39:12 --> 00:39:17
			change that. But the thing is, you
don't know if changing that
		
00:39:17 --> 00:39:23
			particular part permanently, would
render what side effects later in
		
00:39:23 --> 00:39:25
			their lives the problem? So that's
where the ethics you got to do
		
00:39:25 --> 00:39:30
			this on lambs and monkeys and cats
and dogs. Yeah. And even that's an
		
00:39:30 --> 00:39:32
			ethical question, right? Even
that's an ethical question, and
		
00:39:32 --> 00:39:37
			also, that they don't even always
transfer to human. Of course, it
		
00:39:37 --> 00:39:41
			doesn't even transfer. And now is
the mapping of genetics complete.
		
00:39:42 --> 00:39:46
			The genome is complete so you know
exactly where the DNA is for
		
00:39:46 --> 00:39:51
			nails, for hair for skin. For
bony. Many of the phenotypic, we
		
00:39:51 --> 00:39:55
			call them phenotypic traits. We
know like what are there okay, so
		
00:39:55 --> 00:39:59
			that makes the editing the
possibilities are endless,
		
00:39:59 --> 00:39:59
			endless. If
		
00:40:00 --> 00:40:03
			Okay, and you could do this, what
do you mean the dude while their
		
00:40:03 --> 00:40:07
			person is alive, that means you
transform their hair color, their
		
00:40:07 --> 00:40:08
			hair thickness?
		
00:40:09 --> 00:40:15
			I just skin color. Yeah. I'm not
sure if you can change it. While
		
00:40:16 --> 00:40:19
			they have already developed those
like they have reached a certain
		
00:40:19 --> 00:40:24
			age. I'm not sure what that okay.
Now how does this connect now to
		
00:40:24 --> 00:40:30
			AI now? So this connects, because
when you're when you're sending
		
00:40:30 --> 00:40:33
			this particular we call it a
delivery vehicle. Yep, that
		
00:40:34 --> 00:40:39
			the human being takes in, and then
it goes into the cell. And it has
		
00:40:41 --> 00:40:45
			it has kind of a signature with
which it understands where to go
		
00:40:45 --> 00:40:48
			and attach to the DNA. Yeah,
within those 3 billion letters.
		
00:40:48 --> 00:40:50
			Okay, yeah. And
		
00:40:51 --> 00:40:55
			so what happens is, it's not
really foolproof. So sometimes,
		
00:40:56 --> 00:40:59
			most of the time, it will have off
target effect. Like it will also
		
00:40:59 --> 00:41:02
			change some other parts of the
data, it's always the problem,
		
00:41:02 --> 00:41:08
			right? That's always the problem.
So we were trying to find out if
		
00:41:08 --> 00:41:12
			through machine learning, we can
design the sequences in such a way
		
00:41:12 --> 00:41:16
			that they want to do this off
target editing, they will just
		
00:41:16 --> 00:41:20
			stick to that one particular, VA.
So curing cancer in the future
		
00:41:20 --> 00:41:24
			will have nothing to do with
clinical testing. And everything
		
00:41:24 --> 00:41:29
			to do with DNA AI. Right. Yeah. I
mean, that's what it sounds like,
		
00:41:29 --> 00:41:32
			the idea have come and come in and
walk and let me look at you and
		
00:41:32 --> 00:41:36
			let me test your temperature. All
that stuff is extremely
		
00:41:36 --> 00:41:39
			rudimentary and go straight to the
cause now, and now they have the
		
00:41:39 --> 00:41:41
			sorts of even more advanced
techniques where you can also
		
00:41:41 --> 00:41:47
			insert sequences. Yeah. into the,
into your own DNA. Yeah. So what
		
00:41:47 --> 00:41:51
			they're doing is basically
training these large models, just
		
00:41:51 --> 00:41:55
			like they're trained jadibooti
They are training it on the DNA
		
00:41:55 --> 00:41:58
			sequence 3 billion letters. And
then they are trying to generate
		
00:41:58 --> 00:42:03
			sequences with which you can if
you can insert them into your DNA,
		
00:42:03 --> 00:42:06
			you can permanently change from
the the state or things like
		
00:42:06 --> 00:42:11
			those. Okay, okay, good. So these
are all the like functional and,
		
00:42:12 --> 00:42:18
			and uses of AI that are not
consumer based? No, this is only
		
00:42:18 --> 00:42:24
			within a specific field, right? So
we now let's shift over, let's
		
00:42:24 --> 00:42:25
			shift back to the consumer.
		
00:42:26 --> 00:42:29
			In your view, what is the number
one thing that a regular person
		
00:42:29 --> 00:42:33
			needs to be aware of a heads up on
how life is going to change with
		
00:42:33 --> 00:42:33
			AI?
		
00:42:35 --> 00:42:39
			regular everyday life flow when
the when 2007 came, the smartphone
		
00:42:39 --> 00:42:40
			came around.
		
00:42:41 --> 00:42:44
			And then six months later, apps
were downloadable from the
		
00:42:44 --> 00:42:48
			smartphone. And amongst them were
social media apps, then the world
		
00:42:48 --> 00:42:53
			changed in one year. Right? Before
that a big revolution was do
		
00:42:53 --> 00:42:56
			YouTube. Anybody could broadcast
themselves to the internet,
		
00:42:56 --> 00:43:01
			through YouTube. That was 2005.
These are massive technological
		
00:43:01 --> 00:43:06
			jumps. I mean, it's not even an
invention is just the development
		
00:43:06 --> 00:43:10
			of a technology. A jump, that many
people weren't aware of that life
		
00:43:10 --> 00:43:14
			would change drastically because
of this. So if you want to if I'm
		
00:43:14 --> 00:43:16
			a somebody that I'm a regular guy,
I don't want to get caught off
		
00:43:16 --> 00:43:18
			guard like I was last time with
the smartphone.
		
00:43:20 --> 00:43:24
			Give me the lowdown. What should I
expect? How's it going to change
		
00:43:24 --> 00:43:24
			life?
		
00:43:25 --> 00:43:29
			I think immediately right now what
people will see that lots of
		
00:43:29 --> 00:43:31
			things will become much easier.
		
00:43:32 --> 00:43:35
			That's what they will say like
writing an email, you just have to
		
00:43:35 --> 00:43:40
			tell it I want to reply to this
email with this sentiment in such
		
00:43:40 --> 00:43:43
			a way that this person doesn't get
offended even though I am
		
00:43:45 --> 00:43:48
			replying negatively to him. And
then it will it will generate this
		
00:43:48 --> 00:43:52
			email for you. Yeah, your shopping
experience might get better.
		
00:43:53 --> 00:43:58
			That's because AI will learn like
it will it will, I'm most certain
		
00:43:58 --> 00:44:00
			that Google will use this to
		
00:44:01 --> 00:44:04
			target their advertising more
efficiently right now. It's very
		
00:44:04 --> 00:44:05
			inefficient.
		
00:44:07 --> 00:44:11
			So most of these changes right now
will be mundane.
		
00:44:12 --> 00:44:14
			But the bigger shifts
		
00:44:15 --> 00:44:21
			will happen much later in my in my
opinion. That's because
		
00:44:23 --> 00:44:27
			the data that is being trained on
right now, right now, it's just
		
00:44:27 --> 00:44:31
			Internet data. And they are doing
a lot of filtering on that data.
		
00:44:31 --> 00:44:35
			Because so for example, Open ID
data that it trained on, it
		
00:44:35 --> 00:44:40
			filtered it through cheap labor
from Kenya to remove like, text
		
00:44:40 --> 00:44:43
			data about like, you know,
		
00:44:44 --> 00:44:49
			* *, like all these all
these things that exist on these
		
00:44:49 --> 00:44:53
			forums who's doing this open AI
open, it's doing what so they are
		
00:44:54 --> 00:44:59
			they are they are removing this
text data through this cheap labor
		
00:44:59 --> 00:44:59
			from Kenya.
		
00:45:00 --> 00:45:04
			From the data set, so that so that
Changi PD doesn't train on that.
		
00:45:05 --> 00:45:08
			Otherwise, it will also generate
those data again, because it's
		
00:45:08 --> 00:45:11
			being trained. I see. Okay, so you
mean manual removal, manual
		
00:45:11 --> 00:45:15
			removal? And what happens is these
people are suffering from PTSD
		
00:45:15 --> 00:45:19
			from doing this work. Which means
What do you mean like the nine
		
00:45:19 --> 00:45:22
			hour day exposure to exposure to
these things? So what does that
		
00:45:22 --> 00:45:25
			actually mean? When the removing
the data, like they see a website?
		
00:45:25 --> 00:45:29
			They hit X? Like, what does that
mean? Physically? Physically
		
00:45:29 --> 00:45:32
			speaking, what does the guy doing
on the computer? So so I don't
		
00:45:32 --> 00:45:35
			know specifically what they are
doing? But my guess would be they
		
00:45:35 --> 00:45:38
			are finding this text, they are
reading them? And then the text or
		
00:45:38 --> 00:45:42
			websites, text texts from where
from the internet, all of this
		
00:45:42 --> 00:45:46
			internet? Okay, probably been
compiled on like some Yeah, some
		
00:45:46 --> 00:45:50
			UI that open is created for them
to read fine review of
		
00:45:50 --> 00:45:52
			information. And then they're
going through and you know,
		
00:45:52 --> 00:45:56
			clicking excelent, when they see
certain keywords, they delete that
		
00:45:57 --> 00:46:00
			they couldn't create a software
for that. It might be it might be
		
00:46:00 --> 00:46:02
			a little bit more nuanced than
just seeing the keyword and
		
00:46:02 --> 00:46:05
			deleting it, maybe this is
probably why it's manual, right?
		
00:46:05 --> 00:46:08
			Because what if it's a paper
that's talking about the problems
		
00:46:08 --> 00:46:12
			about *, right, or the problems
about *, so they need people
		
00:46:12 --> 00:46:15
			to manually and this is why it's
cheap labor. These aren't PhDs who
		
00:46:15 --> 00:46:20
			understand, you know, the nuances
of all these discussions just
		
00:46:20 --> 00:46:25
			okay, is this does this sound like
something that's bad about *,
		
00:46:25 --> 00:46:27
			then, you know, let's, let's
exclude this or if this is
		
00:46:27 --> 00:46:30
			controversial, and I'm guessing
there's probably a weighting
		
00:46:30 --> 00:46:33
			factor to it? It's not just like
some binary yes or no answer. It's
		
00:46:33 --> 00:46:36
			like, okay, this is more
problematic or less problematic.
		
00:46:36 --> 00:46:38
			And it's weighted based on
something like this. And these
		
00:46:38 --> 00:46:42
			people are getting traumatized by
exposure to *, * and
		
00:46:42 --> 00:46:46
			*, nine hours of their
working day. This happens
		
00:46:46 --> 00:46:50
			similarly, even in other like
content moderation, like things
		
00:46:50 --> 00:46:53
			like in Facebook, Instagram, all
these places, because they hire
		
00:46:54 --> 00:46:59
			like people to manually go through
many like controversial posts. So
		
00:46:59 --> 00:47:02
			for example, let's say some big
person.
		
00:47:03 --> 00:47:06
			I don't know Donald Trump posts
something, right. It's not like
		
00:47:06 --> 00:47:10
			some automatic, you know, reject
or deny. And as we know, it's
		
00:47:10 --> 00:47:14
			actually manually reviewed by
somebody human beings, who says
		
00:47:14 --> 00:47:17
			whether that post is controversial
or not controversial aware of the
		
00:47:17 --> 00:47:22
			radar, and if it should be
allowed. So yeah, let's let me ask
		
00:47:22 --> 00:47:28
			you this. Let's take a shift.
Who's in the lead? I think what
		
00:47:28 --> 00:47:30
			I'm hearing is that Microsoft is
in the lead, because they
		
00:47:30 --> 00:47:31
			purchased jet GBG.
		
00:47:33 --> 00:47:38
			Microsoft, and Google is actually
sweating. They're behind for the
		
00:47:38 --> 00:47:41
			first time. Before we get to that
question. I actually was in
		
00:47:41 --> 00:47:44
			LaLaLand. Yeah. Before we get to
that question, I think Nikki's
		
00:47:44 --> 00:47:46
			will know this a little bit
better. But I want to answer the
		
00:47:46 --> 00:47:50
			last question as well, which is,
what are some of the functional
		
00:47:51 --> 00:47:55
			issues that will occur? Yeah.
Because I think day to day life.
		
00:47:55 --> 00:48:00
			Yeah, a lot of emotional. A lot of
people that I talked to,
		
00:48:00 --> 00:48:01
			especially who don't understand
		
00:48:03 --> 00:48:05
			product development, right. A lot
of people might understand
		
00:48:05 --> 00:48:07
			technology, right, like, so for
example.
		
00:48:09 --> 00:48:11
			GPT,
		
00:48:12 --> 00:48:13
			which is the
		
00:48:15 --> 00:48:20
			you know, it's the it's the back
end of Chatbot. Right? The
		
00:48:20 --> 00:48:24
			technology for this existed almost
a year ago, what you're seeing in
		
00:48:24 --> 00:48:27
			Chatbot. Okay, yeah, it's a little
bit more advanced, it's GPD 3.5.
		
00:48:27 --> 00:48:31
			But it's still, it was still
available for people to consume
		
00:48:31 --> 00:48:37
			via their API for a, you know, a
year now. But the normal public
		
00:48:37 --> 00:48:40
			found out because some product
team decided that, hey, we need to
		
00:48:40 --> 00:48:43
			take this technology, put a nice
interface on it, and be able to
		
00:48:43 --> 00:48:46
			show it to the people. So there's
a difference between understanding
		
00:48:46 --> 00:48:49
			things from a technological
perspective, right, which is,
		
00:48:49 --> 00:48:52
			okay, here's the what the
engineers understand. Here's what
		
00:48:52 --> 00:48:54
			all the geeks and the nerds and
stuff, they're building all this
		
00:48:54 --> 00:48:58
			stuff, okay. And then there's this
other level of understanding,
		
00:48:58 --> 00:49:01
			which is okay, how is this going
to functionally impact and how do
		
00:49:01 --> 00:49:05
			we bring this to society? So I
think what people are a little bit
		
00:49:05 --> 00:49:11
			myopic about is, and short sighted
about is this idea that the growth
		
00:49:11 --> 00:49:16
			of AI is exponential. Many people
are very forgetful about when the
		
00:49:16 --> 00:49:19
			first iPhone came out. I don't
know if you remember, it didn't
		
00:49:19 --> 00:49:22
			have an app store. Yeah, right. It
was just like,
		
00:49:23 --> 00:49:27
			iPhone. That doesn't mean that the
first iPhone didn't revolution on
		
00:49:27 --> 00:49:30
			revolutionized technology. It was
the first device that came out
		
00:49:31 --> 00:49:34
			that combined the iPod, the camera
and the phone. Nothing had ever
		
00:49:34 --> 00:49:38
			done that before. It blew people's
minds, right. And as soon as they
		
00:49:38 --> 00:49:41
			added the App Store, it blew
people's minds again, right. And,
		
00:49:41 --> 00:49:43
			and in the beginning, I don't know
if everybody remembers the apps
		
00:49:43 --> 00:49:46
			were very rudimentary. You could
do things like you know, turn on a
		
00:49:46 --> 00:49:49
			flashlight, or you could do that
light hair thing and people will
		
00:49:49 --> 00:49:52
			be like, Oh, look, I could turn on
the lighter. It's so cool. Right?
		
00:49:52 --> 00:49:55
			But that was something that you
couldn't do before. And this is
		
00:49:55 --> 00:49:58
			where we are in the stage of AI
development.
		
00:50:01 --> 00:50:04
			For there's advancements in AI,
for example, there's So GitHub,
		
00:50:04 --> 00:50:08
			which is a repository created, you
know, it's it's a code repository
		
00:50:09 --> 00:50:12
			management application. It's used
by most coders.
		
00:50:13 --> 00:50:16
			They've, you know, as part of it
as part of their suite, they now
		
00:50:16 --> 00:50:17
			included a new
		
00:50:18 --> 00:50:24
			technology called copilot, it
allows you to have a, I read your
		
00:50:24 --> 00:50:27
			code, understand it, and it's
based off of the same fundamental
		
00:50:27 --> 00:50:30
			technology that Chatbot is, read
your code, understand it, and it
		
00:50:30 --> 00:50:33
			codes with you. And I started
using it, you know, two months
		
00:50:33 --> 00:50:38
			back, and it's already increased
my productivity by 30, to 40%. And
		
00:50:38 --> 00:50:41
			I know many people who are already
using it, there are people who are
		
00:50:41 --> 00:50:45
			writing, you know, there, I saw a
bunch of posts of people saying,
		
00:50:45 --> 00:50:47
			you know, I didn't have to go to
my lawyer to write such and such
		
00:50:47 --> 00:50:50
			things, because I just had chatbot
write it for me, I didn't have to
		
00:50:50 --> 00:50:54
			do X, Y, and Z things, because,
you know, I was able to outsource
		
00:50:54 --> 00:50:59
			it to the AI. Let's assume that
done it, right. Well, it's
		
00:50:59 --> 00:51:01
			irrelevant whether it's done it or
right, they've done it right or
		
00:51:01 --> 00:51:03
			not, because people are still
using it, right.
		
00:51:04 --> 00:51:09
			And so this is where the trick the
issue comes in, in terms of like
		
00:51:09 --> 00:51:10
			practical life.
		
00:51:12 --> 00:51:16
			People keep saying that, well,
what chatbot gives you isn't
		
00:51:16 --> 00:51:19
			reliable, but I don't think
anybody remembers when people were
		
00:51:19 --> 00:51:22
			using Wikipedia, and you would go
to school, or your, you know, or
		
00:51:22 --> 00:51:24
			your college and your professor
would say, Hey, don't use
		
00:51:24 --> 00:51:27
			Wikipedia, but people still use
Wikipedia anyway. Right? And then
		
00:51:27 --> 00:51:30
			they would just cross verify what
was on Wikipedia to make sure that
		
00:51:30 --> 00:51:36
			it was accurate. But almost every
student that came after Wikipedia
		
00:51:36 --> 00:51:40
			existed, use Wikipedia said, Oh,
here's all the list of the sources
		
00:51:40 --> 00:51:42
			that Wikipedia CITES. Let me go
cross reference it make sure it's
		
00:51:42 --> 00:51:45
			accurate. But it was the starting
point to begin your journey of
		
00:51:45 --> 00:51:49
			analysis. Yeah. Right. And
similarly, chatbot will be the
		
00:51:49 --> 00:51:52
			same thing. People may not, you
might say, Oh, well, chatbots not
		
00:51:52 --> 00:51:54
			accurate. Well, that depends if
you know the subject or not, if
		
00:51:54 --> 00:51:58
			you know the subject, then it
becomes a very good tool for you
		
00:51:58 --> 00:52:01
			to use it to be able to do
research and do many other things.
		
00:52:01 --> 00:52:04
			So it's a matter of,
		
00:52:05 --> 00:52:09
			let's look down the line, because
this version of chatbot, I mean,
		
00:52:09 --> 00:52:13
			it's supposed to be updated,
according to open AI, to GPT,
		
00:52:13 --> 00:52:17
			three in the next quarter or two,
right. And I'm sure it already
		
00:52:17 --> 00:52:20
			exists, people have used the beta.
And it's far more advanced than
		
00:52:20 --> 00:52:23
			the current version. So what
happens when we begin this
		
00:52:23 --> 00:52:26
			iterative phase, and we get to
more advanced versions of these,
		
00:52:26 --> 00:52:29
			we need to think like two three
years down the line. And that's
		
00:52:29 --> 00:52:34
			pretty, pretty fast, right? The if
anybody has looked at mid journey,
		
00:52:34 --> 00:52:36
			which is the image generation, and
you go look at, there's a lot of
		
00:52:36 --> 00:52:40
			videos out there now of what mid
journey was six months ago, and
		
00:52:40 --> 00:52:44
			what it is today, right? Six
months ago, it wasn't able to, you
		
00:52:44 --> 00:52:48
			know, recreate a human being or
anything as good as it is now. Now
		
00:52:48 --> 00:52:51
			you look at it, and there's
people, there's artists, many
		
00:52:51 --> 00:52:54
			artists complaining that well,
this kind of just eradicates you
		
00:52:54 --> 00:52:56
			know, a lot of the work that we
were doing, because people are
		
00:52:56 --> 00:52:59
			able to take all this stuff,
there's a lot of legal issues that
		
00:52:59 --> 00:53:05
			are happening now. For example,
nobody could in the past, take a
		
00:53:06 --> 00:53:10
			painting, take take an imaginary
idea, say like a house on the
		
00:53:10 --> 00:53:16
			hill, and say, hey, I want this to
look like how Hayao Miyazaki or
		
00:53:16 --> 00:53:17
			Pixar Studios or,
		
00:53:18 --> 00:53:23
			you know, some, some other artists
has, you know, made it look, you
		
00:53:23 --> 00:53:26
			would have to hire money to do
that. Now, mid journey can go and
		
00:53:26 --> 00:53:30
			make it it can build that. So now
that all these legal troubles of,
		
00:53:30 --> 00:53:33
			you know, is Pixar going to start
coming down your, you know, coming
		
00:53:33 --> 00:53:36
			down to you and saying, Hey,
you're not allowed to use this
		
00:53:36 --> 00:53:37
			because all of
		
00:53:38 --> 00:53:41
			all of your journey was based on a
catalog of images. And in those
		
00:53:41 --> 00:53:44
			images, there were Pixar images,
and that's how it's able to create
		
00:53:44 --> 00:53:48
			the pixel. Yes, you this, then let
me ask you this. Every painter
		
00:53:48 --> 00:53:51
			when he makes a painting, he walks
through the Louvre. He looks at
		
00:53:51 --> 00:53:55
			stuff, he gets inspired by 500
images, he produces an image, that
		
00:53:55 --> 00:53:58
			image is based upon the 500
paintings you saw at the Louvre.
		
00:54:00 --> 00:54:03
			What's the difference? There is a
difference, but you're in a phase,
		
00:54:03 --> 00:54:06
			you want to take that one. Like,
seriously, what is the difference?
		
00:54:06 --> 00:54:08
			You mean, looking through all of
your work?
		
00:54:09 --> 00:54:13
			Letting it settle in my mind? Then
pushing it aside, then making my
		
00:54:13 --> 00:54:16
			own thing? Clearly, it's gonna
look 25% Like yours? 10%, like
		
00:54:16 --> 00:54:21
			yours? 25%, like yours? What's the
difference? Well, I would say that
		
00:54:22 --> 00:54:26
			we see whereby it's when we talk
about this stuff, right? Because
		
00:54:26 --> 00:54:29
			we have an epistemology of what
good is, what truth is, what
		
00:54:29 --> 00:54:33
			morality is, et cetera, et cetera,
et cetera. But the argument that's
		
00:54:33 --> 00:54:37
			used by pro AI people is sorry,
the argument that's used by anti
		
00:54:37 --> 00:54:42
			AI people is, well, a human being
works differently, right? A human
		
00:54:42 --> 00:54:44
			being when they walk into the
Louvre, and they see all these
		
00:54:44 --> 00:54:47
			images and they see all these
things. We don't have recall
		
00:54:47 --> 00:54:51
			memory the way that a computer
does, right? When we recall
		
00:54:51 --> 00:54:54
			something we recall something
based on events based on
		
00:54:54 --> 00:54:58
			interactions based on things. So
for example, if you told me Hey,
		
00:54:58 --> 00:54:59
			we had a podcast with Alex and the
		
00:55:00 --> 00:55:02
			Pass. And this is what we were
talking about. Okay, I'm gonna
		
00:55:02 --> 00:55:06
			remember the smell the day, I'm
going to remember what happened
		
00:55:06 --> 00:55:08
			that day what we were talking
about. And then based on the
		
00:55:08 --> 00:55:10
			context of that conversation, I'm
gonna remember I remember one
		
00:55:10 --> 00:55:13
			thing. And you know what? I'm not
going to remember it exactly, it's
		
00:55:13 --> 00:55:16
			going to be a little bit
different. It's never going to be
		
00:55:16 --> 00:55:22
			exactly like what it was before a
human being. doesn't remember, a
		
00:55:22 --> 00:55:28
			human being remembers, right, a
chain recalls information. Right?
		
00:55:28 --> 00:55:33
			That's the difference, right? And
so when legally like, Yeah, and so
		
00:55:33 --> 00:55:38
			when the machine when the
algorithm, it looks at your prompt
		
00:55:38 --> 00:55:43
			that says, hey, I want you to make
me an image of myself in the style
		
00:55:43 --> 00:55:44
			of DaVinci.
		
00:55:45 --> 00:55:49
			It's now going to go and recall
every single one of Da Vinci's
		
00:55:49 --> 00:55:54
			paintings, it's going to recall
how that style of you know,
		
00:55:54 --> 00:55:58
			whatever Baroque art or whatever
it was, at the time, you know, was
		
00:55:58 --> 00:56:01
			done that style of Don Gunn
different artists, if I told an
		
00:56:01 --> 00:56:05
			artist paint me the way DaVinci
would have painted me, he's gonna
		
00:56:05 --> 00:56:09
			click DaVinci images, look at all
the DaVinci image, okay, he looks
		
00:56:09 --> 00:56:12
			like he's doing this stroke here.
He uses this color palette, he
		
00:56:12 --> 00:56:15
			uses this background, blah, blah,
blah, and then he Commission's any
		
00:56:15 --> 00:56:19
			draws that makes the painting.
It's the access is the same. So
		
00:56:19 --> 00:56:25
			for these apps to have access to
the internet, there is no way for
		
00:56:25 --> 00:56:28
			an artist to say that you're,
you're ripping me off, and you're
		
00:56:28 --> 00:56:31
			basing it because of that. Any
artists that I would have hired
		
00:56:31 --> 00:56:33
			would have done the same thing.
Yes, but it's different for the
		
00:56:33 --> 00:56:36
			artists because somebody could
argue that the artists took a
		
00:56:36 --> 00:56:41
			lifetime to learn how to design
and draw in the style of DaVinci
		
00:56:41 --> 00:56:47
			if somebody has a legal Oh, yeah.
Yeah. Well, so if if a person is
		
00:56:47 --> 00:56:51
			not a legal argument, it's an
ethical argument. Right? If a
		
00:56:51 --> 00:56:54
			person can This is why I think
they're gonna lose, they're gonna
		
00:56:54 --> 00:56:55
			lose today IP, right?
		
00:56:56 --> 00:56:59
			Because the AI argument is a very
naturalist materialist argument,
		
00:56:59 --> 00:57:03
			they're gonna win. But somebody
could argue that this person that
		
00:57:03 --> 00:57:06
			learned how to draw in the style
of DaVinci, if you're a DaVinci,
		
00:57:06 --> 00:57:11
			impressionist painter, right?
Yeah, you are good at what you do.
		
00:57:11 --> 00:57:14
			Because I can't do that. You can't
do that random people can't do
		
00:57:14 --> 00:57:18
			that. But if I can go in and write
a prompt into mid journey, it
		
00:57:18 --> 00:57:21
			allows this power of creating
Impressionist paintings to
		
00:57:21 --> 00:57:25
			everyone, right before you had to
spend a lifetime learning DaVinci,
		
00:57:25 --> 00:57:28
			studying Da Vinci studying the art
strong, you know, the, the the
		
00:57:28 --> 00:57:32
			paint strokes, and how all of
these things are constructed. And
		
00:57:32 --> 00:57:34
			you had to learn that and you had
to practice it. And you had to do
		
00:57:34 --> 00:57:37
			it for years and years and years
until he became very good. So this
		
00:57:37 --> 00:57:41
			is a sympathy lawsuit, to not put
people out of a job, all right.
		
00:57:44 --> 00:57:47
			They probably people would say, AI
is just more efficient than a
		
00:57:47 --> 00:57:51
			human. Yeah, it's not the AI
fault. That's true, like so. I
		
00:57:51 --> 00:57:57
			mean, I may have had to practice
dunking for five years. Then my
		
00:57:57 --> 00:58:00
			neighbor comes along because he
had better genetics, and he's a
		
00:58:00 --> 00:58:05
			foot taller. Right? Or, and then
another guy comes along, and he's
		
00:58:05 --> 00:58:10
			a foot shorter than me. And so I
domineer him. Is there unfairness
		
00:58:10 --> 00:58:15
			here? What's the Darius? There is
no way to win that argument. Yeah,
		
00:58:15 --> 00:58:20
			as a anti AI person without
bringing in another version of
		
00:58:20 --> 00:58:27
			epistemology. I will tell you what
every technology ends up the
		
00:58:27 --> 00:58:34
			society just out of mercy for the
previous generations, you know,
		
00:58:34 --> 00:58:39
			work, slowing the technology down
until you guys find another source
		
00:58:39 --> 00:58:43
			of income. Right? So when the
typewriting came out, what do you
		
00:58:43 --> 00:58:47
			think the Ottomans did? They
flipped out all those scribes
		
00:58:47 --> 00:58:50
			flipped out, they said, this
should be banned. There's no way
		
00:58:50 --> 00:58:54
			an idiot can come in and type
Heather. And it comes out in a
		
00:58:54 --> 00:58:59
			beautiful script, when it took me
30 years to be able to write that
		
00:58:59 --> 00:59:03
			write the same exact thing. That's
actually why the Ottomans didn't
		
00:59:03 --> 00:59:05
			bring in the typewriters. Right,
because it's not that they said
		
00:59:05 --> 00:59:08
			that there's something inherently
wrong with the typewriters is that
		
00:59:08 --> 00:59:10
			we're now going to put all of
these people out of jobs, right?
		
00:59:10 --> 00:59:13
			These are the scribes that we
have. And the scribes are so
		
00:59:13 --> 00:59:16
			emotionally affected by that they
come up with these arguments,
		
00:59:16 --> 00:59:21
			where the only real basis of it is
that you just basically
		
00:59:21 --> 00:59:23
			essentially all your years of
training got wiped out in one
		
00:59:23 --> 00:59:29
			second, by me buying a typewriter
and typing the book, in the same
		
00:59:29 --> 00:59:34
			exact script that you type it in.
And I put zero effort. Literally,
		
00:59:34 --> 00:59:37
			it seems unfair, but that's sort
of what life is. Right? It's I'm
		
00:59:37 --> 00:59:40
			putting in zero effort. That's
what technology does. And that's
		
00:59:40 --> 00:59:43
			what they call disruptors and all
these you know, these things out
		
00:59:43 --> 00:59:48
			in the west coast, but zero
effort, and I'm doing the same
		
00:59:48 --> 00:59:52
			exact thing. That's someone now
personally speaking. I don't like
		
00:59:52 --> 00:59:55
			it. But I don't see a legal basis
for it being illegal. There's no
		
00:59:55 --> 00:59:59
			legal basis. I mean, look, I don't
like it. I'd rather go with the
		
00:59:59 --> 00:59:59
			natural guy.
		
01:00:00 --> 01:00:04
			The human the human being who put
the effort but Well, I mean, the
		
01:00:04 --> 01:00:08
			reason you don't like it is based
on your epistemology of like,
		
01:00:08 --> 01:00:10
			well, what is truth? What is
goodness? What is all of these
		
01:00:10 --> 01:00:12
			things? Right? And, and they
don't, they're not bringing those
		
01:00:12 --> 01:00:15
			things to the fore when it comes
to a legal argumentation, right?
		
01:00:15 --> 01:00:17
			It's nothing, there's none,
there's none of that there, right?
		
01:00:17 --> 01:00:21
			Like, we're not going to go sit
there and say, hey, you know, we
		
01:00:21 --> 01:00:24
			believe that, you know, these, the
effort and the work of these
		
01:00:24 --> 01:00:27
			people matters, right. And this is
we have to, we have to favor
		
01:00:27 --> 01:00:30
			humanity over the robot, right?
We're not going to go around, say
		
01:00:30 --> 01:00:34
			that to any legal argument, if it
comes come from,
		
01:00:35 --> 01:00:39
			based on the ownership of the data
the trained on being trained on.
		
01:00:39 --> 01:00:43
			So for example, Microsoft has
already been sued, because of his
		
01:00:43 --> 01:00:46
			co pilot, so it released co pilot,
which helps software developers
		
01:00:47 --> 01:00:51
			generate code automatically,
right. But it was trained on all
		
01:00:51 --> 01:00:55
			of public codes on GitHub
repository, and all those public
		
01:00:55 --> 01:01:00
			code. They're written by other
people who, who haven't given
		
01:01:00 --> 01:01:04
			explicit permission to use it for
training. So that's why it's
		
01:01:04 --> 01:01:09
			there. But you put it out in
public is walking in a mall, if I
		
01:01:09 --> 01:01:12
			walk in the mall? Right? If I walk
in the street
		
01:01:13 --> 01:01:17
			for inspiration, and I get
inspired, and I produce something,
		
01:01:18 --> 01:01:21
			and I sell it, to those people who
are in the public who put their
		
01:01:21 --> 01:01:25
			own shops who put their own faces,
their brother own stuff in public,
		
01:01:26 --> 01:01:29
			they have a right to that money.
No, they don't. So if I go and I
		
01:01:29 --> 01:01:34
			say take my my thing, and I say go
onto the internet, go on safari,
		
01:01:34 --> 01:01:38
			do whatever, Google and get all
the information, you can then I
		
01:01:38 --> 01:01:41
			produce a product with that
information, and I sell it.
		
01:01:42 --> 01:01:45
			I'm using the public stumble using
this. I'm like walking in the
		
01:01:45 --> 01:01:48
			street. I'm using the public. How
is that? So the purpose of the
		
01:01:48 --> 01:01:49
			argument was?
		
01:01:50 --> 01:01:53
			So there is some basis, there is
some basis for their argument.
		
01:01:53 --> 01:01:58
			Right. And, and I think the anti
AI people will hop on this
		
01:01:58 --> 01:02:01
			argument because it suits their
interests, right? Which is that
		
01:02:01 --> 01:02:03
			hey, and here's the flaw in the
argument.
		
01:02:04 --> 01:02:07
			And the problem with this argument
is, hey, let's say let's say they
		
01:02:07 --> 01:02:14
			say, well, Disney, the only right
way you were able to get all of
		
01:02:14 --> 01:02:18
			these images from Disney, is
because you were able to scrape
		
01:02:18 --> 01:02:21
			the web and use copyrighted images
that were not permissible for you
		
01:02:21 --> 01:02:24
			to use, you didn't get explicit
permission from does use what No,
		
01:02:24 --> 01:02:28
			I just use it for inspiration. No,
no, but they would say that when
		
01:02:28 --> 01:02:33
			when you're generating an image,
it's directly going in and using
		
01:02:33 --> 01:02:35
			the data model. And the data model
itself is based on
		
01:02:38 --> 01:02:41
			copyright images. And so they
would say that the data model
		
01:02:41 --> 01:02:45
			itself is is wrong, like you can't
use it at all. And so any image
		
01:02:45 --> 01:02:49
			that's created from the data
model, is is illegal. Now, here's
		
01:02:49 --> 01:02:52
			the problem with this with this
thing, let's say they went on
		
01:02:52 --> 01:02:53
			this, which they might,
		
01:02:54 --> 01:02:59
			it now just allows people who have
large amounts of data to collude
		
01:02:59 --> 01:03:03
			amongst one another to make their
own AI. Right. So Disney could
		
01:03:03 --> 01:03:08
			just get together, let's say a big
company like Warner Brothers, or
		
01:03:08 --> 01:03:11
			Comcast, or Disney or one of these
larger media houses. They got
		
01:03:11 --> 01:03:13
			together, they colluded with
		
01:03:15 --> 01:03:20
			open AI or Google and said, Hey,
listen, were willing to work with
		
01:03:20 --> 01:03:25
			you. As long as we have
proprietary rights to this API, if
		
01:03:25 --> 01:03:28
			there's any ad revenue that comes
from it. Like it has to go, it has
		
01:03:28 --> 01:03:34
			to go to us 10%. And we'll allow
you to use our enormous dataset
		
01:03:35 --> 01:03:38
			that includes, you know, our
films, my data to them.
		
01:03:39 --> 01:03:42
			Well, I mean, stuff I've taken
from my website, that's something
		
01:03:42 --> 01:03:46
			from your website from his
website, no matter because you are
		
01:03:46 --> 01:03:49
			in a chain stop. But yeah, but
it's not about that you can't sue
		
01:03:49 --> 01:03:52
			because you're not as big as
Disney. Disney just has the
		
01:03:52 --> 01:03:54
			manpower and the ability and the
money to just tell you to be
		
01:03:54 --> 01:03:59
			quiet, right. So it allows, it's
not like these people are
		
01:03:59 --> 01:04:01
			principled in the in this idea
that oh, yeah. And awareness, not
		
01:04:01 --> 01:04:04
			gonna allow it for anybody. Look,
an armistice of AI is not
		
01:04:04 --> 01:04:08
			happening. Right? This whole idea.
This is like the guns, people when
		
01:04:08 --> 01:04:10
			people shouldn't have guns. Well,
people have guns, what do you want
		
01:04:10 --> 01:04:12
			to do about it? You're going to
take everybody's guns, it's not
		
01:04:12 --> 01:04:14
			going to happen. You can sit here
and argue all day.
		
01:04:16 --> 01:04:20
			And I'm telling you the argument
is, as I said earlier, it's their
		
01:04:20 --> 01:04:24
			sympathy trials to keep. And by
the way it should you would
		
01:04:25 --> 01:04:27
			recognize that Shinya would
recognize don't put people out of
		
01:04:27 --> 01:04:30
			business, of course, should yeah,
would recognize that.
		
01:04:31 --> 01:04:33
			It's not who cares what your
technology is, you're not allowed
		
01:04:33 --> 01:04:36
			to just try to put someone out of
business tomorrow. You have to
		
01:04:36 --> 01:04:39
			slow this down somehow. Let's
shift from this. Who are the
		
01:04:39 --> 01:04:40
			leaders right now?
		
01:04:43 --> 01:04:44
			Microsoft,
		
01:04:45 --> 01:04:47
			Zuckerberg, Facebook,
		
01:04:48 --> 01:04:53
			Apple's in lala land. And who and
main research organizations are
		
01:04:53 --> 01:04:56
			Microsoft, Microsoft, because
there is that because they bought
		
01:04:56 --> 01:05:00
			chit chat TBT they have their own
it's because of their partnership.
		
01:05:00 --> 01:05:03
			With open AI, they have a
partnership with Elon Musk. Yeah.
		
01:05:03 --> 01:05:06
			I mean, you don't Musk was
previously in what was in open AI,
		
01:05:06 --> 01:05:09
			but he's not anymore. So he's not
any more than that. Yeah, yeah.
		
01:05:09 --> 01:05:16
			Fine. So. So then, then Google has
deep mind. Yeah. And then Facebook
		
01:05:16 --> 01:05:23
			has fair DeepMind. Fair. Yeah.
Fai? Yeah. What a terrible name.
		
01:05:23 --> 01:05:29
			So so the thing is, there's also
large open source code
		
01:05:29 --> 01:05:31
			repositories that open source
models, right, like so for
		
01:05:31 --> 01:05:35
			example, there's stability, which
is, you know, stability, AI has
		
01:05:35 --> 01:05:36
			sufferings.
		
01:05:37 --> 01:05:40
			If you heard of stable diffusion
is the image generator, that's
		
01:05:40 --> 01:05:44
			from civilian data. It's open.
It's open source. It's, there's
		
01:05:44 --> 01:05:47
			no, I mean, I guess there's a
company that runs the open source,
		
01:05:47 --> 01:05:50
			but it's not proprietary, you can
use it, anybody can use it. And so
		
01:05:50 --> 01:05:53
			there's always going to be these
open source competitors. They're,
		
01:05:53 --> 01:05:56
			they're competing with mid journey
and other, you know, top level
		
01:05:56 --> 01:05:58
			tools now. So
		
01:05:59 --> 01:06:04
			it is going to slowly become the
case where it doesn't matter how
		
01:06:04 --> 01:06:07
			big you are, you know, if some
open source team comes together
		
01:06:07 --> 01:06:10
			and start building things, then
you know, they have the same
		
01:06:10 --> 01:06:13
			models, they can build the same
things. Let's go to the question
		
01:06:13 --> 01:06:15
			they the two questions as
		
01:06:18 --> 01:06:22
			data, personal data, and then deep
fakes. These are two questions
		
01:06:22 --> 01:06:25
			that came up. So who wants to take
that one? First? The first one,
		
01:06:26 --> 01:06:30
			your data, personal data? What
about how is it affecting this,
		
01:06:30 --> 01:06:33
			like this is going to take our
personal data
		
01:06:34 --> 01:06:37
			being out there at another level,
it's already at another level with
		
01:06:37 --> 01:06:40
			with what we have now. But it's
going to go to another level. So
		
01:06:40 --> 01:06:45
			most people, most people always
use this example to say that in
		
01:06:45 --> 01:06:49
			the future, your personal data,
and your information will be used
		
01:06:49 --> 01:06:53
			to attack you. Yeah, no, that
already happened 10 years ago?
		
01:06:53 --> 01:06:57
			Yeah. Right, this whole idea that,
you know, you're going to be
		
01:06:57 --> 01:07:01
			targeted and your data is going to
be used? No, no, that already
		
01:07:01 --> 01:07:06
			happened to you. Many people are
sitting on this stream, are on
		
01:07:06 --> 01:07:09
			this stream, because they were
zooming through Facebook or
		
01:07:09 --> 01:07:12
			through Instagram. And they saw
the live icon that was there. And
		
01:07:12 --> 01:07:15
			it was targeted that their Muslim
and their followers of Dr. Shadi
		
01:07:15 --> 01:07:19
			are followers of me, or no fees.
And based on this, they were
		
01:07:19 --> 01:07:23
			targeted and shown that, hey,
these are the people that are on
		
01:07:23 --> 01:07:27
			this live stream you should join
in. They already have your data,
		
01:07:27 --> 01:07:30
			they already have your
information. And look, let's say
		
01:07:30 --> 01:07:32
			there's a lot of people who say
like, well, I don't put my data
		
01:07:32 --> 01:07:38
			out there, you think you don't,
but your metadata is, and a
		
01:07:38 --> 01:07:42
			profile of you is composed pretty
easily, right? Like, for example,
		
01:07:42 --> 01:07:45
			if I know, I can already
		
01:07:46 --> 01:07:50
			filter down the types of people
that are in a specific area,
		
01:07:50 --> 01:07:54
			right? A person that lives in New
Brunswick is going to be
		
01:07:54 --> 01:07:58
			distinctly different from a person
that lives in Guatemala, right? We
		
01:07:58 --> 01:08:01
			just know this, most of the time.
Now you're telling me that this
		
01:08:01 --> 01:08:06
			person also had is Muslim, this
person is also friends with XY and
		
01:08:06 --> 01:08:09
			Z people, this person is a male or
a female or a non binary, or
		
01:08:09 --> 01:08:15
			whatever they are nowadays. And
this person is XYZ, ABC metadata,
		
01:08:15 --> 01:08:18
			we have all this information,
there's no personal information,
		
01:08:18 --> 01:08:22
			right? They were able to people
say, you know, you can gather all
		
01:08:22 --> 01:08:24
			this information from almost
anybody on the internet, if you
		
01:08:24 --> 01:08:26
			just know their name, and you know
where they live, you could
		
01:08:26 --> 01:08:29
			probably find a lot of this
information. Oh, they're a if I
		
01:08:29 --> 01:08:32
			can find it, then I can guarantee
you that, you know, agencies that
		
01:08:32 --> 01:08:36
			are looking for this data can most
definitely find it. Yeah, I think
		
01:08:36 --> 01:08:41
			the what what the internet did was
actually make most of the FBI
		
01:08:41 --> 01:08:42
			search
		
01:08:43 --> 01:08:46
			teams irrelevant. Because people
		
01:08:47 --> 01:08:51
			put their own information on
Facebook, when they check in
		
01:08:51 --> 01:08:55
			somewhere, right? This is why this
is why it was so silly. When
		
01:08:55 --> 01:08:58
			people were posting that stupid
picture of Greta Thornburg and
		
01:08:58 --> 01:09:01
			saying that, like Andrew Tate got
caught because of the pizza box.
		
01:09:01 --> 01:09:03
			It's like, no, they didn't need
the pizza box. You think they're
		
01:09:03 --> 01:09:07
			so these, these agencies needed a
pizza box to find where somebody
		
01:09:07 --> 01:09:10
			is? No. They know where you are.
They know what you're doing.
		
01:09:10 --> 01:09:14
			Right? So they have tons of data
already out there. But go ahead
		
01:09:14 --> 01:09:17
			and use I can't see us like
there's no, there's no concept of
		
01:09:17 --> 01:09:22
			visual cues to know he wants to go
to a level that people the police
		
01:09:22 --> 01:09:25
			can even search through your DNA.
Yeah, basically, if they have a,
		
01:09:25 --> 01:09:29
			they have your DNA, then they can
start. So these 23andme databases.
		
01:09:29 --> 01:09:32
			Yeah. And that's how they caught
some serial killers from the 70s
		
01:09:32 --> 01:09:36
			and 80s Recently, so you think
that your data is not there, but
		
01:09:36 --> 01:09:40
			your innermost data is already out
there? Yeah. So the question just
		
01:09:40 --> 01:09:45
			becomes, who's using it and how?
Because when they when they put up
		
01:09:45 --> 01:09:48
			when I say something like I need
to get
		
01:09:50 --> 01:09:51
			I need to get some batteries.
		
01:09:53 --> 01:09:58
			And then I find on my Amazon
suggestion is batteries. You did
		
01:09:58 --> 01:09:59
			me a favor to be honest with you,
right?
		
01:10:00 --> 01:10:04
			He did me a favor. So I don't even
have no problem with that. We're
		
01:10:04 --> 01:10:08
			gonna have a problem where it goes
into other levels. Right? And
		
01:10:08 --> 01:10:11
			that's what people are always
nervous about. Well, you can get
		
01:10:11 --> 01:10:15
			real philosophical on this really
quickly. But for example, I, I'm
		
01:10:15 --> 01:10:16
			pretty
		
01:10:17 --> 01:10:20
			anti tech, even though I work in
tech, I do tech, you know, I know
		
01:10:20 --> 01:10:24
			all these things. But I don't have
any smart devices in my house,
		
01:10:24 --> 01:10:28
			right? I don't have whatever,
Alexa, I don't have a smart
		
01:10:28 --> 01:10:31
			thermostat. I don't have, you
know, smart printers. I have no
		
01:10:31 --> 01:10:35
			such devices in my house, right,
other than my laptop. And this is
		
01:10:35 --> 01:10:37
			probably the most events. I have a
laptop and an iPad, right. But I
		
01:10:37 --> 01:10:40
			don't have other things that are
listening. I don't have Siri
		
01:10:40 --> 01:10:44
			turned on on my phone. And
somebody might say, well, aren't
		
01:10:44 --> 01:10:47
			you being contradictory? You just
said they know everything. So then
		
01:10:47 --> 01:10:50
			what's the why? Why are you trying
to hide everything? I'm not trying
		
01:10:50 --> 01:10:53
			to hide everything. It's my right
to privacy. Don't you use curtains
		
01:10:53 --> 01:10:55
			on your house? Do you want
everybody to see going on in your
		
01:10:55 --> 01:10:58
			life? It's your right to privacy.
I don't want people listening in
		
01:10:58 --> 01:11:03
			to my stuff. And I assure you,
people do listen, I work on this
		
01:11:03 --> 01:11:08
			stuff. I you know, I build this
stuff. Regular normal employees
		
01:11:08 --> 01:11:11
			are listening in on your personal
conversations at home. Most
		
01:11:11 --> 01:11:15
			definitely. Oh, you're saying
humans, not just it's not just
		
01:11:15 --> 01:11:20
			being gathered. Even humans, let
alone the AI the AI is definitely
		
01:11:20 --> 01:11:23
			listening. But like the humans are
also listening. I didn't realize
		
01:11:23 --> 01:11:27
			that. So I didn't realize even
that humans have access to this.
		
01:11:27 --> 01:11:30
			Like I know that it all goes into
some data center. Definitely.
		
01:11:31 --> 01:11:34
			Do have access. Yeah, really
humans? Yeah. So tell us more
		
01:11:34 --> 01:11:38
			about that. I mean, so basically,
they would have different levels
		
01:11:38 --> 01:11:42
			of levels of classification right
in these companies. So some people
		
01:11:42 --> 01:11:45
			will have access to Class One,
two, some will have access to
		
01:11:45 --> 01:11:50
			class, higher level classes. And
so for the, for the top accounts
		
01:11:50 --> 01:11:54
			in the social media companies,
they will have certain amount of
		
01:11:54 --> 01:11:59
			employees assigned to just handle
these top accounts. And they have
		
01:11:59 --> 01:12:02
			access to all of that information
from there. They can even listen
		
01:12:02 --> 01:12:05
			in to their phones. That's how
actually Saudi
		
01:12:07 --> 01:12:10
			Saudi Arabia actually that's why
they bribed that's how they bribed
		
01:12:10 --> 01:12:14
			two employees via Twitter to take
down their opposition.
		
01:12:15 --> 01:12:19
			opposition's on Twitter. And yeah,
they supplied them with all of
		
01:12:19 --> 01:12:22
			their Twitter information internal
that internally they had. And they
		
01:12:22 --> 01:12:27
			later got I think arrested by FBI
your thumb. Oh, really? Well, I
		
01:12:27 --> 01:12:29
			remember we had mine did an
experiment one time and started
		
01:12:29 --> 01:12:31
			talking about luxury car.
		
01:12:32 --> 01:12:37
			supercars. What was it supercars?
It was luxury bags, luxury, luxury
		
01:12:37 --> 01:12:38
			women's bags, like,
		
01:12:39 --> 01:12:44
			you know, these $1,000 handbags,
just chit chatting about it with
		
01:12:44 --> 01:12:45
			his phone off.
		
01:12:46 --> 01:12:50
			It only took about a day or two
days, three, four days. Yeah. And
		
01:12:50 --> 01:12:53
			so I tried to figure out why this
happened. Because I don't have
		
01:12:54 --> 01:12:59
			this was about seven years ago,
right? Yeah, at that time, this
		
01:12:59 --> 01:13:02
			stuff wasn't as prevailing. And at
that time, I didn't have
		
01:13:03 --> 01:13:06
			C return. Anybody who knows me
knows I don't use Siri. I don't
		
01:13:06 --> 01:13:09
			have any of that turned on my
wife. Now Siri or anything turned
		
01:13:09 --> 01:13:14
			on. I had no application that was
listening. But the only
		
01:13:14 --> 01:13:18
			application I had was WhatsApp.
And you you you actually
		
01:13:18 --> 01:13:21
			explicitly grant permission for
the microphone on WhatsApp to do
		
01:13:21 --> 01:13:25
			voice notes. Now what's happened
Facebook claims that they don't
		
01:13:25 --> 01:13:27
			listen outside of that. And it's
only when you press that button
		
01:13:27 --> 01:13:31
			that they have access. But I can
no longer trust these
		
01:13:31 --> 01:13:34
			organizations and what they're
saying that they do, right it's
		
01:13:35 --> 01:13:39
			now could you say that that's
illegal? Could you sue Facebook?
		
01:13:39 --> 01:13:43
			If you've if you found out Yeah,
sure. If you found out and you
		
01:13:43 --> 01:13:46
			could make a case against it and
sue them then yes, but until
		
01:13:46 --> 01:13:49
			somebody brings that up and
actually does it then you know,
		
01:13:50 --> 01:13:53
			they don't do think some
application like tick tock is
		
01:13:53 --> 01:13:56
			listening. Anybody who's listening
to the stream if you have Tik Tok
		
01:13:56 --> 01:13:59
			installed on your phone please
delete it if you have Facebook
		
01:13:59 --> 01:14:02
			Messenger on your phone please
delete it. If you want to go on a
		
01:14:02 --> 01:14:04
			desktop and use it they're most
definitely spying on you
		
01:14:07 --> 01:14:11
			tick tock you should remove it
should make them remove it yeah
		
01:14:12 --> 01:14:14
			yeah tick tock is the is the Tick
Tock is the number one enemy right
		
01:14:14 --> 01:14:19
			now of any kind of human decency,
germs of the content or in terms
		
01:14:19 --> 01:14:23
			of the taking the information in
terms of the content and also it's
		
01:14:23 --> 01:14:26
			a Chinese spyware. So basically,
they they are targeting the
		
01:14:27 --> 01:14:29
			American youth and they are trying
to bring in all sorts of
		
01:14:29 --> 01:14:34
			degeneracy. So they are using tick
tock to convert these children
		
01:14:34 --> 01:14:38
			teenagers to all these you know,
LGBTQ is yeah, it's one of the
		
01:14:38 --> 01:14:39
			number one converter
		
01:14:40 --> 01:14:45
			in China was that it took is
banned in China they made banned
		
01:14:45 --> 01:14:45
			it.
		
01:14:47 --> 01:14:52
			Let's go to deep fakes. Now. Deep
fake technology is gone. Is is in
		
01:14:52 --> 01:14:55
			a sense, amazing, but it's
actually terrible at the same
		
01:14:55 --> 01:14:57
			time. If you don't know what a
deep fake is for those listening,
		
01:14:57 --> 01:15:00
			the deep fake has the ability
		
01:15:00 --> 01:15:05
			Read to mash up a person's voice
mouth facial expressions and make
		
01:15:05 --> 01:15:09
			a full video of the person saying
something they never said. Now
		
01:15:09 --> 01:15:13
			this has been around for a while.
And I remember Key and Peele was
		
01:15:13 --> 01:15:17
			that there was out there called,
they made one of Obama trashing
		
01:15:17 --> 01:15:21
			Trump, which was hilarious. And
Obama was saying stuff, you know,
		
01:15:21 --> 01:15:25
			in the beginning, it's, it's fine,
right? They get they get you
		
01:15:25 --> 01:15:29
			slowly, then eventually they start
making Obama say things that, you
		
01:15:29 --> 01:15:33
			know, he would never say, right.
And it was hilarious, but it was
		
01:15:33 --> 01:15:38
			scary in that the video looks so
real. And then the starting point
		
01:15:38 --> 01:15:41
			of that, or that was a starting
point of the saying that
		
01:15:43 --> 01:15:48
			very soon, video evidence will
have to have support video, by
		
01:15:48 --> 01:15:52
			itself should mean nothing to
people. Let me bring you another
		
01:15:52 --> 01:15:56
			situation for that. This has to do
with deep fakes, but there was a
		
01:15:56 --> 01:16:02
			kid one time, who seemed it seemed
was harassing an old Native
		
01:16:02 --> 01:16:05
			American, beating his drum.
		
01:16:06 --> 01:16:11
			And it seemed like the kid was
staring him down. Whereas the fact
		
01:16:11 --> 01:16:15
			the truth was the exact opposite.
The kid was given his speech,
		
01:16:15 --> 01:16:19
			which was his right to give. And
then the Native American came up
		
01:16:19 --> 01:16:21
			to him and started beating the
drum. So there's two aspects,
		
01:16:21 --> 01:16:24
			there's one aspect of the false
clippings
		
01:16:25 --> 01:16:30
			that misleading clips. But this is
another level, we're not even
		
01:16:30 --> 01:16:34
			talking about misleading clips.
Out of context clips, we're
		
01:16:34 --> 01:16:37
			talking about literally, the
person never uttered a single word
		
01:16:37 --> 01:16:40
			of this. And the technology to get
to doing this is going to
		
01:16:40 --> 01:16:44
			eventually someday be one of these
fun apps that these these
		
01:16:44 --> 01:16:48
			experiments will apps that they
released for fun, and everyone
		
01:16:48 --> 01:16:49
			will just type in
		
01:16:51 --> 01:16:55
			Barack Obama uttering the shahada,
right, so and so, you know,
		
01:16:55 --> 01:16:58
			cursing somebody else. And then
		
01:16:59 --> 01:17:01
			they're already there, they just
haven't trickled down to the
		
01:17:01 --> 01:17:04
			everyday user. So talk to us about
that. Let's go into fees first,
		
01:17:04 --> 01:17:04
			then we.
		
01:17:06 --> 01:17:07
			So the the
		
01:17:10 --> 01:17:15
			the main point of discussion here
is not the technology, but how
		
01:17:15 --> 01:17:19
			people are how quickly people are
getting used to it. Yeah. So
		
01:17:20 --> 01:17:24
			suppose right now, a clip comes
out of Biden saying something
		
01:17:27 --> 01:17:31
			around no anonymous clip, in the
forums that came out, and he's
		
01:17:31 --> 01:17:32
			saying something, really,
		
01:17:34 --> 01:17:35
			really, really, you know,
controversial.
		
01:17:37 --> 01:17:41
			Most people right now will believe
it. And it will, even if they
		
01:17:41 --> 01:17:44
			claim that okay, it was it was a
deep fake.
		
01:17:46 --> 01:17:49
			I don't think that would fly.
Because Because that because it's
		
01:17:50 --> 01:17:54
			the technology is just not out
there yet. But I don't know if
		
01:17:54 --> 01:17:56
			people know not enough people know
about it. Whereas everyone knows
		
01:17:56 --> 01:18:01
			about Adobe, you know,
photoshopping something. Yes.
		
01:18:01 --> 01:18:06
			Yeah. So once these models are
like production iced, like they
		
01:18:06 --> 01:18:08
			are in apps, different apps, where
you are using them for like
		
01:18:08 --> 01:18:13
			innocent innocuous users, then
they will be much more it will be
		
01:18:13 --> 01:18:16
			much more difficult to decipher
them. Yeah. And then you're going
		
01:18:16 --> 01:18:23
			to need forensic AV guys to be
able to tell us that this what's a
		
01:18:23 --> 01:18:27
			deep fake, and what's real, you're
gonna need like forensic editors,
		
01:18:27 --> 01:18:33
			forensic means go into the bare
granular little, smallest possible
		
01:18:34 --> 01:18:37
			identifiable trait in something.
So you have forensic accountants,
		
01:18:38 --> 01:18:40
			and then forensic scientists,
everything.
		
01:18:41 --> 01:18:44
			On the other hand, people will
also come with like, come up with
		
01:18:44 --> 01:18:47
			like, reverse models, yeah, where
they will be able to, like predict
		
01:18:48 --> 01:18:52
			if this video was like, it was a
real clip or like generated by
		
01:18:52 --> 01:18:57
			another model. So so we have to
see how that how those two phases
		
01:18:57 --> 01:19:01
			phases. So could you repeat that?
So it will be like a reverse
		
01:19:01 --> 01:19:04
			model, where the model takes in a
video, and then it can predict
		
01:19:04 --> 01:19:08
			like whether it was a genuine
clip? Or it was, it was generated
		
01:19:08 --> 01:19:12
			by another AI model. So that's
what we're gonna need. And for
		
01:19:12 --> 01:19:15
			example, chat GPT. Right now kids
have already gotten around that
		
01:19:15 --> 01:19:18
			there is an app that can identify
if your essay was made by chat
		
01:19:18 --> 01:19:22
			GBT. But youth have already gotten
around that by throwing it into
		
01:19:22 --> 01:19:26
			Google Translate, put into
Spanish, copy that, paste it into
		
01:19:26 --> 01:19:30
			Spanish, translate that to
English, you lose all that, throw
		
01:19:30 --> 01:19:33
			in a couple of your own words. And
the kid probably ended up spending
		
01:19:33 --> 01:19:37
			another hour of work right? To
avoid 40 minutes of work, right?
		
01:19:38 --> 01:19:42
			Anyway, but as easier were so
moinian Your take on this on deep
		
01:19:42 --> 01:19:46
			fakes deep fakes will soon not be
a video will not be sufficient
		
01:19:46 --> 01:19:51
			evidence in legal court, but in
the court of public opinion. It's
		
01:19:51 --> 01:19:55
			going to be a disaster for a lot
of people. Yeah, so I think let's
		
01:19:55 --> 01:19:59
			go back a little bit on on the
history of deep fakes.
		
01:20:00 --> 01:20:04
			Because I think there's something
even more nefarious here than just
		
01:20:04 --> 01:20:08
			the idea of like this legal,
criminal incrimination evidence
		
01:20:08 --> 01:20:12
			and all that stuff, I think enough
is covered that pretty well. Deep
		
01:20:12 --> 01:20:16
			fakes originally started with
*, right? Because it, it
		
01:20:16 --> 01:20:19
			started off of, I think, some some
Reddit forum or some 4chan forum
		
01:20:20 --> 01:20:25
			where somebody made a video of a
celebrity, and it was based on,
		
01:20:26 --> 01:20:31
			you know, them doing the act, or
whatever. And it was a deep fake,
		
01:20:31 --> 01:20:34
			right. And it started from that,
and the growth of the deep fake
		
01:20:34 --> 01:20:39
			technology actually happened from
*, right? People wanted to put
		
01:20:39 --> 01:20:43
			their celebrities in various
different * videos. And based
		
01:20:43 --> 01:20:46
			on this, they wanted to build it.
There's actually a saying in
		
01:20:46 --> 01:20:48
			technology that if you really want
to see the most advanced
		
01:20:48 --> 01:20:50
			technology out there go to the
* industry, because they have
		
01:20:50 --> 01:20:54
			the most money and the most people
working there. Anybody wants to
		
01:20:54 --> 01:20:56
			work on that stuff? I mean, yeah,
you have to be a little bit
		
01:20:56 --> 01:20:58
			degenerate to want to work on it,
but they have a lot of funding and
		
01:20:58 --> 01:21:02
			a lot of money to build this
stuff. But what becomes very
		
01:21:02 --> 01:21:07
			nefarious, is this idea of deep
fakes being used to place people
		
01:21:07 --> 01:21:11
			in any sort of situation, right?
It might not even be legally
		
01:21:11 --> 01:21:17
			incriminating, right? But would
you want a picture of yourself in
		
01:21:17 --> 01:21:20
			like this, you know, sexual
escapade that somebody has, would
		
01:21:20 --> 01:21:23
			you want a picture of your your
children, your daughters, your
		
01:21:23 --> 01:21:26
			mothers, your sisters, any of
these types of things, people by
		
01:21:26 --> 01:21:27
			the way,
		
01:21:28 --> 01:21:32
			one of the most disturbing things
I've ever seen in my life is there
		
01:21:32 --> 01:21:39
			is a, there was a secret subreddit
of people who had made fake images
		
01:21:39 --> 01:21:43
			using random Muslim sisters
pictures, and it was like for
		
01:21:43 --> 01:21:46
			* material, and they
like unclothed them and put them
		
01:21:46 --> 01:21:51
			on, you know, different different
people, right? Most disturbing
		
01:21:51 --> 01:21:55
			thing I've seen, but they used
like this real swap technology,
		
01:21:55 --> 01:21:58
			there's a bunch of these other AI
tools out there to make it look
		
01:21:58 --> 01:22:01
			almost, you know, make it look
very realistic, right, you
		
01:22:01 --> 01:22:04
			wouldn't be able to tell. And so
there was actually a version of
		
01:22:04 --> 01:22:09
			stable diffusion called unstable
diffusion, which enable diffusion
		
01:22:09 --> 01:22:13
			in the first place. enabled
diffusion is an open source, image
		
01:22:13 --> 01:22:18
			generation tool, like mid journey,
or dolly, dolly or any of these
		
01:22:18 --> 01:22:21
			tools. So stable diffusion is the
open source version. And there was
		
01:22:21 --> 01:22:26
			unstable diffusion, which was the
open source version to create
		
01:22:26 --> 01:22:30
			*, right? To just create
naked images. So you could say,
		
01:22:30 --> 01:22:35
			hey, I need a naked image of x
actress or y actor, and you would
		
01:22:35 --> 01:22:37
			say, like, I want them in this
position, and it would give it to
		
01:22:37 --> 01:22:43
			you. Yeah, right now, there was
work being done. And this, this
		
01:22:43 --> 01:22:45
			project got shut down, and you
can't find it. Now. I'm sure it
		
01:22:45 --> 01:22:47
			probably exists somewhere and
people are working on it, right.
		
01:22:48 --> 01:22:52
			But there was an effort to be able
to upload your category, your
		
01:22:52 --> 01:22:56
			category of images and say, Hey,
here's this entire like folder I
		
01:22:56 --> 01:23:00
			have of these women that I like,
you know, Can you can you generate
		
01:23:00 --> 01:23:04
			an image of these women and myself
doing such a such thing? And it
		
01:23:04 --> 01:23:07
			will generate it for you? Right?
This is something that's even more
		
01:23:07 --> 01:23:12
			nefarious, right? And it could be
with regards to not even legally
		
01:23:12 --> 01:23:15
			incriminating, you could use it to
place anybody anywhere at any
		
01:23:15 --> 01:23:19
			time. Right? This is what can't be
used. You need forensics to undo
		
01:23:19 --> 01:23:24
			this right. Now. So this is where
like, there can be a legal claim
		
01:23:24 --> 01:23:27
			made where okay, this if this
causes harm to specific
		
01:23:27 --> 01:23:32
			individuals. Do you need to get
consent before you can use
		
01:23:32 --> 01:23:37
			somebody's picture online inside
of this database? Yeah, I don't
		
01:23:37 --> 01:23:41
			know. Right. So well, so So the
court was the one of the reasons
		
01:23:41 --> 01:23:45
			that this technology actually, we
actually sort of needed to speed
		
01:23:45 --> 01:23:50
			up in terms of the trickle down.
Because the court of public
		
01:23:50 --> 01:23:55
			opinion is still not there yet.
Like not enough people know about
		
01:23:55 --> 01:23:58
			deep fakes, you can easily destroy
someone's life on a deep fake
		
01:23:58 --> 01:24:01
			right now, because not enough
people know, it's a deep fake,
		
01:24:01 --> 01:24:05
			right? And these saying it's a
deep fake wouldn't even be
		
01:24:05 --> 01:24:08
			plausible or believable. Right?
		
01:24:09 --> 01:24:13
			So it's, it's in everyone's best
interest that this knowledge at
		
01:24:13 --> 01:24:17
			least of deep fakes trickles down
to everybody. So everyone knows
		
01:24:17 --> 01:24:20
			it. So then eventually, all
certain types of videos will be
		
01:24:20 --> 01:24:24
			cast aside. Right will not be
treated as evidence.
		
01:24:24 --> 01:24:28
			Unfortunately, I think that's only
going to happen for a short period
		
01:24:28 --> 01:24:32
			of time. And this is me being a
little bit pessimistic. But you
		
01:24:32 --> 01:24:37
			can see that the Overton Window on
degeneracy has shifted a lot over
		
01:24:37 --> 01:24:40
			the last, you know, 15 years what
was considered something like
		
01:24:40 --> 01:24:44
			* in 1980 is now
considered like PG 13 material,
		
01:24:44 --> 01:24:48
			right? Like that's considered
nothing. Right? You have decent
		
01:24:48 --> 01:24:52
			people, right? Even people like
you and I, who have probably
		
01:24:52 --> 01:24:57
			watched like a normal TV show, and
it's has like, you know, explicit
		
01:24:57 --> 01:24:59
			content that probably would be
considered * and it
		
01:25:00 --> 01:25:04
			95 Right, like, the the Overton
Window on what is acceptable has
		
01:25:04 --> 01:25:09
			shifted so far that I'm afraid
that even if something like this
		
01:25:09 --> 01:25:12
			comes out, and there's deep fakes
of regular people, I'm afraid that
		
01:25:12 --> 01:25:15
			most people will be like, Well,
that's obvious. There's going to
		
01:25:15 --> 01:25:18
			be deep fakes of me out there
somebody's you know, making images
		
01:25:18 --> 01:25:22
			and videos of me and using them.
Yeah, I'm afraid that that that is
		
01:25:22 --> 01:25:24
			going to be like, well, it's like,
Well, that's obvious. That's what
		
01:25:24 --> 01:25:28
			you get with technology. I'm
afraid that I'm pretty sure that
		
01:25:28 --> 01:25:31
			that's people just gonna
capitulate to that stuff. But at
		
01:25:31 --> 01:25:35
			the same time, the we're still at
the point where a clip a video
		
01:25:35 --> 01:25:40
			clip can destroy someone's life.
Yes. And there's and we haven't
		
01:25:40 --> 01:25:44
			yet seen a situation where someone
responded and said, No, no, that's
		
01:25:44 --> 01:25:48
			a deep fake, like we do we have we
had a scandal like that, where a
		
01:25:48 --> 01:25:51
			scandal literally stopped in its
tracks, because the person said,
		
01:25:51 --> 01:25:55
			it's a deep fake bottle. The
reason for it is just the
		
01:25:55 --> 01:26:00
			technology is like not really at
that, that point where it's like
		
01:26:00 --> 01:26:03
			convincing. Yeah, like, yeah, it's
convinced it's not convincing yet,
		
01:26:03 --> 01:26:07
			like it still, it still has
glitches. But you know that, like
		
01:26:07 --> 01:26:10
			at the root of much more
rudimentary things, such as a
		
01:26:10 --> 01:26:15
			tweet, you can easily tweet
something crappy or texto, put it
		
01:26:15 --> 01:26:20
			on someone else's tweet. And then
and then share that on a different
		
01:26:20 --> 01:26:24
			platform altogether. You will have
people for years thinking that
		
01:26:24 --> 01:26:28
			attributing the that fake tweet to
that other person. Oh, yeah.
		
01:26:28 --> 01:26:33
			Leftist websites, all? Yeah, they
do this all the time. Yeah. So
		
01:26:33 --> 01:26:34
			there's actually
		
01:26:35 --> 01:26:40
			there's actually something I don't
want to get bogged down into
		
01:26:40 --> 01:26:43
			philosophy here. But there's
actually a postmodernist
		
01:26:44 --> 01:26:48
			philosopher by the name of John
Baudrillard. The Matrix movies
		
01:26:48 --> 01:26:52
			were actually made on some of his
philosophy, right, that you have
		
01:26:52 --> 01:26:56
			these levels of simulation in a
world, and you can reach a point.
		
01:26:56 --> 01:26:59
			So it brings a story that you
have, you have the the world,
		
01:27:00 --> 01:27:04
			right, and you have his empire.
And in this empire, the king
		
01:27:04 --> 01:27:06
			decided to hire some
cartographers, and they decided to
		
01:27:06 --> 01:27:09
			make a map of this empire. And
eventually, they made this map and
		
01:27:09 --> 01:27:13
			it was so large that it covered
the entirety of the Empire. Right.
		
01:27:14 --> 01:27:17
			And this is how big this map was.
And eventually, what happens is
		
01:27:17 --> 01:27:21
			that the terrain, the Empire that
it was based on, it withered away,
		
01:27:21 --> 01:27:25
			and it disappeared. And now all
you're all you're left with is the
		
01:27:25 --> 01:27:29
			map itself. And so then when
people 100 years later, they
		
01:27:29 --> 01:27:32
			commonly say, Oh, well, this is
what the Empire is, well, they're
		
01:27:32 --> 01:27:35
			not looking at the Empire. They're
looking at the map itself. They're
		
01:27:35 --> 01:27:38
			looking at this, what what he
calls a simulacra. You're looking
		
01:27:38 --> 01:27:42
			at this, you know, representation
of reality, but it's not reality.
		
01:27:42 --> 01:27:44
			Right? And that's what the matrix
movies were kind of Wow. Right,
		
01:27:44 --> 01:27:48
			which is, you have this reality,
but it's not really the real
		
01:27:48 --> 01:27:52
			thing. And so I don't think this
thing that you're talking about is
		
01:27:52 --> 01:27:56
			going to become plausible, until
we reach a stage at which we reach
		
01:27:56 --> 01:27:59
			hyper reality, which is there are
certain things which are more real
		
01:27:59 --> 01:28:05
			than the real strength, riches.
People trust the the image, or the
		
01:28:05 --> 01:28:09
			video, or the thing that's there
more than they do the real person,
		
01:28:09 --> 01:28:12
			because they're gonna say, Well,
how can we trust you, you're just
		
01:28:12 --> 01:28:15
			the guy. But we can trust the
data, we can trust the machine, we
		
01:28:15 --> 01:28:18
			can trust all of the history of
everything that's there and your
		
01:28:18 --> 01:28:22
			data, and we can verify it. And
here's all these things. Here's
		
01:28:22 --> 01:28:24
			this is what your doctor said,
This is what your mother said,
		
01:28:24 --> 01:28:26
			This is what xy said, this is
where you were, this is what's
		
01:28:26 --> 01:28:29
			happening. You're telling me
you're not in the video, but all
		
01:28:29 --> 01:28:33
			evidence shows that this is you in
the middle? Well, we already have
		
01:28:33 --> 01:28:36
			that, in a sense, through DNA. The
most honest person in the world
		
01:28:36 --> 01:28:40
			could tell you I wasn't there. And
then so so well, your DNA is all
		
01:28:40 --> 01:28:45
			over the doorknobs, your DNA is
all over the pillow, right? And
		
01:28:45 --> 01:28:47
			we're gonna choose the DNA, most
people will believe the DNA over
		
01:28:47 --> 01:28:50
			the person. And what are you going
to do in the situation where
		
01:28:50 --> 01:28:54
			someone says, you know, I wasn't
there, but they say, well, the
		
01:28:54 --> 01:28:57
			data shows that you work? Yeah.
Well, what?
		
01:28:58 --> 01:29:00
			Well, let's say somebody, let's
say,
		
01:29:01 --> 01:29:04
			for example, if somebody's making
a fake video, right, I'm imagining
		
01:29:04 --> 01:29:07
			along with this fake video, they
probably did all the things,
		
01:29:07 --> 01:29:09
			right. Because in the future,
let's say somebody's trying to
		
01:29:09 --> 01:29:12
			make a fake video to impersonate
someone or incriminate someone.
		
01:29:13 --> 01:29:16
			They're also changing, like where
they were at a certain location or
		
01:29:16 --> 01:29:19
			how that happens. And there's data
points that you're they're also
		
01:29:19 --> 01:29:21
			messing with. So somebody could
look at it and say, well, the
		
01:29:21 --> 01:29:25
			evidence is there. It's saying
that you're here. And all the data
		
01:29:25 --> 01:29:28
			points to the fact that you were
there. So your your work, and the
		
01:29:28 --> 01:29:32
			word of your witnesses is not as
strong as the data.
		
01:29:33 --> 01:29:38
			So this is an issue. So we covered
some of the basic facts on how AI
		
01:29:38 --> 01:29:43
			works, we covered Who's in the
lead. Okay, with AI, we covered
		
01:29:43 --> 01:29:48
			some of this medical, which I
think is going to be the biggest
		
01:29:48 --> 01:29:55
			right, curing leukemia, and using
AI models, generated models for
		
01:29:55 --> 01:29:58
			that type of work is going to be
probably the biggest and most
		
01:29:58 --> 01:30:00
			important advance
		
01:30:00 --> 01:30:03
			And then we talked about some of
the practical aspects of life,
		
01:30:04 --> 01:30:08
			including data or personal data
being out there deep fakes not
		
01:30:08 --> 01:30:13
			being sound evidence for anything,
eventually. So I think we're gonna
		
01:30:13 --> 01:30:15
			get in another chat.
		
01:30:16 --> 01:30:20
			In another discussion we need to
have we need to talk about the
		
01:30:20 --> 01:30:24
			metaverse and how that's going to
affect people psychologically, and
		
01:30:24 --> 01:30:27
			how that's going to affect their
view of actual reality in the
		
01:30:27 --> 01:30:33
			world. Right. So we need to have
these Tech Tech conversations like
		
01:30:33 --> 01:30:37
			once every two months to stay up
to date, because you can't be
		
01:30:38 --> 01:30:40
			behind the eight ball in these
things, you got to be ahead of it
		
01:30:40 --> 01:30:46
			ahead of the curve. And so with
that, you guys can hang out and
		
01:30:46 --> 01:30:50
			stay out. Let's turn it to the
audience here. Let's turn it to
		
01:30:50 --> 01:30:54
			everybody who has a comment or
question on Instagram. If you're
		
01:30:54 --> 01:30:57
			on Instagram, you could still
listen in but you can also watch
		
01:30:57 --> 01:31:01
			the video on YouTube, the full
video on YouTube. And let's start
		
01:31:01 --> 01:31:05
			going to any questions that we
have. All right.
		
01:31:06 --> 01:31:10
			We have nevus Hamid here but we
have Nafisa Hamid and in relation
		
01:31:11 --> 01:31:16
			ona Ziva oops I misread it okay.
Oh your sister Mashallah. So that
		
01:31:16 --> 01:31:19
			your parents chose you named you
Neff is and the Seaton the ZIVA.
		
01:31:19 --> 01:31:21
			Okay, it was just a game, right?
		
01:31:23 --> 01:31:26
			Where the is a is it a play on?
Okay, so.
		
01:31:29 --> 01:31:33
			All right. Let's go to comments
and questions here. If you're
		
01:31:33 --> 01:31:38
			having depression, Melody 21 says,
Just wait. All right. You know,
		
01:31:38 --> 01:31:44
			just wait when the we everyone's
on the metaverse, right? When you
		
01:31:44 --> 01:31:49
			take that off, this world will
seem to you to be less colorful,
		
01:31:49 --> 01:31:53
			less everything. And especially
when they put up the haptic body
		
01:31:53 --> 01:31:56
			suit in which like you would be
able to shake someone's hands in
		
01:31:56 --> 01:31:59
			the metaverse and you would feel
it, they would tap you on the
		
01:31:59 --> 01:32:01
			back, you would feel it they would
touch you in a pleasurable way you
		
01:32:01 --> 01:32:02
			would feel it.
		
01:32:04 --> 01:32:07
			Where's depression gonna go? Then
when you take that off of somebody
		
01:32:07 --> 01:32:10
			that's going to be a crack
addiction, basically a heroin
		
01:32:10 --> 01:32:13
			addiction? Let me ask you a
question. Yeah.
		
01:32:15 --> 01:32:18
			They are. Well, let me ask you a
question. Okay.
		
01:32:19 --> 01:32:22
			If I asked you to draw me a
princess, what would you?
		
01:32:24 --> 01:32:29
			What's repeat? If I asked you to
draw me a princess? Yeah. What
		
01:32:29 --> 01:32:29
			would you draw?
		
01:32:30 --> 01:32:35
			Probably a Disney princess is on
my mind. Right? Like, it's no way
		
01:32:35 --> 01:32:40
			to Princess Cinderella princess.
So your idea of what a princess
		
01:32:40 --> 01:32:46
			is, has already been influenced
right in the immediate shot.
		
01:32:47 --> 01:32:50
			Because what you what you think
about a princess, like you're
		
01:32:50 --> 01:32:54
			unable to even comprehend the
reality that there could be some
		
01:32:54 --> 01:32:57
			other version of a princess that
isn't like Belle, or Cinderella,
		
01:32:57 --> 01:33:00
			or whatever these Disney
Princesses are, right? So people
		
01:33:00 --> 01:33:04
			think that they're going to go
into the metaverse, and they're
		
01:33:04 --> 01:33:06
			going to enter this simulation and
start believing all these things.
		
01:33:06 --> 01:33:10
			No, no, you already believe all
those things. Yeah. This is the
		
01:33:10 --> 01:33:14
			whole point of understanding like
today's today's work, right?
		
01:33:14 --> 01:33:18
			You've already been fed all of
these beliefs. And these ideas.
		
01:33:18 --> 01:33:22
			This is just another level of the
simulation. You could say right?
		
01:33:22 --> 01:33:25
			You're already in the simulation,
right? You believe what a princess
		
01:33:25 --> 01:33:28
			is what love is, if I asked you
what love is, you'll tell me like,
		
01:33:28 --> 01:33:30
			oh, you know, some X, Y and Z
movie. You tell me Gone With the
		
01:33:30 --> 01:33:33
			Wind this, this, this and that.
Right? This is love Romeo and
		
01:33:33 --> 01:33:36
			Juliet. What Shakespeare? You're
saying we've been influenced from
		
01:33:36 --> 01:33:38
			the outside in? Yeah, this
		
01:33:39 --> 01:33:43
			is really, yeah, by strangers by
ideologies that have kind of, you
		
01:33:43 --> 01:33:47
			know, taken a hold of people. And
now they believe all these things.
		
01:33:47 --> 01:33:49
			But anyways, yeah, so I don't want
to get too into that. But yeah,
		
01:33:49 --> 01:33:53
			but an immediate, immediate
example would be like, if you are
		
01:33:53 --> 01:33:56
			a heavy social, Facebook or
Twitter user. And if you have
		
01:33:56 --> 01:34:01
			friends there, and you haven't met
them for a long time, you already
		
01:34:01 --> 01:34:04
			have a certain kind of image of
them. Yeah, like of what they are,
		
01:34:04 --> 01:34:08
			what their favorite thing, how
they're acting, but that but that
		
01:34:08 --> 01:34:11
			will be probably completely
different from what they really
		
01:34:11 --> 01:34:15
			are in the in the real life. Yeah,
it's good. And the more someone,
		
01:34:15 --> 01:34:18
			the more that there's a separation
of the online person and the on
		
01:34:18 --> 01:34:22
			site person, the more weirdness is
going to develop, like, for
		
01:34:22 --> 01:34:26
			example, there's a lot of people
that are there in the real world,
		
01:34:26 --> 01:34:28
			and they're online at the same
time. Right? There's a lot of
		
01:34:28 --> 01:34:31
			people like that. So there's
probably going to be more
		
01:34:31 --> 01:34:34
			consistency between their online
and their on in the real world.
		
01:34:34 --> 01:34:40
			Once you have someone with no real
world footprint out there, but a
		
01:34:40 --> 01:34:43
			ton of online presence, you know
that that that online presence
		
01:34:43 --> 01:34:47
			becomes less and less reliable. So
the question is the reliability of
		
01:34:47 --> 01:34:51
			the ability to assess character or
assess the person. If there's a
		
01:34:51 --> 01:34:53
			big gap between your online
presence and your real world
		
01:34:53 --> 01:34:57
			presence. We're going to say your
online presence is not reliable.
		
01:34:58 --> 01:34:59
			Whereas when there's a lot of
online
		
01:35:00 --> 01:35:03
			I am on Unreal World, it's
probably your online is a reliable
		
01:35:03 --> 01:35:07
			reflection of you. The one piece
of advice I can give for most
		
01:35:07 --> 01:35:13
			people with regards to AI coming,
okay, it's coming, it's
		
01:35:13 --> 01:35:16
			inevitable, whether it's going to
affect your job or not, yeah, I
		
01:35:16 --> 01:35:19
			can give you a plethora of
advices. Like, if you're in this
		
01:35:19 --> 01:35:20
			field, you can do this, you can
probably find all that stuff on
		
01:35:20 --> 01:35:24
			the internet. Right? But one
unconventional piece of advice
		
01:35:24 --> 01:35:26
			that you're not going to find on
the internet that, you know, I,
		
01:35:27 --> 01:35:31
			I've understood just from, you
know, our understanding of the
		
01:35:31 --> 01:35:33
			world as Muslims is
		
01:35:35 --> 01:35:39
			the way that we understand the
world now is so convoluted and
		
01:35:39 --> 01:35:43
			impacted from all these like
random ideologies. The one thing
		
01:35:43 --> 01:35:47
			you can prepare for, for the next
version of the simulation, as we
		
01:35:47 --> 01:35:50
			could say, you know, when the AI
comes in, you know, the world is
		
01:35:50 --> 01:35:53
			impacted from all this stuff, is
prepare yourself mentally,
		
01:35:53 --> 01:35:57
			spiritually, right emotionally,
for the things that are going to
		
01:35:57 --> 01:36:00
			be coming, right. So for example,
I'll give you some very dangerous
		
01:36:00 --> 01:36:00
			example.
		
01:36:01 --> 01:36:08
			Someone starts putting in to a fic
bot, let's say there's a fifth bot
		
01:36:08 --> 01:36:11
			out there, right, somebody and we
can probably make this now I'm
		
01:36:11 --> 01:36:15
			sure some organization like Yaqeen
is already working, right? There's
		
01:36:15 --> 01:36:19
			an AI fic bot, where you can go
ask this question. And in this fic
		
01:36:19 --> 01:36:24
			bot, somebody, it as part of the
data, one of the chains of one of
		
01:36:24 --> 01:36:28
			the Hadith was missing, right. And
then over time, people stop
		
01:36:28 --> 01:36:31
			memorizing the chains. And all
they remember is that they're
		
01:36:31 --> 01:36:34
			getting this information from this
fake bot. Now people start
		
01:36:34 --> 01:36:36
			learning from this fake bot. And
eventually it comes that people
		
01:36:36 --> 01:36:39
			don't actually know the original
chain for this hadith, right?
		
01:36:39 --> 01:36:43
			You're going to enter into a world
where, you know, it's, it's going
		
01:36:43 --> 01:36:46
			to be really difficult to
differentiate between what is
		
01:36:46 --> 01:36:50
			truth and what is false. Right? So
that's where, you know, clinging
		
01:36:50 --> 01:36:54
			on to Obama was critical. That's
good. That's a good point. Here's
		
01:36:54 --> 01:36:58
			another point. There's going to be
a massive disconnection, if we
		
01:36:58 --> 01:37:01
			think that 17 year old today is
disconnected from a six year old,
		
01:37:02 --> 01:37:07
			right? And that six year old is
going to say to his, let's say his
		
01:37:07 --> 01:37:10
			grandson, man, when I was young,
we used to ride our bike and knock
		
01:37:10 --> 01:37:14
			on the doors of our friends to
come out and play, right? And the
		
01:37:14 --> 01:37:17
			17 year olds is like, are the 12
year olds is like, what is this
		
01:37:17 --> 01:37:20
			right? Am I gonna hear these
stories from ancient times?
		
01:37:22 --> 01:37:26
			That gap, we have to be ready. If
you don't go on, if you don't want
		
01:37:26 --> 01:37:30
			to go on to VR and live on VR,
there will be a generation that
		
01:37:30 --> 01:37:34
			does live on VR. That generation
is coming. I really believe that
		
01:37:34 --> 01:37:37
			jet generation is coming. Because
Facebook is not going to rename
		
01:37:37 --> 01:37:42
			its operation meta without really
knowing full well that they're
		
01:37:42 --> 01:37:45
			headed to the metaverse, right
where you can go and they're going
		
01:37:45 --> 01:37:50
			to create a whole Metaverse and
the Taptic technological suit,
		
01:37:50 --> 01:37:54
			give that another 20 years. And
people will spend five and six and
		
01:37:54 --> 01:37:58
			eight hours a day at a time. Now,
if you choose not to become a
		
01:37:58 --> 01:38:03
			crack addict, because that world
will be addictive, okay. And you
		
01:38:03 --> 01:38:06
			choose not to be on it, well then
be ready to have a very massive
		
01:38:06 --> 01:38:08
			gap between you and the next
generation.
		
01:38:09 --> 01:38:13
			Right. That's something that we
have to think about to what degree
		
01:38:13 --> 01:38:17
			is that gap? Okay, to what degree
is that gap dangerous? And to what
		
01:38:17 --> 01:38:20
			degree do we literally just check
out of life, it's easy to check
		
01:38:20 --> 01:38:23
			out of life, it's much harder than
when someone says that your
		
01:38:23 --> 01:38:24
			grandson,
		
01:38:25 --> 01:38:28
			they need you. Right. And that's
where I think people who have
		
01:38:28 --> 01:38:31
			families are gonna get pulled,
they're always pulled into
		
01:38:31 --> 01:38:34
			adapting. People who don't have
families don't have to adapt.
		
01:38:35 --> 01:38:37
			Because I don't I don't need to
worry about the other people's
		
01:38:37 --> 01:38:37
			kids, right.
		
01:38:39 --> 01:38:42
			And adapting means that you're
going to be someone in their 30s
		
01:38:42 --> 01:38:46
			and 40s and 50s. almost looking
like a child putting on these
		
01:38:46 --> 01:38:46
			goggles.
		
01:38:48 --> 01:38:52
			And learning what this world is
all about. Right? Is that but I
		
01:38:52 --> 01:38:54
			don't really look at it from this
standpoint. I'm a person who's
		
01:38:54 --> 01:38:57
			going slowly towards the
afterlife. Do I really want to do
		
01:38:57 --> 01:39:01
			that? Let's say I'm 60 years old.
Do I really want to? Do I have the
		
01:39:01 --> 01:39:06
			energy and the temperament? Right.
Give the grandson a pep talk in
		
01:39:06 --> 01:39:09
			the metaverse. Yeah, exactly. Do I
have the temperament to learn a
		
01:39:09 --> 01:39:14
			brand new life altering technology
at that age? That's the those are
		
01:39:14 --> 01:39:17
			the questions that are going to
come up. And the second thing is I
		
01:39:17 --> 01:39:20
			really think it is going to be
like the drug if the cell phone is
		
01:39:20 --> 01:39:23
			a drug right now. And you still
have all your periphery vision
		
01:39:23 --> 01:39:28
			around you imagine the metaverse
and imagine now next generation
		
01:39:28 --> 01:39:31
			the next Elon Musk is going to put
the metaverse in a little chip in
		
01:39:31 --> 01:39:32
			your brain.
		
01:39:33 --> 01:39:38
			And you're gonna see that without
goggles. I pray the magic comes
		
01:39:38 --> 01:39:42
			before that. So to get to that,
but but it's important. I think
		
01:39:42 --> 01:39:43
			it's
		
01:39:44 --> 01:39:48
			I think it's critical to actually
be ready because a lot of people
		
01:39:48 --> 01:39:53
			their life gets disrupted. Just
because technology came in and
		
01:39:53 --> 01:39:56
			took their next generation away
from them. And they didn't know
		
01:39:56 --> 01:39:59
			how to figure they didn't know you
know how to manage
		
01:40:00 --> 01:40:01
			All right.
		
01:40:03 --> 01:40:05
			So a lot of questions will go
ahead.
		
01:40:07 --> 01:40:08
			Okay.
		
01:40:09 --> 01:40:10
			Can you read this?
		
01:40:14 --> 01:40:16
			Alright, so he said,
		
01:40:17 --> 01:40:23
			For the fees, could we potentially
use CRISPR to end neurodivergent
		
01:40:23 --> 01:40:29
			C, as in, we change how the brain
develops once the symptoms of
		
01:40:29 --> 01:40:31
			autism and ADHD come
		
01:40:33 --> 01:40:36
			up. So so the critical part is
basically to find out what to
		
01:40:36 --> 01:40:43
			change, that's the most difficult
part. So the problem with these
		
01:40:43 --> 01:40:48
			diseases or especially
neurodegenerative diseases, is
		
01:40:48 --> 01:40:51
			finding what is really causing
that disease. Most of the times,
		
01:40:51 --> 01:40:55
			it's not really a single gene. So
there will be like, many, many,
		
01:40:56 --> 01:41:02
			multiple genes that work together
to cause these diseases. So So
		
01:41:03 --> 01:41:07
			then it becomes a problem, like,
what to change. And what, like,
		
01:41:07 --> 01:41:11
			you have lots of genes like where
to change, you can cannot ideally
		
01:41:11 --> 01:41:15
			change like all of them, right?
Because those genes have their own
		
01:41:15 --> 01:41:22
			functions also. So it's the
basically the, the hard part of
		
01:41:22 --> 01:41:24
			the research is like, what is
causing it?
		
01:41:25 --> 01:41:30
			And for the for, like, diseases
like ADHD, schizophrenia, we have
		
01:41:30 --> 01:41:35
			like 20 years of research. Like,
it's still, like, most of the
		
01:41:35 --> 01:41:38
			drugs haven't worked, just because
		
01:41:39 --> 01:41:42
			we haven't really found really
causal connection between a gene
		
01:41:42 --> 01:41:43
			and and those diseases.
		
01:41:45 --> 01:41:47
			So, good question.
		
01:41:48 --> 01:41:53
			Yeah, so you have to have
certainty on the causal, the gene
		
01:41:53 --> 01:41:57
			that's causing it, and the side
effects, you can't just pull out a
		
01:41:57 --> 01:42:01
			domino and replace it. And then
imagine there's going to be no
		
01:42:01 --> 01:42:04
			side effects human beings, they're
all world one. Right? So that's
		
01:42:04 --> 01:42:07
			why the playing around with this
stuff does have issues. And
		
01:42:07 --> 01:42:10
			probably the people with who are
about to die are going to be the
		
01:42:10 --> 01:42:14
			ones who put themselves up. Right,
they're going to throw out the
		
01:42:14 --> 01:42:16
			last thread. They're the ones who
are going to be willing to be part
		
01:42:16 --> 01:42:19
			of these experiments, right?
Actually, that's recently a paper
		
01:42:19 --> 01:42:24
			came out where like, they studied
the sales of six patients who
		
01:42:24 --> 01:42:28
			already died. Alzheimer patients,
and they like from that study,
		
01:42:28 --> 01:42:32
			they found a particular gene like
that has been like, really,
		
01:42:33 --> 01:42:36
			it was it was producing extra
protein in the disease patients
		
01:42:36 --> 01:42:37
			brain. So
		
01:42:39 --> 01:42:43
			I'm going to close out by saying
that, I think it's extremely
		
01:42:43 --> 01:42:46
			important for everyone to go get a
book called virtues of seclusion.
		
01:42:48 --> 01:42:53
			Virtues of seclusion, and
seclusion for us in our world. It
		
01:42:53 --> 01:42:56
			levels out your brain levels out
your heart, it levels out your
		
01:42:56 --> 01:43:00
			priorities. And seclusion is not
any longer me going out to hang
		
01:43:00 --> 01:43:03
			out with people. Seclusion is
seclusion from
		
01:43:04 --> 01:43:10
			stimuli, tech stimuli. That's the
that is the seclusion of our day
		
01:43:10 --> 01:43:15
			and age, cutting off technological
stimuli. That's our seclusion,
		
01:43:15 --> 01:43:19
			right. Today, in the old days,
going out and chit chatting with
		
01:43:19 --> 01:43:25
			people was your stimuli. That
today has been so downgraded. And
		
01:43:25 --> 01:43:29
			all we have is the constant
digital stimulation. cutting that
		
01:43:29 --> 01:43:32
			off is our version of seclusion.
And I believe every person at a
		
01:43:32 --> 01:43:35
			certain time, let's say you do a
practice, like seven or 8pm. And
		
01:43:35 --> 01:43:39
			you just go, I don't care who
calls, who texts, what happens in
		
01:43:39 --> 01:43:42
			the world. If a meteor hits the
hits the world, I don't care. I'm
		
01:43:42 --> 01:43:46
			going to put all my phones in the
car, and my computer, my iPads in
		
01:43:46 --> 01:43:49
			the car, and I'm just going to
enjoy the rest of the evening. We
		
01:43:49 --> 01:43:53
			also need to know as human beings,
what diseases and what makes us
		
01:43:53 --> 01:43:56
			upset and what throws off balance
is not having a direction.
		
01:43:58 --> 01:44:01
			I remember there was a chef who
used to have a casket in his
		
01:44:01 --> 01:44:01
			house.
		
01:44:02 --> 01:44:06
			In his room, he had a prayer room
with a casket. When he would go
		
01:44:06 --> 01:44:09
			there, it would level out because
he feels like if that's where I'm
		
01:44:09 --> 01:44:13
			headed. Nothing else matters
except the scoreboard at that
		
01:44:13 --> 01:44:17
			moment. Right. And that actually
used to wash away a lot of his
		
01:44:17 --> 01:44:21
			whole movement of the concerns,
anxieties, fears, past present
		
01:44:21 --> 01:44:25
			future, it washes away because
this is not only this, is this the
		
01:44:25 --> 01:44:27
			only thing that matters. This is
the one thing that's a worldwide
		
01:44:27 --> 01:44:31
			guarantee. No human being will
ever dispute that you're going
		
01:44:31 --> 01:44:32
			there.
		
01:44:33 --> 01:44:37
			And now let's assess who has
something that's going to tell us
		
01:44:37 --> 01:44:38
			what's going to happen when we go
down there.
		
01:44:40 --> 01:44:43
			And the only real answers is
nothing or heaven and *.
		
01:44:44 --> 01:44:46
			Is there really a third answer? No
one really believes the Hindu
		
01:44:46 --> 01:44:51
			stuff. Right? No one's buying into
that. You come back in a different
		
01:44:51 --> 01:44:53
			form. I don't think so.
		
01:44:54 --> 01:44:58
			And even that is based on fine
righteousness, I think. Right? So
		
01:45:00 --> 01:45:06
			That to me is the great balancer
the great stabilizer, is that
		
01:45:06 --> 01:45:10
			casket right there. All your
decisions eventually have to go
		
01:45:10 --> 01:45:13
			through that filter. It's a binary
am I going to do this or not? does
		
01:45:13 --> 01:45:17
			it benefit my afterlife or not?
Right and that's the stabilizer.
		
01:45:17 --> 01:45:18
			So when we talk about this stuff,
		
01:45:20 --> 01:45:23
			it's an a, it's a bombardment of
information. It's a bombardment of
		
01:45:23 --> 01:45:28
			stimuli and seclusion and
remembrance of death ultimately is
		
01:45:28 --> 01:45:32
			the great wiper away and washed
away of all the excess that can
		
01:45:32 --> 01:45:38
			make a person Dizzy lose focus,
addicted to these things, and this
		
01:45:38 --> 01:45:41
			thing is going to dislodge all
those addictions inshallah Tada.
		
01:45:41 --> 01:45:46
			So, we will stop here in sha Allah
today. And tomorrow will give more
		
01:45:46 --> 01:45:49
			time for open QA. All right,
because today we spent time on
		
01:45:49 --> 01:45:55
			open AI. So get the book virtues
of seclusion, in the times of
		
01:45:55 --> 01:45:58
			confusion versus the seclusion in
times of confusion from Mecca
		
01:45:58 --> 01:46:03
			books.com Support the live
[email protected] forward slash
		
01:46:03 --> 01:46:08
			Safina society. And with that, we
will see you all tomorrow in sha
		
01:46:08 --> 01:46:10
			Allah Subhana Allah whom obey him
dig
		
01:46:11 --> 01:46:16
			into the software we're going to
do what they called us in Santa Fe
		
01:46:16 --> 01:46:20
			was ill AlLadhina m&r Minnesota
had what also would have Courtois
		
01:46:20 --> 01:46:24
			so the southern was salam aleikum
wa rahmatullah?
		
01:47:08 --> 01:47:08
			God