Yasir Qadhi – Ai & Islam Future Implications

Yasir Qadhi
Share Page

AI: Summary ©

The potential of artificial intelligence is to create a world that is more rewarding and rewarding, with the potential for humans to learn from each other and create a world that is more rewarding. The potential for AI to predict who lives, make choices based on behavior, and shift power dynamics is discussed, along with the use of AI to analyze people's psychological profiles and address issues with Islam. The importance of being aware of the potential negative impact of AI on individuals is emphasized, along with presenting ideas to benefit the audience and a conference on AI issues to discuss issues of ethics.

AI: Summary ©

00:00:00 --> 00:00:01
			So I don't know about the rest of
		
00:00:01 --> 00:00:03
			you, but, I have been without electricity all
		
00:00:03 --> 00:00:05
			day. How many of you without electricity all
		
00:00:05 --> 00:00:06
			day?
		
00:00:08 --> 00:00:10
			So not actually not everybody. Most of us
		
00:00:10 --> 00:00:11
			apparently have electricity. But,
		
00:00:12 --> 00:00:15
			so the topic therefore that I thought about
		
00:00:15 --> 00:00:17
			was exactly related to
		
00:00:18 --> 00:00:19
			a concept that I feel we need to
		
00:00:19 --> 00:00:21
			bring up. And it just so happened that
		
00:00:21 --> 00:00:22
			today was,
		
00:00:22 --> 00:00:24
			reminded me of it.
		
00:00:24 --> 00:00:25
			Of course, overall,
		
00:00:26 --> 00:00:29
			the dependence that we have on technology,
		
00:00:30 --> 00:00:31
			it is actually frightening.
		
00:00:31 --> 00:00:32
			Because
		
00:00:33 --> 00:00:35
			today, since 6 AM, I have been without
		
00:00:35 --> 00:00:38
			electricity, myself, my whole family, what not. And
		
00:00:38 --> 00:00:40
			in reality, it is so utterly trivial because
		
00:00:40 --> 00:00:43
			for 10000 years, mankind has lived like this.
		
00:00:44 --> 00:00:44
			But
		
00:00:45 --> 00:00:46
			for us, 12 hours,
		
00:00:47 --> 00:00:48
			it's as if we don't know how we're
		
00:00:48 --> 00:00:49
			gonna survive.
		
00:00:50 --> 00:00:51
			We are so dependent
		
00:00:52 --> 00:00:54
			on this function,
		
00:00:54 --> 00:00:57
			we have forgotten how our own grandparents
		
00:00:58 --> 00:01:00
			and 10000 years before them have lived
		
00:01:01 --> 00:01:02
			without that interdependence.
		
00:01:03 --> 00:01:05
			So today I wanted to bring up a
		
00:01:05 --> 00:01:06
			new technology
		
00:01:07 --> 00:01:10
			that is already integrating into our lives and
		
00:01:10 --> 00:01:11
			we are becoming
		
00:01:12 --> 00:01:12
			frighteningly
		
00:01:13 --> 00:01:15
			dependent on this new technology.
		
00:01:16 --> 00:01:17
			And this generation,
		
00:01:18 --> 00:01:18
			our generation
		
00:01:19 --> 00:01:20
			is gonna see the interweaving
		
00:01:21 --> 00:01:22
			of this technology
		
00:01:23 --> 00:01:24
			to an unprecedented
		
00:01:24 --> 00:01:27
			level, and we're already seeing it. And it
		
00:01:27 --> 00:01:28
			is very important.
		
00:01:28 --> 00:01:30
			Every one of us, even those that don't
		
00:01:30 --> 00:01:32
			believe in a higher power, they must be
		
00:01:32 --> 00:01:34
			talking about this because
		
00:01:34 --> 00:01:37
			it is truly frightening, and I'm talking about
		
00:01:37 --> 00:01:38
			artificial intelligence,
		
00:01:38 --> 00:01:39
			AI.
		
00:01:39 --> 00:01:42
			I'm talking about artificial intelligence. Now, what exactly
		
00:01:42 --> 00:01:44
			is artificial intelligence and the pros and cons,
		
00:01:44 --> 00:01:47
			that's a whole long topic. But to summarize
		
00:01:47 --> 00:01:48
			in a nutshell,
		
00:01:48 --> 00:01:52
			artificial intelligence is a relatively new field. It
		
00:01:52 --> 00:01:54
			only dates back to almost 2 decades or
		
00:01:54 --> 00:01:55
			a decade and a half, and it's only
		
00:01:55 --> 00:01:57
			gained traction in the last few years in
		
00:01:57 --> 00:01:58
			particular.
		
00:01:58 --> 00:02:01
			And what artificial intelligence is,
		
00:02:01 --> 00:02:02
			is
		
00:02:02 --> 00:02:03
			us human beings
		
00:02:04 --> 00:02:04
			programming
		
00:02:05 --> 00:02:06
			processors
		
00:02:06 --> 00:02:07
			to essentially
		
00:02:08 --> 00:02:09
			think like us.
		
00:02:10 --> 00:02:12
			So imagine in realistically
		
00:02:13 --> 00:02:14
			5, 10 years,
		
00:02:14 --> 00:02:16
			your child or even you
		
00:02:16 --> 00:02:17
			having
		
00:02:17 --> 00:02:18
			a
		
00:02:18 --> 00:02:19
			partner
		
00:02:19 --> 00:02:23
			that you can have conversations with, ask information
		
00:02:23 --> 00:02:23
			of,
		
00:02:24 --> 00:02:25
			find detailed
		
00:02:26 --> 00:02:28
			analysis of your particular niche field, and that
		
00:02:28 --> 00:02:29
			partner is imaginary.
		
00:02:30 --> 00:02:33
			Imagine engaging in deep philosophical
		
00:02:33 --> 00:02:33
			conversation,
		
00:02:34 --> 00:02:36
			and this partner that you're having a talk
		
00:02:36 --> 00:02:38
			with will be able to research as you
		
00:02:38 --> 00:02:40
			are saying something
		
00:02:40 --> 00:02:42
			and be able to
		
00:02:42 --> 00:02:43
			amass
		
00:02:43 --> 00:02:44
			thousands
		
00:02:44 --> 00:02:46
			of books instantaneously.
		
00:02:47 --> 00:02:49
			And as you ask a question about a
		
00:02:49 --> 00:02:50
			totally new field,
		
00:02:51 --> 00:02:53
			this partner of yours becomes a global expert.
		
00:02:55 --> 00:02:59
			And this is what artificial intelligence is. It's
		
00:02:59 --> 00:02:59
			the mechanism,
		
00:03:00 --> 00:03:01
			the potentiality
		
00:03:02 --> 00:03:03
			to
		
00:03:03 --> 00:03:06
			almost not actually create, because only Allah is
		
00:03:06 --> 00:03:07
			al Khali, but to
		
00:03:08 --> 00:03:08
			program
		
00:03:09 --> 00:03:10
			a system
		
00:03:11 --> 00:03:13
			whose results even we
		
00:03:13 --> 00:03:14
			cannot predict.
		
00:03:15 --> 00:03:16
			This is what is frightening.
		
00:03:17 --> 00:03:19
			Up until now, we knew exactly what the
		
00:03:19 --> 00:03:22
			output would be. Up until now, computers and
		
00:03:22 --> 00:03:25
			programming was basically doing a lot of computation
		
00:03:25 --> 00:03:27
			super fast. We know exactly what the output's
		
00:03:27 --> 00:03:29
			gonna be. We could have done it ourselves,
		
00:03:29 --> 00:03:32
			except that the computer does it much faster
		
00:03:32 --> 00:03:33
			than us. Right?
		
00:03:34 --> 00:03:37
			What's happening now is that we are
		
00:03:37 --> 00:03:39
			allowing a program
		
00:03:40 --> 00:03:41
			to program itself
		
00:03:41 --> 00:03:42
			and learn
		
00:03:43 --> 00:03:44
			as it continues.
		
00:03:44 --> 00:03:47
			And this is uncharted territory.
		
00:03:47 --> 00:03:47
			Now,
		
00:03:49 --> 00:03:50
			already we are seeing the impact of this.
		
00:03:50 --> 00:03:52
			Now by the way, there's 2 types of
		
00:03:52 --> 00:03:52
			AI.
		
00:03:53 --> 00:03:55
			You have, AI which is very specific or
		
00:03:55 --> 00:03:58
			niche oriented, and this is fairly common already.
		
00:03:58 --> 00:04:02
			So for example, we already have, the technology,
		
00:04:02 --> 00:04:04
			which is a very frightening human technology,
		
00:04:05 --> 00:04:06
			to be able to translate
		
00:04:07 --> 00:04:09
			a verbal speech into any other language. AI
		
00:04:09 --> 00:04:11
			can already do this. Right? Google can already
		
00:04:11 --> 00:04:13
			do this. You can speak into an app
		
00:04:14 --> 00:04:15
			in your conversational
		
00:04:15 --> 00:04:19
			English, Urdu, Swahili, in your particular dialect, in
		
00:04:19 --> 00:04:20
			your particular region.
		
00:04:21 --> 00:04:23
			And this app will now be able to
		
00:04:23 --> 00:04:24
			assess your accent,
		
00:04:25 --> 00:04:28
			your your terminologies, your nuances, even your tone,
		
00:04:29 --> 00:04:32
			and will then translate into any of the
		
00:04:32 --> 00:04:34
			languages it is programmed to do on the
		
00:04:34 --> 00:04:38
			spot. This is specific AI, right? Or frightening
		
00:04:38 --> 00:04:40
			already being used by Israel and by China
		
00:04:40 --> 00:04:42
			and others is facial recognition.
		
00:04:42 --> 00:04:44
			And again, this is something we think is
		
00:04:44 --> 00:04:47
			basic, it's not. Imagine walking down the street
		
00:04:47 --> 00:04:48
			of anywhere in the world, and the camera
		
00:04:48 --> 00:04:51
			catches half of your face. It wa- by
		
00:04:51 --> 00:04:53
			an angle, and it's blurry picture.
		
00:04:53 --> 00:04:56
			Now AI has already reached a level where
		
00:04:56 --> 00:04:59
			it can recognize exactly who you are, and
		
00:04:59 --> 00:05:01
			if the governments have the databases, China does
		
00:05:01 --> 00:05:03
			and Israel does. And Allahu Adam, our own
		
00:05:03 --> 00:05:05
			country most likely does, but they're not saying
		
00:05:05 --> 00:05:07
			this. And you know every time we go
		
00:05:07 --> 00:05:08
			at the airport, this is exactly what they
		
00:05:08 --> 00:05:11
			do. Right? So the to able to capture
		
00:05:11 --> 00:05:14
			your image, you will be trackable wherever you
		
00:05:14 --> 00:05:14
			are.
		
00:05:15 --> 00:05:18
			Wherever you go, just one image anywhere, and
		
00:05:18 --> 00:05:20
			AI will be able to recognize out of
		
00:05:20 --> 00:05:22
			billions of people,
		
00:05:22 --> 00:05:23
			instantaneously,
		
00:05:23 --> 00:05:25
			they'll be able to recognize exactly where you
		
00:05:25 --> 00:05:28
			are. We're already using this in medicine. Doctors
		
00:05:28 --> 00:05:30
			here can tell us what AI is doing.
		
00:05:30 --> 00:05:34
			Amazing technological advances, where your sonogram or your
		
00:05:34 --> 00:05:37
			MRI or whatever your diagnosis might be, that
		
00:05:37 --> 00:05:39
			the computer will tap into
		
00:05:40 --> 00:05:42
			millions of different data points and be able
		
00:05:42 --> 00:05:43
			to analyze
		
00:05:43 --> 00:05:45
			far better than any doctor can.
		
00:05:46 --> 00:05:47
			It might be possible
		
00:05:48 --> 00:05:50
			soon. We won't need medical doctors. Guys, don't
		
00:05:50 --> 00:05:52
			be scared because when that happens, you guys
		
00:05:52 --> 00:05:54
			will be in charge of it anyway. So
		
00:05:54 --> 00:05:56
			jobs will be there. Don't worry. But realistically,
		
00:05:57 --> 00:05:59
			you will be better off getting
		
00:05:59 --> 00:06:02
			analyzed by an AI doctor than by a
		
00:06:02 --> 00:06:05
			real doctor. Because the AI doctor will have
		
00:06:05 --> 00:06:07
			a database that is infinitely larger than any
		
00:06:07 --> 00:06:10
			human being. And the AI doctor will have
		
00:06:10 --> 00:06:12
			a one stop specialty. Right now, if you
		
00:06:12 --> 00:06:15
			wanna have one specialist, then another, then another.
		
00:06:15 --> 00:06:17
			You have to keep on booking 5 different
		
00:06:17 --> 00:06:19
			appointments, go through your this and that. Imagine
		
00:06:19 --> 00:06:22
			one computer screen, one camera in front of
		
00:06:22 --> 00:06:25
			you, one analysis, everything being done
		
00:06:26 --> 00:06:26
			simultaneously.
		
00:06:27 --> 00:06:29
			We are literally within a few years away
		
00:06:29 --> 00:06:31
			from something like this. This is tangible,
		
00:06:32 --> 00:06:33
			realistic, niche AI.
		
00:06:34 --> 00:06:36
			The more frightening is general
		
00:06:36 --> 00:06:37
			AI,
		
00:06:38 --> 00:06:40
			and we're heading there now. Chatt GPT 4
		
00:06:40 --> 00:06:43
			and others, we're heading there now. And general
		
00:06:43 --> 00:06:46
			AI is basically a polymath, a super intellectual
		
00:06:47 --> 00:06:47
			genius,
		
00:06:48 --> 00:06:48
			a partner,
		
00:06:49 --> 00:06:50
			an intellectual partner
		
00:06:51 --> 00:06:51
			that
		
00:06:51 --> 00:06:53
			you have no idea
		
00:06:53 --> 00:06:55
			what is gonna come from this. You are
		
00:06:55 --> 00:06:56
			talking about
		
00:06:56 --> 00:06:58
			all of the specialties of the world combined
		
00:06:58 --> 00:07:01
			into 1 person, and you can converse with
		
00:07:01 --> 00:07:03
			that person. Imagine something like this.
		
00:07:03 --> 00:07:06
			That is where the world is heading, and
		
00:07:06 --> 00:07:08
			it's way the way things are happening is
		
00:07:08 --> 00:07:11
			just a matter of time. Already, chat gpt
		
00:07:11 --> 00:07:13
			version 4 and others, they can do some
		
00:07:13 --> 00:07:15
			amazing things that are already frightening. And I'm
		
00:07:15 --> 00:07:17
			telling you as a professor, as a lecturer,
		
00:07:17 --> 00:07:19
			as a teacher, it's very frightening what is
		
00:07:19 --> 00:07:22
			happening already. And this is just the beginnings
		
00:07:22 --> 00:07:24
			of, you know, the realities. Now,
		
00:07:24 --> 00:07:26
			there's a lot of positives as with all
		
00:07:26 --> 00:07:28
			technology. Today's not about the positives.
		
00:07:28 --> 00:07:30
			Of the positives, by the way, we just
		
00:07:30 --> 00:07:33
			told you medical medical developments are gonna be
		
00:07:33 --> 00:07:36
			groundbreaking. Of the biggest positives, we're already seeing
		
00:07:36 --> 00:07:36
			this, is
		
00:07:37 --> 00:07:38
			the,
		
00:07:39 --> 00:07:42
			driving, the self self driving cars. Right? This
		
00:07:42 --> 00:07:45
			is AI, self driving cars, to be able
		
00:07:45 --> 00:07:47
			to recognize anything on the road,
		
00:07:48 --> 00:07:49
			and then be able to
		
00:07:49 --> 00:07:51
			maneuver and navigate.
		
00:07:51 --> 00:07:53
			We've already seen this. I mean, this is
		
00:07:53 --> 00:07:54
			our generation.
		
00:07:54 --> 00:07:56
			Few years ago, it was a dream. Now,
		
00:07:56 --> 00:07:58
			I drive a Tesla. So many people drive
		
00:07:58 --> 00:08:00
			a Tesla. You just click the button, and
		
00:08:00 --> 00:08:02
			you just I answer my email messages while
		
00:08:02 --> 00:08:04
			I'm driving the Tesla. Literally. You know, it's
		
00:08:04 --> 00:08:06
			halal, it's legal because my hands my hands
		
00:08:06 --> 00:08:07
			on the road north. I literally do my
		
00:08:07 --> 00:08:09
			WhatsApp. That's why I got the Tesla because
		
00:08:09 --> 00:08:11
			of time. Like I've just literally just doing
		
00:08:11 --> 00:08:14
			my WhatsApp and answering email, and the the
		
00:08:14 --> 00:08:15
			the the car is driving
		
00:08:15 --> 00:08:17
			automatically on the freeway. I have the full
		
00:08:17 --> 00:08:20
			drive version. Even on the roads, it's driving.
		
00:08:20 --> 00:08:21
			And this is something we heard about for
		
00:08:21 --> 00:08:24
			a decade. Now, I have it. We have
		
00:08:24 --> 00:08:26
			it. Is the technology already there? And it
		
00:08:26 --> 00:08:28
			keeps on improving. Every few days,
		
00:08:28 --> 00:08:30
			there's a new version,
		
00:08:30 --> 00:08:31
			automatic upload,
		
00:08:31 --> 00:08:33
			and it does some amazing things. So we're
		
00:08:33 --> 00:08:35
			already seeing this, this, reality.
		
00:08:37 --> 00:08:40
			There are some challenges and ethical concerns, and
		
00:08:40 --> 00:08:42
			we as Muslims need to be very cognizant
		
00:08:42 --> 00:08:44
			of this because our shari'ah provides an answer
		
00:08:44 --> 00:08:46
			for everything. We need to be at the
		
00:08:46 --> 00:08:48
			forefront of some of the problems of these,
		
00:08:48 --> 00:08:50
			new technologies. So I wanted to just bring
		
00:08:50 --> 00:08:51
			them up, and then in the end of
		
00:08:51 --> 00:08:53
			the day, it's not my area and forte.
		
00:08:53 --> 00:08:55
			Other people need to take up and and
		
00:08:55 --> 00:08:56
			help us in this regard. What are some
		
00:08:56 --> 00:09:00
			of the problems that AI might bring about?
		
00:09:00 --> 00:09:00
			Well,
		
00:09:01 --> 00:09:02
			first and foremost,
		
00:09:02 --> 00:09:04
			one of the most obvious problems
		
00:09:05 --> 00:09:06
			is going to be
		
00:09:06 --> 00:09:08
			AI will have to decide
		
00:09:09 --> 00:09:10
			who lives and who dies
		
00:09:11 --> 00:09:13
			in a rational manner and not an emotional
		
00:09:13 --> 00:09:14
			manner.
		
00:09:14 --> 00:09:16
			You see, when you are driving, may Allah
		
00:09:16 --> 00:09:18
			protect all of us, but you see something
		
00:09:19 --> 00:09:21
			and you react on impulse.
		
00:09:21 --> 00:09:22
			You swerve,
		
00:09:23 --> 00:09:25
			you see a child, you see something, and
		
00:09:25 --> 00:09:26
			you just do something on impulse.
		
00:09:27 --> 00:09:31
			Generally speaking, we are forgiving of you, because
		
00:09:31 --> 00:09:32
			what could you do?
		
00:09:33 --> 00:09:35
			Generally speaking, no matter what happens, like if
		
00:09:35 --> 00:09:38
			it's not your fault, if somebody ran onto
		
00:09:38 --> 00:09:40
			the road and you did something and then
		
00:09:40 --> 00:09:41
			an accident happened,
		
00:09:41 --> 00:09:42
			generally speaking,
		
00:09:43 --> 00:09:45
			you might not be criminally responsible. You might
		
00:09:45 --> 00:09:47
			be legally whatever, but nobody's gonna look at
		
00:09:47 --> 00:09:49
			you as a criminal. Like you couldn't do
		
00:09:49 --> 00:09:51
			anything. You're a human being. You just reacted.
		
00:09:52 --> 00:09:54
			But you see, AI has zero emotion,
		
00:09:54 --> 00:09:57
			and AI is doing things at supersonic speed.
		
00:09:57 --> 00:09:59
			Right? Faster than or at the speed of
		
00:09:59 --> 00:10:00
			light to be precise. So
		
00:10:01 --> 00:10:03
			you will have to figure out
		
00:10:04 --> 00:10:06
			which life is more important.
		
00:10:07 --> 00:10:09
			And the AI is gonna make a calculation
		
00:10:10 --> 00:10:12
			in milli, milli, milliseconds, nanoseconds
		
00:10:12 --> 00:10:14
			and will literally decide,
		
00:10:14 --> 00:10:17
			is the life of the driver more important
		
00:10:17 --> 00:10:20
			or the life of that person on the
		
00:10:20 --> 00:10:20
			road?
		
00:10:21 --> 00:10:23
			And how will you decide that? Well, some
		
00:10:23 --> 00:10:26
			programmers are gonna have to have ethical questions.
		
00:10:26 --> 00:10:28
			Because in the end of the day, no
		
00:10:28 --> 00:10:30
			matter how awkward it is, you have to
		
00:10:30 --> 00:10:32
			program the AI to figure out what to
		
00:10:32 --> 00:10:34
			do. If there's a child, if there's an
		
00:10:34 --> 00:10:36
			elderly man, now we get to the famous,
		
00:10:36 --> 00:10:36
			you know,
		
00:10:38 --> 00:10:40
			philosophical problem that if you can swerve
		
00:10:41 --> 00:10:43
			a train with 2 people in it, and
		
00:10:43 --> 00:10:45
			you know, in the process save 5 people,
		
00:10:45 --> 00:10:48
			are you allowed to swerve the train? Or
		
00:10:48 --> 00:10:49
			should you let the train just go and
		
00:10:49 --> 00:10:52
			crash? You know these famous problems, these ethical
		
00:10:52 --> 00:10:54
			problems. Well right now, when we do it,
		
00:10:54 --> 00:10:55
			it's just a hypothetical.
		
00:10:56 --> 00:10:58
			With AI, it will become a real problem.
		
00:10:58 --> 00:11:00
			The car is going straight down, it sees
		
00:11:00 --> 00:11:01
			a bunch of school children.
		
00:11:02 --> 00:11:05
			You will have to program the AI what
		
00:11:05 --> 00:11:06
			it needs to do.
		
00:11:06 --> 00:11:08
			And some people will die and some people
		
00:11:08 --> 00:11:10
			will live. And this will not be on
		
00:11:10 --> 00:11:11
			impulse.
		
00:11:11 --> 00:11:12
			This will be
		
00:11:12 --> 00:11:14
			a rational decision
		
00:11:14 --> 00:11:17
			that somebody programmed into the AI.
		
00:11:17 --> 00:11:18
			Right? So this is one of the issues
		
00:11:18 --> 00:11:20
			that we're gonna have to be thinking about.
		
00:11:20 --> 00:11:23
			Already, AI is being used in war.
		
00:11:24 --> 00:11:27
			Right now, according to international law, not that
		
00:11:27 --> 00:11:30
			is applying, but still there is a modicum
		
00:11:30 --> 00:11:32
			of human involvement. Right now, according to the
		
00:11:32 --> 00:11:35
			UN and the international law, any drones that
		
00:11:35 --> 00:11:36
			send bombs,
		
00:11:37 --> 00:11:39
			a human must make that decision.
		
00:11:40 --> 00:11:42
			So that somebody can be pinpointed a finger,
		
00:11:42 --> 00:11:43
			it's your fault.
		
00:11:44 --> 00:11:45
			This is now the law of the world,
		
00:11:45 --> 00:11:48
			right? It is not allowed to send a
		
00:11:48 --> 00:11:50
			bomb on a civilization
		
00:11:51 --> 00:11:53
			or a population, whatever it might be, unless
		
00:11:53 --> 00:11:55
			and until at the last minute a human
		
00:11:55 --> 00:11:58
			being presses a button. The reason being, obviously,
		
00:11:58 --> 00:12:00
			they want to blame somebody or hold somebody
		
00:12:00 --> 00:12:02
			accountable. Pause here. Not that it helps all
		
00:12:02 --> 00:12:03
			the time, we see what's happening, but still
		
00:12:03 --> 00:12:05
			there's a human being that's made that decision.
		
00:12:06 --> 00:12:08
			What AI is gonna bring bring into the
		
00:12:08 --> 00:12:10
			picture is why do we need a human
		
00:12:10 --> 00:12:12
			to make the decision? What if we need
		
00:12:12 --> 00:12:14
			to make a split second decision? We don't
		
00:12:14 --> 00:12:16
			need to get involved with a human being.
		
00:12:16 --> 00:12:18
			So once again, we're gonna get all of
		
00:12:18 --> 00:12:18
			these
		
00:12:19 --> 00:12:23
			complex problems that are literally solving life and
		
00:12:23 --> 00:12:23
			death
		
00:12:24 --> 00:12:25
			issues based upon
		
00:12:26 --> 00:12:27
			artificial intelligence.
		
00:12:27 --> 00:12:28
			Another,
		
00:12:29 --> 00:12:31
			problem that we get of AI is and
		
00:12:31 --> 00:12:34
			is already happening to a a small level,
		
00:12:34 --> 00:12:36
			we're gonna see this times 100 within a
		
00:12:36 --> 00:12:38
			year or 2. And that is
		
00:12:39 --> 00:12:41
			AI, and this is frightening and we see
		
00:12:41 --> 00:12:42
			this now,
		
00:12:42 --> 00:12:46
			figures out who you are based upon your
		
00:12:46 --> 00:12:47
			history.
		
00:12:49 --> 00:12:52
			AI knows you better than your spouse and
		
00:12:52 --> 00:12:54
			your children. This is already true.
		
00:12:55 --> 00:12:58
			AI knows what types of videos you like,
		
00:12:58 --> 00:13:01
			what types of music, what types of clips,
		
00:13:01 --> 00:13:03
			what types of intellectual talks, what type of
		
00:13:03 --> 00:13:06
			saffrullah, fahesh and haram. Everything,
		
00:13:06 --> 00:13:07
			AI has a profile.
		
00:13:08 --> 00:13:11
			And because AI wants you to look at
		
00:13:11 --> 00:13:14
			the computer screen Well, because the
		
00:13:15 --> 00:13:16
			social media apps want you to look at
		
00:13:16 --> 00:13:18
			the computer screen because they want money,
		
00:13:19 --> 00:13:22
			AI will then show you what it knows
		
00:13:22 --> 00:13:24
			will attract your attention.
		
00:13:25 --> 00:13:26
			And what
		
00:13:27 --> 00:13:28
			this allows
		
00:13:28 --> 00:13:29
			AI to do
		
00:13:30 --> 00:13:31
			is to brainwash you,
		
00:13:33 --> 00:13:34
			and to keep you
		
00:13:35 --> 00:13:38
			cut off from learning outside of your own
		
00:13:38 --> 00:13:39
			comfort zone.
		
00:13:41 --> 00:13:43
			And we already see this in the Israeli
		
00:13:43 --> 00:13:44
			Palestinian conflict.
		
00:13:45 --> 00:13:47
			Even though, I would say at this stage
		
00:13:47 --> 00:13:49
			it's not being done intentionally
		
00:13:49 --> 00:13:50
			because
		
00:13:50 --> 00:13:53
			all of your news feeds, without exception, when
		
00:13:53 --> 00:13:55
			you're going down Twitter and Facebook,
		
00:13:55 --> 00:13:56
			all of us in this masjid,
		
00:13:57 --> 00:13:59
			our newsfeed is generally
		
00:13:59 --> 00:14:00
			pro Palestinian.
		
00:14:02 --> 00:14:05
			Our news feed is generally people that are
		
00:14:05 --> 00:14:06
			sympathizing.
		
00:14:07 --> 00:14:09
			Why is that happening? Because of AI.
		
00:14:10 --> 00:14:12
			And what you guys need to understand
		
00:14:13 --> 00:14:16
			is that pro Zionist and pro far right
		
00:14:16 --> 00:14:17
			Christian fanatics
		
00:14:18 --> 00:14:20
			who are on a different wavelength,
		
00:14:21 --> 00:14:22
			their entire
		
00:14:22 --> 00:14:24
			scroll and news feed,
		
00:14:24 --> 00:14:26
			exact same time as you,
		
00:14:26 --> 00:14:29
			is gonna be completely different than you.
		
00:14:30 --> 00:14:32
			And here we are, me and you, when
		
00:14:32 --> 00:14:34
			we're going down Twitter and Facebook and everything,
		
00:14:34 --> 00:14:36
			we're like, why can't everybody else see this?
		
00:14:36 --> 00:14:37
			I see it. Why why is not everybody
		
00:14:37 --> 00:14:40
			else seeing what I'm seeing? Because they're not
		
00:14:40 --> 00:14:42
			seeing what you're seeing. You can follow the
		
00:14:42 --> 00:14:44
			exact same 2 people,
		
00:14:44 --> 00:14:46
			but the ones that come after it, the
		
00:14:46 --> 00:14:48
			ones that come in between, it will be
		
00:14:48 --> 00:14:50
			catered upon your own psyche.
		
00:14:50 --> 00:14:51
			And therefore,
		
00:14:52 --> 00:14:54
			right now it is non malicious, I e,
		
00:14:54 --> 00:14:55
			the computer algorithm
		
00:14:56 --> 00:14:58
			wants to feed you what it knows you'll
		
00:14:58 --> 00:15:01
			be interested in reading. And so you liked
		
00:15:01 --> 00:15:02
			a Palestinian protest
		
00:15:03 --> 00:15:06
			video somewhere, guess what? The next 5 days,
		
00:15:06 --> 00:15:08
			and you're gonna see more pro Palestinian protests.
		
00:15:08 --> 00:15:09
			You're gonna say, masha Allah, the tide is
		
00:15:09 --> 00:15:11
			changing, and it is changing, by the way.
		
00:15:11 --> 00:15:13
			But I'm saying, you are in your bubble.
		
00:15:13 --> 00:15:15
			Believe it or not, the other side,
		
00:15:16 --> 00:15:17
			they're only gonna see
		
00:15:18 --> 00:15:19
			the news,
		
00:15:19 --> 00:15:20
			you know,
		
00:15:21 --> 00:15:23
			items and and vignettes that are catering to
		
00:15:23 --> 00:15:26
			their world view. And they will form a
		
00:15:26 --> 00:15:27
			totally skewed worldview.
		
00:15:28 --> 00:15:30
			And they're gonna hear from politicians that are
		
00:15:30 --> 00:15:32
			pandering to them. And they're gonna see advertisers
		
00:15:32 --> 00:15:34
			that are pandering to them. And you too
		
00:15:34 --> 00:15:36
			could be neighbors, next house. And you too
		
00:15:36 --> 00:15:38
			could be looking at the exact same screen
		
00:15:38 --> 00:15:41
			at the exact same time, but everything is
		
00:15:41 --> 00:15:42
			different.
		
00:15:42 --> 00:15:44
			And imagine this times,
		
00:15:45 --> 00:15:48
			10 months, 10 years, imagine what's gonna happen.
		
00:15:48 --> 00:15:51
			Now imagine, which is totally illegal,
		
00:15:52 --> 00:15:55
			but it might be happening by one particular
		
00:15:55 --> 00:15:56
			government, you can understand which one.
		
00:15:57 --> 00:15:59
			Imagine if it is now intentionally
		
00:16:00 --> 00:16:00
			done.
		
00:16:01 --> 00:16:05
			Right now it's algorithms, random. Meaning, you can
		
00:16:05 --> 00:16:06
			program it, and you can give it a
		
00:16:06 --> 00:16:08
			try. You can literally give it a try.
		
00:16:08 --> 00:16:09
			Look at something
		
00:16:10 --> 00:16:12
			of a news item or something you have
		
00:16:12 --> 00:16:14
			never been interested in in your life, okay.
		
00:16:15 --> 00:16:17
			Do some, you know, Antarctica cruise at at
		
00:16:17 --> 00:16:19
			the penguins. Just I'm giving you an example,
		
00:16:19 --> 00:16:21
			literally. And look at 2, 3
		
00:16:22 --> 00:16:25
			news items. You've never in your life been
		
00:16:25 --> 00:16:27
			interested in doing a cruise to the Antarctica
		
00:16:27 --> 00:16:28
			to go visit the penguins.
		
00:16:28 --> 00:16:30
			Next thing you know, for a few days,
		
00:16:31 --> 00:16:33
			little thing popping up there. Did you know
		
00:16:33 --> 00:16:35
			this about a penguin? Did you And then
		
00:16:35 --> 00:16:36
			slowly but surely you get drawn into a
		
00:16:36 --> 00:16:38
			whole different world. Now imagine
		
00:16:39 --> 00:16:41
			this is being done to advertise, right now
		
00:16:41 --> 00:16:41
			it's money.
		
00:16:42 --> 00:16:43
			One country might be doing it to brainwash.
		
00:16:44 --> 00:16:44
			Imagine
		
00:16:46 --> 00:16:47
			if powerful interests
		
00:16:48 --> 00:16:48
			decided,
		
00:16:49 --> 00:16:50
			let's sway
		
00:16:50 --> 00:16:53
			American public opinion in a certain way.
		
00:16:54 --> 00:16:57
			This is very, very doable.
		
00:16:57 --> 00:17:00
			Because what AI can do, it can monitor
		
00:17:00 --> 00:17:01
			all of your biases
		
00:17:02 --> 00:17:05
			and then figure out how to begin
		
00:17:05 --> 00:17:08
			tapping in and swaying you the way that
		
00:17:08 --> 00:17:09
			it wants you to be swayed.
		
00:17:10 --> 00:17:12
			You will become a pawn in a game
		
00:17:12 --> 00:17:16
			that we have no understanding of how deep
		
00:17:16 --> 00:17:17
			it can go. Right?
		
00:17:17 --> 00:17:19
			And this was predicted in a different way
		
00:17:19 --> 00:17:22
			by the famous intellectual Noam Chomsky when he
		
00:17:22 --> 00:17:25
			wrote his book in 1985 or something, manufactured
		
00:17:25 --> 00:17:27
			consent, where he said that this is being
		
00:17:27 --> 00:17:29
			done by the media at a very low
		
00:17:29 --> 00:17:31
			level. But now we're talking about AI
		
00:17:32 --> 00:17:34
			assessing your psychological profile,
		
00:17:35 --> 00:17:38
			having a detailed analysis. Even your psychiatrist wouldn't
		
00:17:38 --> 00:17:39
			know what the AI knows.
		
00:17:40 --> 00:17:41
			And it knows exactly
		
00:17:42 --> 00:17:44
			how to begin to persuade you to have
		
00:17:44 --> 00:17:45
			a different worldview.
		
00:17:46 --> 00:17:48
			And this leads us to my next point,
		
00:17:48 --> 00:17:49
			and that is
		
00:17:49 --> 00:17:50
			what AI is doing,
		
00:17:51 --> 00:17:52
			it is shifting
		
00:17:53 --> 00:17:54
			power dynamics.
		
00:17:55 --> 00:17:55
			Right now,
		
00:17:56 --> 00:17:59
			power is in the hands of the governments,
		
00:17:59 --> 00:18:01
			which is also bad.
		
00:18:01 --> 00:18:03
			But at least it's a physical tangible government
		
00:18:03 --> 00:18:04
			and you understand.
		
00:18:06 --> 00:18:06
			With AI,
		
00:18:07 --> 00:18:10
			power is gonna go to multibillion dollar corporations
		
00:18:11 --> 00:18:12
			that are operating
		
00:18:13 --> 00:18:14
			in extreme privacy.
		
00:18:15 --> 00:18:17
			And this is why there's so much tension
		
00:18:17 --> 00:18:19
			between Facebook and whatnot and between our governments,
		
00:18:19 --> 00:18:21
			because the governments are worried. What do you
		
00:18:21 --> 00:18:23
			and the government wants to ban TikTok and
		
00:18:23 --> 00:18:25
			whatnot, because things are happening beyond their control.
		
00:18:27 --> 00:18:30
			AI is going to completely change power dynamics,
		
00:18:30 --> 00:18:32
			and the real power will be in the
		
00:18:32 --> 00:18:34
			hands of those who have access to all
		
00:18:34 --> 00:18:35
			of that data.
		
00:18:36 --> 00:18:39
			They will be far more powerful than any
		
00:18:39 --> 00:18:39
			government.
		
00:18:40 --> 00:18:40
			And
		
00:18:41 --> 00:18:42
			when it comes to our Islamic religion in
		
00:18:42 --> 00:18:44
			particular, there are a number of specific issues.
		
00:18:45 --> 00:18:47
			We've already seen this a few months ago.
		
00:18:47 --> 00:18:49
			Somebody attempted an AI fatwa program.
		
00:18:51 --> 00:18:53
			It was a disaster
		
00:18:53 --> 00:18:55
			of the highest magnitude.
		
00:18:56 --> 00:18:58
			You ask it a basic question, and it'll
		
00:18:58 --> 00:19:00
			give you something totally irrelevant.
		
00:19:01 --> 00:19:03
			And so the guy himself had to apologize
		
00:19:03 --> 00:19:04
			and say, It was just a prototype.
		
00:19:04 --> 00:19:07
			But here's the point. What does a prototype
		
00:19:07 --> 00:19:09
			mean? It's only a matter of time
		
00:19:10 --> 00:19:10
			before
		
00:19:11 --> 00:19:12
			you don't need me anymore.
		
00:19:13 --> 00:19:15
			You sir Qadhi becomes super flawless. Okay. You
		
00:19:15 --> 00:19:17
			will have mufti chat GPT,
		
00:19:18 --> 00:19:19
			mufti saab.
		
00:19:19 --> 00:19:20
			Mufti GPT.
		
00:19:21 --> 00:19:23
			And I'm not even joking. This is where
		
00:19:23 --> 00:19:25
			this is heading now. Right?
		
00:19:26 --> 00:19:28
			This is where this is heading, where you
		
00:19:28 --> 00:19:30
			will ask your fatwa, your question, and you
		
00:19:30 --> 00:19:32
			can even you'll be able to input, I
		
00:19:32 --> 00:19:34
			want the Hanafi response.
		
00:19:35 --> 00:19:38
			I want the this response, that response, and
		
00:19:38 --> 00:19:41
			AI will be able to, and here's the
		
00:19:41 --> 00:19:42
			scary point,
		
00:19:42 --> 00:19:44
			99% of the time probably,
		
00:19:44 --> 00:19:46
			be accurate in giving you a response.
		
00:19:47 --> 00:19:48
			The problem comes that one time it'll be
		
00:19:48 --> 00:19:51
			wrong, it'll be majorly wrong. But they were
		
00:19:51 --> 00:19:53
			heading there. We're heading there, and it's very
		
00:19:53 --> 00:19:56
			soon. I have a friend, cannot say too
		
00:19:56 --> 00:19:57
			much more about the project.
		
00:19:58 --> 00:19:59
			Let me just say generically,
		
00:20:00 --> 00:20:02
			he's one of computer geek, neuro whatever.
		
00:20:02 --> 00:20:04
			He's using AI
		
00:20:04 --> 00:20:04
			for
		
00:20:05 --> 00:20:06
			hadith, isnaads,
		
00:20:06 --> 00:20:08
			and an analysis of hadith.
		
00:20:09 --> 00:20:11
			And I've seen aspects of this, and it
		
00:20:11 --> 00:20:14
			is super exciting and super scary all at
		
00:20:14 --> 00:20:15
			once.
		
00:20:16 --> 00:20:18
			Where you just put in the hadith and
		
00:20:18 --> 00:20:19
			it's gonna automatically
		
00:20:19 --> 00:20:21
			look at all the books in the database
		
00:20:22 --> 00:20:24
			and all the and draw an entire chart
		
00:20:24 --> 00:20:25
			for you, and then give you its own
		
00:20:25 --> 00:20:27
			verdict. You don't need ibn Hajjr or albani
		
00:20:27 --> 00:20:30
			and you don't need it all. Right? Chat
		
00:20:30 --> 00:20:31
			GPT will tell you the isnat,
		
00:20:33 --> 00:20:35
			And they'll tell you whether it's authentic or
		
00:20:35 --> 00:20:37
			not based upon all of these criterion.
		
00:20:37 --> 00:20:40
			We are already there. This is not in
		
00:20:40 --> 00:20:43
			1 generation. This is within a year or
		
00:20:43 --> 00:20:43
			2.
		
00:20:44 --> 00:20:46
			This is right now we are seeing this.
		
00:20:46 --> 00:20:49
			So the whole globe is changing in this
		
00:20:49 --> 00:20:49
			regard.
		
00:20:49 --> 00:20:51
			And people who do not,
		
00:20:52 --> 00:20:52
			who want
		
00:20:53 --> 00:20:56
			to find problems with Islam, they're using AI
		
00:20:56 --> 00:20:58
			for the wrong stuff as well when it
		
00:20:58 --> 00:20:59
			comes to Islam.
		
00:20:59 --> 00:21:01
			You know, and again I don't wanna get
		
00:21:01 --> 00:21:03
			too explicit here but, you know, the main
		
00:21:03 --> 00:21:05
			miracle we have is that our book cannot
		
00:21:05 --> 00:21:06
			be reproduced.
		
00:21:07 --> 00:21:09
			The AI is being done and I know
		
00:21:09 --> 00:21:10
			this from my friends, friends and whatnot, it
		
00:21:10 --> 00:21:12
			is being done to try to do something.
		
00:21:12 --> 00:21:14
			What are you gonna do in this regard?
		
00:21:14 --> 00:21:16
			Right? These are people that have
		
00:21:16 --> 00:21:19
			complete, you know, nefarious intentions.
		
00:21:19 --> 00:21:21
			And they're using these types of technologies
		
00:21:22 --> 00:21:24
			to try to bring doubts to Islam and
		
00:21:24 --> 00:21:25
			the Muslims.
		
00:21:25 --> 00:21:28
			SubhanAllah. So we have now a very, very
		
00:21:29 --> 00:21:31
			different world coming up.
		
00:21:31 --> 00:21:32
			And
		
00:21:32 --> 00:21:34
			if you're aware of what's happening, in the
		
00:21:34 --> 00:21:35
			last year,
		
00:21:35 --> 00:21:38
			massive internal scandals have happened within the AI
		
00:21:38 --> 00:21:41
			community. Even last week, one of the senior
		
00:21:41 --> 00:21:42
			highest
		
00:21:42 --> 00:21:44
			level officials resigned in public
		
00:21:45 --> 00:21:47
			and said, there's no oversight.
		
00:21:47 --> 00:21:49
			I love this, but I'm frightened to death
		
00:21:49 --> 00:21:51
			of it. What you guys are doing And
		
00:21:51 --> 00:21:53
			she didn't say more, but she resigned
		
00:21:54 --> 00:21:56
			and it caused shock waves because
		
00:21:56 --> 00:21:58
			she didn't tell us explicitly what's going on.
		
00:21:58 --> 00:22:01
			But something happened and and and and and
		
00:22:01 --> 00:22:03
			and what she was saying is that there
		
00:22:03 --> 00:22:05
			is no oversight and you guys are not
		
00:22:05 --> 00:22:09
			understanding the ethical issues involved over here. So
		
00:22:09 --> 00:22:11
			bottom line, I know it's not exactly a
		
00:22:11 --> 00:22:13
			purely Islamic thing, but here's my philosophy.
		
00:22:14 --> 00:22:16
			We can't separate the deen from the dunya.
		
00:22:16 --> 00:22:18
			Muslims have to be aware of this. It's
		
00:22:18 --> 00:22:21
			gonna impact us, and it is impacting us.
		
00:22:21 --> 00:22:23
			If we can't live 12 hours without electricity,
		
00:22:24 --> 00:22:25
			in a few years,
		
00:22:25 --> 00:22:28
			AI will be integrated into our phones.
		
00:22:28 --> 00:22:30
			In a few years, AI will be in
		
00:22:30 --> 00:22:32
			our houses. In a few years, we're literally
		
00:22:32 --> 00:22:33
			gonna be interdependent
		
00:22:34 --> 00:22:36
			on it, right? There will be a lot
		
00:22:36 --> 00:22:37
			of positives.
		
00:22:37 --> 00:22:39
			Can you imagine one of the easiest positives
		
00:22:40 --> 00:22:40
			already happening
		
00:22:41 --> 00:22:42
			is that
		
00:22:42 --> 00:22:44
			schools will not be needed anymore.
		
00:22:44 --> 00:22:47
			An AI will take charge of teaching your
		
00:22:47 --> 00:22:48
			child
		
00:22:48 --> 00:22:51
			exactly in the best manner that your child
		
00:22:51 --> 00:22:52
			needs.
		
00:22:52 --> 00:22:54
			Your child is strong in one field, AI
		
00:22:54 --> 00:22:56
			will zoom over that. It's weak in another,
		
00:22:56 --> 00:22:58
			AI will be able to figure out what
		
00:22:58 --> 00:23:00
			is the best way to help your child
		
00:23:00 --> 00:23:03
			in that maths problem, in that engineering problem,
		
00:23:03 --> 00:23:06
			in that algebra problem. AI know exactly
		
00:23:07 --> 00:23:09
			what will be the most, you know,
		
00:23:09 --> 00:23:10
			repetitive
		
00:23:10 --> 00:23:12
			routines that need to be done so that
		
00:23:12 --> 00:23:15
			your child understands this particular problem and they'll
		
00:23:15 --> 00:23:17
			be forever. So can you imagine a tutor
		
00:23:18 --> 00:23:21
			specifically for every human being in the world
		
00:23:21 --> 00:23:22
			catered to
		
00:23:22 --> 00:23:25
			your particular mindset. That's a massive positive.
		
00:23:25 --> 00:23:26
			But in the process,
		
00:23:27 --> 00:23:29
			by the time this child grows up, this
		
00:23:29 --> 00:23:30
			AI companion
		
00:23:30 --> 00:23:32
			will be even more knowledgeable than his kareen
		
00:23:32 --> 00:23:33
			of the ins and the jinn.
		
00:23:34 --> 00:23:36
			The AI will know more about you than
		
00:23:36 --> 00:23:37
			the kareen of your own jinn knows about
		
00:23:37 --> 00:23:39
			you. Right? Maybe even the jinn will be
		
00:23:39 --> 00:23:41
			frightened of the AI because the AI knows
		
00:23:41 --> 00:23:43
			about the kareen as well. And we as
		
00:23:43 --> 00:23:47
			Muslims are disconnected from that reality completely.
		
00:23:48 --> 00:23:49
			But my point to bring it up is
		
00:23:49 --> 00:23:51
			just to remind us that,
		
00:23:52 --> 00:23:52
			SubhanAllah,
		
00:23:53 --> 00:23:55
			we have to be cognizant. We're living in
		
00:23:55 --> 00:23:58
			a very fragile world. We're living in a
		
00:23:58 --> 00:24:00
			time and a place where within our lifetimes,
		
00:24:00 --> 00:24:02
			and anybody above the age of 30,
		
00:24:02 --> 00:24:05
			the technological change that had happened in your
		
00:24:05 --> 00:24:05
			lifetime,
		
00:24:06 --> 00:24:07
			it is
		
00:24:07 --> 00:24:10
			exponentially at the speed of light. I mean,
		
00:24:10 --> 00:24:12
			I remember you all remember the about about
		
00:24:12 --> 00:24:14
			about the age of 40, even cell phones,
		
00:24:14 --> 00:24:16
			we didn't have them. And now the first
		
00:24:16 --> 00:24:17
			phones that came, remember the Nokia that came
		
00:24:17 --> 00:24:19
			out, right? The little brick that came out.
		
00:24:19 --> 00:24:21
			Remember that back in the nineties. And then
		
00:24:21 --> 00:24:24
			the Now look, we have more power on
		
00:24:24 --> 00:24:24
			this
		
00:24:25 --> 00:24:25
			phone
		
00:24:25 --> 00:24:28
			than NASA did on its supercomputers
		
00:24:28 --> 00:24:30
			when it went to the moon. We have
		
00:24:30 --> 00:24:33
			more power on this phone than NASA did
		
00:24:33 --> 00:24:35
			on the computers that filled this whole room,
		
00:24:35 --> 00:24:36
			and they use them to go to the
		
00:24:36 --> 00:24:38
			moon. We have more power here. We've already
		
00:24:38 --> 00:24:40
			seen this in 1 generation. What is gonna
		
00:24:40 --> 00:24:43
			happen next? Allahu a'alam. We need to be
		
00:24:43 --> 00:24:45
			very very careful. Final point which is truly
		
00:24:45 --> 00:24:48
			terrifying. One of the biggest concerns that ethicists
		
00:24:48 --> 00:24:49
			have about AI
		
00:24:50 --> 00:24:53
			is that once you give AI that much
		
00:24:53 --> 00:24:53
			power,
		
00:24:54 --> 00:24:56
			AI will make choices
		
00:24:56 --> 00:24:58
			that might be logical and rational,
		
00:24:58 --> 00:24:59
			but completely
		
00:25:00 --> 00:25:00
			unethical.
		
00:25:02 --> 00:25:04
			Because AI is not interested in ethics.
		
00:25:05 --> 00:25:07
			And some of those choices might even bring
		
00:25:07 --> 00:25:10
			about types of destructions to human species.
		
00:25:11 --> 00:25:12
			And there's a frightening
		
00:25:14 --> 00:25:17
			science fiction novel by Isaac Asimov in which
		
00:25:17 --> 00:25:19
			the computers take over the world. What is
		
00:25:19 --> 00:25:20
			it called? I forgot. IRobot, I forgot it
		
00:25:20 --> 00:25:22
			was. It. Read it as a kid. But
		
00:25:22 --> 00:25:24
			he predicted this that a time will come
		
00:25:24 --> 00:25:27
			when humans are gonna be fighting the machine,
		
00:25:27 --> 00:25:29
			and the machine will
		
00:25:30 --> 00:25:32
			know more than the human beings.
		
00:25:32 --> 00:25:34
			This is the reality we are facing here.
		
00:25:34 --> 00:25:37
			And wallahi 1 wonders, perhaps it's better than
		
00:25:37 --> 00:25:38
			that we don't go down all of that
		
00:25:38 --> 00:25:40
			route, and we just live our simple lives
		
00:25:40 --> 00:25:43
			so that once electricity is gone, we don't
		
00:25:43 --> 00:25:45
			even know how to light a candle anymore,
		
00:25:45 --> 00:25:48
			right? Maybe our ancestors had it more wiser
		
00:25:48 --> 00:25:50
			and better that they could actually live a
		
00:25:50 --> 00:25:53
			simple and easy life. Allahu'ala what the right
		
00:25:53 --> 00:25:54
			answer is. In any case, wanted to bring
		
00:25:54 --> 00:25:57
			up to you some difficult issues. And by
		
00:25:57 --> 00:25:58
			the way, I will inshaAllah be presenting on
		
00:25:58 --> 00:26:01
			an AI conference, next year inshaAllah about issues
		
00:26:01 --> 00:26:03
			of ethics. So I'm doing my own research
		
00:26:03 --> 00:26:04
			in this regard. If any of you are
		
00:26:04 --> 00:26:05
			experts in this regard, please come to me
		
00:26:05 --> 00:26:07
			to benefit me so that I can, get
		
00:26:07 --> 00:26:09
			some ideas as well. Jazaakumullah khair until next
		
00:26:09 --> 00:26:10
			time. Assalamu alaykum.