Yasir Qadhi – Is Artificial Intelligence Harmful for the Future of Islam

Yasir Qadhi
Share Page

AI: Summary ©

The potential of artificial intelligence is discussed, including its use in various fields such as medicine and the potential for deep philosophical conversations. The potential for AI to change behavior and affect outcomes is also discussed, including the use of AI in predicting future events and the potential for positive outcomes. The potential for AI to lead to a rationality of human behavior and affect outcomes is also discussed, including the potential for AI to predict future events and the potential for negative consequences of AI. The potential for AI to shift power dynamics and lead to a more powerful government is also discussed, along with the use of AI for issues of Islam and the potential for positive outcomes.

AI: Summary ©

00:00:00 --> 00:00:03
			AI will make choices that might be logical
		
00:00:03 --> 00:00:05
			and rational, but completely
		
00:00:05 --> 00:00:06
			unethical
		
00:00:06 --> 00:00:09
			because AI is not interested in ethics. And
		
00:00:09 --> 00:00:11
			some of those choices might even bring about
		
00:00:11 --> 00:00:14
			types of destructions to human species.
		
00:00:15 --> 00:00:17
			So I don't know about the rest of
		
00:00:17 --> 00:00:19
			you, but, I have been without electricity all
		
00:00:19 --> 00:00:21
			day. How many of you without electricity all
		
00:00:21 --> 00:00:21
			day?
		
00:00:22 --> 00:00:22
			So
		
00:00:23 --> 00:00:25
			not actually, not everybody. Most of us apparently
		
00:00:25 --> 00:00:27
			have electricity. But,
		
00:00:28 --> 00:00:31
			so the topic therefore that I thought about
		
00:00:31 --> 00:00:32
			was exactly related to
		
00:00:33 --> 00:00:35
			a concept that I feel we need to
		
00:00:35 --> 00:00:36
			bring up. And it just so happened that
		
00:00:36 --> 00:00:37
			today was,
		
00:00:38 --> 00:00:39
			reminded me of it.
		
00:00:39 --> 00:00:43
			Of course, overall, the dependence that we have
		
00:00:43 --> 00:00:44
			on technology,
		
00:00:45 --> 00:00:46
			it is actually frightening.
		
00:00:47 --> 00:00:47
			Because
		
00:00:48 --> 00:00:51
			today, since 6 AM, I have been without
		
00:00:51 --> 00:00:53
			electricity, myself, my whole family, what not. And
		
00:00:53 --> 00:00:56
			in reality, it is so utterly trivial because
		
00:00:56 --> 00:00:58
			for 10000 years, mankind has lived like this.
		
00:00:59 --> 00:01:01
			But for us, 12 hours,
		
00:01:02 --> 00:01:04
			it's as if we don't know how we're
		
00:01:04 --> 00:01:04
			gonna survive.
		
00:01:05 --> 00:01:07
			We are so dependent
		
00:01:07 --> 00:01:09
			on this function,
		
00:01:10 --> 00:01:12
			we have forgotten how our own grandparents
		
00:01:13 --> 00:01:15
			and 10000 years before them have lived
		
00:01:16 --> 00:01:17
			without that interdependence.
		
00:01:18 --> 00:01:20
			So today I wanted to bring up
		
00:01:20 --> 00:01:21
			a new technology
		
00:01:22 --> 00:01:25
			that is already integrating into our lives and
		
00:01:25 --> 00:01:26
			we are becoming
		
00:01:27 --> 00:01:28
			frighteningly
		
00:01:29 --> 00:01:30
			dependent on this new technology.
		
00:01:31 --> 00:01:34
			And this generation, our generation,
		
00:01:34 --> 00:01:36
			is gonna see the interweaving
		
00:01:36 --> 00:01:37
			of this technology
		
00:01:38 --> 00:01:39
			to an unprecedented
		
00:01:39 --> 00:01:42
			level, and we're already seeing it. And it
		
00:01:42 --> 00:01:43
			is very important.
		
00:01:44 --> 00:01:46
			Every one of us, even those that don't
		
00:01:46 --> 00:01:48
			believe in a higher power, they must be
		
00:01:48 --> 00:01:49
			talking about this because
		
00:01:49 --> 00:01:50
			it is truly
		
00:01:51 --> 00:01:53
			frightening, and I'm talking about artificial intelligence,
		
00:01:54 --> 00:01:54
			AI.
		
00:01:55 --> 00:01:58
			I'm talking about artificial intelligence. Now, what exactly
		
00:01:58 --> 00:02:00
			is artificial intelligence and the pros and cons?
		
00:02:00 --> 00:02:02
			That's a whole long topic. But to summarize
		
00:02:02 --> 00:02:03
			in a nutshell,
		
00:02:04 --> 00:02:07
			artificial intelligence is a relatively new field. It
		
00:02:07 --> 00:02:09
			only dates back to almost 2 decades or
		
00:02:09 --> 00:02:10
			a decade and a half, and it's only
		
00:02:11 --> 00:02:13
			gained traction in the last few years in
		
00:02:13 --> 00:02:13
			particular.
		
00:02:14 --> 00:02:17
			And what artificial intelligence is, is
		
00:02:17 --> 00:02:19
			us human beings
		
00:02:19 --> 00:02:20
			programming
		
00:02:21 --> 00:02:21
			processors
		
00:02:22 --> 00:02:24
			to essentially think like us.
		
00:02:25 --> 00:02:28
			So imagine in realistically
		
00:02:28 --> 00:02:29
			5, 10 years,
		
00:02:30 --> 00:02:31
			your child or even you
		
00:02:32 --> 00:02:32
			having
		
00:02:33 --> 00:02:33
			a
		
00:02:34 --> 00:02:34
			partner
		
00:02:35 --> 00:02:36
			that you can have conversations
		
00:02:36 --> 00:02:39
			with, ask information of,
		
00:02:39 --> 00:02:40
			find detailed
		
00:02:41 --> 00:02:44
			analyses of your particular niche field, and that
		
00:02:44 --> 00:02:45
			partner is imaginary.
		
00:02:46 --> 00:02:48
			Imagine engaging in deep philosophical
		
00:02:48 --> 00:02:49
			conversation
		
00:02:49 --> 00:02:51
			and this partner that you're having a talk
		
00:02:51 --> 00:02:54
			with will be able to research as you
		
00:02:54 --> 00:02:55
			are saying something
		
00:02:56 --> 00:02:57
			and be able to
		
00:02:58 --> 00:02:58
			amass
		
00:02:59 --> 00:02:59
			thousands
		
00:03:00 --> 00:03:01
			of books instantaneously.
		
00:03:02 --> 00:03:04
			And as you ask a question about a
		
00:03:04 --> 00:03:05
			totally new field,
		
00:03:06 --> 00:03:09
			this partner of yours becomes a global expert.
		
00:03:10 --> 00:03:14
			And this is what artificial intelligence is. It's
		
00:03:14 --> 00:03:15
			the mechanism,
		
00:03:15 --> 00:03:16
			the potentiality
		
00:03:17 --> 00:03:18
			to
		
00:03:19 --> 00:03:21
			almost, not actually create, because only Allah is
		
00:03:21 --> 00:03:22
			Al Khaliq, but
		
00:03:23 --> 00:03:24
			to program
		
00:03:25 --> 00:03:25
			a system
		
00:03:26 --> 00:03:29
			whose results even we cannot predict.
		
00:03:30 --> 00:03:31
			This is what is frightening.
		
00:03:32 --> 00:03:34
			Up until now, we knew exactly what the
		
00:03:34 --> 00:03:37
			output would be. Up until now, computers and
		
00:03:37 --> 00:03:40
			programming was basically doing a lot of computation
		
00:03:40 --> 00:03:43
			super fast. We know exactly what the output's
		
00:03:43 --> 00:03:45
			gonna be. We could have done it ourselves,
		
00:03:45 --> 00:03:47
			except that the computer does it much faster
		
00:03:47 --> 00:03:48
			than us. Right?
		
00:03:49 --> 00:03:52
			What's happening now is that we are
		
00:03:53 --> 00:03:54
			allowing a program
		
00:03:55 --> 00:03:56
			to program itself
		
00:03:57 --> 00:03:59
			and learn as it continues.
		
00:04:00 --> 00:04:01
			And this is unchartered
		
00:04:02 --> 00:04:02
			territory.
		
00:04:02 --> 00:04:03
			Now,
		
00:04:04 --> 00:04:06
			already we are seeing the impact of this.
		
00:04:06 --> 00:04:07
			Now by the way, there's 2 types of
		
00:04:07 --> 00:04:08
			AI.
		
00:04:08 --> 00:04:11
			You have, AI which is very specific or
		
00:04:11 --> 00:04:14
			niche oriented, and this is fairly common already.
		
00:04:14 --> 00:04:17
			So for example, we already have, the technology,
		
00:04:17 --> 00:04:19
			which is a very frightening human technology
		
00:04:20 --> 00:04:21
			to be able to translate
		
00:04:22 --> 00:04:25
			a verbal speech into any other language. AI
		
00:04:25 --> 00:04:27
			can already do this. Right? Google can already
		
00:04:27 --> 00:04:29
			do this. You can speak into an app
		
00:04:29 --> 00:04:30
			in your conversational
		
00:04:30 --> 00:04:34
			English, Urdu, Swahili, in your particular dialect, in
		
00:04:34 --> 00:04:35
			your particular region.
		
00:04:36 --> 00:04:38
			And this app will now be able to
		
00:04:38 --> 00:04:42
			assess your accent, your your terminologies, your nuances,
		
00:04:43 --> 00:04:44
			even your tone,
		
00:04:44 --> 00:04:47
			and will then translate into any of the
		
00:04:47 --> 00:04:49
			languages it is programmed to do on the
		
00:04:49 --> 00:04:53
			spot. This is specific AI, right? Or frightening
		
00:04:53 --> 00:04:55
			already being used by Israel and by China
		
00:04:55 --> 00:04:57
			and others is facial recognition.
		
00:04:58 --> 00:04:59
			And again, this is something we think is
		
00:04:59 --> 00:05:02
			basic, it's not. Imagine walking down the street
		
00:05:02 --> 00:05:04
			of anywhere in the world, and the camera
		
00:05:04 --> 00:05:06
			catches half of your face. It wa- by
		
00:05:06 --> 00:05:08
			an angle, and it's blurry picture.
		
00:05:09 --> 00:05:11
			Now AI has already reached a level where
		
00:05:11 --> 00:05:14
			it can recognize exactly who you are, and
		
00:05:14 --> 00:05:17
			if the governments have the databases, China does
		
00:05:17 --> 00:05:19
			and Israel does. And Allahu Alam, our own
		
00:05:19 --> 00:05:20
			country most likely does, but they're not saying
		
00:05:20 --> 00:05:22
			this. And you know, every time we go
		
00:05:22 --> 00:05:23
			at the airport, this is exactly what they
		
00:05:23 --> 00:05:26
			do. Right? So the to able to capture
		
00:05:27 --> 00:05:30
			your image, you will be trackable wherever you
		
00:05:30 --> 00:05:33
			are. Wherever you go, just one image anywhere,
		
00:05:33 --> 00:05:36
			and AI will be able to recognize out
		
00:05:36 --> 00:05:37
			of billions of people,
		
00:05:37 --> 00:05:38
			instantaneously,
		
00:05:39 --> 00:05:41
			they'll be able to recognize exactly where you
		
00:05:41 --> 00:05:43
			are. We're already using this in medicine. Doctors
		
00:05:43 --> 00:05:46
			here can tell us what AI is doing.
		
00:05:46 --> 00:05:48
			Amazing technological advances,
		
00:05:48 --> 00:05:51
			where your sonogram, or your MRI, or whatever
		
00:05:51 --> 00:05:54
			your diagnosis might be, that the computer will
		
00:05:54 --> 00:05:54
			tap into
		
00:05:55 --> 00:05:58
			millions of different data points, and be able
		
00:05:58 --> 00:05:58
			to analyze
		
00:05:59 --> 00:06:01
			far better than any doctor can.
		
00:06:01 --> 00:06:03
			It might be possible
		
00:06:03 --> 00:06:06
			soon, we won't need medical doctors. Guys, don't
		
00:06:06 --> 00:06:08
			be scared because when that happens, you guys
		
00:06:08 --> 00:06:09
			will be in charge of it anyway. So
		
00:06:09 --> 00:06:12
			jobs will be there. Don't worry. But realistically,
		
00:06:12 --> 00:06:14
			you will be better off getting
		
00:06:15 --> 00:06:17
			analyzed by an AI doctor than by a
		
00:06:17 --> 00:06:20
			real doctor. Because the AI doctor will have
		
00:06:20 --> 00:06:23
			a database that is infinitely larger than any
		
00:06:23 --> 00:06:25
			human being. And the AI doctor will have
		
00:06:25 --> 00:06:28
			a one stop specialty. Right now, if you
		
00:06:28 --> 00:06:30
			wanna have one specialist, then another, then another.
		
00:06:30 --> 00:06:32
			You have to keep on booking 5 different
		
00:06:32 --> 00:06:34
			appointments, go through your this and that. Imagine
		
00:06:35 --> 00:06:37
			one computer screen, one camera in front of
		
00:06:37 --> 00:06:41
			you, one analysis, everything being done
		
00:06:41 --> 00:06:41
			simultaneously.
		
00:06:42 --> 00:06:44
			We are literally within a few years away
		
00:06:44 --> 00:06:47
			from something like this. This is tangible, realistic,
		
00:06:48 --> 00:06:49
			niche AI.
		
00:06:49 --> 00:06:50
			The more frightening
		
00:06:50 --> 00:06:52
			is general AI,
		
00:06:53 --> 00:06:56
			and we're heading there now. CHAD GPT 4
		
00:06:56 --> 00:06:58
			and others, we're heading there now. And general
		
00:06:58 --> 00:07:02
			AI is basically a polymath, a super intellectual
		
00:07:02 --> 00:07:02
			genius,
		
00:07:03 --> 00:07:04
			a partner,
		
00:07:04 --> 00:07:06
			an intellectual partner
		
00:07:06 --> 00:07:07
			that
		
00:07:07 --> 00:07:09
			you have no idea what is gonna come
		
00:07:09 --> 00:07:11
			from this. You are talking about
		
00:07:11 --> 00:07:14
			all of the specialties of the world combined
		
00:07:14 --> 00:07:16
			into 1 person and you can converse with
		
00:07:16 --> 00:07:19
			that person. Imagine something like this. That is
		
00:07:19 --> 00:07:22
			where the world is heading. And it's way
		
00:07:22 --> 00:07:24
			the way things are happening is just a
		
00:07:24 --> 00:07:26
			matter of time. Already, chat gpt version 4
		
00:07:26 --> 00:07:29
			and others, they can do some amazing things
		
00:07:29 --> 00:07:31
			that are already frightening. And I'm telling you
		
00:07:31 --> 00:07:32
			as a professor, as a lecturer, as a
		
00:07:32 --> 00:07:35
			teacher, it's very frightening what is happening already.
		
00:07:36 --> 00:07:38
			And this is just the beginnings of, you
		
00:07:38 --> 00:07:39
			know, the realities. Now,
		
00:07:40 --> 00:07:41
			there's a lot of positives as with all
		
00:07:41 --> 00:07:43
			technology. Today's not about the positives.
		
00:07:44 --> 00:07:45
			Of the positives, by the way, we just
		
00:07:45 --> 00:07:48
			told you medical medical developments are gonna be
		
00:07:48 --> 00:07:51
			groundbreaking. Of the biggest positives, we're already seeing
		
00:07:51 --> 00:07:52
			this, is
		
00:07:52 --> 00:07:53
			the,
		
00:07:55 --> 00:07:58
			the driving, the self self driving cars, right?
		
00:07:58 --> 00:08:00
			This is AI, self driving cars, to be
		
00:08:00 --> 00:08:02
			able to recognize anything on the road,
		
00:08:03 --> 00:08:04
			and then be able to
		
00:08:05 --> 00:08:06
			maneuver and navigate.
		
00:08:06 --> 00:08:08
			We've already seen this. I mean, this is
		
00:08:08 --> 00:08:09
			our generation.
		
00:08:09 --> 00:08:11
			Few years ago, it was a dream. Now
		
00:08:11 --> 00:08:13
			I drive a Tesla. So many people drive
		
00:08:13 --> 00:08:15
			a Tesla. You just click the button and
		
00:08:15 --> 00:08:17
			you just, I answer my email messages while
		
00:08:17 --> 00:08:19
			I'm driving the Tesla. Literally. You know, it's
		
00:08:19 --> 00:08:21
			halalut is legal because my hands my hands
		
00:08:21 --> 00:08:23
			on the road. I literally do my WhatsApp.
		
00:08:23 --> 00:08:25
			That's why I got the Tesla because of
		
00:08:25 --> 00:08:27
			time. Like I'm just literally just doing my
		
00:08:27 --> 00:08:29
			WhatsApp and answering email, and the the the
		
00:08:29 --> 00:08:30
			the the car is driving
		
00:08:31 --> 00:08:33
			automatically on the freeway. I have the full
		
00:08:33 --> 00:08:35
			drive version. Even on the roads it's driving.
		
00:08:35 --> 00:08:37
			And this is something we heard about for
		
00:08:37 --> 00:08:39
			a decade. Now, I have it. We have
		
00:08:39 --> 00:08:41
			it. Is the technology already there? And it
		
00:08:41 --> 00:08:44
			keeps on improving. Every few days, there's a
		
00:08:44 --> 00:08:45
			new version.
		
00:08:45 --> 00:08:48
			Automatic upload. And it does some amazing things.
		
00:08:48 --> 00:08:50
			So we're already seeing this this,
		
00:08:50 --> 00:08:51
			reality.
		
00:08:52 --> 00:08:53
			There are some
		
00:08:53 --> 00:08:56
			challenges and ethical concerns, and we as Muslims
		
00:08:56 --> 00:08:58
			need to be very cognizant of this because
		
00:08:58 --> 00:09:01
			our sharia provides an answer for everything. We
		
00:09:01 --> 00:09:02
			need to be at the forefront as some
		
00:09:02 --> 00:09:05
			of the problems of these, new technologies. So
		
00:09:05 --> 00:09:06
			I wanted to just bring them up, and
		
00:09:06 --> 00:09:07
			then in the end of the day, it's
		
00:09:07 --> 00:09:09
			not my area and forte. Other people need
		
00:09:09 --> 00:09:10
			to take up and and help us in
		
00:09:10 --> 00:09:13
			this regard. What are some of the problems
		
00:09:13 --> 00:09:16
			that AI might bring about? Well,
		
00:09:16 --> 00:09:17
			first and foremost,
		
00:09:18 --> 00:09:20
			one of the most obvious problems
		
00:09:20 --> 00:09:21
			is going to be
		
00:09:22 --> 00:09:23
			AI will have to decide
		
00:09:24 --> 00:09:26
			who lives and who dies
		
00:09:26 --> 00:09:29
			in a rational manner and not an emotional
		
00:09:29 --> 00:09:31
			manner. You see, when you are driving, may
		
00:09:31 --> 00:09:33
			Allah protect all of us, but you see
		
00:09:33 --> 00:09:33
			something
		
00:09:34 --> 00:09:36
			and you react on impulse.
		
00:09:37 --> 00:09:38
			You swerve,
		
00:09:38 --> 00:09:40
			you see a child, you see something, and
		
00:09:40 --> 00:09:42
			you just do something on impulse.
		
00:09:43 --> 00:09:47
			Generally speaking, we are forgiving of you, because
		
00:09:47 --> 00:09:48
			what could you do?
		
00:09:48 --> 00:09:50
			Generally speaking, no matter what happens, like if
		
00:09:50 --> 00:09:53
			it's not your fault, if somebody ran onto
		
00:09:53 --> 00:09:55
			the road and you did something and then
		
00:09:55 --> 00:09:56
			an accident happened,
		
00:09:56 --> 00:10:00
			generally speaking, you might not be criminally responsible.
		
00:10:00 --> 00:10:01
			You might be, you know, legally, whatever, but
		
00:10:01 --> 00:10:03
			not nobody's gonna look at you as a
		
00:10:03 --> 00:10:05
			criminal. Like, you couldn't do anything. You're a
		
00:10:05 --> 00:10:07
			human being. You just reacted.
		
00:10:07 --> 00:10:10
			But you see, AI has zero emotion, and
		
00:10:10 --> 00:10:13
			AI is doing things at supersonic speed. Right?
		
00:10:13 --> 00:10:14
			Faster than or or at the speed of
		
00:10:14 --> 00:10:15
			light, to be precise.
		
00:10:16 --> 00:10:18
			So you will have to figure out
		
00:10:19 --> 00:10:21
			which life is more important.
		
00:10:22 --> 00:10:24
			And the AI is gonna make a calculation
		
00:10:25 --> 00:10:27
			in milli, milli, milliseconds, nanoseconds,
		
00:10:28 --> 00:10:29
			and will literally decide,
		
00:10:30 --> 00:10:32
			is the life of the driver more important
		
00:10:33 --> 00:10:33
			or the life
		
00:10:34 --> 00:10:36
			of that person on the road?
		
00:10:36 --> 00:10:39
			And how will you decide that? Well, some
		
00:10:39 --> 00:10:41
			programmers are gonna have to have ethical questions.
		
00:10:42 --> 00:10:43
			Because in the end of the day, no
		
00:10:43 --> 00:10:45
			matter how awkward it is, you have to
		
00:10:45 --> 00:10:47
			program the AI to figure out what to
		
00:10:47 --> 00:10:49
			do. If there's a child, if there's an
		
00:10:49 --> 00:10:51
			elderly man, now we get to the famous,
		
00:10:51 --> 00:10:52
			you know,
		
00:10:53 --> 00:10:56
			philosophical problem that if you can swerve
		
00:10:56 --> 00:10:58
			a train with 2 people in it, and
		
00:10:58 --> 00:11:00
			you know, in the process save 5 people,
		
00:11:00 --> 00:11:03
			are you allowed to swerve the train? Or
		
00:11:03 --> 00:11:05
			should you let the train just go and
		
00:11:05 --> 00:11:07
			crash? You know these famous problems, these ethical
		
00:11:07 --> 00:11:09
			problems. Well, right now, when we do it,
		
00:11:09 --> 00:11:10
			it's just a hypothetical.
		
00:11:11 --> 00:11:13
			With AI, it will become a real problem.
		
00:11:13 --> 00:11:15
			The car is going straight down, it sees
		
00:11:15 --> 00:11:16
			a bunch of school children.
		
00:11:17 --> 00:11:20
			You will have to program the AI what
		
00:11:20 --> 00:11:21
			it needs to do.
		
00:11:21 --> 00:11:23
			And some people will die and some people
		
00:11:23 --> 00:11:25
			will live. And this will not be on
		
00:11:25 --> 00:11:26
			impulse.
		
00:11:26 --> 00:11:29
			This will be a rational decision
		
00:11:29 --> 00:11:32
			that somebody programmed into the AI.
		
00:11:32 --> 00:11:34
			Right? So this is one of the issues
		
00:11:34 --> 00:11:35
			that we're gonna have to be thinking about.
		
00:11:36 --> 00:11:36
			Already,
		
00:11:37 --> 00:11:38
			AI is being used in war.
		
00:11:40 --> 00:11:42
			Right now, according to international law, not that
		
00:11:42 --> 00:11:45
			is applying, but still, there is a modicum
		
00:11:45 --> 00:11:47
			of human involvement. Right now, according to the
		
00:11:47 --> 00:11:51
			UN and the international law, any drones that
		
00:11:51 --> 00:11:52
			send bombs,
		
00:11:53 --> 00:11:55
			a human must make that decision.
		
00:11:55 --> 00:11:58
			So that somebody can be pinpointed a finger,
		
00:11:58 --> 00:11:58
			it's your fault.
		
00:11:59 --> 00:12:01
			This is now the law of the world,
		
00:12:01 --> 00:12:03
			right? It is not allowed to send a
		
00:12:03 --> 00:12:05
			bomb on a civilization
		
00:12:06 --> 00:12:08
			or a population, whatever it might be, unless
		
00:12:08 --> 00:12:10
			and until at the last minute a human
		
00:12:10 --> 00:12:13
			being presses a button. The reason being, obviously,
		
00:12:13 --> 00:12:15
			they want to blame somebody or hold somebody
		
00:12:15 --> 00:12:17
			accountable. Pause here. Not that it helps all
		
00:12:17 --> 00:12:18
			the time. We see what's happening, but still
		
00:12:18 --> 00:12:20
			there's a human being that's made that decision.
		
00:12:21 --> 00:12:24
			What AI is gonna bring bring into the
		
00:12:24 --> 00:12:25
			picture is why do we need a human
		
00:12:25 --> 00:12:27
			to make the decision? What if we need
		
00:12:27 --> 00:12:29
			to make a split second decision? We don't
		
00:12:29 --> 00:12:31
			need to get involved with a human being.
		
00:12:31 --> 00:12:33
			So once again, we're gonna get all of
		
00:12:33 --> 00:12:34
			these
		
00:12:35 --> 00:12:37
			complex problems that are literally solving
		
00:12:38 --> 00:12:39
			life and death
		
00:12:39 --> 00:12:42
			issues based upon artificial intelligence.
		
00:12:42 --> 00:12:43
			Another,
		
00:12:44 --> 00:12:46
			problem that we get of AI
		
00:12:46 --> 00:12:48
			is and it's already happening to a a
		
00:12:48 --> 00:12:51
			small level. We're gonna see this times 100
		
00:12:51 --> 00:12:53
			within a year or 2. And that is
		
00:12:54 --> 00:12:56
			AI, and this is frightening and we see
		
00:12:56 --> 00:12:57
			this now,
		
00:12:58 --> 00:13:01
			figures out who you are based upon your
		
00:13:01 --> 00:13:02
			history.
		
00:13:04 --> 00:13:07
			AI knows you better than your spouse and
		
00:13:07 --> 00:13:09
			your children. This is already true.
		
00:13:10 --> 00:13:13
			AI knows what types of videos you like,
		
00:13:13 --> 00:13:16
			what types of music, what types of clips,
		
00:13:16 --> 00:13:19
			what types of intellectual talks, what type of
		
00:13:19 --> 00:13:21
			saqrullah, fahish and haram. Everything,
		
00:13:21 --> 00:13:23
			AI has a profile.
		
00:13:23 --> 00:13:24
			And because
		
00:13:25 --> 00:13:27
			AI wants you to look at the computer
		
00:13:27 --> 00:13:29
			screen, well, because the,
		
00:13:30 --> 00:13:32
			social media apps want you to look at
		
00:13:32 --> 00:13:33
			the computer screen because they want money,
		
00:13:34 --> 00:13:38
			AI will then show you what it knows
		
00:13:38 --> 00:13:39
			will attract your attention.
		
00:13:41 --> 00:13:41
			And what
		
00:13:42 --> 00:13:43
			this allows
		
00:13:44 --> 00:13:45
			AI to do
		
00:13:45 --> 00:13:47
			is to brainwash you,
		
00:13:48 --> 00:13:50
			and to keep you
		
00:13:51 --> 00:13:54
			cut off from learning outside of your own
		
00:13:54 --> 00:13:55
			comfort zone.
		
00:13:56 --> 00:13:58
			And we already see this in the Israeli
		
00:13:58 --> 00:13:59
			Palestinian conflict.
		
00:14:01 --> 00:14:03
			Even though I would say at this stage,
		
00:14:03 --> 00:14:04
			it's not being done intentionally
		
00:14:05 --> 00:14:05
			because
		
00:14:06 --> 00:14:08
			all of your news feeds, without exception, when
		
00:14:08 --> 00:14:11
			you're going down Twitter and Facebook, all of
		
00:14:11 --> 00:14:12
			us in this masjid,
		
00:14:12 --> 00:14:14
			our news feed is generally
		
00:14:14 --> 00:14:15
			pro Palestinian.
		
00:14:17 --> 00:14:20
			Our news feed is generally people that are
		
00:14:21 --> 00:14:21
			sympathizing.
		
00:14:22 --> 00:14:25
			Why is that happening? Because of AI.
		
00:14:25 --> 00:14:27
			And what you guys need to understand
		
00:14:28 --> 00:14:31
			is that pro Zionist and pro far right
		
00:14:31 --> 00:14:32
			Christian fanatics
		
00:14:33 --> 00:14:35
			who are on a different wavelength,
		
00:14:36 --> 00:14:37
			their entire
		
00:14:37 --> 00:14:39
			scroll and news feed,
		
00:14:39 --> 00:14:41
			exact same time as you,
		
00:14:42 --> 00:14:44
			is gonna be completely different than
		
00:14:45 --> 00:14:47
			you. And here we are, me and you,
		
00:14:47 --> 00:14:49
			when we're going down Twitter and Facebook and
		
00:14:49 --> 00:14:51
			everything, we're like, why can't everybody else see
		
00:14:51 --> 00:14:52
			this? I see it. Why why is not
		
00:14:52 --> 00:14:55
			everybody else seeing what I'm seeing? Because they're
		
00:14:55 --> 00:14:57
			not seeing what you're seeing.
		
00:14:57 --> 00:14:59
			You can follow the exact same 2 people,
		
00:14:59 --> 00:15:01
			but the ones that come after these, the
		
00:15:01 --> 00:15:03
			ones that come in between, it will be
		
00:15:03 --> 00:15:05
			catered upon your own psyche.
		
00:15:06 --> 00:15:07
			And therefore,
		
00:15:07 --> 00:15:09
			right now it is non malicious, I e,
		
00:15:09 --> 00:15:11
			the computer algorithm
		
00:15:11 --> 00:15:13
			wants to feed you what it knows you'll
		
00:15:13 --> 00:15:16
			be interested in reading. And so you liked
		
00:15:16 --> 00:15:18
			a Palestinian protest
		
00:15:18 --> 00:15:21
			video somewhere, guess what? The next 5 days,
		
00:15:21 --> 00:15:23
			and you're gonna see more pro Palestinian protests.
		
00:15:23 --> 00:15:25
			You're gonna say, masha Allah, the tide is
		
00:15:25 --> 00:15:28
			changing, and it is changing, by the way.
		
00:15:28 --> 00:15:30
			But I'm saying, you are in your bubble.
		
00:15:30 --> 00:15:32
			Believe it or not, the other side, they're
		
00:15:32 --> 00:15:34
			only gonna see the news,
		
00:15:34 --> 00:15:35
			you
		
00:15:36 --> 00:15:38
			know, items and and vignettes that are catering
		
00:15:38 --> 00:15:39
			to their worldview.
		
00:15:40 --> 00:15:41
			And they will form a totally
		
00:15:42 --> 00:15:43
			skewed worldview.
		
00:15:43 --> 00:15:45
			And they're gonna hear from politicians that are
		
00:15:45 --> 00:15:47
			pandering to them. And they're gonna see advertisers
		
00:15:47 --> 00:15:49
			that are pandering to them. And you 2
		
00:15:49 --> 00:15:52
			could be neighbors. Next house. And you 2
		
00:15:52 --> 00:15:54
			could be looking at the exact same screen
		
00:15:54 --> 00:15:57
			at the exact same time, but everything is
		
00:15:57 --> 00:15:57
			different.
		
00:15:58 --> 00:16:00
			And imagine this times,
		
00:16:00 --> 00:16:03
			10 months, 10 years, imagine what's gonna happen.
		
00:16:04 --> 00:16:06
			Now imagine, which is totally illegal,
		
00:16:07 --> 00:16:10
			but it might be happening by one particular
		
00:16:10 --> 00:16:12
			government, you can understand which one.
		
00:16:12 --> 00:16:14
			Imagine if it is now intentionally
		
00:16:15 --> 00:16:15
			done.
		
00:16:17 --> 00:16:20
			Right now it's algorithms, random, meaning you can
		
00:16:20 --> 00:16:21
			program it, and you can give it a
		
00:16:21 --> 00:16:23
			try. You can literally give it a try.
		
00:16:24 --> 00:16:26
			Look at something of a news item or
		
00:16:26 --> 00:16:28
			something you have never been interested in in
		
00:16:28 --> 00:16:29
			your life, okay.
		
00:16:30 --> 00:16:33
			Do some, you know, Antarctica cruise at the
		
00:16:33 --> 00:16:35
			penguins. Just I'm giving you an example, literally.
		
00:16:35 --> 00:16:37
			And look at 2, 3
		
00:16:37 --> 00:16:40
			news items. You've never in your life been
		
00:16:40 --> 00:16:42
			interested in doing a cruise to the Antarctica
		
00:16:42 --> 00:16:43
			to go visit the penguins.
		
00:16:44 --> 00:16:46
			Next thing you know, for a few days,
		
00:16:46 --> 00:16:48
			little thing popping up there. Did Did you
		
00:16:48 --> 00:16:50
			know this about a penguin? Did you And
		
00:16:50 --> 00:16:52
			then slowly but surely, you get drawn into
		
00:16:52 --> 00:16:54
			a whole different world. Now imagine this is
		
00:16:54 --> 00:16:57
			being done to advertise. Right now, it's money.
		
00:16:57 --> 00:16:59
			One country might be doing it to brainwash.
		
00:16:59 --> 00:17:00
			Imagine
		
00:17:00 --> 00:17:01
			if
		
00:17:01 --> 00:17:02
			powerful
		
00:17:02 --> 00:17:03
			interests decided,
		
00:17:04 --> 00:17:04
			let's
		
00:17:05 --> 00:17:05
			sway
		
00:17:06 --> 00:17:08
			American public opinion in a certain way.
		
00:17:10 --> 00:17:12
			This is very, very doable.
		
00:17:13 --> 00:17:16
			Because what AI can do, it can monitor
		
00:17:16 --> 00:17:17
			all of your biases
		
00:17:18 --> 00:17:21
			and then figure out how to begin tapping
		
00:17:21 --> 00:17:23
			in and swaying you the way that it
		
00:17:23 --> 00:17:24
			wants you to be swayed.
		
00:17:25 --> 00:17:28
			You will become a pawn in a game
		
00:17:28 --> 00:17:31
			that we have no understanding of how deep
		
00:17:31 --> 00:17:34
			it can go. Right? And this was predicted
		
00:17:34 --> 00:17:36
			in a different way by the famous intellectual
		
00:17:37 --> 00:17:38
			Noam Chomsky when he wrote his book in
		
00:17:38 --> 00:17:38
			1985
		
00:17:39 --> 00:17:42
			or something, manufactured consent, where he said that
		
00:17:42 --> 00:17:43
			this is being done by the media at
		
00:17:43 --> 00:17:46
			a very low level. But now we're talking
		
00:17:46 --> 00:17:47
			about AI
		
00:17:47 --> 00:17:49
			assessing your psychological profile,
		
00:17:50 --> 00:17:53
			having a detailed analysis. Even your psychiatrist wouldn't
		
00:17:53 --> 00:17:54
			know what the AI knows.
		
00:17:55 --> 00:17:56
			And it knows exactly
		
00:17:57 --> 00:17:59
			how to begin to persuade you to have
		
00:17:59 --> 00:18:00
			a different worldview.
		
00:18:01 --> 00:18:03
			And this leads us to my next point,
		
00:18:03 --> 00:18:04
			and that is
		
00:18:04 --> 00:18:06
			what AI is doing,
		
00:18:06 --> 00:18:07
			it is shifting
		
00:18:08 --> 00:18:09
			power dynamics.
		
00:18:10 --> 00:18:11
			Right now,
		
00:18:11 --> 00:18:13
			power is in the hands
		
00:18:13 --> 00:18:14
			of the governments,
		
00:18:15 --> 00:18:16
			which is also bad.
		
00:18:16 --> 00:18:19
			But at least it's a physical tangible government
		
00:18:19 --> 00:18:20
			and you understand.
		
00:18:21 --> 00:18:22
			With AI,
		
00:18:22 --> 00:18:26
			power is gonna go to multibillion dollar corporations
		
00:18:27 --> 00:18:27
			that are operating
		
00:18:28 --> 00:18:30
			in extreme privacy.
		
00:18:30 --> 00:18:32
			And this is why there's so much tension
		
00:18:32 --> 00:18:35
			between Facebook and whatnot and between our governments,
		
00:18:35 --> 00:18:36
			because the governments are worried. What do you
		
00:18:36 --> 00:18:38
			and the government wants to ban TikTok and
		
00:18:38 --> 00:18:41
			whatnot, because things are happening beyond their control.
		
00:18:42 --> 00:18:45
			AI is going to completely change power dynamics,
		
00:18:46 --> 00:18:48
			and the real power will be in the
		
00:18:48 --> 00:18:50
			hands of those who have access to all
		
00:18:50 --> 00:18:51
			of that data.
		
00:18:52 --> 00:18:54
			They will be far more powerful than any
		
00:18:54 --> 00:18:54
			government.
		
00:18:55 --> 00:18:55
			And
		
00:18:56 --> 00:18:58
			when it comes to our Islamic religion, in
		
00:18:58 --> 00:19:00
			particular, there are a number of specific issues.
		
00:19:00 --> 00:19:02
			We've already seen this a few months ago.
		
00:19:02 --> 00:19:05
			Somebody attempted an AI fatwa program.
		
00:19:07 --> 00:19:08
			It was a disaster
		
00:19:09 --> 00:19:10
			of the highest magnitude.
		
00:19:11 --> 00:19:14
			You ask it a basic question, and it'll
		
00:19:14 --> 00:19:15
			give you something totally irrelevant.
		
00:19:16 --> 00:19:18
			And so the guy himself had to apologize
		
00:19:18 --> 00:19:19
			and say it was just a prototype.
		
00:19:20 --> 00:19:21
			But here's the point.
		
00:19:21 --> 00:19:23
			What does a prototype mean?
		
00:19:23 --> 00:19:24
			It's only a matter of time
		
00:19:25 --> 00:19:26
			before
		
00:19:26 --> 00:19:27
			you don't need me anymore.
		
00:19:28 --> 00:19:31
			You sir Qadhi becomes super flawless. Okay. You
		
00:19:31 --> 00:19:34
			will have mufti chat GPT, mufti saab.
		
00:19:35 --> 00:19:36
			Mufti GPT.
		
00:19:37 --> 00:19:39
			And I'm not even joking. This is where
		
00:19:39 --> 00:19:40
			this is heading now. Right?
		
00:19:41 --> 00:19:43
			This is where this is heading, where you
		
00:19:43 --> 00:19:45
			will ask your fatwa or your question, and
		
00:19:45 --> 00:19:47
			you can even, you'll be able to input.
		
00:19:47 --> 00:19:49
			I want the Hanafi response.
		
00:19:50 --> 00:19:53
			I want the This response, that response.
		
00:19:53 --> 00:19:53
			And
		
00:19:54 --> 00:19:55
			AI will be able to,
		
00:19:56 --> 00:19:57
			and here's the scary point,
		
00:19:58 --> 00:19:59
			99% of the time probably,
		
00:20:00 --> 00:20:02
			be accurate in giving you a response.
		
00:20:02 --> 00:20:04
			The problem comes that one time it'll be
		
00:20:04 --> 00:20:06
			wrong, it'll be majorly wrong. But they were
		
00:20:06 --> 00:20:09
			heading there. We're heading there, and it's very
		
00:20:09 --> 00:20:11
			soon. I have a friend, cannot say too
		
00:20:11 --> 00:20:13
			much more about the project.
		
00:20:13 --> 00:20:15
			Let me just say generically,
		
00:20:15 --> 00:20:18
			he's one of computer geek, neuro whatever. He's
		
00:20:18 --> 00:20:19
			using AI
		
00:20:19 --> 00:20:20
			for
		
00:20:20 --> 00:20:21
			hadith, isnaads,
		
00:20:21 --> 00:20:23
			and an analysis of hadith.
		
00:20:24 --> 00:20:26
			And I've seen aspects of this, and it
		
00:20:26 --> 00:20:30
			is super exciting and super scary all at
		
00:20:30 --> 00:20:30
			once.
		
00:20:31 --> 00:20:33
			Where you just put in the hadith and
		
00:20:33 --> 00:20:34
			it's gonna automatically
		
00:20:35 --> 00:20:37
			look at all the books in the database
		
00:20:37 --> 00:20:39
			and all the and draw an entire chart
		
00:20:39 --> 00:20:41
			for you, and then give you its own
		
00:20:41 --> 00:20:43
			verdict. You don't need ibn Hajjr or albaniyah.
		
00:20:43 --> 00:20:45
			You don't need it all. Right? Chat GPT
		
00:20:45 --> 00:20:47
			will tell you the isnat,
		
00:20:48 --> 00:20:50
			And they'll tell you whether it's authentic or
		
00:20:50 --> 00:20:52
			not based upon all of these criteria.
		
00:20:53 --> 00:20:55
			We are already there. This is not in
		
00:20:55 --> 00:20:58
			1 generation. This is within a year or
		
00:20:58 --> 00:20:58
			2.
		
00:20:59 --> 00:21:01
			This is right now we are seeing this.
		
00:21:01 --> 00:21:04
			So the whole globe is changing in this
		
00:21:04 --> 00:21:04
			regard.
		
00:21:05 --> 00:21:06
			And people who do not,
		
00:21:07 --> 00:21:08
			who want
		
00:21:08 --> 00:21:11
			to find problems with Islam, they're using AI
		
00:21:11 --> 00:21:13
			for the wrong stuff as well when it
		
00:21:13 --> 00:21:14
			comes to Islam.
		
00:21:15 --> 00:21:16
			You know, and again, I don't wanna get
		
00:21:16 --> 00:21:18
			too explicit here but, you know, the main
		
00:21:18 --> 00:21:20
			miracle we have is that our book cannot
		
00:21:20 --> 00:21:21
			be reproduced.
		
00:21:22 --> 00:21:24
			The AI is being done and I know
		
00:21:24 --> 00:21:25
			this from my friends, friends and whatnot, it
		
00:21:25 --> 00:21:27
			is being done to try to do something.
		
00:21:27 --> 00:21:29
			What are you gonna do in this regard,
		
00:21:29 --> 00:21:31
			right? These are people that have
		
00:21:32 --> 00:21:34
			complete, you know, nefarious intentions.
		
00:21:35 --> 00:21:37
			And they're using these types of technologies
		
00:21:37 --> 00:21:40
			to try to bring doubts to Islam and
		
00:21:40 --> 00:21:40
			the Muslims.
		
00:21:41 --> 00:21:44
			SubhanAllah. So we have now a very, very
		
00:21:44 --> 00:21:47
			different world coming up. And
		
00:21:47 --> 00:21:49
			if you're aware of what's happening, in the
		
00:21:49 --> 00:21:50
			last year,
		
00:21:51 --> 00:21:53
			massive internal scandals have happened within the AI
		
00:21:53 --> 00:21:56
			community. Even last week, one of the senior
		
00:21:57 --> 00:22:00
			highest level officials resigned in public
		
00:22:00 --> 00:22:02
			and said, there's no oversight.
		
00:22:02 --> 00:22:04
			I love this, but I'm frightened to death
		
00:22:04 --> 00:22:06
			of it. What you guys are doing And
		
00:22:06 --> 00:22:09
			she didn't say more, but she resigned and
		
00:22:09 --> 00:22:12
			it caused shock waves because she didn't tell
		
00:22:12 --> 00:22:15
			us explicitly what's going on, but something happened
		
00:22:15 --> 00:22:17
			and and and and and and what she
		
00:22:17 --> 00:22:19
			was saying is that there is no oversight
		
00:22:19 --> 00:22:22
			and you guys are not understanding the ethical
		
00:22:22 --> 00:22:25
			issues involved over here. So bottom line, I
		
00:22:25 --> 00:22:27
			know it's not exactly a purely Islamic thing,
		
00:22:27 --> 00:22:29
			but here's my philosophy.
		
00:22:29 --> 00:22:31
			We can't separate the deen from the dunya.
		
00:22:32 --> 00:22:33
			Muslims have to be aware of this. It's
		
00:22:33 --> 00:22:36
			gonna impact us, and it is impacting us.
		
00:22:36 --> 00:22:38
			If we can't live 12 hours without electricity,
		
00:22:39 --> 00:22:40
			in a few years
		
00:22:41 --> 00:22:43
			AI will be integrated into our phones.
		
00:22:44 --> 00:22:45
			In a few years AI will be in
		
00:22:45 --> 00:22:48
			our houses. In a few years, we're literally
		
00:22:48 --> 00:22:49
			gonna be interdependent
		
00:22:49 --> 00:22:51
			on it, right? There will be a lot
		
00:22:51 --> 00:22:52
			of positives.
		
00:22:52 --> 00:22:55
			Can you imagine one of the easiest positives
		
00:22:55 --> 00:22:56
			already happening
		
00:22:56 --> 00:22:57
			is that
		
00:22:57 --> 00:22:59
			schools will not be needed anymore.
		
00:23:00 --> 00:23:02
			An AI will take charge of teaching your
		
00:23:02 --> 00:23:03
			child
		
00:23:04 --> 00:23:07
			exactly in the best manner that your child
		
00:23:07 --> 00:23:09
			needs. Your child is strong in one field,
		
00:23:09 --> 00:23:11
			AI will zoom over that. It's weak in
		
00:23:11 --> 00:23:13
			another, AI will be able to figure out
		
00:23:13 --> 00:23:15
			what is the best way to help your
		
00:23:15 --> 00:23:18
			child in that maths problem, in that engineering
		
00:23:18 --> 00:23:22
			problem, in that algebra problem. AI know exactly
		
00:23:22 --> 00:23:24
			what will be the most, you know,
		
00:23:24 --> 00:23:25
			repetitive
		
00:23:25 --> 00:23:27
			routines that need to be done so that
		
00:23:27 --> 00:23:30
			child understands this particular problem, and they'll be
		
00:23:30 --> 00:23:33
			forever. So can you imagine a tutor
		
00:23:33 --> 00:23:36
			specifically for every human being in the world
		
00:23:36 --> 00:23:37
			cater to
		
00:23:38 --> 00:23:40
			your particular mindset. That's a massive positive.
		
00:23:40 --> 00:23:42
			But in the process,
		
00:23:42 --> 00:23:44
			by the time this child grows up, this
		
00:23:44 --> 00:23:47
			AI companion will be even more knowledgeable than
		
00:23:47 --> 00:23:49
			his kareen of the ins and the jinn.
		
00:23:49 --> 00:23:51
			The AI will know more about you than
		
00:23:51 --> 00:23:53
			the kareen of your own jinn knows about
		
00:23:53 --> 00:23:54
			you. Right? Maybe even the jinn will be
		
00:23:54 --> 00:23:56
			frightened of the AI because the AI knows
		
00:23:56 --> 00:23:59
			about the kareen as well. And we as
		
00:23:59 --> 00:24:00
			Muslims are disconnected
		
00:24:01 --> 00:24:02
			from that reality completely.
		
00:24:03 --> 00:24:04
			But my point to bring it up is
		
00:24:04 --> 00:24:06
			just to remind us that,
		
00:24:07 --> 00:24:08
			SubhanAllah,
		
00:24:09 --> 00:24:11
			we have to be cognizant. We're living in
		
00:24:11 --> 00:24:13
			a very fragile world. We're living in a
		
00:24:13 --> 00:24:15
			time and a place where within our lifetimes,
		
00:24:15 --> 00:24:17
			and anybody above the age of 30,
		
00:24:18 --> 00:24:20
			the technological change that had happened in your
		
00:24:20 --> 00:24:21
			lifetime,
		
00:24:21 --> 00:24:23
			it is exponentially
		
00:24:23 --> 00:24:25
			at the speed of light. I mean, I
		
00:24:25 --> 00:24:27
			remember you all remember the about about about
		
00:24:27 --> 00:24:29
			the age of 40, Even cell phones, we
		
00:24:29 --> 00:24:31
			didn't have them. And now the first phones
		
00:24:31 --> 00:24:33
			that came, remember the Nokia that came out,
		
00:24:33 --> 00:24:35
			right? The little brick that came out. Remember
		
00:24:35 --> 00:24:37
			that back in the nineties. And then the
		
00:24:37 --> 00:24:39
			Now look, we have more power on this
		
00:24:40 --> 00:24:40
			phone
		
00:24:41 --> 00:24:41
			than
		
00:24:42 --> 00:24:44
			NASA did on its supercomputers when it went
		
00:24:44 --> 00:24:46
			to the moon. We have more power on
		
00:24:46 --> 00:24:49
			this phone than NASA did on the computers
		
00:24:49 --> 00:24:51
			that fill this whole room and they use
		
00:24:51 --> 00:24:52
			them to go to the moon. We have
		
00:24:52 --> 00:24:54
			more power here. We've already seen this in
		
00:24:54 --> 00:24:57
			1 generation. What is gonna happen next? Allahu'alaam.
		
00:24:58 --> 00:25:00
			We need to be very very careful. Final
		
00:25:00 --> 00:25:02
			point which is truly terrifying. One of the
		
00:25:02 --> 00:25:05
			biggest concerns that ethicists have about AI
		
00:25:05 --> 00:25:08
			is that once you give AI that much
		
00:25:08 --> 00:25:08
			power,
		
00:25:10 --> 00:25:13
			AI will make choices that might be logical
		
00:25:13 --> 00:25:15
			and rational, but completely
		
00:25:15 --> 00:25:16
			unethical.
		
00:25:17 --> 00:25:20
			Because AI is not interested in ethics.
		
00:25:21 --> 00:25:23
			And some of those choices might even bring
		
00:25:23 --> 00:25:25
			about types of destructions to human species.
		
00:25:26 --> 00:25:27
			And there's a frightening
		
00:25:30 --> 00:25:33
			science fiction novel by Isaac Asimov in which
		
00:25:33 --> 00:25:34
			the computers take over the world. What is
		
00:25:34 --> 00:25:35
			it called? I forgot. Irobot? I forgot it
		
00:25:35 --> 00:25:37
			was I wrote it read it as a
		
00:25:37 --> 00:25:39
			kid. But he predicted this that a time
		
00:25:39 --> 00:25:42
			will come when humans are gonna be fighting
		
00:25:42 --> 00:25:42
			the machine,
		
00:25:43 --> 00:25:44
			and the machine will
		
00:25:45 --> 00:25:47
			know more than the human beings.
		
00:25:47 --> 00:25:50
			This is the reality we are facing here.
		
00:25:50 --> 00:25:52
			And wallahi, one wonders, perhaps it's better than
		
00:25:52 --> 00:25:54
			that we don't go down all of that
		
00:25:54 --> 00:25:55
			route, and we just live our simple lives
		
00:25:55 --> 00:25:58
			so that once electricity is gone, we don't
		
00:25:58 --> 00:26:00
			even know how to light a candle anymore,
		
00:26:00 --> 00:26:03
			right? Maybe our ancestors had it more wiser
		
00:26:03 --> 00:26:05
			and better that they could actually live a
		
00:26:05 --> 00:26:08
			simple and easy life. Allahu'ala what the right
		
00:26:08 --> 00:26:10
			answer is. In any case, wanted to bring
		
00:26:10 --> 00:26:12
			up to you some difficult issues. And by
		
00:26:12 --> 00:26:14
			the way, I will inshaAllah be presenting on
		
00:26:14 --> 00:26:17
			an AI conference, next year inshaAllah about issues
		
00:26:17 --> 00:26:18
			of ethics. So I'm doing my own research
		
00:26:18 --> 00:26:19
			in this regard. If any of you are
		
00:26:19 --> 00:26:21
			experts in this regard, please come to me
		
00:26:21 --> 00:26:23
			to benefit me so that I can, get
		
00:26:23 --> 00:26:24
			some ideas as well.
		
00:26:24 --> 00:26:25
			Until next time. Assalamu alaikum.