#Yazılım #Mühendisliği - Ders 11: Güvenlik ve Güvenilebilirlik, Emniyet, Kullanılabilirlik #339
FurkanGozukara
announced in
Tutorials
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
#Yazılım #Mühendisliği - Ders 11: Güvenlik ve Güvenilebilirlik, Emniyet, Kullanılabilirlik
Full tutorial: https://www.youtube.com/watch?v=NYlwD4XScuQ
#Ücretsiz #Ders #Kurs #Eğitim
İyi bir yazılım programlamak sadece kodlama değil aynı zamanda bir mühendislik sürecidir. Büyük ölçekli gerçek hayat sistemleri, kodlanmaya başlamadan önce iyi planlanmalı ve organize edilmelidir. Böylesine büyük ölçekli bir projede başarıya ulaşmak için, fikirden nihai ürünün teslimatına kadar, projenin tüm aşamalarının iyi bir şekilde dokümente edilmesi ve belirlenen kurallara uyulması gerekir. Bu derste kod yazmaktan ziyade nasıl daha iyi bir yazılım mühendisi olacağınız öğretilecektir.
Bilgisayar Becerileri dersi oynatma listesi : https://www.youtube.com/playlist?list=PL_pbwdIyffsk7Rfb3OPCTnDdfKV3cNl5l
Dersin GitHub Depo Adresi : https://github.com/FurkanGozukara/Yazilim-Muhendisligi-IT522-2021
Dersin Discord kanalı bağlantı adresi : https://discord.gg/6Mrb8MwteQ
Discord nasıl çalışır / kullanılır : https://youtu.be/AEwPtYiLvsQ
Bu dersi almak için herhangi bir ön şart gerekmemektedir.
Eğer programlama öğrenmek istiyorsanız veya kendinizi geliştirmek istiyorsanız aşağıdaki derslerimizi de takip edebilirsiniz:
[1] C# ile Programlamaya Giriş dersi ders videoları oynatma listesi : https://www.youtube.com/playlist?list=PL_pbwdIyffskoSXySh0MdiayPJsBZ7m2o
[2] C# ile İleri Programlama dersi ders videoları oynatma listesi : https://www.youtube.com/playlist?list=PL_pbwdIyffslHaBdS3RUW26RKzSjkl8m4
[3] C# ile Nesne Tabanlı Programlama dersi ders videoları oynatma listesi : https://www.youtube.com/playlist?list=PL_pbwdIyffsnH3XJb66FDIHh1yHwWC26I
[4] C# ile ASP.NET Core MVC tabanlı BootStrap ile responsive web tasarımı : https://www.youtube.com/playlist?list=PL_pbwdIyffsnAWtgk4ja3HN3xgMKF7BOE
[5] Yapay Zeka ve Makine Öğrenmesi (örnek programlar C# dilinde) dersi ders videoları oynatma listesi : https://www.youtube.com/playlist?list=PL_pbwdIyffskVschrADCL6KEnL_nqDtgD
[6] Yazılım Mühendisliği dersi ders videoları oynatma listesi : https://www.youtube.com/playlist?list=PL_pbwdIyffslgxMVyXhnHiSn_EWTvx1G-
[7] Bilgi Sistemlerinin Güvenliği dersi ders videoları oynatma listesi : https://www.youtube.com/playlist?list=PL_pbwdIyffslM_o92NwkaUzD7C6Fekx26
[8] Bilgisayar Becerileri dersi ders videoları oynatma listesi : https://www.youtube.com/playlist?list=PL_pbwdIyffsmyE2e909ea1MXLcMb8MenG
Video Transcription
00:00:03 Hello everyone, welcome to our Software Engineering 11th course.
00:00:05 welcome to our Software Engineering 11th course. The topic of today's course is security and
00:00:08 The topic of today's course is security and reliability. Before we begin,
00:00:11 reliability. Before we begin,
00:00:20 I want to comment on the concepts of security and reliability. These concepts get quite confusing when translated from English. There was Seyfi, there was an
00:00:25 concepts get quite confusing when translated from English. There was Seyfi, there was an Efendi ticket, there was reliability, there
00:00:28 Efendi ticket, there was reliability, there was security. All of these can be translated into Turkish
00:00:32 was security. All of these can be translated into Turkish as security or safety.
00:00:34 as security or safety. That's why there's a bit of conceptual
00:00:38 That's why there's a bit of conceptual confusion. Actually,
00:00:42 confusion. Actually,
00:00:46 if you know English, it might be better to read the original version of Slime. The topics covered in our section are
00:00:50 might be better to read the original version of Slime. The topics covered in our section are reliability features.
00:00:52 reliability features. Here, what we
00:00:53 Here, what we call depende was never dependable. So,
00:00:56 call depende was never dependable. So, if something is dependable, you
00:00:58 if something is dependable, you can trust it
00:01:00 can trust it and live on it. We have
00:01:04 and live on it. We have a reliability in that sense.
00:01:07 a reliability in that sense. System features that lead to revenue.
00:01:09 System features that lead to revenue. Usability and reliability.
00:01:11 Usability and reliability.
00:01:15 Systems must be in place to provide services and operate as expected. So, if you trust a system, the
00:01:19 So, if you trust a system, the system
00:01:21 system must meet your expectations. It must be always online, it
00:01:23 must meet your expectations. It must be always online, it must not fail, it
00:01:25 must not fail, it must not crash. Security
00:01:28 must not crash. Security systems must behave in an unsafe manner.
00:01:30 systems must behave in an unsafe manner. So, your mother opened a medical
00:01:35 So, your mother opened a medical device, an MRI device, or an X
00:01:40 device, an MRI device, or an X device. These devices are expected to
00:01:43 device. These devices are expected to operate in a way that does not harm human health.
00:01:45 operate in a way that does not harm human health. And
00:01:47 And a security system like an elderly insulin pump device and themselves. It
00:01:51 a security system like an elderly insulin pump device and themselves. It should be protected from external interference,
00:01:53 should be protected from external interference, so for example, we hear all the time about
00:01:57 so for example, we hear all the time about data leakage. For example,
00:02:00 data leakage. For example, here's the issue. For example, the
00:02:03 here's the issue. For example, the passwords of many people's email addresses were stolen,
00:02:05 passwords of many people's email addresses were stolen, like the passwords of many people were published online.
00:02:09 like the passwords of many people were published online. Or, they're stealing your credit card
00:02:11 Or, they're stealing your credit card information from
00:02:13 information from Biran e-commerce site where your mother shops. They're
00:02:14 Biran e-commerce site where your mother shops. They're
00:02:20 making purchases without your knowledge. My mother probably
00:02:23 My mother probably made a purchase before. The database of the e-commerce
00:02:25 made a purchase before. The database of the e-commerce site was added, and
00:02:28 site was added, and they made purchases with the information they obtained.
00:02:30 they made purchases with the information they obtained. Of course, I had to cancel them all one by one.
00:02:33 Of course, I had to cancel them all one by one. But systems must be secure.
00:02:36 But systems must be secure. Yes, system reliability is the
00:02:40 Yes, system reliability is the most important system feature for many computer-based systems.
00:02:42 most important system feature for many computer-based systems. A reliable system
00:02:46 A reliable system
00:02:50 reflects the user's level of trust in that system. It will work as expected,
00:02:53 It will work as expected, and
00:02:54 and they won't be successful in normal use. The user's level of trust
00:02:55 they won't be successful in normal use. The user's level of trust
00:03:00 reflects the level of trust in the system. That day, reliability
00:03:02 That day, reliability can be used, and security encompasses related system features such as security.
00:03:04 can be used, and security encompasses related system features such as security. So,
00:03:07 So,
00:03:15 here's what it can do for a year. Faced municipality. Security, like here, is someone from the Gentleman. So, you see, translating such concepts
00:03:17 you see, translating such concepts
00:03:21 from English is a bit difficult for me. Now we'll manage. Does reliability matter? System
00:03:24 reliability matter? System failures can have
00:03:26 failures can have widespread effects on many people affected by the failure.
00:03:28 widespread effects on many people affected by the failure. So, if there's a failure in the system,
00:03:31 So, if there's a failure in the system, many people affected by the system failure can
00:03:33 many people affected by the system failure can come and go, administrators,
00:03:36 come and go, administrators, and naturally, they can send their credit card
00:03:39 and naturally, they can send their credit card information. Like millions of people being stolen,
00:03:41 information. Like millions of people being stolen, like home addresses being leaked, like
00:03:45 like home addresses being leaked, like cell phone numbers being exposed,
00:03:47 cell phone numbers being exposed, like unreliable,
00:03:49 like unreliable, unreliable, or security systems. Look, we've
00:03:52 unreliable, or security systems. Look, we've translated this here, we have a mistake. Let
00:03:56 translated this here, we have a mistake. Let me translate it into English, let's fix this.
00:04:00 me translate it into English, let's fix this. Yes, yes, look, you're stuck with the system. Hewai
00:04:09 Yes, yes, look, you're stuck with the system. Hewai trappex with a large number of people.
00:04:17 trappex with a large number of people. Yes, you know, we're leaving.
00:04:24 Yes, you know, we're leaving. Unreliable, unreliable, right moment.
00:04:28 Unreliable, unreliable, right moment. Selin, 8-10, Seven, eight, looking, Both can
00:04:33 Selin, 8-10, Seven, eight, looking, Both can be translated as unsafe again. Right, should we
00:04:37 be translated as unsafe again. Right, should we
00:04:49 say unsafe, unreliable, ten voices are choosing, that is, saying it's not, one or the security system can be
00:04:53 saying it's not, one or the security system can be rejected by its users. In other words, you
00:04:54 rejected by its users. In other words, you
00:05:05 can never tolerate the system you use leaking your information, especially to this sea, in most cases. So, if this significant failure causes economic losses or
00:05:08 significant failure causes economic losses or physical damage, the
00:05:10 physical damage, the costs of system failure can be very high. For
00:05:12 costs of system failure can be very high. For example, those who lost their lives in a plane crash
00:05:16 example, those who lost their lives in a plane crash
00:05:22 need to be paid very serious compensation, or of course, nothing can
00:05:24 nothing can be brought back alive from the loss of life. If the bank breaks down, this means that
00:05:27 be brought back alive from the loss of life. If the bank breaks down, this means that serious money
00:05:29 serious money is lost in this business, this money must be returned to the people.
00:05:31 is lost in this business, this money must be returned to the people. Like this, it can cause
00:05:34 Like this, it can cause both serious physical and economic damage.
00:05:39 both serious physical and economic damage. Unreliable
00:05:42 Unreliable systems
00:05:44 systems can cause information loss with a high recovery cost.
00:05:46 can cause information loss with a high recovery cost. See. I also want to touch on a good point here.
00:05:47 See. I also want to touch on a good point here.
00:05:53 There's a very serious cryptocurrency dream right now, right? That's cryptocoin Jin is never money. That means it's
00:05:56 cryptocoin Jin is never money. That means it's not very true. These systems are
00:06:00 not very true. These systems are not the most reliable in most cases. For
00:06:01 not the most reliable in most cases. For example, people get hacked. All their money
00:06:03 example, people get hacked. All their money can be lost in an instant.
00:06:05 can be lost in an instant. There's no authority you can turn to. Do you know what these systems
00:06:08 There's no authority you can turn to. Do you know what these systems are used for? They're always
00:06:09 are used for? They're always as a payment tool. For example,
00:06:13 as a payment tool. For example, very recently, they're trying to
00:06:18 very recently, they're trying to tell me what ransomware is. If you ask what a ransomware is,
00:06:20 tell me what ransomware is. If you ask what a ransomware is, a person can kidnap someone and ask for a video. For example, if you kidnap someone and
00:06:24 a person can kidnap someone and ask for a video. For example, if you kidnap someone and ask for a video, there are wick software with ransomware.
00:06:27 ask for a video, there are wick software with ransomware.
00:06:36 This is an oil test. Let's write "Oil," which is neutralized. Let's
00:06:41 write "Oil," which is neutralized. Let's see, it says "
00:06:44 see, it says " Yes, year, heart, panet." Hah, five, five
00:06:51 Yes, year, heart, panet." Hah, five, five million liras. What does that mean?
00:06:53 million liras. What does that mean? American Oil final. Oil
00:06:59 American Oil final. Oil producers paid $5 million to hackers.
00:07:03 producers paid $5 million to hackers.
00:07:07 Do you know what it is? He probably paid as a sheep. Let 's see, one sheep. It doesn't matter which sheep it
00:07:10 's see, one sheep. It doesn't matter which sheep it is, but a cryptocurrency.
00:07:12 is, but a cryptocurrency. This is how people are
00:07:16 This is how people are added to systems. These mountains work like this.
00:07:21 added to systems. These mountains work like this. Give a computer and take the last one.
00:07:23 Give a computer and take the last one. When you enter the ransomware, it selects all the information on that computer
00:07:25 When you enter the ransomware, it selects all the information on that computer according to the extensions.
00:07:28 according to the extensions. Of course, it says password, so the software
00:07:32 Of course, it says password, so the software is that information. It doesn't decrease, it first encrypts,
00:07:34 is that information. It doesn't decrease, it first encrypts, then deletes, so your only way to
00:07:37 then deletes, so your only way to recover that system and recover your data is to
00:07:39 recover that system and recover your data is to
00:07:50 get your password from the hacker, meaning the password used for the encrypted encryption algorithm.
00:07:54 meaning the password used for the encrypted encryption algorithm.
00:07:58 You can't access your data without getting this password. That's why hackers are
00:08:02 why hackers are paid with "Bir Umut" in very critical systems. This increases the
00:08:04 paid with "Bir Umut" in very critical systems. This increases the cost of recovering your data
00:08:07 cost of recovering your data significantly. What happens is, like my mother did here,
00:08:10 significantly. What happens is, like my mother did here,
00:08:16 because a very large company has been added here, they say, "Let's pay the hackers, let's recover our passwords and continue production."
00:08:18 let's recover our passwords and continue production."
00:08:24 In fact, you hear about cryptocurrencies in many places that really have no benefit to the world. There's
00:08:27 many places that really have no benefit to the world. There's no such thing as a project,
00:08:29 no such thing as a project, no, man, technology of the future. This is
00:08:31 no, man, technology of the future. This is all a story in the group. The essence of money is
00:08:34 all a story in the group. The essence of money is triplo.
00:08:36 triplo. Some people make very serious profits through exchanges. Each
00:08:39 Some people make very serious profits through exchanges. Each one tries to attract more people to join
00:08:42 one tries to attract more people to join so they can buy at a higher price
00:08:44 so they can buy at a higher price and turn over the profits.
00:08:48 and turn over the profits. In fact, Çiftlik Bank is no different. It
00:08:50 In fact, Çiftlik Bank is no different. It only works globally. It also
00:08:55 only works globally. It also
00:09:05 facilitates many things like money laundering, hacking, tax evasion, and so on. That's why I
00:09:08 That's why I
00:09:13 advocate for the ban of Kırıklı exchanges, yes. They say, " Cryptocurrency blocking can't be prevented." No, it
00:09:15 Cryptocurrency blocking can't be prevented." No, it can be easily prevented. If crypto
00:09:16 can be easily prevented. If crypto exchanges are prohibited from working with banks, it
00:09:19 exchanges are prohibited from working with banks, it can be easily prevented, and
00:09:25 can be easily prevented, and
00:09:30 I think governments will make this decision in the future. For example, cryptocurrency. Just go ahead, graphics cards
00:09:33 go ahead, graphics cards have become unavailable. It also causes serious problems,
00:09:36 have become unavailable. It also causes serious problems, and most importantly, electricity. We are
00:09:39 and most importantly, electricity. We are already a
00:09:40 already a country dependent on imported electricity. For example, gas, for
00:09:44 country dependent on imported electricity. For example, gas, for example, we produce electricity. This is why it
00:09:47 example, we produce electricity. This is why it increases our electricity costs.
00:09:49 increases our electricity costs.
00:09:54 It increases the cost of the electricity we use at home, it increases the price, and so on. I can count the number of harmful ones, and you can't even
00:09:56 I can count the number of harmful ones, and you can't even list a single benefit to me. For
00:09:58 list a single benefit to me. For example, I'm asking this to
00:10:01 example, I'm asking this to
00:10:07 those who defend money on social media and so on. But there are no benefits. Yes, let's continue,
00:10:10 Yes, let's continue, don't get off topic, but as said,
00:10:13 don't get off topic, but as said, unreliable systems can
00:10:14 unreliable systems can cause information loss with a high recovery cost,
00:10:16 cause information loss with a high recovery cost, and as you can see right now,
00:10:19 and as you can see right now, just 73 days ago, the news is
00:10:23 just 73 days ago, the news is already brazed. For example, in America, for example, hundreds
00:10:25 already brazed. For example, in America, for example, hundreds of millions of dollars
00:10:27 of millions of dollars are said to be paid to hackers every year. What's
00:10:29 are said to be paid to hackers every year. What's going on? It was paid with cryptocurrency. So, let's say the
00:10:33 going on? It was paid with cryptocurrency. So, let's say the reasons for failure are
00:10:36 reasons for failure are
00:10:44 failures due to freezing, design, production errors, or components reaching the end of their natural life. This is a software error. This Software technical
00:10:49 This is a software error. This Software technical features fail
00:10:50 features fail due to design or implementation errors.
00:10:52 due to design or implementation errors. Operational failures are caused by human
00:10:55 Operational failures are caused by human operators making mistakes. Perhaps
00:10:57 operators making mistakes. Perhaps
00:11:03 this is the biggest problem facing the high society technical system, and system failures are the biggest obstacles. Software can now
00:11:05 Software can now be developed quite securely, especially in Open Source
00:11:07 be developed quite securely, especially in Open Source software. Since
00:11:10 software. Since software is read and verified, it's
00:11:13 software is read and verified, it's very difficult to be prone to errors in hardware, and
00:11:16 very difficult to be prone to errors in hardware, and high-quality hardware is now
00:11:18 high-quality hardware is now available. However,
00:11:21 available. However, when humans are involved, no matter how reliable our software is,
00:11:25 when humans are involved, no matter how reliable our software is, errors
00:11:28 errors become obvious. For example, what did they do on Twitter?
00:11:30 become obvious. For example, what did they do on Twitter? They
00:11:33 They added a vulnerability used in the movie. Or rather, let's not say adding it.
00:11:35 added a vulnerability used in the movie. Or rather, let's not say adding it. With this vulnerability,
00:11:37 With this vulnerability, I can tweet from any account I want. For
00:11:39 I can tweet from any account I want. For example, they committed cryptocurrency
00:11:41 example, they committed cryptocurrency fraud, and it's back in the clips. Let me
00:11:43 fraud, and it's back in the clips. Let me remind you right away:
00:11:46 remind you right away: Twitter hacking. Let's
00:11:53 Twitter hacking. Let's write it like this. It means it's already been done from Wikipedia.
00:11:58 write it like this. It means it's already been done from Wikipedia.
00:12:04 Many accounts tweeted about Bitcoin fraud, and we tried. I was
00:12:10 told I would send 2-3 times the amount back to cryptocurrency routes, and many people sent it. Look,
00:12:12 people sent it. Look, they stole 110,000 US dollars, for example. They
00:12:15 they stole 110,000 US dollars, for example. They could have caused much greater damage. I
00:12:17 could have caused much greater damage. I mean, they could have even caused a war or crisis between countries.
00:12:19 mean, they could have even caused a war or crisis between countries.
00:12:24 Look, as you can see, Apple, for example,
00:12:33 says, "We will send two-story money to anyone who sends money to this address within 30 minutes." This is that. This is a box. Let's continue from there. Yes, and the
00:12:41 a box. Let's continue from there. Yes, and the reason for this error was again a social-technical
00:12:43 reason for this error was again a social-technical error, in other words, social engineering. What
00:12:45 error, in other words, social engineering. What they did was that they obtained the passwords of
00:12:48 they did was that they obtained the passwords of people who had access to Twitter's management tools,
00:12:50 people who had access to Twitter's management tools, and thus,
00:12:55 and thus, this system was not an angle, in other words, they
00:12:57 this system was not an angle, in other words, they achieved it through human error. Basic
00:13:01 achieved it through human error. Basic reliability features are
00:13:02 reliability features are dependability, in other words, reliability. Home.
00:13:06 dependability, in other words, reliability. Home. Elif, come on, availability, that is, the system should always
00:13:08 Elif, come on, availability, that is, the system should always be online, it should be running.
00:13:11 be online, it should be running. How much can I trust the system in the car? Look,
00:13:14 How much can I trust the system in the car? Look, what did she write here, the description,
00:13:15 what did she write here, the description, there was no tablet. System to deliver services. Sprite,
00:13:19 there was no tablet. System to deliver services. Sprite, that is, the system
00:13:22 that is, the system will offer this service, the
00:13:26 will offer this service, the unit will be able to
00:13:29 unit will be able to deliver it as promised. What did she say to your home before?
00:13:31 deliver it as promised. What did she say to your home before? Yes, the bulk system to deliver
00:13:33 Yes, the bulk system to deliver services and Quest. The system's
00:13:39 services and Quest. The system's ability to stop when services are requested.
00:13:42 ability to stop when services are requested. Seyitli security. A tidy system.
00:13:45 Seyitli security. A tidy system. Operasi. A minute of traffic remains. What does that mean
00:13:48 Operasi. A minute of traffic remains. What does that mean ? If the system does not work without a serious failure.
00:13:53 ? If the system does not work without a serious failure.
00:13:58 Security. Look, if it is translated as security, the state is gold. The only system. It's
00:14:01 state is gold. The only system. It's a Legends. Aksittl. Delivery introasian. This
00:14:04 a Legends. Aksittl. Delivery introasian. This also means that the system consciously or unconsciously
00:14:06 also means that the system consciously or unconsciously
00:14:10 [Music] These hackers can be either by bad people
00:14:12 These hackers can be either by bad people or in a bad way. Let's say
00:14:15 or in a bad way. Let's say BIM prevents the filter, that is, being squeezed in.
00:14:20 BIM prevents the filter, that is, being squeezed in. Yes, the
00:14:31 basic features are usable. Faced ticket vitamin. The possibility of being operational and
00:14:34 ticket vitamin. The possibility of being operational and providing useful services. Let's
00:14:36 providing useful services. Let's call them Aslı eve robe. The
00:14:38 call them Aslı eve robe. The meaning here is that, that is, a home
00:14:42 meaning here is that, that is, a home came. So, reliability is how the system
00:14:44 came. So, reliability is how the system
00:14:50 correctly provides services to users as expected. This is
00:15:01 a judgment from the Divan depende municipality security serik about the probability of the system harming people or the environment. A judgment about the probability of accidental or intentional interference in the security system
00:15:07 and other reliability features. Repairable Lig repairable
00:15:09 features. Repairable Lig repairable it it.
00:15:10 it it. This reflects that the system can be predicted in the event of a failure.
00:15:12 This reflects that the system can be predicted in the event of a failure.
00:15:19 What can I say, I say cheese. Actually, our
00:15:23 say cheese. Actually, our ability is also survivability. Has this started?
00:15:28 ability is also survivability. Has this started? Yes, anyway,
00:15:30 Yes, anyway,
00:15:33 it reflects the extent to which the system can be applied to new requirements. Survivability reflects the
00:15:36 Survivability reflects the extent to which the system can provide service under hostile attack.
00:15:38 extent to which the system can provide service under hostile attack. Look, under this attack,
00:15:41 Look, under this attack, fault tolerance reflects the ability to
00:15:43 fault tolerance reflects the ability to prevent and tolerate user input errors.
00:15:45 prevent and tolerate user input errors. These other
00:15:48 These other reliability features,
00:15:51 reliability features, we will see all alone. Repairable Lig.
00:15:53 we will see all alone. Repairable Lig. This is repairable faster than sound and in a way that
00:15:56 This is repairable faster than sound and in a way that
00:16:03 minimizes the certainty caused by system failure. This downloadable diagnostic
00:16:11 requires access to faulty components and making changes to fix the problems. Repairability is a measure of
00:16:18 how easy it is to predict the software that causes a system failure. What is it like to
00:16:21 it like to
00:16:25 apply software to fix the faults that cause a system failure? In other words, how easy is it to predict a fault that has occurred?
00:16:29 predict a fault that has occurred? How easy is it to fix it?
00:16:31 How easy is it to fix it? This is their acceptability criterion. They are also
00:16:34 This is their acceptability criterion. They are also affected by the Unity operating environment.
00:16:37 affected by the Unity operating environment.
00:16:43 Therefore, it is difficult to evaluate the system before deploying it to the environment where it will run. It is difficult to predict this before deploying the system
00:16:47 before deploying it to the environment where it will run. It is difficult to predict this before deploying the system because it
00:16:51 because it depends a lot on the environment in which the system will run.
00:16:53 depends a lot on the environment in which the system will run. We will be working in this environment. It may be
00:16:57 We will be working in this environment. It may be
00:17:01 more difficult to repair the error of soap under water on your important device, or we will be working in the air.
00:17:10 Yes, or it could be an embedded system as an operating system. It could be Windows, Windows 11,
00:17:14 could be Windows, Windows 11, and so on. You
00:17:17 and so on. You
00:17:20 can quickly get them. Downtime caused by system failure can be minimized.
00:17:22 Downtime caused by system failure can be minimized. This diagnostic requires access to faulty components
00:17:26 This diagnostic requires access to faulty components and making changes to fix the problems. It is possible to
00:17:28 and making changes to fix the problems. It is possible to
00:17:35 repair the software that causes the faults that cause a system store that can buy Ben 10. Yes, it is the same thing. Did we
00:17:38 Yes, it is the same thing. Did we read it again? I put this twice.
00:17:40 read it again? I put this twice. Let's read it again.
00:17:43 Let's read it again. Sustainability is a system feature that deals with the ease of
00:17:46 Sustainability is a system feature that deals with the ease of predicting a system after a fault is detected,
00:17:48 predicting a system after a fault is detected, or modifying the system to
00:17:50 or modifying the system to include new features.
00:17:52 include new features. They
00:17:56 They can take a system into service from a
00:17:58 can take a system into service from a short-term perspective.
00:17:59 short-term perspective. Sustainability is ordered from a
00:18:01 Sustainability is ordered from a
00:18:06 long-term perspective. The difference is that it can be repaired from a short-term perspective. This means that immediate troubleshooting
00:18:13 ensures the system can be restored to its original operating state. Sustainability is
00:18:25 crucial for systems because maintenance issues often occur in a system. When
00:18:31 you perform maintenance, a fault can enter the system, causing a new problem,
00:18:33 can enter the system, causing a new problem, or the fault can be forgotten.
00:18:36 or the fault can be forgotten. Therefore, sustainability is
00:18:39 Therefore, sustainability is
00:18:44 crucial for systems. If a system can be maintained, the probability of a fault occurring and going
00:18:46 fault occurring and going undetected is lower. Survivability is
00:18:53 undetected is lower. Survivability is
00:19:11 an increasingly important feature for distributed systems. It encompasses the concept of resilience. A system's ability to continue providing services to users in the face of deliberate or accidental attack. This can compromise security. It's an increasingly important feature for distributed systems. Sustainability encompasses the concept of resilience.
00:19:15 Sustainability encompasses the concept of resilience. A system's ability to continue operating despite component deterioration.
00:19:17 A system's ability to continue operating despite component deterioration.
00:19:24 What's the English translation here? So, in Survivor, it actually
00:19:27 in Survivor, it actually means survival in Turkish. Yes,
00:19:37 means survival in Turkish. Yes, there's a translation error here. This is the one that compromises security
00:19:39 there's a translation error here. This is the one that compromises security more. I said I'll love the building more.
00:19:46 more. I said I'll love the building more.
00:19:57 How is this process exposed to this danger most accurate because distributed systems
00:20:01 most accurate because distributed systems
00:20:06 communicate over the internet, and passwords and similar things aren't taken very seriously.
00:20:09 similar things aren't taken very seriously. Every error is part of
00:20:13 Every error is part of a more general usability
00:20:16 a more general usability feature of the trans. It reflects that user
00:20:18 feature of the trans. It reflects that user errors are prevented with caution and are
00:20:21 errors are prevented with caution and are detected or notified.
00:20:23 detected or notified. Errors that may occur should be
00:20:27 Errors that may occur should be detected and corrected as automatically as possible.
00:20:29 detected and corrected as automatically as possible. They should be transferred to the system, and
00:20:31 They should be transferred to the system, and we shouldn't cause malfunctions. Here, you should consider
00:20:34 we shouldn't cause malfunctions. Here, you should consider possible errors that the user may make. You should
00:20:36 possible errors that the user may make. You should take precautions. You should also record
00:20:39 take precautions. You should also record all errors in the system,
00:20:43 all errors in the system, that is, errors you cannot foresee. You should find
00:20:46 that is, errors you cannot foresee. You should find these
00:20:49 these errors later by checking the records and
00:20:51 errors later by checking the records and recording the errors. This is a
00:20:53 recording the errors. This is a very important
00:20:56 very important attribute. Trustworthiness depends on the
00:20:59 attribute. Trustworthiness depends on the existence of dependencies in a secure system operating system and its
00:21:02 existence of dependencies in a secure system operating system and its secure operation.
00:21:04 secure operation. A system can be unreliable
00:21:07 A system can be unreliable because its data has been corrupted by an external attack.
00:21:09 because its data has been corrupted by an external attack. A system can be unreliable because the
00:21:12 A system can be unreliable because the service it serves is attacked. In other words, the
00:21:14 service it serves is attacked. In other words, the
00:21:21 aim is to make the system unusable. If a system has reached its goal, you cannot be sure of its reliability and
00:21:23 has reached its goal, you cannot be sure of its reliability and security.
00:21:30 Reliability success is to prevent accidental errors from occurring while developing the system.
00:21:32 accidental errors from occurring while developing the system.
00:21:40 Shorten the processes and discover the women's boards in the system. So, verification is the Validation work. Both
00:21:44 verification is the Validation work. Both actually mean verification
00:21:50 here. By the way, we explained that the world is different if you go,
00:21:53 By the way, we explained that the world is different if you go, this verification disabled
00:21:58 this verification disabled changes here. Here, a message immediately says, "
00:22:06 Let's see, this Ashure gives to the park, the stitching, the
00:22:18 this Ashure gives to the park, the stitching, the
00:22:48 verification." This says, "Does our system really have the features you wrote? We can fulfill the Validation work silently up to the employer's requests." So, have you collected the features you designed and planned correctly? Verification Validation is the system can fulfill the desired features up to its age.
00:22:51 fulfill the desired features up to its age. Yes, it doesn't verify it. Design
00:22:54 Yes, it doesn't verify it. Design protection mechanisms that protect against external attacks.
00:22:56 protection mechanisms that protect against external attacks. For example, a
00:22:58 For example, a firewall could be a very common one.
00:23:00 firewall could be a very common one. What is a firewall? Configure this
00:23:03 What is a firewall? Configure this system correctly for the operating environment. For
00:23:05 system correctly for the operating environment. For
00:23:08 example, Windows. You are using it correctly. We won't use unnecessary ports. You can
00:23:11 We won't use unnecessary ports. You can close ports. This
00:23:13 close ports. This will provide you with extra security. What is Fort?
00:23:16 will provide you with extra security. What is Fort? Porto is a gateway that allows you to access the operating system over the internet
00:23:19 Porto is a gateway that allows you to access the operating system over the internet or over the local network.
00:23:21 or over the local network.
00:23:34 Recovery mechanisms to help restore normal operating system service after a failure. Add a message. It is very important to regularly
00:23:38 Add a message. It is very important to regularly back up the system and store another result.
00:23:40 back up the system and store another result.
00:23:45 Reliability costs are very important. Reliability costs will increase exponentially as increasing
00:23:47 Reliability costs will increase exponentially as increasing reliability levels
00:23:49 reliability levels are required.
00:23:50 are required. There are two reasons for this. The use of
00:23:55 There are two reasons for this. The use of
00:23:59 more development techniques and
00:24:01 development techniques and hardware is necessary to achieve a higher level of reliability. For example, I
00:24:05 hardware is necessary to achieve a higher level of reliability. For example, I
00:24:09 have one. What is the remote result of my hard drive being a rahit one? A
00:24:11 being a rahit one? A priest is a miro. What is cloning? Let's say
00:24:16 priest is a miro. What is cloning? Let's say data written to one hard drive is written to another hard drive.
00:24:18 data written to one hard drive is written to another hard drive. For data to be lost, two hard
00:24:20 For data to be lost, two hard drives need to fail simultaneously. I
00:24:23 drives need to fail simultaneously. I do this on my own computer as well.
00:24:24 do this on my own computer as well. And you should definitely do this for your important data. You
00:24:26 And you should definitely do this for your important data. You should definitely get a regular 7
00:24:29 should definitely get a regular 7 or something like that. You
00:24:32 or something like that. You can even do this with three disks. If three disks are hard, all
00:24:34 can even do this with three disks. If three disks are hard, all three disks should fail simultaneously. If
00:24:36 three disks should fail simultaneously. If one of them fails, you can immediately
00:24:38 one of them fails, you can immediately replace it and add a new disk. You can
00:24:40 replace it and add a new disk. You can
00:24:46 automatically enable the existing disk to use that disk. And this, of course, means cost. Because the
00:24:48 of course, means cost. Because the more disks you will get, the
00:24:50 more disks you will get, the
00:25:02 more testing and system verification required to convince the client and regulators that the required reliability has been achieved. So,
00:25:05 testing and system verification required to convince the client and regulators that the required reliability has been achieved. So, more verification is needed to
00:25:08 verification is needed to
00:25:12 determine whether we have reached the level we can trust. Of course, more testing and system
00:25:15 Of course, more testing and system verification means higher cost.
00:25:18 verification means higher cost.
00:25:23 Look, look at this. There is the cost and reliability, Elif.
00:25:26 There is the cost and reliability, Elif.
00:25:33 We can initially achieve very high reliability with a small fee, but as your reliability level increases,
00:25:35 reliability level increases, what happens to our costs?
00:25:37 what happens to our costs? It increases exponentially day by day. So, what
00:25:41 It increases exponentially day by day. So, what should we do? We establish a trade-off between
00:25:44 should we do? We establish a trade-off between the reliability that our system should have and the cost.
00:25:47 the reliability that our system should have and the cost. Trillo. So, are we
00:25:50 Trillo. So, are we
00:25:57 establishing a trade-off like this? Don't stop there when we find a reasonable level. We
00:25:59 Don't stop there when we find a reasonable level. We need to achieve neither maximum security nor
00:26:03 need to achieve neither maximum security nor excessive reliability
00:26:05 excessive reliability nor under-
00:26:07 nor under- reliability. Yes,
00:26:12 reliability. Yes, the economics of reliability are key to
00:26:14 the economics of reliability are key to achieving reliability. It might be
00:26:16 achieving reliability. It might be
00:26:23 more cost-effective to accept unreliable systems and pay for failures due to their very high costs. So, you're saying, "Okay, such
00:26:26 So, you're saying, "Okay, such a system shouldn't be very reliable because
00:26:28 a system shouldn't be very reliable because if a crash occurs, we can
00:26:30 if a crash occurs, we can fix it with a small cost.
00:26:34 fix it with a small cost. But my mother is making a game.
00:26:37 But my mother is making a game. What is the cost of
00:26:39 What is the cost of reloading the game from a popular location?
00:26:42 reloading the game from a popular location? Or there
00:26:46 Or there might be other factors here. This is for you to
00:26:50 might be other factors here. This is for you to decide. However, this depends on social and political
00:26:53 decide. However, this depends on social and political factors.
00:26:56 factors. Trust is essential for unreliable products. Reputation is the
00:26:58 Trust is essential for unreliable products. Reputation is the only thing that can cause future business
00:27:00 only thing that can cause future business losses. So,
00:27:01 losses. So, how does the product itself work? It
00:27:05 how does the product itself work? It depends on the type of system, especially for business systems. A
00:27:07 depends on the type of system, especially for business systems. A modest level of reliability
00:27:09 modest level of reliability might be sufficient. This is also business systems.
00:27:12 might be sufficient. This is also business systems. But let's look at this.
00:27:24 Now, is it something like this? It can be used for some business systems, and
00:27:31 be used for some business systems, and reliability is
00:27:34 reliability is this visibility and the probability of a system operating
00:27:39 this visibility and the probability of a system operating for a specific purpose in a specific environment for a specific period of time. Do we
00:27:41 for a specific purpose in a specific environment for a specific period of time. Do we
00:27:46 have a translation ancestor here? Is that actually what we're talking about? We said it could be used
00:27:55 actually what we're talking about? We said it could be used and seen.
00:27:57 and seen. I'm a fisherman here. So,
00:28:01 I'm a fisherman here. So, up to a system. You can live on your back.
00:28:06 up to a system. You can live on your back.
00:28:10 How much can you trust that system? Actually, the translation is the same. It's a bit problematic.
00:28:13 translation is the same. It's a bit problematic. Reliable here is the
00:28:16 Reliable here is the
00:28:24 probability of a system operating for a specific purpose in a specific environment for a specific period of time. This is the availability and level. This is
00:28:28 availability and level. This is never the case. This is the availability. How much
00:28:31 never the case. This is the availability. How much [Music] is
00:28:32 [Music] is this system? Then, it
00:28:34 this system? Then, it will be accessible. A system will
00:28:38 will be accessible. A system will be operational at any time and
00:28:40 be operational at any time and provide the requested services.
00:28:42 provide the requested services. Both of these features
00:28:44 Both of these features can be expressed quantitatively. For example, 0999
00:28:47 can be expressed quantitatively. For example, 0999 can be used. You can hear it from behind.
00:28:49 can be used. You can hear it from behind.
00:28:54 It's been ninety-nine percent online for a full day. Because it's Turkish, the system's time is 199 for the new
00:28:59 system's time is 199 for the new system. The time is 0999. The system will be able to use 199 for
00:29:08 system. The time is 0999. The system will be able to use 199 for
00:29:20 39 time. Ha ha. It means that 100, 99.9 times
00:29:26 Ha ha. It means that 100, 99.9 times is operational. So,
00:29:28 is operational. So, for example, the system in 100 hours.
00:29:32 for example, the system in 100 hours. Let's say one hour is now.
00:29:35 Let's say one hour is now. How long is the system offline? 190,
00:29:38 How long is the system offline? 190, 99, 99, 99 hours online in Iğdır. It's only one
00:29:42 99, 99, 99 hours online in Iğdır. It's only one hour. That is, one hour is only one hour. If this
00:29:46 hour. That is, one hour is only one hour. If this system is available for hours, you can't use it. So, some
00:29:50 system is available for hours, you can't use it. So, some systems are really critical.
00:29:52 systems are really critical. What do banks do?
00:29:54 What do banks do?
00:30:03 For example, they're doing maintenance during the least frequent time people use it. If
00:30:19 a system is available, sometimes it's possible to consider system availability within the scope of system reliability. Frankly, if a system isn't available, that is, if it's not
00:30:20 system is available, sometimes it's possible to consider system availability within the scope of system reliability. Frankly, if a system isn't available, that is, if it's not available, then specify that. If it's not, then
00:30:22 available, then specify that. If it's not, then
00:30:27 it means the system is providing its services. However, it's
00:30:32 possible to have systems with low trust. System failures should be present, and as
00:30:38 long as they can recover quickly and without damaging data, some system failures may not be a problem. Therefore, availability is
00:30:42 not be a problem. Therefore, availability is
00:30:48 best evaluated as a separate feature, reflecting that the system's service delivery will not be maintained.
00:30:52 If the system needs to be taken out of service to repair failures, the availability period is taken into account.
00:30:54 availability period is taken into account. So,
00:30:59 So, say the system will be taken out of service, if it's out of service, it's
00:31:02 say the system will be taken out of service, if it's out of service, it's usable, like this. Join the simple timeframe.
00:31:04 usable, like this. Join the simple timeframe.
00:31:14 Just this stubborn perception of reliability doesn't always
00:31:17 reliability doesn't always
00:31:23 reflect the user's perception of a system's reliability, and assumptions about the environment in which the system will be used
00:31:25 about the environment in which the system will be used may be incorrect. Using a system
00:31:29 may be incorrect. Using a system in an office environment can be quite different from using the same system
00:31:32 in an office environment can be quite different from using the same system in a university environment.
00:31:34 in a university environment. Why? Because,
00:31:37 Why? Because, for example, three or five people in an office
00:31:39 for example, three or five people in an office will access the system, and thousands of
00:31:41 will access the system, and thousands of people may need to travel to a university. For example,
00:31:45 people may need to travel to a university. For example, security.
00:31:47 security. We can give this example for devices with security wall systems, or
00:31:51 We can give this example for devices with security wall systems, or here's an example of system failures related to your internet-enabled device. The
00:31:53 here's an example of system failures related to your internet-enabled device. The consequences of system failures
00:31:56 consequences of system failures affect the perception of reliability.
00:31:59 affect the perception of reliability. Invisible windshield wipers in a car
00:32:01 Invisible windshield wipers in a car may be insignificant in a dry climate because
00:32:04 may be insignificant in a dry climate because where are you?
00:32:04 where are you? But
00:32:07 But if you don't trust them when you use them,
00:32:08 if you don't trust them when you use them, they won't work.
00:32:11 they won't work. Failures with serious consequences,
00:32:13 Failures with serious consequences, such as an engine failure in a car, are
00:32:15 such as an engine failure in a car, are
00:32:20 given more weight by users.
00:32:28 Yes, here, failures with serious consequences, such as an engine failure in a car, are
00:32:31 failures with serious consequences, such as an engine failure in a car, are
00:32:35 given more weight by users. This is an important element. For example, if your car
00:32:38 an important element. For example, if your car doesn't start,
00:32:40 doesn't start, yes, it will be annoying. But this may
00:32:43 yes, it will be annoying. But this may not cause a life-threatening problem. So, it's
00:32:47 not cause a life-threatening problem. So, it's manageable, but
00:32:51 manageable, but let's say the air conditioner doesn't work. This is annoying,
00:32:53 let's say the air conditioner doesn't work. This is annoying, but it's not life-threatening. You
00:32:56 but it's not life-threatening. You sweat a lot. You or you can meet, but
00:32:59 sweat a lot. You or you can meet, but when there's an engine failure, it
00:33:02 when there's an engine failure, it prevents you from using your car. It
00:33:04 prevents you from using your car. It may even cause accidents or something.
00:33:07 may even cause accidents or something. Therefore, I believe
00:33:11 Therefore, I believe more weight
00:33:14 more weight should be given to factors that will have serious consequences. Reliability and
00:33:18 should be given to factors that will have serious consequences. Reliability and features. Reliability can only be
00:33:22 features. Reliability can only be formally defined according to a system specification.
00:33:24 formally defined according to a system specification. So, a failure is a
00:33:28 So, a failure is a deviation from the specification.
00:33:31 deviation from the specification. Let's look at the English here. Reliability is reliability, which is almost every reliability here.
00:33:35 Let's look at the English here. Reliability is reliability, which is almost every reliability here.
00:33:40 So, and the other is not dependability. So,
00:33:45 and the other is not dependability. So, if it's in the system specification, if
00:33:48 if it's in the system specification, if it's being sold from it, it's unreliable.
00:33:51 it's being sold from it, it's unreliable. However, many specifications are missing or
00:33:54 However, many specifications are missing or incorrect. Therefore, a
00:33:57 incorrect. Therefore, a system that meets the specification
00:34:00 system that meets the specification can fail from the user's perspective. One, while you're
00:34:03 can fail from the user's perspective. One, while you're extracting the features here on the site,
00:34:05 extracting the features here on the site, if you hear that the system isn't good enough, it
00:34:09 if you hear that the system isn't good enough, it
00:34:13 might fail from the market. It's successful for you in terms of reliability because the system
00:34:16 in terms of reliability because the system fully meets the specifications.
00:34:18 fully meets the specifications. But it's especially incomplete, it's incorrect, it's
00:34:20 But it's especially incomplete, it's incorrect, it's
00:34:25 never translated as "I mentioned here," especially sound would be more accurate.
00:34:38 Also, users don't read the specifications. Therefore,
00:34:39 don't read the specifications. Therefore, they don't know how the system should behave.
00:34:42 they don't know how the system should behave. Therefore, the perceived day is
00:34:44 Therefore, the perceived day is more important in the application, meaning that
00:34:51 meeting the user's demands is more important,
00:34:53 more important, and the perception of usability is
00:34:56 and the perception of usability is generally expressed
00:34:59 generally expressed
00:35:04 as the percentage of time the system is available to provide ready-made synchronization services. For example, 99.95 percent, so
00:35:11 For example, 99.95 percent, so let me assign eveylibil. However, these two factors are not
00:35:13 let me assign eveylibil. However, these two factors are not taken into account. So,
00:35:16 taken into account. So, it gets stuck like this without taking into account two different factors. One is a
00:35:19 it gets stuck like this without taking into account two different factors. One is a service outage. What is the
00:35:21 service outage. What is the number of users affected?
00:35:23 number of users affected? Loss of service in the middle of the night is less important than the service
00:35:25 Loss of service in the middle of the night is less important than the service during the most optimal usage periods for most systems.
00:35:26 during the most optimal usage periods for most systems. One is
00:35:31 One is 199.95. But the user who will be using it at that moment is
00:35:35 199.95. But the user who will be using it at that moment is actually not being used.
00:35:40 actually not being used. Or do you choose such a time?
00:35:43 Or do you choose such a time? Then who would use it anyway?
00:35:46 Then who would use it anyway? So no one would be aggrieved.
00:35:50 So no one would be aggrieved. But this is the day, for example. What's going on here? It's
00:35:54 But this is the day, for example. What's going on here? It's not one in a thousand, but
00:35:58 not one in a thousand, but five in a ten thousand. There's a five in a ten thousand unused situation. If
00:36:00 five in a ten thousand. There's a five in a ten thousand unused situation. If this five in a ten thousand usage is at its
00:36:03 this five in a ten thousand usage is at its peak, it might
00:36:05 peak, it might
00:36:10 affect even one percent of your usage. In this case, it's never five in a ten thousand, but one in a hundred. That's
00:36:14 one in a hundred. That's why the time it was interrupted is
00:36:17 why the time it was interrupted is actually very important.
00:36:20 actually very important. The longer
00:36:23 The longer it is, the longer it is in the evening. Several
00:36:25 it is, the longer it is in the evening. Several short interruptions are less likely to be disruptive than one long interruption. The
00:36:27 short interruptions are less likely to be disruptive than one long interruption. The
00:36:34 duration of very long interruptions is a special day issue. We might have a translation error here, and
00:36:45 We might have a translation error here, and
00:37:00 that's possible. Yes, several short interruptions are likely to be more disruptive than one long interruption.
00:37:02 interruptions are likely to be more disruptive than one long interruption.
00:37:08 Do you have long interruptions? So, were all these interruptions a single time,
00:37:09 were all these interruptions a single time, or did small interruptions occur
00:37:11 or did small interruptions occur multiple times without much disturbance?
00:37:13 multiple times without much disturbance? This is also an important element. Yes,
00:37:17 This is also an important element. Yes, know this. Key points. What is this?
00:37:20 know this. Key points. What is this? Reliability in the system reflects the user's
00:37:23 Reliability in the system reflects the user's trust in that system. I
00:37:26 trust in that system. I mean, as I mentioned, reliability is a
00:37:29 mean, as I mentioned, reliability is a series of related non-functional system
00:37:31 series of related non-functional system features. Usability,
00:37:32 features. Usability, reliability, safety. And security is
00:37:34 reliability, safety. And security is a term used to recognize, to recognize,
00:37:37 a term used to recognize, to recognize, see what he said here.
00:37:41 see what he said here. Non-functional, that is, functional features
00:37:43 Non-functional, that is, functional features were different. If you remember, these
00:37:46 were different. If you remember, these non-functional features are also
00:37:47 non-functional features are also very important
00:37:49 very important features, but they are not functions. For
00:37:52 features, but they are not functions. For example, a system being 100% online is
00:37:54 example, a system being 100% online is not a function. Can
00:37:56 not a function. Can I explain? That is not an action like that,
00:37:58 I explain? That is not an action like that,
00:38:03 but for example, the ability to automatically lower and raise the windows while driving is a
00:38:06 automatically lower and raise the windows while driving is a feature that works. You have
00:38:08 feature that works. You have features, but for example, a car, for example, start and
00:38:12 features, but for example, a car, for example, start and run. What can we say? A car can do
00:38:15 run. What can we say? A car can do 200,000 kilometers with 100,000 kilometers? What are the
00:38:17 200,000 kilometers with 100,000 kilometers? What are the things that you don't have?
00:38:20 things that you don't have? Especially diet. Maybe we can define
00:38:23 Especially diet. Maybe we can define a system's availability. It is the probability that
00:38:26 a system's availability. It is the probability that services are available when it is done. The
00:38:28 services are available when it is done. The
00:38:34 reliability of a system is that the system services are delivered in a way that indicates that the system is delivering.
00:38:36 delivered in a way that indicates that the system is delivering. So, this polish is my ticket. This ticket is
00:38:39 So, this polish is my ticket. This ticket is pushes this dependable ticket to the house. This is also the Master.
00:38:44 pushes this dependable ticket to the house. This is also the Master. Let's leave it together, right?
00:38:46 Let's leave it together, right? Yes, later, security and
00:38:50 Yes, later, security and reliability. Section 2. Reliability
00:38:54 reliability. Section 2. Reliability terminology. Term: Human error or
00:38:58 terminology. Term: Human error or error description. Look, what is this? Hope is
00:39:04 error description. Look, what is this? Hope is completely for our hospital management system.
00:39:06 completely for our hospital management system.
00:39:11 Vote, vote, vote. This has arrived. Yes, Yes, now, here, it hasn't returned in this period.
00:39:20 Yes, Yes, now, here, it hasn't returned in this period. Terme: The term: human error or error
00:39:28 Terme: The term: human error or error description. What is the explanation of
00:39:30 description. What is the explanation of human behavior that results in errors entering a system?
00:39:33 human behavior that results in errors entering a system?
00:39:38 There was a station in the wild weather system. If you recall, a programmer might decide that the way to calculate the time for the next transmission is to
00:39:41 programmer might decide that the way to calculate the time for the next transmission is to
00:39:45 add one hour to the current time. This
00:39:50 current time. This transmission time
00:39:53 transmission time works except when it's between 23:00 and midnight.
00:39:56 works except when it's between 23:00 and midnight. In the 24-hour clock, midnight is 0-0. Let's
00:40:00 In the 24-hour clock, midnight is 0-0. Let's see what happens. This system error
00:40:07 see what happens. This system error can cause an error. A
00:40:09 can cause an error. A system error code is the addition of one hour to the last production time, regardless of whether the error code is
00:40:14 system error code is the addition of one hour to the last production time, regardless of whether the error code is greater than or equal to 23:00.
00:40:17 greater than or equal to 23:00.
00:40:26 So, here's the code the programmer wrote. This is a system
00:40:29 here's the code the programmer wrote. This is a system error. The system has
00:40:31 error. The system has
00:40:37 generated an erroneous system state that could lead to unexpected system behavior. The production time value is incorrectly set when jeans are executed. 002 is 24x instead of 002.
00:40:43 002 is 24x instead of 002. This system error
00:40:46 This system error did not provide a service where the system was expecting fabrics.
00:40:49 did not provide a service where the system was expecting fabrics. An event occurred at a time when time passed, so
00:40:51 An event occurred at a time when time passed, so weather data
00:40:52 weather data is not sent. Now, look at
00:41:04 these three different error situations here. Do you know how this happened?
00:41:07 you know how this happened? Our goal is that the time
00:41:15 Our goal is that the time is between 23:00 and 24:00. I
00:41:24 is between 23:00 and 24:00. I don't want transmissions except between 23:00 and 24:00. I
00:41:28 don't want transmissions except between 23:00 and 24:00. I mean, let's
00:41:34 mean, let's say this. Let me explain it this way.
00:41:40 say this. Let me explain it this way. Yes, you have a weather
00:41:43 Yes, you have a weather station in our system. We say that
00:41:45 station in our system. We say that reporting should be hourly, then
00:41:48 reporting should be hourly, then what hours are there? For example,
00:41:50 what hours are there? For example, if you do it according to the American time system,
00:41:54 if you do it according to the American time system, how many hours are there from zero to 12?
00:41:58 how many hours are there from zero to 12? But if we
00:42:02 But if we do it according to the 24-hour system, how
00:42:05 do it according to the 24-hour system, how many hours are there from one to 24? 24 becomes 00 again,
00:42:09 many hours are there from one to 24? 24 becomes 00 again, that is, 23:59. Now the program says, "Let
00:42:13 that is, 23:59. Now the program says, "Let me add a fake hour."
00:42:15 me add a fake hour." Okay, that's a good logic, let me
00:42:17 Okay, that's a good logic, let me send it last. For example, it sent at 23:33.
00:42:21 send it last. For example, it sent at 23:33.
00:42:25 Yes, when you add it at this time, it should be the income hour. Should the sending time be
00:42:27 be the income hour. Should the sending time be 24/33? Should it be 00:33? It should
00:42:30 24/33? Should it be 00:33? It should be 00:33. But what happens if the
00:42:34 be 00:33. But what happens if the programmer makes a mistake and
00:42:36 programmer makes a mistake and doesn't check it? 24/33 becomes 24/33. This is
00:42:41 doesn't check it? 24/33 becomes 24/33. This is an error. What will happen when it is 24/32? It will
00:42:44 an error. What will happen when it is 24/32? It will never send data again. This is
00:42:47 never send data again. This is a system error, but who
00:42:49 a system error, but who caused it? Was it caused by human error?
00:42:52 caused it? Was it caused by human error? Is there another situation?
00:42:59 Is there another situation? This is this: 24 instead of 00. We already
00:43:05 This is this: 24 instead of 00. We already did this. It could be that the country won't
00:43:13 did this. It could be that the country won't send it because it has passed. This
00:43:17 send it because it has passed. This could also lead to a service delivery failure.
00:43:19 could also lead to a service delivery failure. So, in this case, could the error resulting from the programmer's logic error at the time
00:43:22 So, in this case, could the error resulting from the programmer's logic error at the time
00:43:30 lead to a lot of system errors? Yes, what is this potential error situation
00:43:33 Yes, what is this potential error situation for this system? But if he had done something like this
00:43:38 for this system? But if he had done something like this instead of adding every hour in the program,
00:43:41 instead of adding every hour in the program, for example, the programming language he's using
00:43:44 for example, the programming language he's using has an add-hour function.
00:43:48 has an add-hour function. If he had done that, he could have automatically
00:43:52 If he had done that, he could have automatically moved the process to the next day
00:43:55 moved the process to the next day or the 24-hour system.
00:43:59 or the 24-hour system. There's a conditional daytime class.
00:44:02 There's a conditional daytime class. Don't you want the error? There's a tract. Look at
00:44:06 Don't you want the error? There's a tract. Look at the class, but I need to look at it. Anyway, let's
00:44:09 the class, but I need to look at it. Anyway, let's say the class object is this object.
00:44:12 say the class object is this object. What happens when we add a time? It can
00:44:15 What happens when we add a time? It can automatically set everything, such as the day, hour, year, and so on.
00:44:18 automatically set everything, such as the day, hour, year, and so on. This way, you can
00:44:20 This way, you can easily set the time to an hour later.
00:44:23 easily set the time to an hour later.
00:44:29 Yes, there are errors and malfunctions. Turn a new error into a House and Wires error
00:44:34 error into a House and Wires error and malfunction. Let's say errors and malfunctions are a result of
00:44:40 and malfunction. Let's say errors and malfunctions are a result of system errors caused by errors in the circular system.
00:44:42 system errors caused by errors in the circular system.
00:44:46 However, errors don't necessarily lead to system errors. Here, I
00:44:49 lead to system errors. Here, I suppose a malfunction means
00:44:54 suppose a malfunction means subscriber, the
00:44:57 subscriber, the biggest thing I'll eat this minute. What can I say?
00:45:04 biggest thing I'll eat this minute. What can I say? Cheese failures. Even
00:45:09 Cheese failures. Even system transitions, for example, the connection is like that. The
00:45:16 system transitions, for example, the connection is like that. The system is always straight. It's
00:45:19 system is always straight. It's caused by errors. Let's
00:45:32 say this. However, errors do not necessarily lead
00:45:34 errors do not necessarily lead to system failures. The
00:45:38 to system failures. The
00:45:43 system state resulting from the failure can be temporary and can be
00:45:45 can be temporary and can be corrected before an error occurs.
00:45:48 corrected before an error occurs. Faulty code is never executed. Errors
00:45:51 Faulty code is never executed. Errors do not necessarily lead to system failures.
00:45:55 do not necessarily lead to system failures. This can be corrected with built-in error detection and
00:45:58 This can be corrected with built-in error detection and recovery. Arzu
00:46:00 recovery. Arzu
00:46:06 can be protected with complex built-in protection systems. These can be protected with protection facilities. For example, these
00:46:12 can protect system resources from system errors. Yes, we are having some trouble with the Turkish translation, but this way,
00:46:16 we are having some trouble with the Turkish translation, but this way, this input-output mapping match mapping
00:46:19 this input-output mapping match mapping as a system import program output
00:46:23 as a system import program output sertsin putka zinger onu sal tuseno Salt.
00:46:27 sertsin putka zinger onu sal tuseno Salt. So, we have an input set that enters the program
00:46:29 So, we have an input set that enters the program and an output set. There are also
00:46:32 and an output set. There are also inputs that cause errors, and
00:46:35 inputs that cause errors, and these also cause incorrect outputs. Lara,
00:46:37 these also cause incorrect outputs. Lara, software usage patterns.
00:46:41 software usage patterns. User one user two users three
00:46:45 User one user two users three and incorrect inputs in these outputs.
00:46:51 and incorrect inputs in these outputs. Yes, it is very important that
00:46:53 Yes, it is very important that reliability in use. Removing two percent of errors in a system
00:46:56 reliability in use. Removing two percent of errors in a system
00:47:03 and increasing confidence by x percent.
00:47:09 So, there is no direct one-to-one relationship. What should I do? A research study shows that
00:47:11 What should I do? A research study shows that 160% of product defects will be eliminated
00:47:16 160% of product defects will be eliminated with King confidence. Why do
00:47:17 with King confidence. Why do you think it showed a 100% improvement?
00:47:20 you think it showed a 100% improvement? Because those parts
00:47:23 Because those parts
00:47:28 might be very few users encountering those flaws. In this case, this only provides a 100% improvement.
00:47:31 only provides a 100% improvement. Program flaws
00:47:34 Program flaws can be in sections of code that are rarely executed. See, they
00:47:35 can be in sections of code that are rarely executed. See, they
00:47:41 are rarely used, so they are never encountered by users. Therefore,
00:47:42 never encountered by users. Therefore, their elimination
00:47:45 their elimination
00:47:54 does not affect the perceived reliability. Users can use warnings to avoid system features that could be successful for them.
00:47:55 system features that could be successful for them.
00:48:00 So, if a user encounters an error, if
00:48:02 user encounters an error, if there is a way to avoid this error,
00:48:05 there is a way to avoid this error, this user can change their behavior.
00:48:07 this user can change their behavior. A program with known bugs can
00:48:10 A program with known bugs can therefore
00:48:12 therefore still be
00:48:14 still be perceived as reliable by puzzles. A yes,
00:48:22 reliability success. Error prevention.
00:48:33 Development techniques are used to minimize the likelihood of errors or trap errors before they cause errors. Fault detection and correction. Verification and validation techniques that increase the likelihood of
00:48:36 and correction. Verification and validation techniques that increase the likelihood of detecting and correcting errors before they enter service.
00:48:38 detecting and correcting errors before they enter service.
00:48:43 And one person and change techniques
00:48:47 person and change techniques are used. Fault tolerance. System
00:48:51 are used. Fault tolerance. System errors do not cause system errors. And when you say
00:48:54 errors do not cause system errors. And when you say system errors here,
00:48:58 system errors here, of course, it
00:49:00 of course, it sounds illogical, but this
00:49:05 sounds illogical, but this yes. Come on, this page.
00:49:08 yes. Come on, this page. Ercan Tekniksa, the executing one right there
00:49:14 Ercan Tekniksa, the executing one right there on the system board, straight to the system Erdost.
00:49:19 on the system board, straight to the system Erdost. What should we say here instead of an error? I wonder if it
00:49:22 What should we say here instead of an error? I wonder if it means a system error or a volt error. But
00:49:24 means a system error or a volt error. But this system flaws, let's say this, are
00:49:29 this system flaws, let's say this, are system errors. Runtime techniques are used to ensure that these flaws do not cause system errors
00:49:31 system errors. Runtime techniques are used to ensure that these flaws do not cause system errors and that system errors
00:49:35 and that system errors lead to correct site stores.
00:49:40 lead to correct site stores.
00:49:47 What does this mean? Runtime techniques mean that while the system
00:49:49 Runtime techniques mean that while the system is running, there is
00:49:51 is running, there is
00:50:07 no risk of human injury or death, that is, without causing physical damage,
00:50:14 and let's say in Ados, without harming the system's environment. Although we can say that true physical damage is
00:50:16 Although we can say that true physical damage is also included in the environment. Because it is
00:50:17 also included in the environment. Because it is
00:50:22 a system feature that reflects the ability to operate normally or abnormally. In other words, if
00:50:25 In other words, if it operates normally in an abnormal situation, if your safety
00:50:28 it operates normally in an abnormal situation, if your safety desires it, it does not harm the environment or people. It is
00:50:30 desires it, it does not harm the environment or people. It is
00:50:41 important to consider that software failure is now critical because it includes software-based control systems. What does this mean? This means that we need to pay
00:50:44 What does this mean? This means that we need to pay more attention to software security.
00:50:45 more attention to software security. Because
00:50:48 Because all systems that are critical in the event of a failure are
00:50:50 all systems that are critical in the event of a failure are now written
00:50:51 now written and controlled. For example, a
00:50:54 and controlled. For example, a dam is now controlled by
00:50:57 dam is now controlled by software to determine when to open and close the dam gates.
00:50:59 software to determine when to open and close the dam gates. Therefore, it is critical.
00:51:03 Therefore, it is critical. So, a person manually
00:51:06 So, a person manually opening and closing the gates in the software. Is it serious? Is this a serious
00:51:08 opening and closing the gates in the software. Is it serious? Is this a serious error in many places? It
00:51:10 error in many places? It could cause a collapse, which
00:51:12 could cause a collapse, which could lead to loss of human life. That's why
00:51:14 could lead to loss of human life. That's why software security is
00:51:16 software security is so important for safety in almost every
00:51:19 so important for safety in almost every system.
00:51:21 system. Nuclear power plants have been around for a very long
00:51:24 Nuclear power plants have been around for a very long time. How were the first nuclear power plants
00:51:26 time. How were the first nuclear power plants controlled? Probably
00:51:28 controlled? Probably with completely manual systems,
00:51:31 with completely manual systems, by humans. But now they are controlled by software.
00:51:33 by humans. But now they are controlled by software. That's why very important security
00:51:37 That's why very important security requirements are usually specific
00:51:38 requirements are usually specific requirements. In other words,
00:51:41 requirements. In other words, instead of specifying necessary system services, they
00:51:43 instead of specifying necessary system services, they exclude undesirable situations. These
00:51:47 exclude undesirable situations. These create security requirements if desired.
00:51:50 create security requirements if desired.
00:52:31 There seems to be a sentence error here, right? So, what this means is that instead of specifying necessary features,
00:52:35 instead of specifying necessary features, it says, "These should not be the case." In other words, these
00:52:37 it says, "These should not be the case." In other words, these situations are encountered.
00:52:41 situations are encountered. Instead of giving all the requirements, they
00:52:44 Instead of giving all the requirements, they wrote like this in the book, but we
00:52:51 wrote like this in the book, but we
00:52:55 can think of it as specifying bad situations and offering solutions.
00:53:09 For example, the insulin pump control system, when a failure
00:53:12 pump control system, when a failure occurs, directly
00:53:14 occurs, directly affects and endangers human life. It is
00:53:16 affects and endangers human life. It is very critical in terms of security. Secondly,
00:53:19 very critical in terms of security. Secondly, 42 systems failures cause failures
00:53:22 42 systems failures cause failures in other social technical fantasy systems,
00:53:24 in other social technical fantasy systems, and later the
00:53:27 and later the security results can be correct.
00:53:29 security results can be correct. In other words, it directly
00:53:31 In other words, it directly affects them. Failures resulting from errors
00:53:34 affects them. Failures resulting from errors
00:53:44 can affect other critical systems at the first level, for example, in the mental health management system of the mental health hospital, in the
00:53:46 in the mental health management system of the mental health hospital, in the patient management
00:53:50 patient management system, in terms of security, in the business
00:53:53 system, in terms of security, in the business system
00:53:55 system and adolescence. Because
00:53:58 and adolescence. Because failure
00:54:00 failure can lead to the prescription of inappropriate treatment, that is, a
00:54:03 can lead to the prescription of inappropriate treatment, that is, a failure
00:54:05 failure does not directly affect a patient, but what can happen
00:54:06 does not directly affect a patient, but what can happen with the unsuccessful and incorrect treatment applied as a result,
00:54:09 with the unsuccessful and incorrect treatment applied as a result, and the patient's health
00:54:14 and the patient's health can be harmed. This is also critical for the second level of security.
00:54:16 can be harmed. This is also critical for the second level of security.
00:54:22 Security and reliability are interrelated
00:54:24 reliability are interrelated but different. In general,
00:54:26 but different. In general, reliability and usability
00:54:28 reliability and usability are necessary but not sufficient conditions for system security.
00:54:30 are necessary but not sufficient conditions for system security.
00:54:34 You can provide reliability. The Sivas specification is about conformity and service delivery.
00:54:36 specification is about conformity and service delivery. In other words, a software is
00:54:39 In other words, a software is considered safe by some. It may be understandable,
00:54:41 considered safe by some. It may be understandable, but it may not actually be safe.
00:54:45 but it may not actually be safe. Because that insecure Hatay
00:54:47 Because that insecure Hatay has never encountered that person, or it is very rare,
00:54:48 has never encountered that person, or it is very rare, so they
00:54:50 so they can find a written, trustworthy security
00:54:53 can find a written, trustworthy security specification. Compliance with this is possible without
00:54:55 specification. Compliance with this is possible without opening it. That is, even if the specification is
00:54:58 opening it. That is, even if the specification is not specified, it should
00:55:02 not specified, it should
00:55:10 not include that security vulnerability. We ensure that the system does not cause damage.
00:55:13 ensure that the system does not cause damage. So, regardless of whether it is
00:55:16 So, regardless of whether it is within the specification or not, the
00:55:18 within the specification or not, the user encounter
00:55:20 user encounter should never cause system damage.
00:55:26 should never cause system damage. Physical and environmental safety.
00:55:30 Physical and environmental safety. Unsafe and reliable. Unsafe and
00:55:32 Unsafe and reliable. Unsafe and reliable systems.
00:55:36 reliable systems. Let me look at some of these. But our safe, free,
00:55:41 Let me look at some of these. But our safe, free, Live system cannot be detected in a system for many years,
00:55:47 Live system cannot be detected in a system for many years, and rarely,
00:55:49 and rarely, malfunctions can occur during sleep. If the
00:55:52 malfunctions can occur during sleep. If the system specification
00:55:54 system specification is incorrect, the system may behave as specified,
00:55:56 is incorrect, the system may behave as specified, but it still can cause an accident.
00:55:58 but it still can cause an accident.
00:56:03 So, is the system reliable? But it
00:56:05 the system reliable? But it is not a safe system. Hardware failures that create fake inputs are
00:56:08 is not a safe system. Hardware failures that create fake inputs are
00:56:13 difficult to predict. Hardware failures are really quite annoying and difficult. Because it is
00:56:16 quite annoying and difficult. Because it is very difficult to predict. Context-sensitive commands. In other words,
00:56:20 very difficult to predict. Context-sensitive commands. In other words, giving the right command at the wrong time is
00:56:22 giving the right command at the wrong time is usually the operator's
00:56:23 usually the operator's fault. This is the result of user error. Burhan
00:56:29 fault. This is the result of user error. Burhan
00:56:38 Oh, security terminology. Term. Look here. It is about artificial intelligence.
00:56:40 here. It is about artificial intelligence. I can say something Your mother This is the first translation
00:56:43 I can say something Your mother This is the first translation I'm doing with artificial intelligence here.
00:56:47 I'm doing with artificial intelligence here. Grandson actually had a torment period,
00:56:49 Grandson actually had a torment period, but this context has two meanings. The
00:56:53 but this context has two meanings. The term. Here's
00:56:58 term. Here's why we haven't built a very good environment with artificial intelligence yet. Black
00:57:01 why we haven't built a very good environment with artificial intelligence yet. Black on TX is because it's difficult to understand the
00:57:04 on TX is because it's difficult to understand the context. An accident or mishap is
00:57:08 context. An accident or mishap is
00:57:14 an unplanned event or series of events that results in human death and injury, damage to property or the environment. See, there's nothing specific here.
00:57:17 series of events that results in human death and injury, damage to property or the environment. See, there's nothing specific here. An insulin overdose is an
00:57:19 An insulin overdose is an example of an accident. A hazard is a situation that has the potential to contribute to causing an accident. A
00:57:24 example of an accident. A hazard is a situation that has the potential to contribute to causing an accident. A
00:57:28 blood sugar sensor failure is an example. Damage is
00:57:32 failure is an example. Damage is a measure of loss resulting from a mishap. Damage
00:57:34 a measure of loss resulting from a mishap. Damage
00:57:42 can range from minor injuries or property damage to Hasan, resulting from an overdose. Hasan can be severely
00:57:44 resulting from an overdose. Hasan can be severely injured and the pump user can
00:57:46 injured and the pump user can die. So you see, damage
00:57:48 die. So you see, damage ranges from the smallest to the largest.
00:57:51 ranges from the smallest to the largest. Hazard severity is an assessment of the
00:57:54 Hazard severity is an assessment of the worst possible damage that could result from a particular hazard.
00:57:57 worst possible damage that could result from a particular hazard. Hazard severity
00:58:01 Hazard severity
00:58:11 can vary as much as the disaster where many people die. Only minor damage occurs, minors come from. Has, or what should we say? Yes, damages can vary when an individual death
00:58:19 Yes, damages can vary when an individual death is a possibility.
00:58:21 is a possibility. When death occurs in love, a reasonable
00:58:23 When death occurs in love, a reasonable assessment of the severity of the danger is very high.
00:58:25 assessment of the severity of the danger is very high. That is, if there is death, the severity of the danger is very
00:58:29 That is, if there is death, the severity of the danger is very high. It can even be called the highest. Because
00:58:31 high. It can even be called the highest. Because death is now irreversible.
00:58:35 death is now irreversible. There is also the possibility of danger. The probability of an
00:58:39 There is also the possibility of danger. The probability of an event occurring that creates a danger
00:58:41 event occurring that creates a danger
00:58:47 tends to be arbitrary. However, let's
00:59:06 look at the probability in single quotes. The probability of a danger occurring is one divided by one hundred. The probability of a dangerous event not occurring is one. It's also illogical. So, if a danger can occur with a one percent chance, we call it probable. You even see it in the package inserts when using drugs. For example, a section says 1 in 1000000.
00:59:08 section says 1 in 1000000. This period is common, and
00:59:11 This period is common, and in one section, they say 1 in 1000000. For
00:59:13 in one section, they say 1 in 1000000. For example, the illogical and dangerous possibility of
00:59:17 example, the illogical and dangerous possibility of occurring can be thrown out. No conceivable
00:59:19 occurring can be thrown out. No conceivable situation is likely. The
00:59:22 situation is likely. The
00:59:27 probability of a sensor failure resulting from a person's pump overdose is probably low. For example, one in 1 billion hours
00:59:29 low. For example, one in 1 billion hours of use is very low. Here,
00:59:33 of use is very low. Here, a loop needs to be established. So,
00:59:38 a loop needs to be established. So, preventing everything 100% is impossible, but with
00:59:40 preventing everything 100% is impossible, but with a probability close to impossible, the system
00:59:44 a probability close to impossible, the system needs to be developed. The probability of failure.
00:59:46 needs to be developed. The probability of failure. Desire. That is, a system
00:59:51 Desire. That is, a system
00:59:55 can always operate stably when there is no hardware failure. But you cannot predict it in a frozen store.
00:59:57 you cannot predict it in a frozen store. Something very unusual could happen.
01:00:00 Something very unusual could happen. It's possible to take precautions against these,
01:00:02 It's possible to take precautions against these, but the cost increases. Therefore, the cost
01:00:04 but the cost increases. Therefore, the cost needs to be increased, but
01:00:08 needs to be increased, but
01:00:13 since this is your face pump, it's vital. It requires adding a few additional fault detection or preventative measures.
01:00:16 fault detection or preventative measures. Risk is
01:00:20 Risk is a measure of the system's likelihood of causing an accident. The
01:00:22 a measure of the system's likelihood of causing an accident. The probability of a hazard
01:00:24 probability of a hazard
01:00:31 is assessed by taking into account the likelihood of the hazard causing a serious and dangerous accident. The risk of insulin overdose is probably low.
01:00:33 risk of insulin overdose is probably low. For example, is there always a risk of a plane crashing?
01:00:36 For example, is there always a risk of a plane crashing? There is, but because it's very rare and very low, we
01:00:39 There is, but because it's very rare and very low, we continue to fly aircraft.
01:00:41 continue to fly aircraft. We can think of it this way. Yes,
01:00:45 We can think of it this way. Yes, if safety success is achieved, this hazard
01:00:49 if safety success is achieved, this hazard avoidance is achieved. The system is designed to prevent certain hazard classes from
01:00:52 avoidance is achieved. The system is designed to prevent certain hazard classes from occurring.
01:00:54 occurring. Hazard detection and
01:00:56 Hazard detection and elimination. The system is designed to
01:00:59 elimination. The system is designed to detect and eliminate hazards before they result in an accident.
01:01:01 detect and eliminate hazards before they result in an accident.
01:01:06 Damage limitation is a system that can
01:01:08 can or should be minimized.
01:01:10 or should be minimized. For example,
01:01:14 For example, in the event of a potential malfunction at a nuclear facility, they take precautions to minimize them. They
01:01:20 in the event of a potential malfunction at a nuclear facility, they take precautions to minimize them. They
01:01:25 were building very thick walls.
01:01:29 They were developing automatic systems to cool the raki inside, and so on. In other words, they are working very hard to limit damage.
01:01:32 words, they are working very hard to limit damage. Normal accidents,
01:01:36 Normal accidents, accidents in complex systems rarely have a single
01:01:38 accidents in complex systems rarely have a single cause. Because these systems have a single cause.
01:01:41 cause. Because these systems have a single cause.
01:01:47 It is designed to be resistant to the point of failure. This is very important, so it needs to take many precautions. It is
01:01:49 needs to take many precautions. It is necessary to use Rhodes. This way, for
01:01:54 necessary to use Rhodes. This way, for example, I just said that I
01:01:56 example, I just said that I am doing a situation like this. If two disks
01:01:59 am doing a situation like this. If two disks fail simultaneously, but the data is lost. This is
01:02:01 fail simultaneously, but the data is lost. This is really very unlikely, and
01:02:04 really very unlikely, and so what am I doing? I am taking a precaution.
01:02:06 so what am I doing? I am taking a precaution.
01:02:12 Designing systems so that a single point of failure will not cause an accident
01:02:16 is the basic principle of security system design. Almost all accidents are the result of combinations of failures rather than individual failures.
01:02:18 failures rather than individual failures.
01:02:23 Probably, especially in software control systems, predicting all problem combinations is
01:02:26 systems, predicting all problem combinations is almost impossible.
01:02:29 almost impossible. Therefore, achieving complete safety
01:02:31 Therefore, achieving complete safety is impossible. Accidents are
01:02:33 is impossible. Accidents are inevitable. Anyway, it's
01:02:40 inevitable. Anyway, it's like plane crashes, but
01:02:42 like plane crashes, but there are many systems to prevent them. For example, if a plane's
01:02:45 there are many systems to prevent them. For example, if a plane's engine fails, nothing happens.
01:02:47 engine fails, nothing happens. All its engines fail, but
01:02:48 All its engines fail, but maybe then it will crash. Think about it that way.
01:02:50 maybe then it will crash. Think about it that way. Because there are many backup systems.
01:02:53 Because there are many backup systems. Accordingly, there are situations and similar situations.
01:02:57 Accordingly, there are situations and similar situations. I wrote about the ephemera. If it passes,
01:02:58 I wrote about the ephemera. If it passes, there are backup software systems. What can we say?
01:03:01 there are backup software systems. What can we say? There are processors. One processor burns out.
01:03:04 There are processors. One processor burns out. Like me, software security advantages. Although
01:03:07 Like me, software security advantages. Although software
01:03:08 software and security can be critical, the
01:03:10 and security can be critical, the use of software control systems
01:03:12 use of software control systems contributes to increasing system security.
01:03:14 contributes to increasing system security. Software monitoring and control.
01:03:17 Software monitoring and control.
01:03:21 Using electromechanical security systems is more than possible. It
01:03:26 allows monitoring and controlling the conditions of the reservoir. Software control allows for the identification of security strategies that
01:03:28 Software control allows for the identification of security strategies that reduce the time people spend in hazardous environments.
01:03:30 reduce the time people spend in hazardous environments.
01:03:36 Software
01:03:39 can detect and correct critical operator errors from a security perspective. This is also quite important. How does this message occur? Let's
01:03:42 quite important. How does this message occur? Let's consider an example. For example, even if this
01:03:49 consider an example. For example, even if this happens, say, a drinking
01:03:59 happens, say, a drinking water facility is provided with
01:04:03 water facility is provided with these. Of course,
01:04:05 these. Of course, you know there's state control. Software is added,
01:04:08 you know there's state control. Software is added, and a very high amount of
01:04:11 and a very high amount of chlorine is introduced. Here,
01:04:14 chlorine is introduced. Here,
01:04:20 because the software system has trench coats, it could even make this error. It can also control it. The user
01:04:22 can also control it. The user could also make it. For example, let's say
01:04:25 could also make it. For example, let's say
01:04:29 5 kilograms. For example, in this case, it should be able to enter a maximum of 50 kilograms. What happens? It
01:04:32 What happens? It immediately detects this and
01:04:34 immediately detects this and prevents critical operator errors. This
01:04:40 prevents critical operator errors. This security is a system security feature that reflects the system's ability to protect itself from
01:04:43 security is a system security feature that reflects the system's ability to protect itself from accidental or intentional external
01:04:45 accidental or intentional external attacks.
01:04:47 attacks.
01:04:53 This capability is a lion's share of a system feature. Since most systems are networked,
01:04:56 feature. Since most systems are networked,
01:05:00 external access to the system over the internet is possible. Security is important. Security is
01:05:03 Security is important. Security is
01:05:07 a fundamental prerequisite for usability, reliability, and security. Of course, since it's translated from English,
01:05:09 course, since it's translated from English, everything is translated the same way.
01:05:34 Yes, this is basic security. A system is a
01:05:37 this is basic security. A system is a networked system. SV is insecure.
01:05:41 networked system. SV is insecure.
01:05:46 Statements about reliability and security are unreliable. These statements
01:05:50 depend on the execution system and the developed system being the same, but an intrusion can change the execution system or its
01:05:53 intrusion can change the execution system or its travels. Therefore,
01:05:56 travels. Therefore, reliability and security assurance are
01:05:58 reliability and security assurance are no longer valid. So, hackers
01:05:59 no longer valid. So, hackers
01:06:06 can do whatever they want to a system. This term refers to
01:06:09 This term refers to something valuable that needs to be protected. The software used
01:06:11 something valuable that needs to be protected. The software used by the system or the system itself can be exposed to
01:06:12 by the system or the system itself can be exposed to
01:06:18 potential loss or damage. This could be data loss or damage,
01:06:21 or damage. This could be data loss or damage, or a loss of
01:06:23 or a loss of time and effort needed to recover after a security breach. A
01:06:26 time and effort needed to recover after a security breach. A
01:06:40 But I think it can be exploited.
01:06:50 Yes, an attack is a system security exploit. It is usually
01:06:53 exploit. It is usually external to the system and is a deliberate
01:06:55 external to the system and is a deliberate attempt to cause damage. Threats that have
01:06:58 attempt to cause damage. Threats that have caused loss or damage are situations that have the potential to cause them.
01:07:00 caused loss or damage are situations that have the potential to cause them.
01:07:07 We can consider them as opening a system to security control. Encryption is a protective measure that reduces the vulnerability of a system,
01:07:10 a protective measure that reduces the vulnerability of a system, or a weak access
01:07:13 or a weak access control system.
01:07:15 control system. Here is an example of a security vulnerability. Here is what encryption means.
01:07:19 Here is an example of a security vulnerability. Here is what encryption means.
01:07:25 When we register on a website, we provide a password, and if we keep this password in a bare form,
01:07:27 we provide a password, and if we keep this password in a bare form, and This system database is
01:07:32 and This system database is compromised. You can then log in to this system with this password,
01:07:36 compromised. You can then log in to this system with this password, and
01:07:40 and transactions can be made in your name.
01:07:42 transactions can be made in your name. That email address and password are matched.
01:07:46 That email address and password are matched. Poae pasta. The
01:07:48 Poae pasta. The same password can be tried on other systems using that email address.
01:07:51 same password can be tried on other systems using that email address.
01:07:58 What can be done to prevent this? For example, those passwords are encrypted with the one-way HDM encryption algorithm
01:08:01 encrypted with the one-way HDM encryption algorithm and are not stored in the form of bare metal. So,
01:08:04 and are not stored in the form of bare metal. So, numbers and letters are also transformed. And
01:08:08 numbers and letters are also transformed. And when the encrypted data is stolen, it
01:08:10 when the encrypted data is stolen, it doesn't give you any information to the thief. I'm talking about security
01:08:16 doesn't give you any information to the thief. I'm talking about security examples here in the mental health hospital
01:08:19 examples here in the mental health hospital management system.
01:08:21 management system. You're very sweet, but I'm already pushing it.
01:08:29 You're very sweet, but I'm already pushing it. I'm
01:08:32 I'm looking at the records of every patient who has received or is receiving treatment for an hour.
01:08:37 looking at the records of every patient who has received or is receiving treatment for an hour. Here's an example. The
01:08:38 Here's an example. The security number is an example.
01:08:41 security number is an example. What are the patient records? Exposure
01:08:46 What are the patient records? Exposure data protection. The
01:08:48 data protection. The
01:08:52 potential financial loss of future patients who don't want treatment because they don't trust the clinic. Financial loss resulting from legal action by a sports star.
01:08:56 loss of future patients who don't want treatment because they don't trust the clinic. Financial loss resulting from legal action by a sports star. Loss of reputation.
01:08:59 Loss of reputation. What does this mean? For example,
01:09:03 What does this mean? For example, what happened to a sports star? He received treatment.
01:09:07 what happened to a sports star? He received treatment. But this treatment
01:09:10 But this treatment will negatively impact his career, so
01:09:12 will negatively impact his career, so no one knows about patient privacy.
01:09:15 no one knows about patient privacy. What happens when this data is manipulated? It
01:09:18 What happens when this data is manipulated? It causes a very serious reputational loss.
01:09:20 causes a very serious reputational loss.
01:09:25 What does he say about his future career? He's ruined. Or maybe we can think about the politician's job. Or something like that.
01:09:28 we can think about the politician's job. Or something like that. For example, there's something about that hospital system. Normally,
01:09:31 For example, there's something about that hospital system. Normally, new customers will come to that system, right?
01:09:32 new customers will come to that system, right? New patients, but
01:09:34 New patients, but people who learn that the system is secure don't.
01:09:37 people who learn that the system is secure don't. What happens? This leads to a potential financial loss.
01:09:40 What happens? This leads to a potential financial loss. What is this? Exposure.
01:09:41 What is this? Exposure. Security vulnerability. Users can easily
01:09:45 Security vulnerability. Users can easily create predictable passwords.
01:09:46 create predictable passwords.
01:09:51 User IDs that are the same as names. If a
01:09:56 system allows an important person to enter passwords with the same name. This is a security tool. Or if it allows a person to
01:09:59 This is a security tool. Or if it allows a person to constantly cycle passwords.
01:10:03 constantly cycle passwords. This is a security tool. An
01:10:05 This is a security tool. An attack. Impersonating an authorized user.
01:10:07 attack. Impersonating an authorized user. M&T HDYT. When an unauthorized user
01:10:12 M&T HDYT. When an unauthorized user uses an authorized user's
01:10:14 uses an authorized user's credentials, the system will guess the login name and password. This is
01:10:15 credentials, the system will guess the login name and password. This is
01:10:25 a control system. Normally, include a word. This is a password control system.
01:10:27 password control system. Allowing user passwords, which are papers or words. So, if you don't allow easy passwords, this
01:10:30 Allowing user passwords, which are papers or words. So, if you don't allow easy passwords, this is a control system.
01:10:33 is a control system. That's why the message is modern and
01:10:36 That's why the message is modern and their systems require that your passwords contain at least 8 characters, such as one uppercase letter,
01:10:38 their systems require that your passwords contain at least 8 characters, such as one uppercase letter, one lowercase letter, one number, and one special letter.
01:10:42 one lowercase letter, one number, and one special letter.
01:10:50 Barte classes.
01:10:53 classes. Threats to the confidentiality of the system and data. Threats to the integrity of the system.
01:10:55 Threats to the confidentiality of the system and data. Threats to the integrity of the system. Unauthorized
01:10:57 Unauthorized people or programs
01:10:59 people or programs can access this information. This information can be disclosed. It
01:11:02 can access this information. This information can be disclosed. It
01:11:05 can damage or corrupt software or data.
01:11:10 Threats to the system's data availability can restrict authorized users' access to the system.
01:11:12 system's data availability can restrict authorized users' access to the system. Look, what I mean by
01:11:15 Look, what I mean by authorized users here
01:11:16 authorized users here is not the system's administrators. If you
01:11:19 is not the system's administrators. If you 're using your own account, you're authorized. What is the
01:11:21 're using your own account, you're authorized. What is the reason for that account? If you
01:11:23 reason for that account? If you can't access that account,
01:11:26 can't access that account, what is the restriction of the authorized user?
01:11:28 what is the restriction of the authorized user?
01:11:34 Oh, the damage caused by deleting security. Denial of service. The system is forced into a state where normal services are
01:11:36 Denial of service. The system is forced into a state where normal services are unavailable or service delivery is
01:11:39 unavailable or service delivery is significantly reduced. If
01:11:40 significantly reduced. If you perform a DOS attack, you don't
01:11:44 you perform a DOS attack, you don't actually add to the system.
01:11:46 actually add to the system. You can't damage the system, but
01:11:49 You can't damage the system, but you prevent its use because the entire
01:11:51 you prevent its use because the entire system is wasted. Resource-burning destroys your
01:11:53 system is wasted. Resource-burning destroys your health, and those
01:11:55 health, and those who genuinely want to use the system are left without system resources.
01:11:58 who genuinely want to use the system are left without system resources.
01:12:02 This is also true here. If you attack with a friend, the famous one probably isn't adding files.
01:12:05 probably isn't adding files. Programs or data corruption.
01:12:07 Programs or data corruption. Programs or data in the system
01:12:10 Programs or data in the system can be modified without authorization.
01:12:11 can be modified without authorization. Disclosing confidential information.
01:12:15 Disclosing confidential information. Information managed by the system can be exposed to people who aren't
01:12:16 Information managed by the system can be exposed to people who aren't authorized to read or use this information.
01:12:18 authorized to read or use this information.
01:12:25 Yes, security assurance is inevitable for security reasons. The system
01:12:28 inevitable for security reasons. The system
01:12:31 is designed to prevent security vulnerabilities. For example, if there's no external network connection, external attack
01:12:33 if there's no external network connection, external attack is impossible. For example, what if
01:12:39 is impossible. For example, what if a nuclear power plant system
01:12:42 a nuclear power plant system doesn't have an open internet connection? This
01:12:44 doesn't have an open internet connection? This will increase its security because only
01:12:46 will increase its security because only those with access to the local network can get access to bars. They can do this,
01:12:50 those with access to the local network can get access to bars. They can do this, what do large companies usually do?
01:12:52 what do large companies usually do? They shut down internet access.
01:12:55 They shut down internet access. They set up a network, a private virtual
01:12:58 They set up a network, a private virtual network, and only this Wi-Fi
01:13:01 network, and only this Wi-Fi network can be accessed from outside.
01:13:04 network can be accessed from outside. So, how do hackers
01:13:07 So, how do hackers manage to damage the whole system? They add an account of
01:13:09 manage to damage the whole system? They add an account of one of the users who has access to that system,
01:13:12 one of the users who has access to that system,
01:13:15 they add their computer, and that way, they get pictures. The
01:13:17 and that way, they get pictures. The
01:13:24 system is designed to
01:13:28 detect and neutralize attacks before they are exposed, not directly to the person. Browse
01:13:37 virus checkers
01:13:39 checkers find and remove viruses that infect a system.
01:13:42 find and remove viruses that infect a system. That's the important thing, what they say? It's difficult to
01:13:44 That's the important thing, what they say? It's difficult to treat a disease after it enters the body.
01:13:46 treat a disease after it enters the body. That's why there are preventive
01:13:48 That's why there are preventive policies to prevent disease.
01:13:50 policies to prevent disease. That's why it's
01:13:55 That's why it's very important to install this antivirus before your system gets infected.
01:13:57 very important to install this antivirus before your system gets infected. Blocking a virus is
01:13:59 Blocking a virus is much easier than removing it after it's infected.
01:14:02 much easier than removing it after it's infected. Your mother, why do we wear masks?
01:14:05 Your mother, why do we wear masks?
01:14:09 We prevent the coronavirus from infecting us, but once the virus is infected, it's much harder to get rid of it.
01:14:11 much harder to get rid of it. Exposure limitation, recovery,
01:14:15 Exposure limitation, recovery, this system is designed to minimize the negative consequences of a successful health.
01:14:17 this system is designed to minimize the negative consequences of a successful health.
01:14:20 For example, a backup policy allows a damaged gene
01:14:22 backup policy allows a damaged gene to be loaded or a For
01:14:24 to be loaded or a For example, the message is, we got sick despite wearing a mask.
01:14:27 example, the message is, we got sick despite wearing a mask. But how can we become
01:14:30 But how can we become someone from whom we received the virus load, which would be very low.
01:14:33 someone from whom we received the virus load, which would be very low. So your body
01:14:35 So your body can easily defeat howitzers, but much more virus. Because
01:14:38 can easily defeat howitzers, but much more virus. Because if we receive it, our body will develop antibodies.
01:14:40 if we receive it, our body will develop antibodies. How much virus can multiply because the whole
01:14:42 How much virus can multiply because the whole multiplication rate depends on what the number depends on.
01:14:44 multiplication rate depends on what the number depends on. So the more virus we receive, the faster the virus
01:14:46 So the more virus we receive, the faster the virus will multiply. Think about it this way.
01:14:50 will multiply. Think about it this way. Key points for this section:
01:14:52 Key points for this section: reliability is related
01:14:54 reliability is related to the probability of an error occurring in an operation.
01:14:56 to the probability of an error occurring in an operation. A system known to be faulty
01:14:58 A system known to be faulty can be reliable.
01:15:02 can be reliable. Because those errors are very rare, people
01:15:04 Because those errors are very rare, people do not encounter them. It is reliable. A
01:15:06 do not encounter them. It is reliable. A security system
01:15:08 security system
01:15:13 is a system feature that relies on the ability to threaten people and the environment. That is, a system feature that is
01:15:17 independent of the definition of the system and everything. It will never harm you, the environment or people.
01:15:20 harm you, the environment or people. Security is a security system that protects itself from external
01:15:23 Security is a security system that protects itself from external attacks.
01:15:24 attacks. This is a system feature that is lion. What should we say?
01:15:26 This is a system feature that is lion. What should we say? It is a selective system that produces and likes this.
01:15:33 It is a selective system that produces and likes this. If it is not secure,
01:15:35 If it is not secure, reliability is compromised because it can find code or data.
01:15:36 reliability is compromised because it can find code or data. This also indicates dependencies.
01:15:48 Yes, sir, that's all for this week, friends. Let's load the lesson and, of course, this
01:15:52 friends. Let's load the lesson and, of course, this lesson is this or computer information
01:15:55 lesson is this or computer information systems are always better understood in English.
01:15:58 systems are always better understood in English. Terms are better translated into Turkish.
01:15:59 Terms are better translated into Turkish. Rocket terms
01:16:03 Rocket terms are full. That's why those who know English should
01:16:07 are full. That's why those who know English should read English, I will be very aware of
01:16:09 read English, I will be very aware of
01:16:15 this, go to our lesson right away, but let's upload it to the banana he
01:16:17 upload it to the banana he he Take care of yourself, Corona, let's
01:16:35 he Take care of yourself, Corona, let's pay attention to the point, hopefully, see you,
01:16:37 pay attention to the point, hopefully, see you,
Beta Was this translation helpful? Give feedback.
All reactions