Introduction Attacks on Computers
1. Security Trends
2. Need for Security
3. Security Attacks (Active and Passive attacks)
4. Security Services and Mechanisms
5. Network Security
6. Network Security Model
6.Internet Standards and RFCs
Size: 574.81 KB
Language: en
Added: Mar 03, 2025
Slides: 27 pages
Slide Content
Unit 1 Network Fundamentals and Security
Introduction Attacks on Computers Computer threats and attacks involve accessing information, obliterating or manipulating data, destabilizing the computer, or degrading its performance . Computer attacks are mainly information gathering, privilege escalation, buffer overflow exploits, remote accessing by unauthorized users, and denial of service attacks. Security Trends : security trends refer to the emerging patterns, practices, and technologies that are evolving to address new and ongoing security challenges. As cyber threats grow in sophistication and frequency, security trends capture the collective strategies and tools used to counteract and prevent these threats. Key cybersecurity trends often include: Adoption of Zero Trust Models - Emphasizing strict identity verification for every individual and device attempting to access network resources, reducing internal threats. AI and Machine Learning in Security - Using artificial intelligence (AI) and machine learning (ML) to enhance threat detection, automate responses, and analyze data for patterns that reveal potential attacks. Rise of Ransomware and Ransomware as a Service (RaaS) - Increased focus on ransomware prevention as attackers increasingly target sensitive sectors, with RaaS making attacks more accessible. Supply Chain Security - Enhanced scrutiny and protection for third-party vendors and software providers to prevent supply chain breaches. Cloud Security Enhancements - Addressing security for cloud-based infrastructures and applications as businesses migrate to the cloud, focusing on configuration, monitoring, and data protection. Extended Detection and Response (XDR) - Integrating and analyzing security data across multiple layers, such as network, endpoint, and cloud, for faster and more effective threat response. IoT Security - Securing Internet of Things (IoT) devices that are increasingly targeted due to weak security protocols and massive proliferation. Human-Centric Security and Training - Recognizing human error as a major vulnerability, with a strong emphasis on employee training and awareness to prevent phishing and social engineering attacks. Privacy-Enhancing Technologies - Protecting sensitive data during processing, especially crucial for collaboration across organizations and industries. Quantum-Resistant Cryptography - Preparing for the potential of quantum computing to break traditional encryption, leading to the exploration of new encryption standards.
Three key objectives that are at the heart of computer security: 1. Confidentiality : Preserving authorized restrictions on information access and disclosure, including means for protecting personal privacy and proprietary information. A loss of confidentiality is the unauthorized disclosure of information. This term covers two related concepts: Data confidentiality: Assures that private or confidential information is not made available or disclosed to unauthorized individuals. Privacy: Assures that individuals control or influence what information related to them may be collected and stored and by whom and to whom that information may be disclosed. 2 .Integrity: Guarding against improper information modification or destruction, including ensuring information nonrepudiation and authenticity. A loss of integrity is the unauthorized modification or destruction of information. This term covers two related concepts: Data integrity: Assures that information (both stored and in transmitted packets) and programs are changed only in a specified and authorized manner. System integrity: Assures that a system performs its intended function in an unimpaired manner, free from deliberate or inadvertent unauthorized manipulation of the system. 3. Availability : Ensuring timely and reliable access to and use of information. A loss of availability is the disruption of access to or use of information or an information system .Assures that systems work promptly and service is not denied to authorized users.
Types of computer security Computer security can be classified into four types: 1. Cyber Security: Cyber security means securing our computers, electronic devices, networks , programs, systems from cyber attacks. Cyber attacks are those attacks that happen when our system is connected to the Internet. 2. Information Security: Information security means protecting our system’s information from theft, illegal use and piracy from unauthorized use. Information security has mainly three objectives: confidentiality, integrity, and availability of information. 3. Application Security: Application security means securing our applications and data so that they don’t get hacked and also the databases of the applications remain safe and private to the owner itself so that user’s data remains confidential. 4. Network Security: Network security means securing a network and protecting the user’s information about who is connected through that network. Over the network hackers steal, the packets of data through sniffing and spoofing attacks, man in the middle attack, war driving, etc , and misuse the data for their benefits.
Need of Security: Network security is the practice of protecting a computer network from unauthorized access, misuse, or attacks. It involves using tools, technologies, and policies to ensure that data traveling over the network is safe and secure, keeping sensitive information away from hackers and other threats. The basic principle of network security is protecting huge stored data and networks in layers that ensure the bedding of rules and regulations that have to be acknowledged before performing any activity on the data. These levels are: Physical Network Security: This is the most basic level that includes protecting the data and network through unauthorized personnel from acquiring control over the confidentiality of the network. The same can be achieved by using devices like biometric systems. Technical Network Security: It primarily focuses on protecting the data stored in the network or data involved in transitions through the network. This type serves two purposes. One is protected from unauthorized users, and the other is protected from malicious activities. Administrative Network Security: This level of network security protects user behavior like how the permission has been granted and how the authorization process takes place. This also ensures the level of sophistication the network might need for protecting it through all the attacks. This level also suggests necessary amendments that have to be done to the infrastructure.
Types of Network Security Email Security Network Segmentation Access Control Sandboxing : Sandboxing is a cybersecurity technique in which files are opened or code is performed on a host computer that simulates end-user operating environments in a secure, isolated environment. To keep threats off the network, sandboxing watches the code or files as they are opened and searches for harmful activity. Firewalls Security : A firewall is a network security device, either hardware or software-based, which monitors all incoming and outgoing traffic and based on a defined set of security rules accepts, rejects, or drops that specific traffic. Before Firewalls, network security was performed by Access Control Lists (ACLs) residing on routers.
Benefits of Network Security Network Security has several benefits, some of which are mentioned below: Network Security helps in protecting clients’ information and data which ensures reliable access and helps in protecting the data from cyber threats. Network Security protects the organization from heavy losses that may have occurred from data loss or any security incident. It overall protects the reputation of the organization as it protects the data and confidential items. Advantages of Network Security Protection from Unauthorized Access : Network security measures such as firewalls and authentication systems prevent unauthorized users from accessing sensitive information or disrupting network operations. Data Confidentiality : Encryption technologies ensure that data transmitted over the network remains confidential and cannot be intercepted by unauthorized parties. Prevention of Malware and Viruses : Network security solutions like antivirus software and intrusion detection systems (IDS) detect and block malware, viruses, and other malicious threats before they can infect systems. Secure Remote Access : Virtual private networks (VPNs) and other secure remote access methods enable employees to work remotely without compromising the security of the organization’s network and data.
Disadvantages of Network Security Complexity and Management Overhead : Implementing and managing network security measures such as firewalls , encryption, and intrusion detection systems (IDS) can be complex and require specialized knowledge and resources. Cost : Effective network security often requires investment in hardware, software, and skilled personnel, which can be expensive for organizations, especially smaller ones. Privacy Concerns : Some network security measures, such as deep packet inspection and monitoring, may raise privacy concerns among users and stakeholders, requiring careful balancing of security needs with individual privacy rights.
Security Attacks (Active and Passive attacks) What is Active Attacks? Active attacks are the type of attacks in which, The attacker efforts to change or modify the content of messages. Active Attack is dangerous to Integrity as well as availability. Due to active attack system is always damaged and System resources can be changed. The most important thing is that, In an active attack, Victim gets informed about the attack. Advantages of Active Attack (during the process by the attacker) Immediate Impact: By definition, active attacks are also much quicker in that they can immediately and visibly bring about conditions such as system halts, loss of data, and the like. Potential for Data Manipulation: Hackers may corrupt or compromise data, and data integrity problems may arise that may cause significant and prolonged implications for organizations. Disruption of Services: Active attacks, again, can be a great threat to services as they intend at attacking key systems or networks. Disadvantages of Active Attack Higher Risk of Detection: Based on the fact that active attacks imply the wavelength or disruption, it is easier for them to be identified by security systems and administrators. Legal Consequences: There is only passive attack and it is unlawful and if the attacker is apprehended, he will face legal repercussions. Resource Intensive: An active attack is normally more resourceful, technical and needs more tools and skills than those typical of passive attacks.
What is Passive Attacks? Passive Attacks are the type of attacks in which, The attacker observes the content of messages or copies the content of messages. Passive Attack is a danger to Confidentiality. Due to passive attack, there is no harm to the system. The most important thing is that In a passive attack, Victim does not get informed about the attack. Advantages of Passive Attack (from the attacker’s perspective) Low Risk of Detection: Passive attack are hidden in the sense that they do not attempt to modify or destroy the data or the systems and as such, they are more difficult to recognize. Information Gathering: Such attacks make it possible for the attackers to obtain useful information which can be useful in future active attacks or other vices. Minimal Resources Required: Passive attack types can be accomplished using less means, and less skills, and are therefore available to a larger set of potential attackers. Disadvantages of Passive Attack (from the attacker’s perspective) No Immediate Impact: Compared to active attacks passive attacks are not able to directly effect system resources, this may reduce their applicability in some cases. Reliance on Future Actions: The information obtained in passive attacks have to be utilized at some point in time to fulfill the attacker’s goals – and this entails additional measures. Limited to Information Gathering: Passive attacks do not let the attacker to manipulate or destroy data and is usually confined to the collection of data.
Security Services and Mechanisms A security mechanism is a method or technology that protects data and systems from unauthorized access, attacks, and other threats. Security measures provide data integrity, confidentiality, and availability, thereby protecting sensitive information and maintaining trust in digital transactions. Therefore security mechanisms can also be termed as is set of processes that deal with recovery from security attacks. Various mechanisms are designed to recover from these specific attacks at various protocol layers. Types of Security Mechanism Encipherment : This security mechanism deals with hiding and covering of data which helps data to become confidential. It is achieved by applying mathematical calculations or algorithms which reconstruct information into not readable form. It is achieved by two famous techniques named Cryptography and Encipherment. Level of data encryption is dependent on the algorithm used for encipherment. Access Control : This mechanism is used to stop unattended access to data which you are sending. It can be achieved by various techniques such as applying passwords, using firewall, or just by adding PIN to data. Notarization : This security mechanism involves use of trusted third party in communication. It acts as mediator between sender and receiver so that if any chance of conflict is reduced. This mediator keeps record of requests made by sender to receiver for later denied.
Data Integrity : This security mechanism is used by appending value to data to which is created by data itself. It is similar to sending packet of information known to both sending and receiving parties and checked before and after data is received. When this packet or data which is appended is checked and is the same while sending and receiving data integrity is maintained. Authentication Exchange : This security mechanism deals with identity to be known in communication. This is achieved at the TCP/IP layer where two-way handshaking mechanism is used to ensure data is sent or not Bit Stuffing : This security mechanism is used to add some extra bits into data which is being transmitted. It helps data to be checked at the receiving end and is achieved by Even parity or Odd Parity . Digital Signature : This security mechanism is achieved by adding digital data that is not visible to eyes. It is form of electronic signature which is added by sender which is checked by receiver electronically. This mechanism is used to preserve data which is not more confidential but sender’s identity is to be notified .
Network Security Model A Network Security Model exhibits how the security service has been designed over the network to prevent the opponent from causing a threat to the confidentiality or authenticity of the information that is being transmitted through the network. For a message to be sent or receive there must be a sender and a receiver. Both the sender and receiver must also be mutually agreeing to the sharing of the message. Now, the transmission of a message from sender to receiver needs a medium i.e. Information channel which is an Internet service. A logical route is defined through the network (Internet), from sender to the receiver and using the communication protocols both the sender and the receiver established communication. Any security service would have the three components discussed below: 1. Transformation of the information which has to be sent to the receiver. So, that any opponent present at the information channel is unable to read the message. This indicates the encryption of the message. It also includes the addition of code during the transformation of the information which will be used in verifying the identity of the authentic receiver. 2. Sharing of the secret information between sender and receiver of which the opponent must not any clue. Yes, we are talking of the encryption key which is used during the encryption of the message at the sender’s end and also during the decryption of message at receiver’s end. 3. There must be a trusted third party which should take the responsibility of distributing the secret information (key) to both the communicating parties and also prevent it from any opponent .
The network security model presents the two communicating parties sender and receiver who mutually agrees to exchange the information. The sender has information to share with the receiver. But sender cannot send the message on the information cannel in the readable form as it will have a threat of being attacked by the opponent. So, before sending the message through the information channel, it should be transformed into an unreadable format. Secret information is used while transforming the message which will also be required when the message will be retransformed at the recipient side. That’s why a trusted third party is required which would take the responsibility of distributing this secret information to both the parties involved in communication. So, considering this general model of network security, one must consider the following four tasks while designing the security model. 1. To transform a readable message at the sender side into an unreadable format, an appropriate algorithm should be designed such that it should be difficult for an opponent to crack that security algorithm. 2 . Next, the network security model designer is concerned about the generation of the secret information which is known as a key . This secret information is used in conjunction with the security algorithm in order to transform the message. 3. Now, the secret information is required at both the ends, sender’s end and receiver’s end. At sender’s end, it is used to encrypt or transform the message into unreadable form and at the receiver’s end, it is used to decrypt or retransform the message into readable form. So, there must be a trusted third party which will distribute the secret information to both sender and receiver. While designing the network security model designer must also concentrate on developing the methods to distribute the key to the sender and receiver.
T he attackers who attack your system that is accessible through the internet. These attackers fall into two categories: 1. Hacker: The one who is only interested in penetrating into your system. They do not cause any harm to your system they only get satisfied by getting access to your system. 2. Intruders: These attackers intend to do damage to your system or try to obtain the information from the system which can be used to attain financial gain. The attacker can place a logical program on your system through the network which can affect the software on your system. This leads to two kinds of risks: a. Information threat: This kind of threats modifies data on the user’s behalf to which actually user should not access. Like enabling some crucial permission in the system. b. Service threat: This kind of threat disables the user from accessing data on the system. These kinds of threats can be introduced by launching worms and viruses and may more like this on your system. Attack with worms and viruses are the software attack that can be introduced to your system through the internet.
The network security model to secure your system is shown in the figure below: There are two ways to secure your system from attacker of which the first is to introduce the gatekeeper function . Introducing gatekeeper function means introducing login-id and passwords which would keep away the unwanted access. In case the unwanted user gets access to the system the second way to secure your system is introducing internal control which would detect the unwanted user trying to access the system by analyzing system activities. This second method we call as antivirus which we install on our system to prevent the unwanted user from accessing your computer system through the internet. So, this is all about the network security model. We have discussed two network security model. One, securing your information over the network during information transmission. Second, securing your information system which can be accessed by the hacker through the network or internet.
Internet Standards and RFCs Internet standards are fundamental to the global internet, which given the complex and diverse nature of the systems and technologies on the internet attempts to prove how these systems can interact. These are standardization patterns that are provided by the Internet Engineering Task Force (IETF), which has worked on its protocols for a long time and has a strict mechanism for revising and approving them. Primary Technologies Internet Standard(STD): proposed as its formal specification of the rule and policy promoting the uniform and compatibility usage of Internet architecture across the world. Request For Comments (RFC): The entire collection of documents that formed the foundation for developing the internet standards and protocols and also for their promotion and implementation. Proposed Standard: It is used at a certain level of the work in order to prove that the current specification is not just prepared by somebody and exists in the form of a planned work, but has gone through the basic level of evaluation and the review and standardization phase of this work. Draft Standard: A better-defined progression of a concept that is a previously rational and developed idea that has been implemented in other settings. Internet Standard: A frequently adopted and highly stable standard that has turned out to be very effective in actual practice as far as interconnect is concerned.
Importance of Internet Standards Interoperability: They enable many networks of systems and technologies could be efficiently managed. Security: To elaborate on the measures that would help improve the level of Internet security, the following should be taken into consideration: Innovation: This will enable the establishment of the stable groundwork of the new superior technologies as well as the stipulated type of service. Reliability: It helps in ensuring the required stability as well as Internet functionality globally in the broader sense. Global Adoption: It comes out with general information that all the internet technologies achieve acceptability in the world. Examples of Internet Standards TCP/IP (Transmission Control Protocol/Internet Protocol): However, the W3C is the World Wide Web consortium, but it provided the elementary policy of the World Wide Web. HTTP/HTTPS (Hypertext Transfer Protocol / Secure HTTP): It is formalized as the ‘‘Meantime’’ general communication rules of the World Wide Web that includes all forms and means. SMTP (Simple Mail Transfer Protocol): Here are some of the general regulations concerning emails that have been established. DNS (Domain Name System): A process that maps a name to the domain and results to the IP address . IPv6 (Internet Protocol version 6): One inter-net protocol on the internet is selected to replace the old inter-net protocol formerly known as Ipv4 due to its limited capability in providing addresses.
RFCs : Th e planning and growth that can be tracked back to a sequence of highly-polished documents called Internet Protocol, Request for Comments (RFCs). RFCs are running as the foundation stones of internet standardization which provide essence, specification, protocol, procedure, and methodology on how distinct elements of the internet are interlinked. Originally started in 1969 and constantly updated to date, the Internet Engineering Task Force (IETF) has been responsible for the establishment, development and current management of RFC. What is a Request for Comment (RFC)? When defining the world of networking and Internet protocols, an RFC is known as a ‘Request for Comment ‘. Essentially, an RFC is a type of technical document issued by the Internet Engineering Task Force (IETF) that describes specifications, procedures, and standards in given Internet technologies. Current RFCs in Use: Blueprint of the Internet This article encapsulates the historical perspectives, aims, formation, and roles of RFCs in defining today’s internet. Historical Background Eric A. Roberts stated that RFCs were created in 1969 by Steve Crocker who was an early ARPANET proponent. The first RFC was officially called “Host Software”, and it was the very first step to document protocols used in ARPANET . This was a simple document that aimed to describe protocols in ARPANET but signified the initiation of RFCs. At first, RFCs were meant to be simply basic, untechnical communications that the scientific community could dissect and assemble quickly for the encouragement of conversation among researchers and engineers who were developing the initial network. ARPANET stands for Advanced Research Projects Agency NET. ARPANET was first network which consisted of distributed control. It was first to implement TCP/IP protocols. It was basically beginning of Internet with use of these technologies. It was designed with a basic idea in mind that was to communicate with scientific users among an institute or university .
Characteristics of ARPANET : It is basically a type of WAN. It used concept of Packet Switching Network. It used Interface Message Processors(IMPs) for sub-netting. ARPANETs software was split into two parts- a host and a subnet. Advantages of ARPANET : ARPANET was designed to service even in a Nuclear Attack. It was used for collaborations through E-mails. It created an advancement in transfer of important files and data of defense. Limitations of ARPANET : Increased number of LAN connections resulted in difficulty handling. It was unable to cope-up with advancement in technology.
Purpose of RFCs RFCs serve several critical functions in the development and maintenance of internet protocols and standards: Documentation: RFCs are standards that offer an ideal and complete source of knowledge about specific protocols, procedures and standards. Incorporated standards”, they act as an archive of the development of Internet standards. Standardization: RFCs are used commonly for development of protocols and procedures that help create better compatible system and network. Discussion and Collaboration: Discussions carried out in the context of RFCs are relatively free and unstructured, allowing researchers, engineers and developers to get together and improve the technology, in terms of the internet. Guidance: An RFC is a finely documented method or technique of employing a a particular technology or a set of technologies for structuring the Internet and offers the best practices for implementing and deploying those technologies.
Structure of an RFC An RFC can be of a general or complex nature, although it has a specific format, which usually consists of several essential parts: Title Page : Typically the title page contains RFC number and title, the author and date of the document publication, and possibly an abstract containing a brief description of the document contents. Status of This Memo: This section shows how advanced the RFC is in terms of standards, for example, whether it is currently a proposed standard or a draft standard, or simply an RFC informational document. Table of Contents: The guidelines on what should be included in text formation also need to follow an organized approach that can easily be navigated in the document. Introduction: The memory sections giving the context, the aim of the RFC, and a description of the problem being solved. Specification: They elaborate on the main topic and involve definitions, protocols, algorithms, procedures, and other technical descriptions. Security Considerations: In this section, common security threats are identified as well as recommendations made regarding their management. References: Including all the sources referred to throughout the document like other RFCs, scientific journal articles, and technical papers. Acknowledgments: Some of things which needed to be done include: Pay tribute to all those people and organizations who have assisted in putting together the RFC.
Types of RFCs They can also be grouped in several types depending on their function and message Nevertheless, in most of the cases, RFCs are of the following types: Standards Track RFCs: These RFCs contain new internet standards or changes in the existing ones and may outline specifications for protocols and how they are to be implemented and interconnect. They are put through a lot of scrutiny and stringency before any is considered ready for adoption as standard. Informational RFCs: While these documents offer information and explanations concerning a number of issues in the area of the Internet, they do not include new standardization recommendations. Experimental RFCs: These RFCs are produced by the IETF to define experimental protocols or procedures that may not yet be ready for standardization but are published in the public domain for test and further reviews. Best Current Practice (BCP) RFCs : This is a valuable source of information as it contains information about how to properly employ and introduce Internet technologies. Historic RFCs: These RFCs refer to protocols which are no longer implemented or procedures which have been replaced by newer implementations.
The RFC Process The process of preparing and releasing an RFC is multi-step and includes several indispensable stages: Internet Draft: First, the Internet Society Internet Draft is offered as the initial proposition and this basic document is open to many amendments. Internet Drafts are normally valid for a finite time period, usually for about six months, and during this period it may be subject to update, replacement or abandonment. Working Group Review: After that, the draft passes through the respective IETF working group attempt to critically discuss, test, and modify the draft. Last Call: When the working group is ready to sign off on the document, the official document is published for a Last Call when other individuals in the IETF can respond. IESG Review: This document is then reviewed by the Internet Engineering Steering Group (IESG) to assess the accuracy, comprehensiveness, and conformity of the draft with all existing standards. Publication as RFC: And if the IESG reviews them and endorse it, then the draft is published as an RFCs by the RFC Editor and it is coded with an official number and is made public.
Significant RFCs in Internet History Different RFCs have helped in the development of the internet and some of them are the following: RFC 791: This RFC outlined the Internet Protocol or the IP, which is the fundamental tool in the routing of data packets through the networks. RFC 793: This document laid down the parameters for the Transmission Control Protocol or TCP which is essential in guaranteeing that the computing devices that are connected in a network are able to exchange data successfully. RFC 1035: This RFC defined the Domain Name System , the convention of translating reasonable sounding names into digitally suitably formatted IP addresses. RFC 2616: Under HTTP 1.1 this document had described the Hypertext Transfer Protocol that is so crucial for Web communication.