distributed_os.pptx to the world and then proceed with this code
chauhanankii1234
13 views
23 slides
Oct 13, 2024
Slide 1 of 23
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
About This Presentation
There is no point
Size: 203 KB
Language: en
Added: Oct 13, 2024
Slides: 23 pages
Slide Content
Distributed Operating System Definition: A Distributed Operating System (Distributed OS) runs applications on interconnected computers, enhancing communication and integration. Multiple CPUs, seamless for end-users, resource sharing across sites.
Resource Sharing: CPUs, disks, network interfaces, nodes, and computers shared across the system. Loosely Coupled System: Processors connected via high-speed buses and communication lines. Interconnected Nodes: Multiple computers and sites linked through LAN/WAN lines. Effective Communication: High-speed buses and telephone lines connect processors. Local Memory: Each processor equipped with its local memory. Networked Processors: Neighboring processors facilitate communication. CHARACTERISTICS
HARDWARE CONCEPTS Tightly coupled systems (multiprocessors) shared memory intermachine delay short, data rate high Loosely coupled systems ( multicomputers ) private memory intermachine delay long, data rate low
Types of Distributed Operating Systems Client-Server Systems Peer-to-peer system Middleware
Applications of Distributed OS Internet Technology Distributed Databases System Air Traffic Control System Airline Reservation Control Systems Peer-to-Peer Networks System Telecommunication Networks Scientific Computing System Cluster Computing Grid Computing Data Rendering
Examples of Distributed Operating System Few examples of a distributed OS are as follows: AIX operating system for IBM RS/6000 computers. Solaris operating system for SUN multiprocessor workstations. Mach/OS is a multitasking and multithreading UNIX compatible operating system. OSF/1 operating system
Message Passing in Distributed Systems Communication medium for nodes in distributed systems. Commuting information and coordinating actions. Transferring and receiving messages between nodes. Coordination, synchronization, data sharing.
Types of Message Passing Synchronous Message Passing: Ensures sender and receiver synchronization. Asynchronous Message Passing: Allows concurrent execution and non-blocking communication.
Synchronous Message Passing Sender and receiver synchronize their actions. Critical applications that require precise coordination. Example: Real-time systems and safety-critical applications. Asynchronous Message Passing Enables concurrent execution and non-blocking communication. Systems where performance and responsiveness are critical. Example: High-throughput distributed databases and web services.
Synchronous Message Passing in a Banking System Transferring funds from one account to another. Sender: Initiates the transfer. Receiver: Processes the transfer, ensuring consistency. Blocking: Sender waits for confirmation that the transaction is complete.
Asynchronous Message Passing Email Sending : A user composes an email using their email client and clicks the "Send" button. The email client packages the email content into a message and sends it to the email server. The user's email client doesn't block and allows the user to continue using the email client for other tasks. Email Server Processing : The email server receives the email message and stores it in the outgoing mail queue. The server acknowledges the receipt of the email message. The server doesn't block and continues to handle other incoming messages. Message Delivery : The email server processes messages from the outgoing mail queue. It identifies the recipient's email server and attempts to deliver the email. If the recipient's email server is online and available, it accepts the message for delivery. If the recipient's server is offline, the email server retries the delivery at intervals without blocking the sender. Recipient's Email Server : The recipient's email server receives the email and stores it in the recipient's inbox. It sends a notification to the recipient's email client or device if it's online. The recipient's server doesn't block but continues to process other incoming emails. Recipient's Email Client : The recipient's email client receives a new email notification. The recipient can choose to open and read the email when they're ready. Response (Reply) : If the recipient chooses to respond to the email, the process reverses. The recipient's email client sends a reply message to the sender. The sender's email client receives the reply asynchronously and allows the user to respond at their convenience.
Real-Time Collaborative Document Editing Synchronous Message Passing (Synchronous Collaboration) : Users A, B, and C are collaboratively editing a document in real-time. User A types a message in a chat window embedded in the document editing tool. The message is sent synchronously to the server, which processes it immediately. The server broadcasts the message to users B and C in real-time. Users B and C receive the message in their chat windows, and it appears instantaneously. In this part of the scenario, synchronous message passing is used for real-time chat within the collaborative document editing tool. It ensures that messages are delivered and displayed to all users instantly, promoting synchronous collaboration. Asynchronous Message Passing (Document Changes) : User A makes changes to the document, such as editing a paragraph. These changes are packaged into a message and sent asynchronously to the server. User A doesn't need to wait for confirmation or synchronization from the server before continuing to edit. The server processes the document changes, stores them, and acknowledges receipt asynchronously. Users B and C receive the document changes asynchronously, and the document updates on their screens without blocking their ability to continue editing or reviewing.
Message Passing in Practice Message queues, publish-subscribe systems, RPC (Remote Procedure Call). Examples of popular message passing tools and libraries.
Parallel Operating Systems help speed up the processing time by dividing the task into multiple sub-tasks or sub-processes. N umerous processors can deal with various tasks simultaneously.
It can be achieved using various processors in a single system or systems forming a cluster. numerous operating systems reach faster processing speeds. Also, such systems can be used when multiple applications are to be run simultaneously without interfering with each other. Such systems can handle multiple loads at the same time. A parallel operating system has many applications, such as database and mining, argument reality, various engineering fields, graphics, etc. Some examples of parallel operating systems are Microsoft Hyper-V, Oracle VM, Sun xVM Server, etc.
Examples of parallel operating systems Linux with support for SMP (Symmetric Multi-Processing). AIX (IBM's UNIX-based OS) for high-performance computing. Solaris with support for multi-core processors. Windows Server for enterprise parallel processing. HPE NonStop OS for high availability and parallel processing.
Working of a Parallel Operating System The main task is broken down into smaller sub-tasks in a parallel operating system design. Each sub-task is assigned to a different system for processing. The task is completed quickly since each instruction is further divided for various CPU components. a single system transmits and processes instructions whenever a task is assigned to different components for simultaneous processing. In a parallel operating system, various systems achieve the same much quicker, and such a system can handle much more loads of work simultaneously.
It has a multiprocessing environment for faster processing. At the same time, it implements security measures among the processors. A parallel operating system design can handle much more work than a single system. Resources are shared effectively among different processors. Processes that may thread with each other are focused on so that there is no interference between simultaneous processes.
Types of Parallel Operating Systems parallel operating systems are categorized into two types:- type 1 and type 2. type-1:- This system runs directly on metal and acts as a native hypervisor. Systems that share physical hardware or virtual machines can be used in this type. The operating system does not provide any Input/Output emulations. For example, VMware uses the type-1 parallel operating system design. type-2:- This type is hosted on a hypervisor. But, it is executed when the hypervisor runs on conventional operating systems like Windows, Linux, etc.
Application of Parallel Operating System Parallel operating systems are used in various applications to harness the power of multiple processors or cores for improved performance and efficiency. High-Performance Computing (HPC): Parallel OSs are essential for scientific simulations, weather modeling, and other data-intensive HPC tasks. Data Centers and Cloud Computing: They enable efficient resource allocation and scalability in data centers and cloud environments. Real-time Systems: Used in applications like autonomous vehicles and industrial automation where rapid response times are critical. Database Management: Parallel databases benefit from the ability to distribute and process data across multiple nodes. Scientific and Engineering Simulations: Parallel computing accelerates complex simulations in fields like physics and engineering. Machine Learning and AI: Parallelism accelerates deep learning and training of complex neural networks. Parallel operating systems facilitate better performance, scalability, and responsiveness in these and other compute-intensive applications.
B enefits of parallel operating systems. It helps reduce task completion time since multiple processes are run simultaneously. It can solve large and complex operating system problems. It achieves this by sharing resources efficiently. It can allocate much more memory space and resources to processors than a single system making it faster.
limitations of using a parallel operating system. The architecture of such systems is complex and requires a high power supply and high maintenance. Such systems are costly since numerous processors require many resources. Also, coolers are used, which are expensive too.