New 02-New Trends in Computer Science.pptx

SamahAdel16 36 views 39 slides Oct 07, 2024
Slide 1
Slide 1 of 39
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39

About This Presentation

the file explores the latest developments and emerging trends in the field of computer science. It covers a range of topics including artificial intelligence, machine learning, data science, cloud computing, and blockchain technology, highlighting their impact on various industries. The presentation...


Slide Content

NEW TRENDS IN COMPUTER SCIENCE LECTURE 2

OBJECTIVES Understand the basic concepts of the latest trends in computer science Artificial Intelligence Machine Learning Robotics Process Automation Blockchains Edge Computing Virtual Reality and Augmented Reality Cyber Security Internet of Things Provide examples of real-case applications of the latest trends in computer science

ARTIFICIAL INTELLIGENCE

ARTIFICIAL INTELLIGENCE Artificial intelligence (AI) is an area of computer science that emphasizes the creation of intelligent machines that work and react like humans. Some of the activities computers with artificial intelligence are designed for include: Speech recognition – Learning – Planning - Problem solving. The main idea of a “thinking machine” came from Alan Mathison Turing (23 June 1912 – 7 June 1954) who was an English mathematician, computer scientist, logician, cryptanalyst, philosopher and theoretical biologist. Alan Turing proposed the “Turing test” which was used to measure a machine's ability to think and is an important concept in the philosophy of artificial intelligence. The basic premise of the Turing test is whether a human judge can determine he is talking to a machine or another human.

Turing Test A computer's ability to think is determined through an imitation game . In this game, there are three players A, B and C. Player A is a man, B is a woman and C is of either gender. C cannot see A and B, and communicates with the others through written notes. Player C determines which of the others is a man and which is a woman by asking a series of questions. Turing proposed that a computer can be said to possess artificial intelligence if it can mimic human responses under specific conditions. The original Turing Test requires three terminals, each of which is physically separated from the other two. One terminal is operated by a computer, while the other two are operated by humans. ARTIFICIAL INTELLIGENCE

During the test, one of the humans functions as the questioner, while the second human and the computer function as respondents. The questioner interrogates the respondents within a specific subject area, using a specified format and context. After a preset length of time or number of questions, the questioner is then asked to decide which respondent was human and which was a computer. The test is repeated many times. If the questioner makes the correct determination in half of the test runs or less, the computer is considered to have artificial intelligence because the questioner regards it as "just as human" as the human respondent. ARTIFICIAL INTELLIGENCE

Applications Marketing Banking Finance and economics Agriculture Healthcare Gaming Space Exploration Autonomous vehicles Chatbots Military Training Art and Music ARTIFICIAL INTELLIGENCE Please refer to https://www.edureka.co/blog/artificial-intelligence-applications/ for further details

Please visit this link: https://www.youtube.com/watch?v=nOnsbd7rdhc to watch a video about the most amazing examples of AI ARTIFICIAL INTELLIGENCE

MACHINE LEARNING

MACHINE LEARNING Machine learning is an application of Artificial Intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed. Machine learning focuses on the development of computer programs that can access data and use it learn for themselves. The process of learning begins with observations or data, such as examples, direct experience, or instruction, in order to look for patterns in data and make better decisions in the future based on the examples that we provide. The primary aim is to allow the computers learn automatically without human intervention or assistance and adjust actions accordingly.

Applications Image Recognition Speech Recognition Medical Diagnosis Classification Prediction Recommendation for Products and Services Online Customer Supports Virtual Personal Assistant Robot Control MACHINE LEARNING

MACHINE LEARNING Please visit this link: https://www.youtube.com/watch?v=HKcO3-6TYr0 to watch a video about the top 10 applications of machine learning

ROBOTICS PROCESS AUTOMATION

ROBOTICS PROCESS AUTOMATION Robotics Process Automation (RPA) is the term used for software tools that partially or fully automate human activities that are manual, rule-based, and repetitive. They work by replicating the actions of an actual human interacting with one or more software applications to perform tasks such as data entry, process standard transactions, or respond to simple customer service queries. For example: “chat bot” that has started to become ubiquitous on websites is almost always a robotic process automation tool, not a human. It can handle the typical standard queries like “where is X on the website”, “how do I reset my password”, and the like.

ROBOTICS PROCESS AUTOMATION

Healthcare (Patient registration - Billing) HR (Payroll process – Hiring shortlisted candidates) Manufacturing and retail (Bills of materials – Calculation of sales) Banking and financial services (Cards activation – Fraud claims) Travel and logistics (Ticket booking – Passenger details – Accounting) ROBOTICS PROCESS AUTOMATION Applications

ROBOTICS PROCESS AUTOMATION Please visit this link: https://www.youtube.com/watch?v=xW95yb6J1eU to watch a video about how robotic process automation works

BLOCKCHAINS

BLOCKCHAINS A Blockchain is essentially a distributed database of records or public ledger of all transactions or digital events that have been executed and shared among participating parties. It enables peer-to-peer transactions thereby eliminating third parties from the scenario (for instance, banks/financial institutions/state, in the case of the traditional mode of exchanging currency) and ensuring transparency as well as cost-effectiveness. https://cuts-ccier.org/pdf/Briefing_Paper-Introduction_to_Blockchain_Technology.pdf

BLOCKCHAINS

BLOCKCHAINS

Blockchains are claimed to be an incorruptible, irrefutable and permanent record of transactions. They exhibit an edge over traditional mediums of exchange by introducing unprecedented security benefits. If one were to hack through a particular block of a Blockchain, a hacker would not only need to hack into that specific block but all of the proceeding blocks going back the entire history of that Blockchain and they would need to do it on every ledger in the network, which could be millions, simultaneously. BLOCKCHAINS

BLOCKCHAINS

BLOCKCHAINS Please visit this link: https://www.youtube.com/watch?v=wFyiSnGV1sc to watch a video about real world examples of blockchain projects

EDGE COMPUTING

EDGE COMPUTING Edge computing is a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible. The move toward edge computing is driven by mobile computing, the decreasing cost of computer components and the sheer number of networked devices in the Internet of Things (IoT). Depending on the implementation, time-sensitive data in an edge computing architecture may be processed at the point of origin by an intelligent device or  sent to an intermediary server located in close geographical proximity to the client.  Data that is less time sensitive is sent to the cloud for historical analysis, big data analytics and long-term storage.

EDGE COMPUTING

VIRTUAL REALITY AND AUGMENTED REALITY

VIRTUAL REALITY AND AUGMENTED REALITY Virtual reality ( VR ) implies a complete immersion experience that shuts out the physical world. VR is typically achieved by wearing a headset like Facebook’s Oculus equipped with the technology, and is used prominently in two different ways: To create and enhance an imaginary reality for gaming, entertainment, and play (Such as video and computer games, or 3D movies, head mounted display). To enhance training for real life environments by creating a simulation of reality where people can practice beforehand (Such as flight simulators for pilots).

Augmented reality (AR) often adds digital elements to a live view by using a smartphone camera. Examples of augmented reality experiences include Snapchat lenses and the game Pokémon Go. Applications of AR in manufacturing may include: complex assembly – maintenance – expert support – quality assurance – automation VIRTUAL REALITY AND AUGMENTED REALITY

VIRTUAL REALITY AND AUGMENTED REALITY Please visit this link: https://www.youtube.com/watch?v=f9MwaH6oGEY to watch a video about what virtual reality and augmented reality are.

SUMMARY AND DISCUSSION
Tags