Risk and Security in Cloud Computing.pdf

AmolGaikwad48 3 views 79 slides Oct 31, 2025
Slide 1
Slide 1 of 82
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76
Slide 77
77
Slide 78
78
Slide 79
79
Slide 80
80
Slide 81
81
Slide 82
82

About This Presentation

This presentation explains various risks and security issues and other related concepts in cloud computing


Slide Content

SECURITY IN CLOUD
COMPUTING by
Amol S. Gaikwad
Lecturer
Government Polytechnic Gadchiroli
Maharashtra

Learning Outcomes
Explain key features of Data Security.
Write steps to implement Cloud Data Security
Explain identity management and access facility of given Cloud set-up.
Explain security and related risks in Cloud Computing
Explain the features of Security-As-A-Cloud Service

Multi-tenancy
Multi-tenancy in clouds means multiple customers (tenants) with different
requirements are served by a single hardware infrastructure
Virtual infrastructure (virtual CPU, memory etc.) partitions a data center
among multiple customers (tenants)
In software as-a-service model cloud providers provide solutions to multiple
client organizations (or tenants) from a single, shared instance of the software
The cloud user’s information is virtually, not physically, separated from other
users

Multi-tenancy
The major benefit of this model is cost-effectiveness for the cloud provider
Some risks or issues with this model of multi-tenancy for the cloud user
include -
(a) the potential (possibility) for one user to be able to access (use)
data belonging to another user
(b) difficulty to back up and restore data
Multi-tenancy, if not implemented or maintained properly, can put a cloud
user’s data at risk of corruption, contamination, or unauthorized access

Data Outsourcing
Data outsourcing means the responsibility of storing, managing or processing
data is given to an external third party organisation or service provider
If data management is outsourced than company themselves do not handle
the data within their own company
For example - a business might store its customer’s data on Google cloud or
Amazon web services (AWS) instead of storing it on their own computers
Benefits (Advantages) of data outsourcing -
(a) Cost Saving - There is no need to buy and maintain expensive
hardware like servers and storage
(b) Data Management - Professionals or experts at the service provider
handle data security , data backup and maintenance

Data Outsourcing
Benefits (Advantages) of data outsourcing -
(c ) Scalability - Easy to increase or decrease storage as needed
(d) Accessibility - Accessible anytime, anywhere through internet
Risks (disadvantages) of data outsourcing -
(a) Data security and privacy - Sensitive (important) data is handled and
stored by third party hence there is possibility of loss of data control and
data hacking
(b) Vendor lock-in - It is difficult to move data to another service provider
(c) Compliance - Issues with data protection laws
(d) Dependence on third party and possible delays in data access

Trust Management
In this case trust toward cloud providers is important to ensure the desired
level of privacy for applications hosted in the cloud
There is a very large use of third-party services and infrastructures that are used
to host important data or to perform critical (important) operations
Multiple users share same cloud infrastructure (multi-tenancy) and users must
depend on cloud providers data security, privacy and reliability policies
The most critical(important) issue to address(solve) before cloud computing is
that of establishing trust

Trust Management
Mechanisms (methods) to build and maintain trust between cloud computing
consumers and cloud computing providers, as well as between cloud computing
providers among themselves, are essential for the success of any cloud
computing offering
The need to build trust is essential and probably the hardest because it is not a
technical issue only
For cloud computing to be ubiquitous (used every where) and trusted, we need
to make data security systems that are not hindered (made difficult) by the
application the data originates from, the mode of transport for that data, the
place the data is accessed from, or the device it is used upon

Trust Management
Important components of trust management are -
1) Trust establishment - Develop trust between user and cloud provider
2) Trust Evaluation - Measuring trust using factors like performance,
security, availability etc.
3) Trust Propagation - Sharing trust information among different parties
4) Trust update - continuously update trust values based on recent
calculations

Metadata Security
Metadata means data about data
Metadata security means protecting information that describes cloud
resources, services, data, users, configurations from unauthorized access,
manipulation (change) and misuse
Important strategies for protecting metadata includes strong
authentication and access control, encrypting metadata, auditing and
monitoring
Threats to metadata in cloud include unauthorized access, metadata spoofing
(changing metadata), side-channel attacks and insider threats

Cloud Risk
When cloud providers hosts infrastructure, data and storage there always
chance of risk each type service offered by them, this is called as cloud risk
When we select a cloud provider for a particular service then we must
consider the security of that cloud computing platform
When we select a cloud provider for a particular service then we must
consider the security of that cloud computing platform
Security of important (sensitive) data also depends on the physical location of
servers

Cloud Risk
Cloud risks can be divided into following categories
Policy and
organisational risks
Technical risks
Legal risks
Other risks

1) Lock-in - Policy and Organisational Risk
When applications, data and services are dependant on only one cloud
provider then it is called as lock-in problem
Lock-in problems occurs because of high customization of services as per user
demand
Examples - SaaS lock-in, PaaS lock-in, IaaS lock-in
Lock-in is one of the biggest problem in cloud computing

2) Loss of governance - Policy and Organisational Risk
The cloud provider may outsource service to unknown third party that may
not provide same guarantees as issued by the cloud provider
If control and operational management of the cloud changes then the term
and conditions of their services may also change
Loss of control and governance may not provide security requirements like -
confidentiality, integrity and availability of data
It may deteriorate (decrease) performance and quality of service

3) Compliance challenges - Policy and Organisational Risk
Cloud providers makes investments in certifications like SAS 70, PCI DSS and
HIPPA
These certifications shows cloud providers follows security practices and it
also improves their reputation in the market
These certification can create problem in accessing cloud services
Example - If the client wants to use EC2 service of AWS cloud then it cannot use
EC2 service for credit card transaction if it EC2 does not have PCI certification
Customer can use other services of AWS cloud but not EC2 service

4) Cloud service termination or failure - Policy and Organisational Risk
The cloud services should be available to 24x7 to customers
But due to improper business strategy, lack of financial support and other
factors the cloud provider may shut down their services
It is also possible that for short or medium period of time some cloud
computing services could be terminated5) Supply chain failure -
Cloud provider can outsource some services to third party and if there is
interruption or corruption or lack of coordination between them then it will
cause inaccessibility of services, loss of data confidentiality, availability and
integrity.
This can cause economic and reputation loss to cloud providers as they cannot
provide service to customers and SLA violation

1) Isolation Failure - Technical Risks
In cloud computing capacity, storage, and network resources are shared
between multiple customers (multi-tenancy)
This multi-tenancy can cause some threats like failure of logical and physical
separations between memory stacks, storage and routing tables where data of
multiple customers can be maliciously stored in same table
Possible attacks include - side channel attacks, SQL injection attacks, guest-
hopping attacks etc.

2) Resource exhaustions - Technical Risks
There is chance of risk in proper allocation of resources to cloud users
Many algorithms are used for proper allocation of resources to cloud services
However insufficient resource provisioning and investments in infrastructure
may lead to service unavailability problem or degradation (reduction) in
performance 3) Cloud provider malicious insider -
Malicious actions of an insider could affect the confidentiality, integrity and
availability of data and services
This damages organisation’s reputation, customer expectation and the
experience of employees

4) Interpreting data in transit - Technical Risks
Cloud services distributed architecture therefore data is transferred many
times from one physical machine to another, one VM to another VM, to
remote web clients and VPN environment
There is more risk (vulnerable) when data is transferred from on-premises to
cloud or cloud storage to on-premises
When data is transferred there is a possibility of cyber attacks like - spoofing,
man-in-middle attack and sniffing attack

5) Insecure or ineffective of data Technical Risks
Whenever a cloud provider is changed, resources are scaled down and physical
hardware is moved then client data is available beyond the lifetime
mentioned in data security policy
Full data deletion is very difficult as it can achieved only by destroying a hard
disk which also stored data of other clients
Whenever a client requests for deletion of cloud resource then data may not
be completely deleted
For this special procedures must be followed that may not be supported by
any API and strong encryption mechanism is required to reduce the risk

6) Conflict between customer hardening procedures and cloud environment Technical Risks
Cloud providers use different server hardening methods than traditional
server hardening procedures
Server hardening means changing server configurations to reduce and resist
cyber attacks
Example - AWS EC2 service follows security group and IAM role for securing
and authentication of instances, while physical servers like Linux and
Windows follow traditional methods of hardening
Clients must clearly understand the hardening procedures of cloud providers
and which best practices they are using for hardening

7) Loss of encryption keys - Technical Risks
This includes disclosure of secret keys (ex -file encryption, SSL, customer
private keys) or passwords to malicious parties
Loss or corruption of those keys, or their unauthorized use for authentication
and non-repudiation (digital signature)8) Malicious probes or scams -
Malicious probes or scanning, network mapping, are indirect threats to the
assets of cloud
They can be used to collect information in the name of hacking effort
They can cause loss of confidentiality, integrity and availability of service and
data

9) Compromise service engine - Technical Risks
All cloud providers use service engines like hypervisor or hosted application
engines
Hacking service engines damages isolation(separation) between different
customer environments and gain access to data inside them
Hackers can monitor and adapt to information in a transparent way (without
direct interaction with the application inside the customer environment)
They can also decrease the resources assigned to customer, causing a denial of
service (DoS attacks)

1) Risk from changes in jurisdiction - Legal Risks
Customer data may be kept in several jurisdictions, some of which may be
high risk
If datacentres are located in high-risk countries (ex - lack of rule of law,
unpredictable laws, dictatorship, police state, does not respect international
agreements) then sites could be attacked by local authorities and data or
system could be forced to disclosure or seizure (capture)2) Licensing -
Licensing conditions such as per-seat agreements and online licensing checks
may become unusable in a cloud environment
Example - Software charged at per instances basis so if our cloud-based
instances increases then cost of software also increases exponentially

3) Data protection risk - Legal Risks
It is tough for cloud customers to check the data processing done by cloud
providers and be sure that the data is handled in proper (lawful) way
There may data security breaches (breaks) that are not intimated (informed)
to the data controller by the cloud provider
The cloud customer may misplace control of the data administered by the
cloud provider
This issue is increased in the case of multiple transfers of data (ex - between
federated cloud porviders)

1) Backup lost or stolen Other Risks
This risk is possible due to inadequate physical security procedures
It is also possible due to AAA vulnerabilities, user provisioning vulnerabilities
and user de-provisioning vulnerabilites2) Unauthorized access to premised -
If there is no proper physical security procedures then unauthorized access to
datacentres is possible
As cloud datacentres are large physical control of these datacentres must be
stronger, even a small breach (break) can have big impact

3) Theft of computer equipment - Other Risks
If there is no proper physical security at datacentres then there can be theft of
computer equipment
Only authenticated person must be allowed to enter the physical datacentres
Dual authentication mechanism should be followed to access those machines 4) Natural disasters -
Natural disasters can happen anytime so there should be perfect disaster
recovery plan
Redundancy and fault tolerance are used by cloud providers
Example - AWS has various physical regions and multiple availability one
option within a region also

Cloud Security Services
1) Authentication
2) Authorization
3) Auditing

Authentication means digitally confirming the identity of the entity requesting
access to some protected information
It creates the identity of user and checks users are who they claim to be
In cloud computing applications and data are accessed over internet hence
digital authentication becomes very complex
Alteration (change) of authentication and authorization policies requires
involvement of the cloud provider’s system and services
The process of authentication involves validating (checking) at least one factor
of identification of the user, who is to be authenticated
1) Authentication

The factor can be something the user knows (password/pin), something user
has (smart card) or something that uniquely identifies the user (fingerprints)
In multifactor authentication combination of more than one factors is used for
user identification
1) Authentication
Who are
you ?

Authorization in cloud computing requires the use of cloud provider’s services
for specifying access policies
Authorization means privileges, permissions and rights approved to a user
who is allowed to access information and computer resources
Once the user is correctly identified (authenticated) then the authorization
decides the privileges , permissions and rights held by that user
2) AuthorizationWhat
permissions do
you have ?

Auditing is required in cloud computing for operational assurance
For cloud applications auditing mechanisms are required to get visibility into
the application and data accesses by users
3) Auditing
Auditing also checks the actions performed by the application users including
mobile users and devices such as wireless laptops and smartphones
Auditing also checks following functions of cloud -
- System and transaction controls
- Backup controls
- Data library procedures
- Data centre security
- Contingency plans

Data Security Technologies
Database Outsourcing and Query Integrity Assurance
Data Integrity in Untrustworthy Storage
Web-Application-Based Security
Multimedia Data Security

Database outsourcing has become an important part of cloud computing Database Outsourcing and Query Integrity Assurance
Due to advancement in technology large data (terabytes) can be transferred at
a very low costqueryRewrite(Q) Query Results dataTransform(T) Fig : The system architecture of database outsourcing Clients Service Provider Database Owner

Previous figure demonstrates the general architecture of a database
outsourcing environment with clients
The database owner outsources its data management tasks, and clients send
queries to theuntrusted service provider
Let ‘T’ denote the data to be outsourced. The data ‘T’ are is pre-processed,
encrypted, and stored at the service provider
For evaluating queries, a user rewrites a set of queries ‘Q’ against ‘T’ to queries
against the encrypted database
There are two security issues (problems) in database outsourcing -
1) Data privacy protection
2) Query integrityDatabase Outsourcing and Query Integrity Assurance

To protect data privacy SQL queries are executed over encrypted database
The strategy is to process as much of a query as possible by the service
providers, without having to decrypt the data.
Decryption and the remaining part of the query processing are performed at
the client side
Another method proposes an order-preserving encryption scheme for numeric
values that allows any comparison operation to be directly applied on
encrypted data.
This technique is able to handle updates, and new values can be added without
requiring changes in the encryption of other values1) Data Privacy Protection

Generally, existing methods enable direct execution of encrypted queries on
encrypted data sets and allow users to ask identity queries over data of
different encryptions
The main goal of data privacy protection is to make queries in encrypted
databases as efficient as possible while preventing adversaries from learning
any useful knowledge about the data1) Data Privacy Protection

Query integrity checks the trustworthiness of the hosting environment
When a client receives a query result from the service provider, it wants to be
assured that the result is both correct and complete2) Query Integrity Assurance
Where correct means that the result must originate from the owner’s data and
not has been tampered (changed) with, and complete means that the result
includes all records (data) satisfying the query
One method used Merkle hash tree which is based on the idea of using a
signature on the root of the Merkle hash tree to generate a proof of
correctness
Another method utilized an aggregated signature to sign each record with the
information from neighbouring records by assuming that all the records are
sorted with a certain order

This method ensures the completeness of a selection query by checking the
aggregated signature2) Query Integrity Assurance
Another method proposed a mechanism called the challenge token and uses it
as a probabilistic proof that the server has executed the query over the entire
database
This method can handle any types of queries including joins and does not
assume that the stored data is ordered (sorted)
Dual encryption method ensures query integrity without requiring the
database engine to perform any special function beyond query processing

Dual encryption enables cross-examination of the outsourced data, which
consist of (a) the original data stored under a certain encryption scheme and
(b) another small percentage of the original data stored under a different
encryption scheme.2) Query Integrity Assurance
Users generate queries against the additional piece of data and analyze their
results to obtain integrity assurance

Data Integrity in Untrustworthy Storage Loss of control over data is a major security issue for users in cloud storage
service
Cloud storage infrastructure provider may become untrustworthy or even
malicious like - mistakes in operations, deny vulnerability in the system
Various remote data storage protocols have been suggested
Remote data checking protocols should satisfy the following five
requirements
Verifier - data owner or trusted third party
Prover - storage service provider or storage medium owner or system
administrator

Data Integrity in Untrustworthy Storage Requirement #1
It should not be a pre-requirement that the verifier should have a complete
copy of the data to be checked
It is not necessary for the verifier to keep a duplicated copy of the content to
be verified
Storing a more concise contents digest of the data at the verifier is
enough

Data Integrity in Untrustworthy Storage Requirement #2
The protocol has to be very robust because the prover is untrustworthy
A malicious prover is motivated to hide the viola tion of data integrity
The protocol should be robust enough that such a prover should fail to
convince the verifier
Requirement #3
The amount of information exchanged during the verification operation
should not lead to high communication overhead

Data Integrity in Untrustworthy Storage Requirement #4
The protocol should be computationally efficient - it should not use very
large computing resources like CPU, memory etc.
Requirement #5
It should be possible to run the verification an unlimited number of times

PDP Based Integrity Checking Protocol This protocol is based on the provable data procession (PDP)technology, which
allows users to obtain a probabilistic proof from the storage service providers
This proof will be used as evidence that their data have been stored there
One of the advantages of this protocol is that the proof could be generated
by the storage service provider by accessing (using) only a small portion of
the whole dataset
At the same time, the amount of the metadata that end users are required
to store is also small—that is, O(1)
Additionally, such a small amount data exchanging procedure lowers the
overhead in the communication channels also

PDP Based Integrity Checking Protocol The steps in the protocol are as below -
The data owner, the client in the figure, executes the protocol to verify that a
dataset is stored in an outsourced storage machine as a collection of ‘n’ blocks
Before uploading the data into the remote storage, the data owner pre-
processes the dataset and a piece of metadata is generated
The metadata are stored at the data owner’s side, and the dataset will be
transmitted to the storage server
The cloud storage service stores the dataset and sends the data to the user in
responding to queries from the data owner in the future

PDP Based Integrity Checking Protocol As part of pre-processing procedure, the data owner (client) may conduct
operations on the data such as expanding the data or generating additional
metadata to be stored at the cloud server side
The data owner could execute the PDP protocol before the local copy is
deleted to ensure that the uploaded copy has been stored at the server
machines successfully
The data owner may encrypt a dataset before transferring them to the storage
machines.
During the time that data are stored in the cloud, the data owner can generate
a “challenge” and send it to the service provider to ensure that the storage
server has stored the dataset

PDP Based Integrity Checking Protocol The data owner requests that the storage server generate a metadata based
on the stored data and then send it back
Using the previously stored local metadata, the owner verifies the response
The PDP protocol only randomly accesses one sub-data block when the sample
the stored dataset
Hence, the PDP protocol probabilistically guarantees the data integrity
It is mandatory(compulsory) to access the whole dataset if a deterministic
guarantee is required by the user

PDP Based Integrity Checking Protocol Input file F Client Server Client store Server store m m F ‘ F ‘ Client generates
metadata (m) and
modified file (F) No server
processing (a) Pre-process and store

PDP Based Integrity Checking Protocol Client Server m m F ‘ F ‘ P 0/1 (1) Client generates a
random challenge ‘R’
(2) Server
computes proof
of possession ‘P’
(3) Client verifies
server’s proofClient store Server store
(b) Verify server possessionR

PDP-based protocol does not satisfy Requirement #2 with 100% probability
An enhanced protocol has been proposed based on the idea of the Diffie
Hellman schemeAn Enhanced Data Possession Checking Protocol
this protocol satisfies all five requirements and is computationally more efficient
than the PDP-based protocol
This protocol satisfies all five requirements and is computationally more efficient than
the PDP-based protocol
The verification time has been shortened at the setup stage by taking advantage of the
trade-offs between the computation times required by the prover and the storage
required at the verifier

Cloud computing provides resources as service to users over internet
users access these business applications and computing resources on-line
from a Web browserWeb-Application-Based Security
But applications (software) and data are stored on cloud servers
When any Web security vulnerability (weakness) is identified, attacker will use those
techniques to take advantage of the security vulnerability, this is called as web attack

Web-Application-Based Security Authentication Authorization
Client-Side Attacks
Command Execution
Information Disclosure
Logical AttacksFig : Types of web attack

Authentication is a process of verifying a user that he claims to be
Authentication attacks target a Web site’s method of validating the identity of
a user, service, or application, including Brute Force, Insufficient
Authentication, and Weak Password Recovery ValidationAuthentication
Brute force attacks means trying all the possible combinations to guess username
and password, it is trial and error method
In the Insufficient Authentication case, some sensitive content or functionality are
protected by “hiding” the specific location in obscure (encoded/encrypted) string but still
remains accessible directly through a specific URL

Many websites provide password recovery service by answering some
questions, if these questions can be easily guessed or skipped then that Web
site is considered to be Weak Password Recovery ValidationAuthentication
The attacker could discover those URLs through a Brute Force probing of files and
directories

Insufficient authorization occurs when a Web site does not protect sensitive
content or functionality with proper access control restrictionsAuthorization
Authorization means weather an authenticated user has permission to perform certain
operations (tasks), authentication should be done before authorization
Authorization attacks include Credential/Session Prediction, Insufficient Session
Expiration, and Session Fixation
In many Web sites, after a user successfully authenticates with the Web site for the first
time, the Web site creates a session and generate a unique “session ID” to identify this
session. This session ID is used in next requests as proof of authenticated session
Credential/Session Prediction attack deduces or guesses the unique value of a session to
hijack or impersonate a user

Authorization Insufficient Session Expiration occurs when an attacker is allowed to reuse old session
credentials or session IDs for authorization
For example, in a shared computer, after a user accesses a Web site and then leaves,
with Insufficient Session Expiration, an attacker can use the browser’s back button to
access Web pages previously accessed by the victim
Session Fixation forces a user’s session ID to an arbitrary value via Cross Site
Scripting or peppering the Web site with previously made HTTP requests.
Once the victim logs in, the attacker uses the predefined session ID value to
impersonate the victim’s identity

Client Side Attacks The Client-Side Attacks lure( force) victims to click a link in a malicious Web page and
then take advantage of the trust relationship expectations of the victim for the real Web
site
In Content Spoofing, the malicious Web page can trick a user into typing user name and
password and will then use this information to impersonate the user
Cross-Site Scripting (XSS) launches attacker-supplied executable code in the victim’s
browser. The code is usually written in browser-supported scripting languages such as
JavaScript, VBScript, ActiveX, Java, or Flash
This code has the ability to read, modify, and transmit any sensitive data, such as
cookies, accessible by the browser

Client Side Attacks Cross-Site Request Forgery (CSRF) is a server security attack to a vulnerable site that does
not take the checking of CSRF for the HTTP/HTTPS request
If the URLs of the vulnerable site which are not protected by CSRF checking after luring
the victim to click a link in a malicious Web page, the attacker can forge the victim’s
identity and access the vulnerable Web site on victim’s behalf

Command Execution The Command Execution attacks exploit server-side vulnerabilities to execute remote
commands on the Web site
If a Web application does not properly check user-supplied input before using it within
application code, an attacker could alter (change) command execution on the server
For example, if the length of input is not checked before use, buffer overflow could
happen and result in denial of service
Or if the Web application uses user input to construct statements such as SQL, XPath,
C/C11 Format String, OS system command, LDAP, or dynamic HTML, an attacker may
inject (insert) arbitrary executable code into the server if the user input is not properly
filtered

Information Disclosure The Information Disclosure attacks acquire sensi tive information about a web site
revealed by developer comments, error messages, or well-know file name conventions
For example, a Web server may return a list of files within a requested directory if the
default file is not present.
This will supply an attacker with necessary information to launch further attacks
against the system.
Other types of Information Disclosure includes using special paths such as “.” and “..” for
Path Traversal, or uncovering hidden URLs via Predictable Resource Location

Logical Attacks Logical Attacks involve the exploitation of a Web application’s logic flow
A common Logical Attack is Denial of Service (DoS). DoS attacks will attempt to consume
all available resources in the Web server such as CPU, memory, disk space, and so on, by
abusing (exploiting) the functionality provided by the Web site
When any one of any system resource reaches some utilization threshold (level), the
Web site will no long be responsive to normal users
DoS attacks are often caused by Insufficient Anti-automation where an attacker is
permitted to automate a process repeatedly
An automated script could be executed thousands of times a minute, causing potential
loss of performance or service.

Multimedia Data Security Storage With the rapid developments of multimedia technologies, more and more multimedia
contents are being stored and delivered over many kinds of devices, databases, and
networks
Multimedia Data Security plays an important role in the data storage to protect
multimedia data
Important security issues in multimedia data security are as below -
Protection from unauthorized replication
Protection from unauthorized replacement
Protection from unauthorized pre-fetching
Multimedia data means digital information like audio, video, graphics and text

Protection from Unauthorized replication Contents replication is required to create multiple copies of certain multimedia
contents
For example, content distribution networks (CDNs) have been used to manage content
distribution to large numbers of users, by keeping the replicas (copies) of the same
contents on a group of geographically distributed servers
Unauthorized replication means replication (creating copies) without permission
Although the replication can improve the system performance, the unauthorized
replication causes some problems such as contents copyright, waste of replication cost,
and extra control overheads

Protection from Unauthorized replacement As the storage capacity is limited, a replacement process must be carried out when
the capacity exceeds its limit
It means the situation that a currently stored content must be removed from the
storage space in order to make space for the new coming content
If an unauthorized replacement happens, the content which the user doesn’t want to
delete will be removed resulting in an accident of the data loss
However, how to decide which content should be removed is very important
Furthermore, if the important content such as system data is removed by
unauthorized replacement, the result will be more serious

Protection from Unauthorized Pre-fetching The Pre-fetching is widely used in Multimedia Storage Network Systems between
server databases and end users’ storage disks
It means, If a content can be predicted to be requested by the user in future requests,
this content will be fetched from the server database to the end user before this user
requests it, in order to decrease user response time
Although the Pre-fetching shows its efficiency, the un authorized pre-fetching should
be avoided to make the system to fetch the necessary content

Data Security Risks In cloud computing user data are are stored and maintained by a third-party cloud
provider such as Google, Amazon, Microsoft, and so on
When data is upload into the database in cloud data centre, it is important that
during this process data should not be hijacked (hacked)
It is necessary to the stores the data in the data centre to ensure that they are
encrypted at all times
The access to those data need to be controlled; this control should also be applied to
the hosting company, including the administrators of the data centre
Data resources should also be protected during its use

Data Security Risks In cloud computing it is possible that the data from one country may be hosted in
another country hence various data security acts and rules should be followed of that
country
As multiple users are using cloud system, access control becomes much more
fundamental issue
One way this can be done is by adding an element of control, in the form of access
control, to afford a degree of risk mitigation (reduction)
Information-centric access control (as opposed to access control lists) can help to
balance improved accessibility with risk, by associating access rules with different
data objects within an open and accessible platform, without losing the inherent
usability of that platform

Data Security Risks In cloud computing there is risk of use of content after access
Protection of information through out its life cycle (begin to end), encryption and
access control of data are an important components of cloud security

Digital Identity Digital identity is important for flexible data security within a cloud environment
Identity is closest to the heart of the individual, our identity is our most personal
possession and a digital identity represents who we are and how we interact with
others on-line
Digital identity can be used to form the base for data security, not only in the cloud
but also at the local network level too
Access, identity and risk are connected with each other in data security, if access
increases then risk also increases
Access, identity and risk are connected with each other in data security, if access
increases then risk to data security also increases

Digital Identity Access can be controlled if the user trying to access the data can be indetified
Digital identity can be programmatically linked to security policies controlling the post-
access usage of data
Digital identity, in a security system, must be a verified identity by some trusted third
party
Even if your digital identity is verified by a trusted host, it can still be under an
individual’s management and control
A digital identity can carry with it many identifiers (attributes) about an individual
that make identity theft a problem, but identity should also be kept private for the
simple reason of respect

Digital Identity and Access Management (DIAM/IAM) Identity management provides consistent methods for digitally identifying persons
and maintaining associated (related) identity attributes for users across multiple
organisations
Access management deal with user identities , their authentication, authorization and
access policies
Standardized access control policies ensure confidentiality of data
Role-based access control in cloud
Role-based access control method is used to restrict access to confidential information
to authorized users only
In this different roles are defined for different users

Digital Identity and Access Management (DIAM/IAM) All users from a specific department within an organisation can be given one role and
there can be different roles for different departments
When a user wants to access the application data in the cloud, it is required to send
his/her data to the system administrator who assigns (gives) permissions and access
control policies which are stored in User Roles and Data Access Policies databases
respectively
Role-based access control in cloud
The role based access control method allows access to application data to users based
on the assigned roles and data access policies

Role-based access control in cloud credentials
verification user
management
access policy
management
user role
assignment
Role Based
Access ControlCloud App Credentials DB User Roles DB Data Access Policies DB User Administrator

Case Study - AWS Identity and Access Management (IAM) AWS Identity and Access Management (IAM) enables you to create multiple Users
Permissions for each of these Users are managed within your AWS Account
A User is an identity (within your AWS Account) with unique security credentials that can
be used to access AWS Services
IAM eliminates the need to share passwords or access keys, and makes it easy to enable
or disable a User’s access as required
IAM enables you to implement security best practices, such as least privi lege, by granting
unique credentials to every User within your AWS account
It only grant permission to access the AWS Services and resources required for the Users
to perform their job

Case Study - AWS Identity and Access Management (IAM) IAM is secure by default; new Users have no access to AWS until permissions are
explicitly granted
IAM is natively (originally) integrated into most AWS Services.
No service APIs have changed to support IAM, and applications and tools built on top
of the AWS service APIs will continue to work when using IAM
Applications only need to begin using the access keys generated for a new User
You should minimize the use of your AWS Account credentials as much as possible
when interacting with your AWS Services and take advantage of IAM User credentials
to access AWS Services and resources

Content Level Security Content level or information-centric security means that the content that makes up
any given data object (for example, a Word document) is protected, as opposed to the
file— that is, the carrier of that information being protected
In this we control the document content itself and improve on the access control
measures
Content-centric security, which is also digital identity led (i.e., the identity used to access
the content), also dictates (decides) the security policy applied to that content and will
allow us to control who can access the document
Even when the document is held on third-party servers, in the cloud it can’t be accessed
by even the system administrator of that server, because the access is controlled at the
content level and is not dependent on the access to the database holding the data

Content Level Security Advantages (Pros)
- Greater control on data
- More focused access control
- increased granular protection over content, and assurance within a cloud-hosted
system
Disadvantages (Cons)
- Content-centric security measures need to be compatible with both database
security and secure transfer of data within a cloud environment
- Content level security may be problematic, especially across different storage types
and in use with query engines, which is particularly pertinent (related) with the use of
dynamic data updating, as required by modern data storage operations

Features of Security-As-A-Cloud Service Security as a Cloud Service, or Security as a Service (SECaaS), is a business model
where a service provider delivers cybersecurity solutions on a subscription basis via
the cloud. organizations use these cloud-delivered services to protect their data,
applications, and network
Data encryption - Data is protected both while it's stored ("at rest") and while it's being
transferred ("in motion").
Data Loss Protection (DLP) - Services are designed to prevent sensitive data from leaving
the protected environment.
Vulnerability Management - Includes vulnerability checks and the automatic patching of
vulnerable areas and misconfigurations
Identity and Access Management (IAM) - Ensures only authorized users access specific
resources, often through mechanisms like multi-factor authentication and role-based
access control.

Features of Security-As-A-Cloud Service Centralized Control - Provides a single point for managing user identities and access
policies across various cloud environments.
Advanced Threat Protection - Includes real-time detection of intrusions, distributed
denial-of-service (DDoS) attacks, and other malicious web-based activity.
Automated Monitoring and Response - Continuously monitors for threats and can
automatically remediate issues, minimizing the time to respond and reducing impact.
AI-driven threat intelligence - Takes advantage of artificial intelligence to identify and
mitigate sophisticated and zero-day threats
Centralized Visibility - Offers a single point to monitor and manage the security
situation of all cloud resources.

Features of Security-As-A-Cloud Service Compliance Management - Provides features and reporting to help ensure the
organization meets security and regulatory compliance standards.
Disaster Recovery - Includes plans and mechanisms to prevent data loss and ensure
service continuity (availability) in the event of a cloud outage.
Redundancy and Failure - Ensures that services remain available by using redundant
systems and automated failover processes

THANK YOUTHANK YOU