Lecture 3 insertion sort and complexity analysis

3,557 views 39 slides Mar 26, 2016
Slide 1
Slide 1 of 39
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39

About This Presentation

Lecture 3 insertion sort and complexity analysis


Slide Content

Lecture 3 : Analysis of
Algorithms & Insertion Sort
Jayavignesh T
Asst Professor
SENSE

Time Complexity
•Amount of computer time required by an algorithm
to run on completion
•Difficult to compute time complexity in terms of
physically clocked time.
•Drawbacks of measuring running time in-terms of
seconds, millisecond etc are
–Dependence of speed of a underlying hardware
–Number of other programs running (System load)
–Dependence of compiler used in generating
machine code

How to calculate running time then?
•Time complexity given in terms of FREQUENCY
COUNT
•Count denoting number of times of execution of
statement.

For (i=0; i <n;i++) { // St1 : 1, St 2 : n+1 , St 3 : n times
sum = sum + a[i]; // n times
}
3n + 2 ; O(n) neglecting constants and lower order terms

How to calculate running time then?
for (i=0; i < n ; i ++) // 1 ; n+1 ; n times
{
for (j=0; j < n ; j ++) // n ; n(n+1) ; n(n)
{
c[i][j] = a[i][j] + b[i][j];
}
}
3n
2
+4n+ 2 = O(n
2
)

Time Complexity
•Number of steps required by an algorithm varies
with the size of the problem it is solving.

•Normally expressed as order of magnitude
–eg O(n
2
)
–Size of problem doubles then the algorithm will
take 4 times as many steps to complete

How to calculate running time then?
•All Algorithms run longer on larger inputs
•Algorithm’s efficiency - f(n)
•Identify the most important operations of the
algorithm – BASIC Operation
•Basic operation – contributing to most of total
running time
•Compute the number of times basic operation is
executed (mostly in inner loop)
Ex : Sorting Algorithms – Comparison (< >)
Matrix Multiplication, Polynomial evaluation – Arithmetic Operations ( *, +)
= (assignment), ==(equality) etc..

Order of Growth of Algorithm
•Measuring the performance of an algorithm
in relation with input size n
•Cannot says it equals n
2
, but it grows like n
2

EFFICIENCY COMPARISONS

Rate of Growth of Algorithm as fn of i/p size

Determination of Complexities
•How do you determine the running time of
piece of code?

Ans : Depends on the kinds of statements used

1. Sequence of Statements
Statement 1;
Statement 2;


Statement k;
•Independent statement in a piece of code and not an
unrolled loop
•Total Time : Adding the time for all statements.
•Total Time = Time (Statement 1) + Time (Statement 2) + … +
Time (Statement k)
•Each statement – simple (basic operations) – Time constant –
Total time is also constant O(1)

1 (Constant Time)
•When instructions of program are executed once or at
most only a few times , then the running time
complexity of such algorithm is known as constant time.

•It is independent of the problem size.
•It is represented as O(1).

•For example, linear search best case complexity is O(1)

Log n (Logarithmic)
•The running time of the algorithm in which large
problem is solved by transforming into smaller sizes sub
problems is said to be Logarithmic in nature.
•Becomes slightly slower as n grows.
•It does not process all the data element of input size n.
•The running time does not double until n increases to
n
2
.
•It is represented as O(log n).
•For example binary search algorithm running time
complexity is O(log n).

2.For loops
for (i=0; i<N;i++)
{
Sequence of statements
}

•Loop executes N times, Sequence of statements also executes N
times.
•Total time for the for loop = N*O(1) = O(N)

3.If-then-else statements
If(cond) {
Sequence of statements 1
}
Else
{
Sequence of statements 2
}
•Either Sequence 1 or Sequence 2 will execute.
•Worst Case Time is slowest of two possibilities
–Max { time (sequence 1), time (sequence 2) }
–If Sequence 1 is O(N) and Sequence 2 is O(1), Worst case time for
if-then-else would be O(N)

n (Linear)
•The complete set of instruction is executed once for each
input i.e input of size n is processed.

•It is represented as O(n).

•This is the best option to be used when the whole input has
to be processed.

•In this situation time requirement increases directly with the
size of the problem.

•For example linear search Worst case complexity is O(n).

4.Nested Loops
For (i=0;i<N;i++){
for(j=0;j<M;j++){
sequence of statements;
}
}

Total Complexity = O(N*M)
= O(N
2
)

5.Statement with function calls
•for (j=0; j<N; j++) g(N); has complexity O(N
2
)
–Loop executes N times
–g(N) has complexity O(N)

n
2
(Quadratic)
•Running time of an algorithm is quadratic in nature
when it process all pairs of data items.
•Such algorithm will have two nested loops.
•For input size n, running time will be O(n
2
).
•Practically this is useful for problem with small input
size or elementary sorting problems.
•In this situation time requirement increases fast with
the size of the problem.
•For example insertion sort running time complexity is
O(n
2
).

Performance Classification

Efficiency comparisons

Function of Growth Rate

Prob1. Calculate worst-case complexity!
•Nested Loop + Non-nested loop
for (i=0;i<N;i++){
for(j=0;j<N;j++){
sequence of statements;
}
}
for(k=0;k<N;j++){
sequence of statements;
}
•O(N
2
), O(N) = O(max(N
2
,N) = O(N
2
)

Prob 2.Calculate worst-case complexity!
•Nested Loop
for (i=0;i<N;i++){
for(j=i;j<N;j++){
sequence of statements;
}
}
•N+ (N-1) + (N-2) + …. + 1 = N(N+1)/2 = O(N
2
)

Approaches of Designing Algorithms
•Incremental Approach
•Insertion sort
–In each iteration one more element joins the sorted array
•Divide and Conquer Approach
–Recursively break down into 2 or more sub problems until it
becomes easy to solve. Solutions are combined to give
solution to original problem
•Merge Sort
•Quick Sort

Insertion Sort
3 4 6 8 9 7 2 5 1
1 n
j 
 i
Strategy

• Start empty handed
• Insert a card in the right position
of the already sorted hand
• Continue until all the cards are
Inserted/sorted

Analysis – Insertion Sort

Insertion Sort – Tracing Input

Analysis – Insertion Sort
•Assume that the i
th
line takes time c
i , which is a constant.
(Since the third line is a comment, it takes no time.)

•For j = 2, 3, . . . , n, let tj be the number of times that the
while loop test is executed for that value of j .

•Note that when a for or while loop exits in the usual way -
due to the test in the loop header - the test is executed
one time more than the loop body.

Analysis – Insertion Sort – Running time

Best case Analysis

Worst case Analysis

Average Case

Divide-and-Conquer
•The most-well known algorithm design strategy:
1. Divide instance of problem into two or more smaller
instances
2. Solve smaller instances recursively
3. Obtain solution to original (larger) instance by
combining these solutions

•Type of recurrence relation

Divide-and-Conquer Technique (cont.)
Tags