Introduction to Dynamic Programming Session No.: 20 Course Name: Design & Analysis of Algorithms Course Code: R1UC407B Instructor Name: Akhilesh tripathi Duration: 50 Date of Conduction of Class: -05-2025
Recap Single Source Shortest Path: Dijkstra's Algorithm
At the end of this session students will beS Students will be able to: Galgotias University 3
Dynamic Programming? Dynamic programming (DP) is a problem-solving approach that breaks down a complex problem into smaller, overlapping subproblems. it then solves each subproblem once and stores the results to avoid redundant calculations , leading to efficient solutions, especially for optimization problems and those with a recursive structure.
Subproblem? A sub-problem is a smaller version of the original problem. It is created by breaking the original problem down into smaller and smaller pieces until it is no longer possible to break it down any further.
Divide & Conquer v/s DP Feature Divide and Conquer Dynamic Programming Subproblem Nature Independent (no shared sub-subproblems) Overlapping (same subproblems computed multiple times) Solution Reuse Generally does not reuse solutions of subproblems Stores and reuses solutions of subproblems (memoization/tabulation) Approach Top-down (recursive) Can be top-down (memoization) or bottom-up (tabulation) Efficiency Can be inefficient if overlapping subproblems exist More efficient for problems with overlapping subproblems Problem Type Often for general problem decomposition Primarily for optimization problems Key Principle Breaking down and combining Avoiding redundant calculations and optimal substructure Example Merge Sort, Quick Sort, Strassen's Fibonacci Sequence, 0/1 Knapsack
How Dynamic Programming works Identify the Subproblems: Determine the smaller, overlapping subproblems that make up the overall problem. Solve Subproblems Once: Solve each subproblem and store its solution in a lookup table or other data structure. Build the Solution: Use the stored solutions to the subproblems to construct the solution to the overall problem.
Example (nth Fibonacci Number) Pure Recursive (naive approach) Time Complexity of this approach is
Recursive Tree
Ways to solve the problem
Steps to design DP solutions Characterize the Optimal Substructure Definition : The optimal solution to the problem can be constructed from the optimal solutions of its subproblems. Example : F(n) can be found from F(n-1) and F(n-2). If F(n) is optimal, then F(n-1) and F(n-2) must also be optimal for their respective subproblems.
Steps to design DP solutions Characterize the Overlapping Subproblems: The problem can be broken down into subproblems, which are reused multiple times in the recursive solution. Example : When computing F(5), you need F(4) and F(3). For F(4), you need F(3) and F(2). Notice F(3) is computed twice. This confirms overlapping subproblems.
Steps to design DP solutions Define the DP State: What information do you need to store to solve a subproblem? This forms the "state" of your DP table. Example : dp [ i ] will represent the i-th Fibonacci number.
Steps to design DP solutions Formulate the Recurrence Relation: It's an equation that shows how to compute the solution for a larger subproblem using the solutions of smaller subproblems. Example (Fibonacci): F(n) = F(n-1) + F(n-2) with base cases F(0) = 0, F(1) = 1.
Memoization (Top-Down Approach) Memoization is an optimization technique used in dynamic programming that involves storing the results of expensive function calls and reusing them when the same inputs occur again. Storage : Uses data structures like dictionaries, arrays, or hash maps to store results of subproblems. Recursive Approach : Implemented in a top-down manner, where the function checks for stored results before computing new ones. Efficiency: Reduces redundant calculations, significantly lowering time complexity (often from exponential to polynomial).
Memoization (Top-Down Approach) Use Cases Problems with overlapping subproblems. Inefficient recursive solutions due to repeated calculations. Situations where the overhead of memoization storage is justified by performance gains.
Tabulation (Bottom-Up Approach) Tabulation is an optimization technique in dynamic programming that involves solving problems by filling up a table (usually an array) iteratively, starting from the base cases and building up to the desired solution. Table Structure: Typically uses arrays or matrices to store results of subproblems. Iterative Approach : Uses a bottom-up approach, solving all subproblems and storing their results in a table. Efficiency : Directly builds the solution by iterating through all possible subproblems, reducing time complexity.
Tabulation (Bottom-Up Approach) Use Cases: Problems that benefit from a bottom-up solution. Situations where an iterative approach is more efficient or easier to implement than recursion. Problems where all subproblems need to be solved to get the final solution.
Bottom-Up DP without Explicit Memory Bottom-up dynamic programming without explicit memory optimization involves solving problems iteratively, starting from the base cases and progressing to the final solution, while minimizing the use of additional storage. Iterative Approach : Uses a bottom-up method to solve subproblems iteratively. Memory Optimization : Minimizes or avoids extra storage by only keeping track of the necessary current and previous states. Efficiency : Reduces both time and space complexity, making the algorithm more efficient.
Summary
Post Session Activity You need to attempt the post assessment activity on the LMS. Give your feedback for this session your LMS by the end of the day. Next session: "0/1 Knapsack Problem" Read the pre-class study materials for the next session.