Understanding Dynamic Programming in Algorithms and Data Structures

Slide Note
Embed
Share

Dynamic programming, as explained by Shmuel Wimer in March 2022, delves into solving optimization problems efficiently by breaking them down into simpler sub-problems and avoiding repeated computations. The process involves considering various approaches such as recursive solutions, top-down and bottom-up techniques, and cut retrieval of optimal solutions. By memorizing sub-problems and utilizing memorization arrays, dynamic programming ensures that solutions are optimized and runtime complexity is minimized.


Uploaded on Oct 09, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Dynamic Programming Prepared by Shmuel Wimer March 2022 Algorithms and DS I: Dynamic Programming 1

  2. Rod Cutting Optimization Length? Price?? 1 3 2 6 3 8 10 12 4 5 ? = 3 + 6 + 6 = 15 ? = 3 + 10 = 13 ? = 6 + 8 = 14 What cut of the rod maximizes the revenue ??? For ?-length rod there are 2? 1 possible cuts (less are distinct but still non-polynomial in ?) March 2022 Algorithms and DS I: Dynamic Programming 2

  3. Recursive solution: ??= ???1????+ ??? Very inefficient solution! Run time is exponential with ?. March 2022 Algorithms and DS I: Dynamic Programming 3

  4. Same sub-problems are solved many times! 5 0 4 3 1 2 0 0 0 0 3 1 1 2 1 2 0 0 0 0 0 0 1 1 2 1 0 0 0 0 1 Avoid repetitions by memorizing sub-problems ? 1 0 ? ? = ? 2? ? ? = 1 + ?=0 March 2022 Algorithms and DS I: Dynamic Programming 4

  5. Top-down Solution by Dynamic Programming // initialize revenues of sub-problems March 2022 Algorithms and DS I: Dynamic Programming 5

  6. // avoid repetition of sub-problems // use memorized solution // solve this sub-problems ? ? = ? ?2 because calls to previously solved sub-problem return immediately. March 2022 Algorithms and DS I: Dynamic Programming 6

  7. Bottom-up Solution by Dynamic Programming // max revenue memorization array // update max cost // memorize max revenue ? ? = ? ?2 because solving sub-problems in ascending size avoids repetitive solution of same size by using memorization. March 2022 Algorithms and DS I: Dynamic Programming 7

  8. Cut Retrieval of Optimal Solution // max revenue and optimal 1st cut memorization array // find what 1st cut maximize revenue for ? // is better cut found // update max revenue // record where cut occurs // memorize max revenue March 2022 Algorithms and DS I: Dynamic Programming 8

  9. Max revenue cuts is retrieved as follows March 2022 Algorithms and DS I: Dynamic Programming 9

  10. Optimal Matrix-Chain Multiplication Given matrix chain ?1?2 ??, #col ?? 1= #row ??. What multiplication order yields smallest number of scalar products? ?1?2?3?4 can be parenthesize as follows (multiplication is associative): (?1(?2(?3?4))) ,(?1(?2?3)?4)) ,((?1?2)(?3?4)), ((?1(?2?3))?4) ,(((?1?2) ?3)?4) Let ?10 100 , ?100 5 and ?5 50 ?? ? 10 100 5 + (10 5 50) = 7500 products ? ?? 10 100 50 + (100 5 50) = 75000 products! March 2022 Algorithms and DS I: Dynamic Programming 10

  11. #parenthesization: ? ? = 1 if ? = 1 if ? 2 ? 1? ? ? ? ? ?=1 ? ? grows non-polynomial, at least 2? (homework) What is the structure of an optimal solution? Notation: ?? ? ????+1 ?? If ? < ?, parenthesization ?, ? ? < ? so ?? ?= ?? ???+1 ?. Multiplication cost: cost ?? ? + cost ??+1 ? + cost ?? ???+1 ?. Optimal solution parenthesization of ?? ? and ??+1 ? must minimize # scalar products at each. Above suggests a recursive solution. March 2022 Algorithms and DS I: Dynamic Programming 11

  12. Notation: ? ?,? all parenthizationscost ?? ?. min size of ?? is ?? 1 ?? ?? ???+1 ? multiplication takes ?? 1???? scalar products. Consequently, ? ?,? = ? ?,? + ? ? + 1,? + ?? 1???? Since ? can be any of ? ? < ?, 0 min ? ?<?? ?,? + ? ? + 1,? + ?? 1???? if ? = ? if ? < ? . (1) ? ?,? = Optimal parenthesis retrieval uses ? ?,? which stores the optimal ?. March 2022 Algorithms and DS I: Dynamic Programming 12

  13. Optimal cost computation can use straight-forward recursions, but as in the rod cutting problem it takes at least exponential time. (Similar to Rod Cutting, memorization suggest efficient to-down solution, see CLRS 15.3.) Rather, repetitive solution of same problems are avoided by memorizing ? ?,? in a ? ? matrix initialized to (only ? ? matters) . Eqn. (1) shows that computing ? ?,? , which chain length is ? ? + 1 depends only on shorter sub-chains yielding ? ?,? and ? ? + 1,? , calling for a bottom-up solution approach. The following algorithm runs in ? ?3 and uses ? ?2 memory space. March 2022 Algorithms and DS I: Dynamic Programming 13

  14. ?,? // no multiplication, zero cost // set sub chain length // compute min cost for all sub chains of length ? // set sub chains border March 2022 Algorithms and DS I: Dynamic Programming 14

  15. Retrieval of Optimal Parenthesization ? ?,? of the table ? 1..? 1,2..? records at what ? the optimal parenthesization ????+1 ?? splits between ?? and ??+1. Hence ?1..? 1,? from there can be retrieved recursively. ?? 1,? +1..? starts the optimal parenthesization and // equivalent to "(? ,?,")" March 2022 Algorithms and DS I: Dynamic Programming 15

  16. Example: Let the split table below obtained for ?1 ?6 optimal parenthesization. Calling PRINT POTIMAL PARENS ?,1,6 yields ((?1?2?3)((?4?5)?6)). Dynamic Programming comprises: 1. Optimal substructures 2. Overlapping sub problems 3. Reconstruction of optimal solution 4. Memorization March 2022 Algorithms and DS I: Dynamic Programming 16

  17. Longest Common Subsequence (LCS) ?1= ACCGGTCGAGTGCGCGGAAGCCGGCCGAA ?2= GTCGTTCGGAATGCCGTTGCTCTGTAA ?1 and ?2 describe DNA. The longest common subsequence ?3 defines their similarity ?1= ACCGGTCGAGTGCGCGGAAGCCGGCCGAA ?2= GTCGTTCGGAATGCCGTTGCTCTGTAAA ?3= GTCGTCGGAAGCCGGCCGAA ? = ?1,?2, ,?? is a subsequence of ? = ?1,?2, ,?? if there is a strictly increasing sequence ?1,?2, ,?? such that ???= ??, 1 ? ?. March 2022 Algorithms and DS I: Dynamic Programming 17

  18. Since ? = ?1,?2,,?? has 2? distinct subsequences, finding LCS by enumeration is impractical. Definition: ??= ?1,?2, ,?? is called the ?th prefix of ?. Theorem: (Optimal substructure of LCS) Let ? = ?1,?2, ,?? be any LCS of ? = ?1,?2, ,?? and ? = ?1,?2, ,??. Then 1. If ??= ?? ??= ??= ?? and ?? 1 is LCS of ?? 1 and ?? 1, 2. If ?? ??, then ?? ?? ? is LCS of ?? 1 and ?, 3. If ?? ??, then ?? ?? ? is LCS of ? and ?? 1. Proof: (1) If ?? ??then ??= ??could be appended to ?, yielding LCS of length ? + 1, contradicting ? LCS. March 2022 Algorithms and DS I: Dynamic Programming 18

  19. Hence ??= ??= ?? and ??1 is ? 1-length common subsequence of ?? 1 and ?? 1. If ?? 1 is not LCS of ?? 1 and ?? 1, there is ? LCS of length at least ?. Appending ??= ?? to ? yields LCS of ? and ? of length ? + 1, a ccomtradiction to ?. (2) If ?? ??, then ?? ?? ? is a subsequence of ?? 1 and ?, but if it was not LCS, there was another ? LCS of ?? 1 and ? with length ? + 1 at least. ? is also LCS of ?? (= ?) and ?, contradiction that ? is such. (3) Symmetric to (2). March 2022 Algorithms and DS I: Dynamic Programming 19

  20. Consequence: LCS of two sequences contains LCS of prefixes of the two, thus LCS problem posses optimal substructure property. Consequence: Given ? and ?, if ??= ?? then LCS is obtained from LCS of ?? 1 and ?? 1. Otherwise the LCS is obtained from the longest LCS among ?? 1 and ? or ? and ?? 1. Let ? ?,? be LCS of ?? and ??, then 0 if ? = 0 or ? = 0, if ?,? > 0 and ??= ??, if ?,? > 0 and ?? ??. ? ? 1,? 1 + 1 max ? ?,? 1 ,? ? 1,? ? ?,? = Straight-forward implementation of above recursion solves exponential sub-problems (why?) but only ?? distinct LCS sub problems exist. March 2022 Algorithms and DS I: Dynamic Programming 20

  21. // 0-length ? prefix // 0-length ? prefix // case (1) // LCS growing // case (2) // LCS not growing // case (3), LCS not growing March 2022 Algorithms and DS I: Dynamic Programming 21

  22. LCS Retrieval The equation defining ? ?,? enables LCS retrieval in reverse order. ? 0 ?? 1 2 3 4 5 6 ? B D C A B A Starting at ? ?,? , examination of ? ? 1,? 1 , ? ?,? 1 determines how ? ?,? was obtained. ?? A 0 0 0 0 0 0 0 0 ? ? 1,? and 0 0 0 0 1 1 1 1 2 B C 0 0 1 1 1 1 1 2 1 2 2 2 2 2 3 LCS increases whenever ??= ??. Notice that LCS is not unique! B 0 1 1 2 2 3 3 4 5 D 0 1 2 2 2 3 3 Q: Does the order of (2) and (3) in LCS- LENGTH matter? A 0 1 2 2 3 3 4 6 7 B 0 1 2 2 3 4 4 March 2022 Algorithms and DS I: Dynamic Programming 22

Related


More Related Content