Primers  Dynamic Programming
 Introduction
 Two Ways of Solving Problems
 Example: Length of the Longest Increasing Subsequence
 Famous DP Problems
 Online Resources
 Quiz
Introduction
Dynamic Programming is a powerful technique used for solving a particular class of problems. The concept is based on a very simple overarching idea:
If you have solved a problem with a particular input, save the result for future reference, so as to avoid solving the same problem again!
Always remember!
“Those who can’t remember the past are condemned to repeat it”
Two Ways of Solving Problems

TopDown : Start solving the given problem by breaking it down. If you see that the problem has been solved already, then just return the saved answer. If it has not been solved, solve it and save the answer. This is usually easy to think of and very intuitive. This is referred to as memoization.

BottomUp : Analyze the problem and see the order in which the subproblems are solved and start solving from the trivial subproblem, up towards the given problem. In this process, it is guaranteed that the subproblems are solved before solving the problem. This is referred to as dynamic programming.
Example: Length of the Longest Increasing Subsequence
 Given a sequence, the goal of the longest increasing subsequence problem is, as the name suggests, to find the longest increasing subsequence. Formally, given a sequence \(S={a_1, a_2, a_3, a_4, \ldots, a_{n1}, a_n}\) we have to find a longest subset such that for all \(i\) and \(j\), \(j < i\) in the subset \(a_j < a_i\).
 We start off by trying to find the value of the longest subsequence at every index \(i\), say \(LSi\), with the last element of sequence being \(a_i\). Then largest LSi would be the longest subsequence in the given sequence. To begin \(LSi\) is assigned to be one since \(a_i\) is element of the sequence (last element). Then for all \(j\) such that \(j < i\) and \(a_j < a_i\), we find the largest \(LSj\) and add it to \(LSi\).
 The algorithm is quadratic in time. In other words, it has a time complexity of \(O(n^2)\)
Pseudocode

The time complexity of the algorithm can be reduced by using better a data structure rather than an array. Storing the predecessor array and its index would make the algorithm efficient.

Similar concept could be applied in finding longest path in Directed Acyclic Graph (DAG).
for i=0 to n1
LS[i]=1
for j=0 to i1
if (a[i] > a[j] and LS[i] < LS[j])
LS[i] = LS[j] + 1
for i=0 to n1
if (largest < LS[i])
Famous DP Problems
 Floyd Warshall Algorithm  Tutorial and C Source
 Integer Knapsack Problem  Tutorial and C Source
 Longest Common Subsequence  Tutorial and C Source
Online Resources
 MIT 6.006: Video Lectures  Lessons 19, 20, 21, 22
 LeetCode Dynamic Programming Patterns
 TopCoder: Dynamic Programming from Novice to Advanced
 CodeChef
 InterviewBit
 GeeksForGeeks:
 How to write DP solutions
Quiz
 Sample Quiz to test your knowledge.