Using Python Greedy Approach to Solve Longest Increasing Subsequence Problem

I’m trying to solve a longest-increasing subsequence problem using a greedy approach in Python. I’m using the algorithm outlined from this reference. I’ve written some code to find the longest increasing subsequence of a given array but I’m getting the wrong result. I’m not sure if my code is incorrect or if I’m missing something about the algorithm. Here’s my code:

def LIS(arr):
    n = len(arr)
    lis = [1] * n
    # Compute optimized LIS values in bottom up manner
    for i in range (1 , n):
        for j in range(0 , i):
            if arr[i] > arr[j] and lis[i]< lis[j] + 1 :
                lis[i] = lis[j]+1
    # Initialize maximum to 0 to get the maximum of all
    # LIS
    maximum = 0
    # Pick maximum of all LIS values
    for i in range(n):
        maximum = max(maximum , lis[i])
    return maximum

# Test Cases
arr = [10, 22, 9, 33, 21, 50, 41, 60]
print(LIS(arr))

Any help would be much appreciated! Thanks!

Start by trying to solve a simpler problem. How would you find the length of the increasing subsequence that starts at the first element? (Hint: would you compare every element no matter what, or would you stop at some point? Would you keep comparing the elements to the first element, or would you compare them in some other way?)

1 Like