Problem Statement:

Given an array \(A\) of unsorted integers and an integer \(C\), find a contiguous subsequences of \(A\) that sums to \(C\)

Firstly, remember that a contiguous subsequence is a subarray of another array - the elements have to appear in order. For example, the answer to the question for `A = [1,2,-3,1], C = 0`

would be either `[1,2,-3]`

or `[2,-3,1]`

.

The naive solution is to take all possible subsequences. Take the first element, and get all subsequences starting with it - there are \(n\) of them. Repeat for second element. The total number of steps is \(n + (n-1) + (n-2) \dots 1 = \frac{n(n+1)}{2}\).

Define a partial sum \(S_n = \sum_{i=0}^n A[i]\). It then follows that if \(S_i - S_j = C, i > j\), the subsequence \(A[j+1], A[j+2] \ldots A[i]\) sums up to C.

In code, this means we need to traverse the array, and keep a hashmap of partial sums to the index at which they appear at. If at any point we see that the \(curSum - C\) was found beforehand, we have found our subsequence.

In python code:

Also available as a gist