Sort 131 Discussions, By:
Please Login in order to post a comment
The wording of the problem is not quite precise.
What he means is when partitioning arrays the order of elements stays the same. Man, I was solving a totally different thing...
DP isn't needed at all here. Wasted a lot of time performing a DP approach with a table before realising it's pointless.
As we only ever take the first possible cut and thus only ever have one choice of cut we will never re-use our solutions (which is the whole point of DP) thus making a DP approach pointless.
Let k be the location of our cut such that we have two arrays, a[i]..a[k] and a[k+1]..a[j].
Let opt(i,j) denote our solution for an array running from i to j
Our solution is given as opt(i,j) = max(opt(i,k),opt(k+1,j))
It should be clear now that because k takes on only one value for a particular i and j, and i <= k < j, that we never get a situation where we hit opt(i,j) more than once and we get a simple binary tree struture for our recursion.
I am not sure about the first claim of the editorial. Why can we only consider the first valid split?
HINT for timeouters: Whatch out for the all zeroes case
The best solution (with O(nlogn)) has nothing to do with dynamic programming.