Wood Cutting
There is a wooden stick with length L >= 1, we need to cut it into pieces, where the cutting positions are defined in an int array A. The positions are guaranteed to be in ascending order in the range of [1, L - 1]. The cost of each cut is the length of the stick segment being cut. Determine the minimum total cost to cut the stick into the defined pieces.
Examples
L = 10, A = {2, 4, 7}, the minimum total cost is 10 + 4 + 6 = 20 (cut at 4 first then cut at 2 and cut at 7)
Solution: 2D DP
Key insights:
left stick does not affect right stick
Treat the cuts as segments, account for indexes only
Treat as merging problem
Base Case:
Adjacent index can not be cut further, thus = zero
Inductive Rule:
Iterate through all possible merge locations of current segment length
Cost of current merge = Length of substick created + left stick creation cost + right stick creation cost
ex: 4 index stick
--- -
-- --
- ---
Time Complexity: O(N * N / 2) tiles to fill * O(N) check merges = O(N^3)
Space Complexity: M[ ] = O(N * N) = O(N^2)
Last updated
Was this helpful?