Recently I came across an interview question. I tried solving it but interviewer was looking for a better solution. The question is:
Given a source array containing zeroes and target array containing numbers, return the smallest number of steps in which you can obtain target from source. You are only allowed to do following operation: You can increase the values of source array elements each by 1 from index L to index R in one operation.
My thoughts:
Let array be [4,2,3,5]
cnt=0;
For each sub-array with only non-zero element, subtract minimum of that sub-array from each of the element of that sub-array. Count increases by sum of mins in each subarray
So array becomes :
After first step : [2,0,1,3] cnt+=1*2 (because only one subarray)
After second step : [0,0,0,2] cnt+=2+1 (because two subarray, each requiring an increment operation)
After second step : [0,0,0,0] cnt+=2
Can some one help in finding better algorithm? I am also thinking if segment tree / binary index tree could also be used but couldn't come up with a solution.