8

I have a problem which requires a string to be transformed into another one by appending copies of its' initial value to itself. The problem allows to remove single characters at some places.

Explanation

let x = "abba"; // First string
let y = "aba" // Second initial string

y("aba") => remove last "a" => y("ab") => y+initialY = "ab"+"aba" => 
y("ababa") => remove char at index 2 => y("abba") => y == x => sucess

My algorithm successfully solves the problem:

let x = "abbbbcccaaac"
let y = "abc"

let xArr = x.split('')
let yArr = y.split('')

let count = 0;

for (let i = 0; i < xArr.length; i++) {
  if(yArr[i] == undefined) {
    yArr = yArr.concat(y.split(''));  
    count++;
  }
  if(xArr[i] != yArr[i]) {
    yArr.splice(i, 1);
    i--;
  }
}

console.log("Output is:", yArr.join(''))
console.log("Appends in order to transform:", count)

The algorithm works as intended, however, I am uncertain regarding its time and space complexity and most importantly - efficiency.

  1. Is this algorithm in O(n) time complexity where n is the length of x?

  2. If this is not O(n), can the problem be solved in O(n) time?

  3. Does .concat(), .splice() or .split() somehow change the time complexity since they are nested in a for loop? What if they weren't, do they still change the time complexity of an algorithm and by how much?

  4. Given the rules of this problem, is this an efficient way to solve it?

  5. What is the space complexity of this algorithm?

kaya3
  • 47,440
  • 4
  • 68
  • 97
  • Actually, if we really knew the time-complexity of splice, split or concat, then it would obviously affect the overall time-complexity of your algorithm. But since we don't, we would regard them as a constant operation, O(1), which in effect keeps the loop operation at an O(N) - worst case. The space complexity is O(1) because we are not creating new storage (in this case, count) for every loop operation rather we just update it. IMO, it's a fair solution since I don't know what the constraints for your given problem are. – Caleb Lucas Dec 04 '18 at 00:08
  • 1
    @CalebLucas "But since we don't, we would regard them as a constant operation" - should we? Would this problem still be meaningful in that case? Besides, the ECMA specification provides hints as to what they are. – meowgoesthedog Dec 04 '18 at 08:07
  • @meowgoesthedog "should we?" - I feel it depends on one's objective. These functions are optimized at runtime by the javascript engine. If the aim is to really dig into the specifics of each function and their application in a given scenario, then sure, considering their time-complexity would be paramount. – Caleb Lucas Dec 04 '18 at 08:58
  • @CalebLucas you are right split() ,concat() etc have been optimize by JS engine. However that doesn't mean there is no time/space complexity for those operation. There are 2 answer to this based on intention of the question. If this is for interview then we need consider time and space complexity of those methods. If it's for app you don't need to worry about. Looking at question, it seems interview question, for that OP need to consider those thing because next question will be with is complexity for this algo – Viral Mar 02 '19 at 18:57

1 Answers1

3

Normally a question like this is quite difficult to give a definite answer to, because different implementations of Javascript have different time complexities for basic array operations (such as creating a new array of size n). Javascript arrays will typically be implemented either as dynamic arrays or hashtables, and these data structures have different performance characteristics.

So, there is no definitive time complexity for splice to remove one element from an array. What we can say is that removing one element takes linear time for a dynamic array, and as @Ry- points out in the comments, also linear time for a hashtable, because of the need to renumber the later indices. We can also say that it's highly likely one of these two data structures is used, and no sensible implementation will take more than linear time to do splice.

Either way, the worst case for your algorithm is when x = 'aa...aa' and y = 'abb...bb', i.e. x is n copies of 'a', and y is 'a' followed by (m - 1) copies of 'b'.

For a dynamic array or a hashtable, then the time complexity for just the splice operations is O(nm²). This is because the outer loop iterates O(nm) times (note the i-- inside the loop, which happens every time the letter 'b' needs to be removed), and the splice operation requires shifting or renumbering O(m) elements in yArr after index i.

But suppose some more exotic data structure is used which supports removing an element in sub-linear time (e.g. a skip list). In that case, the above only gives O(nm) times the complexity of the "remove" operation. But we haven't counted concat yet; this creates a new data structure and copies every item into it, which will still take linear time. concat is called O(n) times and takes an average of O(n + m) time per call, so the complexity of just the concat operations is O(n² + nm).

So the time complexity is very likely O(n² + nm²), and certainly at least O(n² + nm); not linear.


The space complexity is O(n), since the length of yArr is never more than twice as long as xArr.

kaya3
  • 47,440
  • 4
  • 68
  • 97
  • 2
    `splice(i, 1)` isn’t O(1) for an index -> value hashtable, because it has to renumber everything. – Ry- Jan 28 '20 at 05:46