0

I'm trying to implement the longest common subsequence algorithm in Ruby, but I'm getting the stack level too deep error message. I know this probably means I have an infinite loop, but I cannot spot it. Below is my best attempt—where am I going wrong?

def lcs(string1, string2)
  if !(string1.chars.any? { |x| string2.include?(x)})
    return ""
  elsif string1[-1] == string2[-1]
    return lcs(string1[0..-2], string2[0..-2]) + string1[-1]
  else
    x = lcs(string1, string2[0..-1])
    y = lcs(string1[0..-1], string2)
    x.length > y.length ? x : y
  end
end

n.b. I'm trying to return the subsequence itself, not its length.

achalk
  • 3,229
  • 3
  • 17
  • 37

1 Answers1

3

Try it with a sample string: abcd and bc and step through it.

You'll realise that these two lines have issues:

x = lcs(string1, string2[0..-1])
y = lcs(string1[0..-1], string2)

Because 'abcd'[0..-1] -> 'abcd' so the moment it reaches x = ... line it'll get stuck.

It should be [0..-2] instead.

Also, you can use if string1.chars.none? { |x| string2.include?(x)} to replace if !(string1.chars.any? { |x| string2.include?(x)})

Ho Man
  • 2,308
  • 11
  • 16