1

Today I came across an interesting piece of code. It's more like a scientific question about the ruby parser. We know everything in ruby is an object and every expression evaluates at least to nil. But how is the following "assignment" parsed:

somevar
 NameError: undefined local variable or method 'somevar' for main:Object
somevar = "test" if false
 => nil
somevar
 => nil

You see the variable is undefined until it's used in the assignment. But the assignment is not happening because of the condition. Or is it happening because the condition evaluates to nil? I tried something which would break in this case, but it just works:

a = {}
a[1/0]
 ZeroDivisionError: divided by 0
a[1/0] = "test" if false
 => nil

So is this meant to work the way it is? Or does it make sense to test the variable (defined?(somevar)) before accessing, in case a future version of ruby will break this behaviour? As example by saving the assigned pointer to this variable. My currently used ruby version is 3.0.2.

JohnRoot
  • 13
  • 3

2 Answers2

1

This is expected behavior in Ruby. Quote from the Ruby docs:

The local variable is created when the parser encounters the assignment, not when the assignment occurs:

a = 0 if false # does not assign to a

p local_variables # prints [:a]

p a # prints nil
spickermann
  • 100,941
  • 9
  • 101
  • 131
  • Thank you :D. Shame on me I wasn't able to find this part in the docs. Even there is almost the same example :/ – JohnRoot Oct 19 '21 at 15:00
0

If you do = "test" if false it evaluates to nil => no assignment needed. But by calling somevar = ... you told the interpreter to declare the name somevar. The nil aren't the same (if that makes sense).

The [] operator however doesn't declare a variable (only accesses) but since if false isn't true there is no assignment so the whole left side isnt evaluated.

Consider:

a = [1,2,3]
a[1] = "test" if false
a
=> [1,2,3]

a[1] is neither nil nor test.

Not sure what you expect or how future Ruby will break this?

Max
  • 361
  • 4
  • 16
  • The piece I was missing, was the point where the interpreter creates the variable. To explain the reason I was thinking it could break in future: When you define this variable, there is a reference in memory which is never used. It's a very small one, still it could get "optimized". But as spickermann points out it's expected and documented behaviour. So I can savely use it :D – JohnRoot Oct 19 '21 at 15:02