2
var romanToInt = function(s) {
  value = 0;
  for (let i = 0; i < s.length; i += 1) {
    symbols[s[i]] < symbols[s[i + 1]] ? value -= symbols[s[i]] : value += symbols[s[i]]
  }
  return value
};

This is an leetcode exmple, I am confused with this condition symbols[s[i]] < symbols[s[i+1]] ?,I don't understand why s[i+1] won't out of range? or it is out of range but considered as false?

Nick Parsons
  • 45,728
  • 6
  • 46
  • 64
Blacktea
  • 83
  • 6

4 Answers4

6

In javascript, arrays are objects, so there's no such thing as out of range in javascript arrays. Array indexes are stored as keys. If the index doesn't exists in the array, undefined is returned.

const arr = [1, 2, 3];
console.log(typeof arr);    // output: "object"
console.log(arr[4]);        // output: undefined
Yousaf
  • 27,861
  • 6
  • 44
  • 69
2

Array on MDN

JavaScript arrays are zero-indexed: the first element of an array is at index 0, and the last element is at the index equal to the value of the array's length property minus 1.

Using an invalid index number returns undefined.

Emphasis added.

In your case, any comparison of < or > with a number and undefined is false.

crashmstr
  • 28,043
  • 9
  • 61
  • 79
1

Javascript arrays are just objects, and indexes are like keys, as for any other field.

You can witness Javascript behavior here:

let a = [ 0, 1, 2 ];

for (let i = 0; i < 5; i++){
   console.log(a[i]);
}

console.log("Same for other fields:")
console.log(a.foo);
console.log(a.bar);

a.foo = "hi!"

console.log(a.foo);
Pac0
  • 21,465
  • 8
  • 65
  • 74
0

In JavaScript, arrays are declared without a fixed size. Hence s[i+1] won't be out of range, it will just be undefined. Hence the final output will be false.