I'm trying to evaluate expression (1 <= month <= 12)
in a if condition.
This statement seems valid in javascript, but not in Java.
In Java,
int month = 0;
boolean flag = (1 <= month <= 12);
It throws following error:
The operator <= is undefined for the argument type(s) boolean, int
In Javascript,
var month = 0;
console.log('Expression evaluates to: ', (1 <= month <= 12));
It always returns
true
no matter what the value of month is.
Can someone please explain:
true
in javascript?Also I know I can get it to work it this way (1 <= month && month <= 12)
. So, not looking for a solution but an explanation.
Thanks. Also let me know if my questions are not clear.
<=
is non-associative, so you can't use it by repetition. You can specify it with:
1 <= month && month <= 12
The reason is that the JavaScript parser parses 1 <= month <= 12
as:
(1 <= month) <= 12
It's a consequence of the grammar of JavaScript, they could have defined it otherwise, but it would complicate the matter a bit. Most grammars define the expressions as:
expr -> [0-9]+
expr -> identifier
expr -> expr '<=' expr
(with an LALR) parser.
And Java uses the following (approximate) grammar:
expr -> numExpr '<=' numExpr
expr -> numExpr
numExpr -> identifier
numExpr -> [0-9]+
(...and so on...)
In Java it is thus even impossible to parse such expression (unless you perform a cast which makes it a numExp
again).
For the JavaScript part, why does it always return true
?
Now (1 <= month)
is a boolean (true/1
or false/0
), and that value cannot be compared (reasonable) with 12
(0
and 1
are always less than or equal to 12
). Only very limited programming languages support such feature.
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments