Why does
var y = 5
y.inc()
Act more like
var y = 5
y+1
than y++ ?
The operator inc maps to ++ so I would assume the behavior on int would be same…why is it different?
Why does
var y = 5
y.inc()
Act more like
var y = 5
y+1
than y++ ?
The operator inc maps to ++ so I would assume the behavior on int would be same…why is it different?
Internally, y++
is turned into something more like:
y = y.inc()
One of the reasons to do it this way is so that the increment operator can work on immutable types (Int
is an example of just such a type).
You can read more about how the ++
and --
operators are overloaded here