Yeah, I’ve seen a lot of those videos where they do things like {} + [], but why would anyone care what JS does in that case? Unless you’re a shit-ass programmer, you’re never going to be running code like that.
The idea behind that kind of type conversion was that JS was originally designed to be extremely lenient. If it ever crashed, the web page would freeze, so it lets you do things other languages just crash from, like divide by zero.
A language’s deficiencies are rarely obvious when everyone is writing it perfectly.
But a coherent type system gives the programmer confidence - for free. Do you know what [1] + [2] is in JavaScript? Do you know what type it is? JavaScript teaches you that it has operator overloading for built-in types but then it behaves in such a dumb way you can’t use it.
That’s explained by a desire to be extremely lenient, but it’s not justified by it. Programming langauges are generally not made by idiots, so every bad decision has an explanation.
I would assume [1] + [2] would give you either 0 or 2, but maybe "12". But why you ever write that? I’ve never bothered to memorize what happens there because I would never write that. The plus operator is not for arrays. It’s for numbers and strings. If you’re trying to concatenate arrays, there’s a function for that. Would you do that in Java or C? People trying to make JavaScript do silly things just because it refuses to crash when you do then calling the language bad for it is just silly to me.
Operator overloading is a perfectly reasonable feature in a language to make use of and to assume works. If it is not going to behave sensibly, it should be an error, not nonsense, because having it work for strings but not other sequence types is surprising, and surprising is bad.
As I said, the fact that you didn’t know the result means that JavaScript’s type system is opaque and hard to understand. You might have understood that there are some things you “shouldn’t do” but being hard to understand is a bad aspect of a language even if it doesn’t prevent you from writing correct, good code.
By way of analogy, thing of a language which, like JavaScript, doesn’t require semicolons, but they are accepted. Except, if you use a semicolon after the last statement in a block, that statement never gets executed. Your reply is like saying, “just don’t use semicolons - they’re not needed” instead of acknowledging that an element of the language which is prone to causing mistakes is bad design.
I mean how can you define a sensible way to subtract Infinity from an array, or add an object to a string? The way JavaScript defines it is predictable, easy to compute, and handles bad code gracefully, which is a good tradeoff between doing something like matrix arithmetic on a CPU and just straight up crashing. If you’re doing silly things like that, you should know how JavaScript coerces types, but I don’t do silly things like that, so I don’t really care how JavaScript would handle it. Every language will do silly things if you force it to. That doesn’t make it a bad language.
Do you feel the same about C because C lets you take pointers of pointers of pointers until you’re addressing random memory in an unpredictable way? No, because it’s silly to do that.
I mean how can you define a sensible way to subtract Infinity from an array, or add an object to a string?
TypeError.
There are also various sensible ways, for example if you have an array of floats, subtracting Infinity from the array could result in an array of the same length as the original, with each value being negative Infinity. But in general inhomogeneous types should not be addable without careful thought to create a type system which is consistent and coherent. That is exactly what JavaScript did not do.
It doesn’t “handle bad code gracefully”; it handles it in a way that’s hard to reason about and hence hard to work with.
The way JavaScript defines it is predictable
You literally just failed to predict it, so I don’t think there’s any point continuing this conversation.
Yeah, I’ve seen a lot of those videos where they do things like {} + [], but why would anyone care what JS does in that case? Unless you’re a shit-ass programmer, you’re never going to be running code like that.
By this same logic, memory safety issues in C/C++ aren’t a problem either, right? Just don’t corrupt memory or dereference null pointers. Only “a shit-ass programmer” would write code that does something like that.
Real code has complexity. Variables are written to and read from all sorts of places and if you have to audit several functions deep to make sure that every variable won’t be set to some special value like that, then that’s a liability of the language that you will always have to work around carefully.
Yeah, I’ve seen a lot of those videos where they do things like
{} + [], but why would anyone care what JS does in that case? Unless you’re a shit-ass programmer, you’re never going to be running code like that.The idea behind that kind of type conversion was that JS was originally designed to be extremely lenient. If it ever crashed, the web page would freeze, so it lets you do things other languages just crash from, like divide by zero.
A language’s deficiencies are rarely obvious when everyone is writing it perfectly.
But a coherent type system gives the programmer confidence - for free. Do you know what
[1] + [2]is in JavaScript? Do you know what type it is? JavaScript teaches you that it has operator overloading for built-in types but then it behaves in such a dumb way you can’t use it.That’s explained by a desire to be extremely lenient, but it’s not justified by it. Programming langauges are generally not made by idiots, so every bad decision has an explanation.
I would assume
[1] + [2]would give you either0or2, but maybe"12". But why you ever write that? I’ve never bothered to memorize what happens there because I would never write that. The plus operator is not for arrays. It’s for numbers and strings. If you’re trying to concatenate arrays, there’s a function for that. Would you do that in Java or C? People trying to make JavaScript do silly things just because it refuses to crash when you do then calling the language bad for it is just silly to me.Operator overloading is a perfectly reasonable feature in a language to make use of and to assume works. If it is not going to behave sensibly, it should be an error, not nonsense, because having it work for strings but not other sequence types is surprising, and surprising is bad.
As I said, the fact that you didn’t know the result means that JavaScript’s type system is opaque and hard to understand. You might have understood that there are some things you “shouldn’t do” but being hard to understand is a bad aspect of a language even if it doesn’t prevent you from writing correct, good code.
By way of analogy, thing of a language which, like JavaScript, doesn’t require semicolons, but they are accepted. Except, if you use a semicolon after the last statement in a block, that statement never gets executed. Your reply is like saying, “just don’t use semicolons - they’re not needed” instead of acknowledging that an element of the language which is prone to causing mistakes is bad design.
I mean how can you define a sensible way to subtract Infinity from an array, or add an object to a string? The way JavaScript defines it is predictable, easy to compute, and handles bad code gracefully, which is a good tradeoff between doing something like matrix arithmetic on a CPU and just straight up crashing. If you’re doing silly things like that, you should know how JavaScript coerces types, but I don’t do silly things like that, so I don’t really care how JavaScript would handle it. Every language will do silly things if you force it to. That doesn’t make it a bad language.
Do you feel the same about C because C lets you take pointers of pointers of pointers until you’re addressing random memory in an unpredictable way? No, because it’s silly to do that.
TypeError.
There are also various sensible ways, for example if you have an array of floats, subtracting Infinity from the array could result in an array of the same length as the original, with each value being negative Infinity. But in general inhomogeneous types should not be addable without careful thought to create a type system which is consistent and coherent. That is exactly what JavaScript did not do.
It doesn’t “handle bad code gracefully”; it handles it in a way that’s hard to reason about and hence hard to work with.
You literally just failed to predict it, so I don’t think there’s any point continuing this conversation.
By this same logic, memory safety issues in C/C++ aren’t a problem either, right? Just don’t corrupt memory or dereference null pointers. Only “a shit-ass programmer” would write code that does something like that.
Real code has complexity. Variables are written to and read from all sorts of places and if you have to audit several functions deep to make sure that every variable won’t be set to some special value like that, then that’s a liability of the language that you will always have to work around carefully.
No.
By that same logic, memory safety issues in C/C++ don’t make them bad programming languages.
If you’re worried about it, like you’re accepting input from the user, sanitize it.
if (typeof userProvidedData !== "string") { throw new Error("Only works on strings."); }Better yet, put that in a function called
assertString.