• FishFace@piefed.social
    link
    fedilink
    English
    arrow-up
    4
    ·
    13 hours ago

    A language’s deficiencies are rarely obvious when everyone is writing it perfectly.

    But a coherent type system gives the programmer confidence - for free. Do you know what [1] + [2] is in JavaScript? Do you know what type it is? JavaScript teaches you that it has operator overloading for built-in types but then it behaves in such a dumb way you can’t use it.

    That’s explained by a desire to be extremely lenient, but it’s not justified by it. Programming langauges are generally not made by idiots, so every bad decision has an explanation.

    • hperrin@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      I would assume [1] + [2] would give you either 0 or 2, but maybe "12". But why you ever write that? I’ve never bothered to memorize what happens there because I would never write that. The plus operator is not for arrays. It’s for numbers and strings. If you’re trying to concatenate arrays, there’s a function for that. Would you do that in Java or C? People trying to make JavaScript do silly things just because it refuses to crash when you do then calling the language bad for it is just silly to me.

      • FishFace@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        36 minutes ago

        Operator overloading is a perfectly reasonable feature in a language to make use of and to assume works. If it is not going to behave sensibly, it should be an error, not nonsense, because having it work for strings but not other sequence types is surprising, and surprising is bad.

        As I said, the fact that you didn’t know the result means that JavaScript’s type system is opaque and hard to understand. You might have understood that there are some things you “shouldn’t do” but being hard to understand is a bad aspect of a language even if it doesn’t prevent you from writing correct, good code.

        By way of analogy, thing of a language which, like JavaScript, doesn’t require semicolons, but they are accepted. Except, if you use a semicolon after the last statement in a block, that statement never gets executed. Your reply is like saying, “just don’t use semicolons - they’re not needed” instead of acknowledging that an element of the language which is prone to causing mistakes is bad design.

        • hperrin@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 minutes ago

          I mean how can you define a sensible way to subtract Infinity from an array, or add an object to a string? The way JavaScript defines it is predictable, easy to compute, and handles bad code gracefully, which is a good tradeoff between doing something like matrix arithmetic on a CPU and just straight up crashing. If you’re doing silly things like that, you should know how JavaScript coerces types, but I don’t do silly things like that, so I don’t really care how JavaScript would handle it. Every language will do silly things if you force it to. That doesn’t make it a bad language.

          Do you feel the same about C because C lets you take pointers of pointers of pointers until you’re addressing random memory in an unpredictable way? No, because it’s silly to do that.

          • FishFace@piefed.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 minute ago

            I mean how can you define a sensible way to subtract Infinity from an array, or add an object to a string?

            TypeError.

            There are also various sensible ways, for example if you have an array of floats, subtracting Infinity from the array could result in an array of the same length as the original, with each value being negative Infinity. But in general inhomogeneous types should not be addable without careful thought to create a type system which is consistent and coherent. That is exactly what JavaScript did not do.

            It doesn’t “handle bad code gracefully”; it handles it in a way that’s hard to reason about and hence hard to work with.

            The way JavaScript defines it is predictable

            You literally just failed to predict it, so I don’t think there’s any point continuing this conversation.