The Sad State of Web Development

Does that image really suggest Ruby invented arrow functions?

Every now and then I forget what it’s like to develop non-meteor node applications.
So every now and then I give it a try and by the end of the day I’m suicidal. Crying, I go back to Meteor. If it was really bad, Meteor won’t do and I need go back to good old C# for a while (which is very awesome these days). From there I try to forget about the crazy node world that’s out there, until the next time I think I can handle it and make the same mistake again.

All of this to say I understand the mental state the guy must have been in when he wrote this blog post and for that I’m ready to forgive the huge exaggerations and half truths in there.
Node is just a very young project with huge growing pains. I’m an optimist so I hold on to what’s nice about it and believe the rest will get better. And it actually seems to get better quite fast.

4 Likes

You see the Node.js philosophy is to take the worst fucking language ever designed and put it on the server.

I think it was the universal prevalence of JavaScript in browsers that led to the choice to use it.

But I agree that JavaScript is a bad language, especially after having come from years of ASP.NET. We are spending so much time building things using a language that still to this day truly thinks 0.1 + 0.2 - 0.3 === 0 is false. I still sometimes try it in my browser console and shake my head at the insanity.

2 Likes

I think it was the universal prevalence of JavaScript in browsers that led to the choice to use it.

From what I’ve read/watched, Dahl created Node because of it’s asynchronous nature and event loop. Specifically he was inspired by how annoying it is to implement a real upload progress bar

Here’s an excerpt from an interesting article in which Ryan Dahl explains his motivation for making node

“Turns out, a lot of the frameworks were designed in a way that they made the assumption a request — response is something that happens instantaneously and that your entire web development experience should be abstracted as a function. You get a request, you return a response. That is the extent of your context.”

“Node was originally born out of this problem — how can you handle two things at the same time? Non-blocking sockets is one way. Node is more or less the idea, or exploring the idea: what if everything was non-blocking? What if you never waited for any IO to happen?”

For example, the non-blocking IO, what falls out of that? And pairing that with JavaScript, it turns out you can make a web upload progress bar with this, among other things

It’s not a bad language, it’s just misunderstood.

I think perhaps you don’t know enough about other languages, because floating point equality checks is insane in all languages, dotnet included, it invented decimals to make it a bit better.

see …


and google for more examples in nearly any language that support floating points . Some languages come up with their own ways of storing numbers that make things better, especially for storing things like money (never do that with a float)

So why would anyone use the insane floating point numbers? Because the CPU natively computes them and for many applications they work really well. If you ever try == with a native floating point, you are doing the wrong thing :slight_smile:

2 Likes

I’ve been programming since around 10 years of age and do know about other languages.

I use decimal in .NET and BigDecimal in Java for monetary calculations. They’ve been around for almost a decade. JavaScript has been around since 1995 (over 2 decades) yet we still don’t have a type other than floating point for non-integer numerical calculations, which is imprecise. I’d think it’s one reason to call it a bad language. Yet now here we are trying to build serious applications with it (or at least I am).

1 Like

I know, which is why I’ve been using BigNumber, for all monetary calculations, but it’s such a hassle and it’s not compatible with some other packages that I need.

Indeed, I don’t understand why people keep repeating the 0.1 + 0.2 != 0.3 problem. That’s the very basics of numerical analysis, which is included in about every CS program around the world.

I think it’s even worse if the language tries to help you. In php for example 0.1 + 0.2 will result in 0.3 (so it rounds from 0.30000000000000004, but ceil((0.1+0.2)*10) will give you 4, which IMO is more confusing, since you need to think of it only some of the times instead of all the time.

1 Like

But those are statically typed languages. Are there any dynamically typed languages that solve this?
Well, maybe those that have operator overloading make it easier…
But it’s often used to diss javascript, while it’s true for many languages.

Arguably the fact that JavaScript is dynamically typed may be a more general reason to why it’s bad. It may provide flexibility but that can come at later cost. See The Case Against Dynamic Typing.

Now we have Aldeed’s SimpleShema and Meteor’s ValidatedMethod that introduces some type checking functionality, but they would not have been necessary if JavaScript was statically typed.

ASP.NET or Java doesn’t try to “help you”. As long as you declare the variables using decimal or BigDecimal, any monetary calculations will automatically be precise. It saves the time and mental energy for other more important work.

So what is the “best-practice” solution to have accurate monetary and foreign currency exchange rate calculations using JavaScript?

I’m finding dynamic typing a very hard thing to come to terms with after having spent 15 years working in strongly typed languages, but when I look at the hoops I have to jump through in C# to get code fully testable (implementing interfaces that will only ever be used once just so you can mock something and do dependency injection for example) I do wonder whether strong typing, and even OO, might actually be the wrong solution altogether.

I haven’t got as far as testing in JS yet, but it seems like it should be simpler simpler to get tests round things in JS as you won’t need to create endless interfaces. Although, I guess you will potentially need a lot more tests to deal with what happens if your arguments don’t have the properties/methods expected.

Still, as far as strong typing goes, as was said earlier, the solution to that is probably TypeScript. If it’s good enough for Google…

1 Like

TypeScript introduces the “number” data type. The joke is that it’s just the same floating point type used by JavaScript, so inherits its imprecision.

The fact that TypeScript even exists with a growing user base and is now being used by Google for Angular 2 (when previously they just used JavaScript) adds weight to the notion that JavaScript itself is a bad language and needed a lot of fixing.

2 Likes

But you can’t just say a language is bad. You can say a language is bad for a specific use case. Many people forget the script in javascript. For the use cases it was designed, it does a pretty good job.
If you’re working with 20 developers on a huge financial business application, then yeah, javascript is bad for the job. But if you’re doing that, then it’s rather you who’s bad at making technology choices. Granted, in a web front-end you don’t have much choice.

1 Like

Oh that’s genius :facepalm:

And there in lies the problem I guess. Wrong tool for the job, but also the only one that can be used.

It was designed by Netscape for front-end web page scripting. Whether it did a good job is questionable. What can we expect from a language that was developed in just 10 days? Did you program in JavaScript in the dark ages when we had to always test using at least 2 different browsers? It was annoying to have to write code that would work on all of them. We had to often write workarounds just so it would work in a particular browser even if it worked fine in the others. I hated it. Yet now I’m deeply immersed in JavaScript for what it was not originally designed for - server-side scripting with high-frequency database access and monetary calculations.

That’s right - it’s a bad language yet we are all stuck with it because it’s the only standard scripting language with the universal browser support. This is one of the points contributing to “the sad state of web development”.

@tab00 have you looked into the Elm language at all? It’s strong typing and ML syntax remind me of F# for the browser. The only cons is that it doesn’t inter-opt as nicely as typescript

1 Like

You are mixing up the language with the ecosystem and make judgments on wrong premises.

  1. Watch this
    - YouTube

  2. Use this (or any other)
    http://mathjs.org/
    ( type 0.1 + 0.2 != 0.3 into the demo )

  3. Use TypeScript
    http://www.typescriptlang.org/

Eventually dec64 will make it into ES but also, asm.js and web assembly are coming. That will further blur the line between subcomponent routines and the application layer glue for which JS is well known for.

1 Like

Does TypeScript have a decimal type appropriate for monetary calculations? I could only find the number type in the language specification.

I know that there are libraries / packages in the JavaScript / TypeScript ecosystem. Maybe you missed my post above in which I said that I chose to use BigNumber. So I have not mixed up the languages with the ecosystem.

My point is that it would have been better to have a numerical data type appropriate for precise monetary calculations as part of the language instead of needing to rely on libraries or packages.

Crockford says that using the floating point type that JavaScript now has was the wrong language design choice because of the imprecision, but it was done because other programming languages at the time used it too. But around a decade ago Java added BigDecimal and ASP.NET added decimal to deal with the imprecision problem. Nothing was done to solve it in JavaScript and it appears that TypeScript has not solved it either.

His Dec64 solution could be great but we don’t know how long it would take for it to be available for us to use.