About enshitification of web dev.
An fuck off with these dumbass, utterly vacuous Anti JavaScript rants.
I’m getting so sick of people being like “I keep getting hurt by bullets, clearly it’s the steel industry that’s the problem”.
Your issue isn’t with JavaScript it’s with advertising and data tracking and profit driven product managers and the things that force developers to focus on churning out bad UXs.
I can build an insanely fast and performant blog with Gatsby or Next.js and have the full power of React to build a modern pleasant components hierarchy and also have it be entirely statically rendered and load instantly.
And guess what, unlike the author apparently, I don’t find it a mystery. I understand every aspect of the stack I’m using and why each part is doing what . And unlike the author’s tech stack, I don’t need a constantly running server just to render my client’s application and provide basic interactivity on their $500 phone with a GPU more powerful than any that existed from 10 years ago.
This article literally says absolutely nothing substantive. It just rants about how websites are less performant and react is complicated and ignore the reality that if every data tracking script happened backend instead, there would still be performance issues because they are there for the sole reason that those websites do not care to pay to fix them. Full stop. They could fix those performance issues now, while still including JavaScript and data tracking, but they don’t because they don’t care and never would.
Thank you!
Almost everything the author complains about has nothing to do with JS. The author is complaining about corporate, SaaS, ad-driven web design. It just so happens that web browsers run JavaScript.
In an alternate universe, where web browsers were designed to use Python, all of these same problems would exist.
But no, it’s fun to bag on JS because it has some quirks (as if no other languages do…), so people will use the word in the title of their article as nerd clickbait. Honestly, it gets a little old after a while.
Personally, I think JS and TS are great. JS isn’t perfect, but I’ve written in 5 programming languages professionally, at this point, and I haven’t used one that is.
I write a lot of back end services and web servers in Node.js (and Express) and it’s a great experience.
So… yeah, the modern web kind of sucks. But it’s not really the fault of JS as a language.
Well, JS is horrible, but TS is really pleasant to work with.
Exactly, even if you had no front end language at all, and just requests to backend servers for static html and CSS content, those sites would still suck because they would ship the first shitty server that made them money out the door and not care that it got overloaded or was coded garbagely.
Now it takes four engineers, three frameworks, and a CI/CD pipeline just to change a heading. It’s inordinately complex to simply publish a webpage.
Huh? I mean I get that compiling a webpage that includes JS may appear more complex than uploading some unchanged HTML/CSS files, but I’d still argue you should use a build system because what you want to write and what is best delivered to browsers is usually 2 different things.
Said build systems easily make room for JS compilation in the same way you can compile SASS to CSS and say PUG or nunjucks to HTML. You’re serving 2 separate concerns if you at all care about BOTH optimisation and devx.
Serious old grump or out of the loop vibes in this article.
I straddle the time between dumping html and CSS files over sftp and using a pipeline to deliver content.
the times a deployment failed over sftp vs cicd is like night and day.
you’re always one bad npm package away from annihilation.
Yep.
On a rare occasion I hit a website that loads just like “boom” and it surprises me.
Why is that? Because now we are used to having to wait for javascript to load, decompress, parse, JIT, transmogrify, rejimble and perform two rinse cycles just to see the opening times for the supermarket.
(And that’s after you dismissed the cookie, discount/offer and mailing list nags with obfuscated X buttons and all other manner of dark patterns to keep you engaged)
Sometimes I wish we’d just stopped at gopher :)
See also: https://motherfuckingwebsite.com/
having to wait for javascript to load, decompress, parse, JIT, transmogrify, rejimble and perform two rinse cycles
This is whole sentence is facetious nonsense. Just-in-time compilation is not in websites, it’s in browsers, and it was a massive performance gain for the web. Sending files gzipped over the wire has been going on forever and the decompressing on receival is nothing compared to the gains on load time. I’m going to ignore the made up words. If you don’t know you don’t know. Please don’t confidently make shit up.
EDIT: I’m with about the nags though. Fuck them nags.
Another continual irritation:
The widespread tendency for JavaScript developers to intercept built-in browser functionality and replace it with their own poor implementation, effectively breaking the user’s browser while on that site.
And then there’s the vastly increased privacy & security attack surface exposed by JavaScript.
It’s so bad that I am now very selective about which sites are allowed to run scripts. With few exceptions, a site that fails to work without JavaScript (and can’t be read in Firefox Reader View) gets quickly closed and forgotten.
Having 2 loads gives the illusion that it’s fast, aka. not waiting staring at something not doing anything for too long.
From a business perspective, isn’t it best to just yeet most stuff to the front end to deal with?
Hahahahhah.
See also: https://motherfuckingwebsite.com/
See also: http://bettermotherfuckingwebsite.com/
And: https://thebestmotherfucking.website/
Both of which are vastly better.
See also: https://evenbettermotherfucking.website/
deleted by creator
The key idea remains though. Text on a page, fast. No objections with (gasp) colours, if the author would like to add some.
I prefer the original. The “better” one had a bit of a lag (only a fraction of a second, but in this context that’s important) loading and the “best” one has the same lag and unreadable colours.
The original is terrible. It works ok on a phone, but on a wide computer screen it takes up the full width, which is terrible for readability.
If you don’t like the colours, the “Best” lets you toggle between light mode and dark mode, and toggle between lower and higher contrast. (i.e., between black on white, dark grey on light grey, light grey on dark grey, or white on black)
OK, I was on my phone. Just checked on my desktop and agree the original could do with some margins. I stand behind the rest of what I said - the default colours for the “best” are awful - the black black and red red is really garish. If I didn’t notice the dark/light mode switch and contrast adjustment does it really matter if they were there or not? There is also way to much information on the “best” one - if I’m going to a web site cold, with no expectation at all of what you might find, I’m not going to sit there and read that much text - I need a gentle introduction, that may lead somewhere.
I actually really like the black black. And they didn’t use red red (assuming that term is supposed to mean FF0000); it’s quite a dull red, which I find works quite well. I prefer the high contrast mode though, with white white on black black, rather than slightly lower-contrast light grey text. I’m told it’s apparently evidence-based to use the lower-contrast version, but it doesn’t appeal to me.
Though I will say I intensely dislike the use of underline styling on “WRONG”. Underline, on the web, has universally come to be a signal of a hyperlink, and should almost never be used otherwise. It also uses some much nicer colours for both unclicked and visited hyperlinks.
What’s the difference between 1 and 2? And 3’s colors hurt my eyes, and flimmers while scrolling (though, color weirdness may come from DarkReader)
What’s the difference between 1 and 2?
“7 fucking [CSS] declarations” adjusting the margins, line height, font size, etc.
The most important difference between 1 and 2 is, IMO, the width limiter. You can actually read the source yourself, it’s extremely simple hand-written HTML & (inline) CSS.
max-width:650px;
stops you needing to crane your head. It also has slightly lower contrast, which I’m told is supposedly better for the eyes according to some studies, but personally I don’t really like as much, which is why “Best” is my favourite, since it has a little button to toggle between light mode and dark mode, or between lower and maximum contrast.
My usual onlineshop got a redesign (sort of). Now, the site loads the header, then the account and cart icons blink a while and after a few seconds it loads the content.
Ah yes, and the old “flash some faded out rectangles” to prepare you for that sweet, sweet, information that’s coming any… moment… now…
No, now…
Now…
Is “rejimble” a real word for a real thing?
Who’s the genius who named it that?
I made it up, but if be happy for it to be adopted.
No, but it could be if we try hard enough!
Around 2010, something shifted.
I have been ranting about Javascript breaking the web since probably close to a decade before that.
Clearly that’s indicative of you two both being accurate in your assessments.
Totally couldn’t be an old man yells at cloud situation with you two separated by close to a decade…
Totally couldn’t be an old man yells at cloud situation
It literally couldn’t, because I was a teenager at the time.
Old man yells at cloud isn’t an age, it’s a bitter mindset.
Ðis is on point for almost everyþing, alþough ðere’s a point to be made about compiling websites.
Static site generators let you, e.g. write content in a markup language, raðer ðan HTML. Ðis requires “compiling” the site, to which ðe auþor objects. Static sites, even when ðey use JavaScript, perform better, and I’d argue the compilation phase is a net benefit to boþ auþors and viewers.
What’s going on with your keyboard? I’m curious, what’s your native language?
I don’t think I really understood the compilation portion.
Compiling in the web world can also include … type checking which I think is good, minifying code which is good, bundling code which is good. I understand that in this article that they allude to the fact that those can be bad things because devs just abuse it like expecting JavaScript to tree shake and since they don’t understand how tree-shaking works, they will just assume it does and accidentally bloat their output.
Also some static site generators could do things that authors and stuff don’t think about like accessibility and all that.
Seems to be icelandic, and kind of incorporating old English letters like þ which make a th like sound and is the letter called thorn
I think they intend to use one for voiced “th” and another for unvoiced, but they mess up a few times
Static site generators let you, e.g. write content in a markup language, raðer ðan HTML.
HTML is a markup language, goddamnit! It’s already simple when you aren’t trying to do weird shit that it was never intended for!
(Edit: not mad at you specifically; mad at the widespread misconception.)
You’re right, of course. HTML is a markup language. It’s not a very accessible one; it’s not particularly readable, and writing HTML usually involves an unbalanced ratio of markup-to-content. It’s a markup language designed more for computers to read, than humans.
It’s also an awful markup language. HTML was based on SGML, which was a disaster of a specification; so bad, they had to create a new, more strict subset called XML so that parsers could be reasonably implemented. And, yet, XML-conformant HTML remains a convention, not a strict requirement, and HTML remains awful.
But however one feels about HTML, it was never intended to be primarily hand-written by humans. Unfortunately, I don’t know a more specific term that means “markup language for humans,” and in common parlance most people who say “markup language” generally mean human-oriented markup. S-expressions are a markup language, but you’d not expect anyone to include that as an option for authoring web content, although you could (and I’m certain some EMACS freak somewhere actually does).
Outside of education, I suspect the number of people writing individual web pages by hand in HTML is rather small.
For its intended use case of formatting hypertext, HTML isn’t as convenient as Markdown (for example), but it’s not egregiously cumbersome or unreadable, either. If your HTML document isn’t mostly the text of the document, just with the bits surrounded by
<p>...</p>
s and with some<a>...</a>
s and<em>...</em>
s and such sprinkled through it, you’re doing it wrong.HTML was intended to be human-writable.
HTML wasn’t intended to to be twenty-seven layers of nested
<div>
s and shit.Uh, there’s still a shitload of websites out there doing SSR using stuff like PHP, Rails, Blazor, etc. HTML is alive and well, and frankly it’s much better than you claim.
You stopped using stupid characters that aren’t in the English alphabet.
Yeah, HTML is simple and completely and utterly static. Its simple to the point of not being useful for displaying stuff to the user.
Static pages have been perfectly fit for purpose useful for displaying stuff to the user for literally thousands of years. HTML builds upon that by making it so you don’t have to flip through a TOC or index to look up a reference. What more do you want?
Lmao, oh yes bruv, let’s provide our users with a card catalog to find information on our website.
It worked for hundreds of years so it’s good enough for them right?
People want pleasant UXs that react quickly and immediately to their actions. We have decades of UX research very clearly demonstrating this.