The modern web is an insult to the idea of efficiency at practically every level.
You cannot convince me that isolation and sandboxing requires a fat 4Gb slice of RAM for a measly 4 tabs.
For my home PC, sure. Running some windows apps on my Linux machine in wine is a little weird and sluggish. Discord is very oddly sluggish for known reasons. Proton is fine tho.
But for my work? Nah. My M3 MacBook Pro is a beast compared to even the last Intel MacBook. Battery is way better unless you’re like me and constantly running a front end UI for a single local service. But without that, it can last hours. My old one could only last 2 meetings before it started dying.
I paid for the whole amount of RAM, I’m gonna use the whole amount of RAM.
/s
Joke aside, the computer I used a little more than a decade ago used to take 1 minute just to display a single raw photo. I’m a liiiittle better off now.
More like:
You paid for more RAM, so I’ll use whole amount of RAM.
– All software developers
Was it a Raspberry Pi?
The program expands so as to fill the resources available for its execution
– C.N. Parkinson (if he were alive today)
That’s not “programmer_humor”, that’s an absolute fact. Had to go on an ancient laptop here for an old lost file recently, it was WAY faster than a new ultra speedy decked out recent build Win12 machine. WTAF??
Thought leaders spent the last couple of decades propaganding that features-per-week is the only metric to optimize, and that if your software has any bit of efficiency or quality in it that’s a clear indicator for a lost opportunity to sacrifice it on the alter of code churning.
The result is not “amazing”. I’d be more amazed had it turned out differently.
Fucking “features”. Can’t software just be finished? I bought App. App does exactly what I need it to do. Leave. It. Alone.
“More AI features”? Of course we can implement more AI features for yo.
It’s kind of funny how eagerly we programmers criticize “premature optimization”, when often optimization is not premature at all but truly necessary. A related problem is that programmers often have top-of-the-line gear, so code that works acceptably well on their equipment is hideously slow when running on normal people’s machines. When I was managing my team, I would encourage people to develop on out-of-date devices (or at least test their code out on them once in a while).
Optomisation often has a cost, weather it’s code complexity, maintenance or even just salary. So it has to be worth it, and there are many areas where it isn’t enough unfortunately.
Exactly the mindset responsible for the state of modern software.
Your spelling is terrible
Bro just denied bro’s lemmy comment pull request
Oops, forgot the AI step
You do really feel this when you’re using old hardware.
I have an iPad that’s maybe a decade old at this point. I’m using it for the exact same things I was a decade ago, except that I can barely use the web browser. I don’t know if it’s the browser or the pages or both, but most web sites are unbearably slow, and some simply don’t work, javascript hangs and some elements simply never load. The device is too old to get OS updates, which means I can’t update some of the apps. But, that’s a good thing because those old apps are still very responsive. The apps I can update are getting slower and slower all the time.
It’s the pages. It’s all the JavaScript. And especially the HTML5 stuff. The amount of code that is executed in a webpage these days is staggering. And JS isn’t exactly a computationally modest language.
Of the 200kB loaded on a typical Wikipedia page, about 85kb of it is JS and CSS.
Another 45kB for a single SVG, which in complex cases is a computationally nontrivial image format.
I don’t agree. It’s both. I’ve opened basic no JS sites on old tablets to test them out and even those pages BARELY load
What caused the latency in that case?
I can’t update YouTube on my iPad 2 that I got running again for the first time in years. It said it had been 70,000~ hours since last full charge. I wanted to use it to watch videos on when I’m going to bed. But I can’t actually login to YouTube because the app is so old and I seemingly can’t update it.
I was using the web browser and yeah I don’t remember it being so damn slow. It’s crazy how that is.
I have an old YouTube app on my iPad, and it still works fine. One of the more responsive apps on the device. I get nagged nearly every time I use it to update to the newest YouTube release, but that’s impossible. I’d first have to upgrade my OS, and Apple no longer releases new OSes for this generation of iPads. So, I’m stuck with an old YouTube, which mostly works fine, and an occasional nag message.
I’m sure within a year or two mine will be like yours and YouTube will simply no longer work. But, for now it’s in a relatively good spot where I can use a version of YouTube designed for this particular hardware that doesn’t feel sluggish.
My PC is 15 times faster than the one I had 10 years ago. It’s the same old PC but I got rid of Windows.
Everything bad people said about web apps 20+ years ago has proved true.
It’s like, great, now we have consistent cross-platform software. But it’s all bloated, slow, and only “consistent” with itself (if even). The world raced to the bottom, and here we are. Everything is bound to lowest-common-denominator tech. Everything has all the disadvantages of client-server architecture even when it all runs (or should run) locally.
It is completely fucking insane how long I have to wait for lists to populate with data that could already be in memory.
But at least we’re not stuck with Windows-only admin consoles anymore, so that’s nice.
All the advances in hardware performance have been used to make it faster (more to the point, “cheaper”) to develop software, not faster to run it.
And that us poors still on limited bandwidth plans get charged for going over our monthly quotas because everything has to be streamed or loaded from the cloud instead of installed (or at least cached) locally.
I’m dreading when poorly optimized vibe coding works it’s way into mainstream software and create a glut of technical debt. Performance gonna plummet the next 5 years just wait.
Let me assure you this is already happening.
Already happening with Windows. Also supposedly with Nvidia GPU drivers, with some AMD execs pushing for the same now
Bloated electron apps are what makes Linux on desktop viable today at all, but you guys aren’t ready for that conversation.
Yes, in that the existence of bloated electron apps tends to cause web apps to be properly maintained, as a side effect.
But thankfully, we don’t actually have to use the Electron version, to benefit.
I can only think of a couple Electron apps I use, and none that are important or frequently used.
Uhhh like what?
Note, I don’t know how comprehensive this wiki list is, just quick research
https://en.wikipedia.org/wiki/List_of_software_using_Electron
From those, I’m only currently using a handful.
balenaEtcher, Discord, Synergy, and Obsidian
The viability of linux isn’t dependent on them though
Agreed. I wasn’t the one that claimed that
If only bad people weren’t the ones who said it, maybe we would have listened 😔
I almost started a little rant about Ignaz Semmelweis before I got the joke. :P
Computer speed feels about the same as it was years ago.
Had to install (an old mind you, 2019) visual studio on windows…
…
…
First it’s like 30GB, what the hell?? It’s an advanced text editor with a compiler and some …
Crashed a little less than what I remember 🥴😁
Visual Studio is the IDE. VS Code is the text editor.
OP was clearly using a rhetorical reduction to make a point that VS is bloated.
Visual code is another project, visual studio is indeed an IDE but it integrates it all. Vscode is also an integrated development environment. I don’t really know what more to say.
VS Code is considered a highly extensible text editor that can be used as an IDE, especially for web based tools, but it isnt an IDE. It’s more comparable to Neovim or Emacs than to IntelliJ in terms of the role it’s supposed to fill. Technically. VS Code definitely is used more as an IDE by most people, and those people are weak imo. I’m not one to shill for companies (i promise this isnt astroturf) but if you need to write code Jetbrains probably has the best IDE for that language. Not always true but moee often than not it is imo.
First it’s like 30GB, what the hell??
Just be grateful it’s SSD and not RAM.
The whole industry needs rebuilt from the foundations. GRTT with a grading ring that tightly controls resources (including, but not limited to RAM) as the fundamental calculus, instead of whatever JS happens to stick to the Chome codebase and machine codes spewed by your favorite C compiler.
It took me a long time to figure out that “GRTT” is “Graded Modal Type Theory”. Letting others know, if they want to look into it further.
Sorry. I didn’t pick the acronym, it comes from the paper: https://arxiv.org/pdf/2010.13163.pdf I’m not sure why there’s no “M” in the acronym, but I should probably spell things out when I actually want collaborators.
While I’m dropping links, I will also drop https://github.com/granule-project/ where Gerty and Granule live and where real research is done.
If one of us ever wins the lotto we better get on funding that
If someone wants to collab, I’ve been writing various codes around it: https://gitlab.com/bss03/grtt
Right now, it’s a bunch of crap. But, it’s published, and I occasionally try to improve it.
Also, Granule and Gerty are actual working implementations, tho I think some of the “magic” is in the right grading ring for the runtime, and they and more research oriented, allowing for fairly arbitrary grading rings.
This entire thread is a perfect example of the paradox folks keep mentioning:
Nobody in both 🧵s pointed out that Ocean used Mastodon to post the banter with.
Plenty more optimized federated slop software in the market.I am also on Jabber, if it means anything to Zoomies.
“Let them eat ram”
I hate that our expectations have been lowered.
2016: “oh, that app crashed?? Pick a different one!”
2026: “oh, that app crashed again? They all crash, just start it again and cross your toes.”
I’m starting to develop a conspiracy theory that MS is trying to make the desktop experience so terrible that everyone switches to mobile devices, such that they can be more easily spied on.
That would be incredibly ironic given that they completely fucking gave up on mobile devices when the iPhone came out.














