I paid for the whole amount of RAM, I’m gonna use the whole amount of RAM.
/s
Joke aside, the computer I used a little more than a decade ago used to take 1 minute just to display a single raw photo. I’m a liiiittle better off now.
The program expands so as to fill the resources available for its execution
– C.N. Parkinson (if he were alive today)
Thought leaders spent the last couple of decades propaganding that features-per-week is the only metric to optimize, and that if your software has any bit of efficiency or quality in it that’s a clear indicator for a lost opportunity to sacrifice it on the alter of code churning.
The result is not “amazing”. I’d be more amazed had it turned out differently.
It’s kind of funny how eagerly we programmers criticize “premature optimization”, when often optimization is not premature at all but truly necessary. A related problem is that programmers often have top-of-the-line gear, so code that works acceptably well on their equipment is hideously slow when running on normal people’s machines. When I was managing my team, I would encourage people to develop on out-of-date devices (or at least test their code out on them once in a while).
Optomisation often has a cost, weather it’s code complexity, maintenance or even just salary. So it has to be worth it, and there are many areas where it isn’t enough unfortunately.
Your spelling is terrible
Bro just denied bro’s lemmy comment pull request
Oops, forgot the AI step
You do really feel this when you’re using old hardware.
I have an iPad that’s maybe a decade old at this point. I’m using it for the exact same things I was a decade ago, except that I can barely use the web browser. I don’t know if it’s the browser or the pages or both, but most web sites are unbearably slow, and some simply don’t work, javascript hangs and some elements simply never load. The device is too old to get OS updates, which means I can’t update some of the apps. But, that’s a good thing because those old apps are still very responsive. The apps I can update are getting slower and slower all the time.
It’s the pages. It’s all the JavaScript. And especially the HTML5 stuff. The amount of code that is executed in a webpage these days is staggering. And JS isn’t exactly a computationally modest language.
Of the 200kB loaded on a typical Wikipedia page, about 85kb of it is JS and CSS.
Another 45kB for a single SVG, which in complex cases is a computationally nontrivial image format.
I don’t agree. It’s both. I’ve opened basic no JS sites on old tablets to test them out and even those pages BARELY load
I can’t update YouTube on my iPad 2 that I got running again for the first time in years. It said it had been 70,000~ hours since last full charge. I wanted to use it to watch videos on when I’m going to bed. But I can’t actually login to YouTube because the app is so old and I seemingly can’t update it.
I was using the web browser and yeah I don’t remember it being so damn slow. It’s crazy how that is.
I have an old YouTube app on my iPad, and it still works fine. One of the more responsive apps on the device. I get nagged nearly every time I use it to update to the newest YouTube release, but that’s impossible. I’d first have to upgrade my OS, and Apple no longer releases new OSes for this generation of iPads. So, I’m stuck with an old YouTube, which mostly works fine, and an occasional nag message.
I’m sure within a year or two mine will be like yours and YouTube will simply no longer work. But, for now it’s in a relatively good spot where I can use a version of YouTube designed for this particular hardware that doesn’t feel sluggish.
Computer speed feels about the same as it was years ago.
This entire thread is a perfect example of the paradox folks keep mentioning:
Nobody in both 🧵s pointed out that Ocean used Mastodon to post the banter with.
Plenty more optimized federated slop software in the market.I am also on Jabber, if it means anything to Zoomies.
The whole industry needs rebuilt from the foundations. GRTT with a grading ring that tightly controls resources (including, but not limited to RAM) as the fundamental calculus, instead of whatever JS happens to stick to the Chome codebase and machine codes spewed by your favorite C compiler.
It took me a long time to figure out that “GRTT” is “Graded Modal Type Theory”. Letting others know, if they want to look into it further.
If one of us ever wins the lotto we better get on funding that
If someone wants to collab, I’ve been writing various codes around it: https://gitlab.com/bss03/grtt
Right now, it’s a bunch of crap. But, it’s published, and I occasionally try to improve it.
Also, Granule and Gerty are actual working implementations, tho I think some of the “magic” is in the right grading ring for the runtime, and they and more research oriented, allowing for fairly arbitrary grading rings.
Had to install (an old mind you, 2019) visual studio on windows…
…
…
First it’s like 30GB, what the hell?? It’s an advanced text editor with a compiler and some …
Crashed a little less than what I remember 🥴😁
Visual Studio is the IDE. VS Code is the text editor.
OP was clearly using a rhetorical reduction to make a point that VS is bloated.
Visual code is another project, visual studio is indeed an IDE but it integrates it all. Vscode is also an integrated development environment. I don’t really know what more to say.
VS Code is considered a highly extensible text editor that can be used as an IDE, especially for web based tools, but it isnt an IDE. It’s more comparable to Neovim or Emacs than to IntelliJ in terms of the role it’s supposed to fill. Technically. VS Code definitely is used more as an IDE by most people, and those people are weak imo. I’m not one to shill for companies (i promise this isnt astroturf) but if you need to write code Jetbrains probably has the best IDE for that language. Not always true but moee often than not it is imo.
First it’s like 30GB, what the hell??
Just be grateful it’s SSD and not RAM.
My PC is 15 times faster than the one I had 10 years ago. It’s the same old PC but I got rid of Windows.
Everything bad people said about web apps 20+ years ago has proved true.
It’s like, great, now we have consistent cross-platform software. But it’s all bloated, slow, and only “consistent” with itself (if even). The world raced to the bottom, and here we are. Everything is bound to lowest-common-denominator tech. Everything has all the disadvantages of client-server architecture even when it all runs (or should run) locally.
It is completely fucking insane how long I have to wait for lists to populate with data that could already be in memory.
But at least we’re not stuck with Windows-only admin consoles anymore, so that’s nice.
All the advances in hardware performance have been used to make it faster (more to the point, “cheaper”) to develop software, not faster to run it.
And that us poors still on limited bandwidth plans get charged for going over our monthly quotas because everything has to be streamed or loaded from the cloud instead of installed (or at least cached) locally.
I’m dreading when poorly optimized vibe coding works it’s way into mainstream software and create a glut of technical debt. Performance gonna plummet the next 5 years just wait.
Let me assure you this is already happening.
Already happening with Windows. Also supposedly with Nvidia GPU drivers, with some AMD execs pushing for the same now
Bloated electron apps are what makes Linux on desktop viable today at all, but you guys aren’t ready for that conversation.
Yes, in that the existence of bloated electron apps tends to cause web apps to be properly maintained, as a side effect.
But thankfully, we don’t actually have to use the Electron version, to benefit.
I can only think of a couple Electron apps I use, and none that are important or frequently used.
Uhhh like what?
Note, I don’t know how comprehensive this wiki list is, just quick research
https://en.wikipedia.org/wiki/List_of_software_using_Electron
From those, I’m only currently using a handful.
balenaEtcher, Discord, Synergy, and Obsidian
The viability of linux isn’t dependent on them though
Agreed. I wasn’t the one that claimed that
If only bad people weren’t the ones who said it, maybe we would have listened 😔
I almost started a little rant about Ignaz Semmelweis before I got the joke. :P
“Let them eat ram”
I hate that our expectations have been lowered.
2016: “oh, that app crashed?? Pick a different one!”
2026: “oh, that app crashed again? They all crash, just start it again and cross your toes.”
I’m starting to develop a conspiracy theory that MS is trying to make the desktop experience so terrible that everyone switches to mobile devices, such that they can be more easily spied on.
That would be incredibly ironic given that they completely fucking gave up on mobile devices when the iPhone came out.
I bought a desktop PC for a little over 2k in late 2011, and still use it. I’m a back-end developer, and certainly I would like to be able to upgrade my 16 GB RAM to 32 GB in an affordable way.
Other than that, it’s perfectly fine. IDE, a few docker containers, works.
And modern gaming is a scam anyway. Realistic graphics do not increase fun, they just eat electricity and our money. Retro gaming or not at all.
Imagine how things were if they were built to be maintained for 15+ years.
2011 means it’s probably DDR3, which is still fairly affordable
wow, you are right! I didn’t bother to check this whole time of needless suffering, but for what I earn with it in less than an hour I could probably buy 2x8 GB DDR-3, lol!
It just seemed a fair assumption that it would be insanely expensive …
The same? Try worse. Most devices have seen input latency going up. Most applications have a higher latency post input as well.
Switching from an old system with old UI to a new system sometimes feels like molasses.
I work in support for a SaaS product and every single click on the platform takes a noticeable amount of time. I don’t understand why anyone is paying any amount of money for this product. I have the FOSS equivalent of our software in a test VM and its far more responsive.
Except for KDE. At least compared to cinnamon, I find KDE much more responsive.
AI generated code will make things worse. They are good at providing solutions that generally give the correct output but the code they generate tends to be shit in a final product style.
Though perhaps performance will improve since at least the AI isn’t limited by only knowing JavaScript.
I still have no idea what it is, but over time my computer, which has KDE on it, gets super slow and I HAVE to restart. Even if I close all applications it’s still slow.
It’s one reason I’ve been considering upgrading from6 cores and 32 GB to 16 and 64.
Have you gone through settings and disabled unnecessary effects, indexing and such? With default settings it can get quite slow but with some small changes it becomes very snappy.
I have not, but also it’s not slow immediately, it takes time under use to get slow. Fresh boot is quite fast. And then once it’s slow, even if I close my IDE, browsers and everything, it remains slow, even if CPU usage is really low and there’s theoretically plenty of memory that could be freed easily.
Have you tried disabling all local Trojans and seeing if that helps?
Upgrade isn’t likely to help. If KDE is struggling on 6@32, you have something going on that 16@64 is only going to make it last twice as long before choking.
wail till it’s slow
Check your Ram / CPU in top and the disk in iotop, hammering the disk/CPU (of a bad disk/ssd) can make kde feel slow.
plasmashell --replace # this just dumps plasmashell’s widgets/panels
See if you got a lot of ram/CPU back or it’s running well, if so if might be a bad widget or panel
if it’s still slow,
kwin_x11 --replace
or
kwin_wayland --replace &
This dumps everything and refreshes the graphics driver/compositor/window manager
If that makes it better, you’re likely looking at a graphics driver issue
I’ve seen some stuff where going to sleep and coming out degrades perf
Hmm, I haven’t noticed high CPU usage, but usually it only leaves me around 500MB actually free RAM, basically the entire rest of it is either in use or cache (often about 15 gigs for cache). Turning on the 64 gig swapfile usually still leaves me with close to no free RAM.
I’ll see if it’s slow already when I get home, I restarted yesterday. Then I’ll try the tricks you suggested. For all I know maybe it’s not even KDE itself.
Root and home are on separate NVMe drives and there’s a SATA SSD for misc non-system stuff.
GPU is nvidia 3060ti with latest proprietary drivers.
The PC does not sleep at all.
To be fair I also want to upgrade to speed up Rust compilation when working on side projects and because I often have to store 40-50 gigs in tmpfs and would prefer it to be entirely in RAM so it’s faster to both write and read.
Don’t let me stop you from upgrading, that’s got loads of upsides. Just suspecting you still have something else to fix before you’ll really get to use it :)
It CAN be ok to have very low free ram if it’s used up by buffers/cache. (freeable) If Buff/cache gets below about 3GB on most systems, you’ll start to struggle.
If you have 16GB, it’s running low, and you can’t account for it in top, you have something leaking somewhere.
Lol I sorted top by memory usage and realized I’m using 12 gigs on an LLM I was playing around with to get local code completion in my JetBrains IDE. It didn’t work all that well anyway and I forgot to disable it.
I did have similar issues before this too, but I imagine blowing 12 gigs on an LLM must’ve exacerbated things. I’m wondering how long I can go now before I’m starting to run out of memory again. Though I was still sitting at 7 gigs buffer/cache and it hadn’t slowed down yet.
12/16, That’ll do it. Hopefully that’s all, good luck out there and happy KDE’ing
Have you tried disabling the file indexing service? I think it’s called Baloo?
Usually it doesn’t have too much overhead, but in combination with certain workflows it could be a bottleneck.
I want to avoid building react native apps.
Windows 11 is the slowest Windows I’ve ever used, by far. Why do I have to wait 15-45 seconds to see my folders when I open explorer? If you have a slow or intermittent Internet connection it’s literally unusable.
Even Windows 10 is literally unusable for me. When pressing the windows key it literally takes about 4 seconds until the search pops up, just for it to be literally garbage.
Found out about this while watching “Halt and Catch Fire” (AMC’s effort to recreate the magic of Mad Men, but on the computer).
In 1982 Walter J. Doherty and Ahrvind J. Thadani published, in the IBM Systems Journal, a research paper that set the requirement for computer response time to be 400 milliseconds, not 2,000 (2 seconds) which had been the previous standard. When a human being’s command was executed and returned an answer in under 400 milliseconds, it was deemed to exceed the Doherty threshold, and use of such applications were deemed to be “addicting” to users.
if it only occurs hours or days after boot, try killing the startmenuexperiencehost process. that’s what I was doing until I switched to linux
I am using windows like once a week at maximum and then it only takes about 10 minutes. So I kind of do not really care and am glad, that I do not need to use it more often.
The Windows bloat each new generation is way out of control.
It takes forever to boot I know that and that’s from fast food which is extra pathetic.
fast food
Too many nuggies
Maybe if Windows quit pigging out on tendies and slimmed down it would be as baf
Probably that’s the folder explorer or whatever itself crashing.
yeah
and like why does it crash? it worked fine on Windows 10
I’ve given up trying to understand modern PC software. I can barely keep up with the little microcontrollers I work with. They aren’t so little.















