

Trusting humans (to any extent) is quite vastly different than trusting corporations, to be honest. But if that’s your jingle, you can always do the work with a bulk tagger / gallery assembler. No AI, nor environment killing, needed.
I write English / Escribo en Español.
Vidya / videojuegos. Internet. Cats / Gatos. Pizza. Nap / Siesta.
This user’s posts under CC-BY-NC-SA license. Ask me if you need a different permission.
Trusting humans (to any extent) is quite vastly different than trusting corporations, to be honest. But if that’s your jingle, you can always do the work with a bulk tagger / gallery assembler. No AI, nor environment killing, needed.
Organic died shortly ago (they’d been enshittifying for a while). We go with CoMaps nowadays.
The Proton CEO praised Trump and his control and choices regarding IT and antitrust some months ago. People said it was a “one-off” but then the Proton board backed up the CEO. Writing’s been on the wall for a while.
But not Youtube. Which is Google.
Curious.
I’m done with Google! Watch me rant on Google’s Youtube! Earns me money!
So… yeah.
Why are we giving neonazis attention, again?
Have you tried hiring a couple humans? They’re much better than AI. And you help your local (physical or virtual) economy!
Plex has paywalled my server!
Skill issue tbh.
Platform.
Optional.
It’s on us (all of us).
Apparently.
Posting media.
Fediverse.
Yes.
Good catch. Still, doesn’t make it true either: it’s not such a “fundamental use case” that it would even require the capability. The browser already reports the usable information in the user agent (you rarely even in that 1% need more specificity than “Windows” on “Desktop Intel”).
No. It should be made available with a permission, because not every site out there is going to offer you to download binaries. 1% of the web “”“requiring”“” this does not justify 99% of the web being able to violate that privacy.
Operating system and CPU architecture are useful for sites to serve the correct binaries when a user is downloading an application.
Barely. You could trim down the data to incredibly low granularity (“OS: Windows”, “CPU: Intel Desktop”) and you’d still get the exact same binary as 99% of the people 99% of the time, anyway.
No need to report any sort of even remotely precise value then. Just report “low” or “high”. Also it’s bold of you to assume that just because I am plugged to the wall I want to be served 400 MB of exta javascript and MPEG4 instead of one CSS file and a simple PNG.
One of the biggest reasons websites need to run JS is submitting form data to a server. Like this website.
No. Forms function quite perfectly without JS thanks to action=
.
Now whether you want to get “desktop app” fancy with forms and pretend you are a “first-class desktop citizen” that’s a skill issue. But submitting form data, by itself, has not required JS since at least 1979. Maybe earlier.
They can stop telegraphing some of this information, but then the websites won’t render properly (they use this information to display the website properly),
Pretty much none of the information is necessary to ever render a site properly.
OS and CPU architecture? Ireelevant to whether you are sending a JPG or PNG background. Nearly irrelevant to whether you are using a vertical or horizontal screen (and browsers adverstise that info separately anyway, it’s even part of CSS media queries).
Accelerometer and gyroscope? The only reason that could ever be needed for rendering is if the user is moving so incredibly fast that red pixels in their screen would become green due to shifting. And in any time between 2025 and 2999, if you have someone moving that fast, you have worse problems than the site not rendering adequately.
Keyboard layout? If the rendering of a site depends on whether I’m pulsing “g” vs “j” while it loads, then that’s quite stupid anyway because that boldly assumes the app focus is on the page.
Proximity sensor? Again: absolutely useless unless rendering environment moving at incredibly superhigh speed (at which the sensor might be reading data wrong anyway).
Shittiest post of the week in the Fedivere, up there ↑.
If you are going to be this shitty, dismissive and misinformative, you can head back to Twitter.
a decent chunk of coding is stupid boilerplate/minutia that varies
…according to a logic, which means LLMs are bad at it.
Well, the first and obvious thing to do to show that AI is bad is to show that AI is bad. If it provides that much of a low-hanging fruit for the demonstration… that just further emphasizes the point.
[Features][features]
- [Proceeds to list social credit features]
No thanks, if I wanted that I’d go to the CCP, Reddit, or Twitter.
You skipped like, three towns ahead and one to the right, mate. Actively defending against specific bad people in the world is not an “echo chamber”.
So it’s “You get off Google but not me! And keep subsidizing me!”?
It’s about 15% more understandable, but still, the same crap I expect by default from that kind of person.