

I assume microcontrollers. Most of those are invisible to consumers.
FLOSS virtualization hacker, occasional brewer


I assume microcontrollers. Most of those are invisible to consumers.
I would not want anything that requires a cloud connection to be responsible for securing my house. The security record of these smart locks also isn’t great.
The final question you need to ask yourself is how they fail safe? There have been Tesla owners trapped in burning cars. If, god forbid, your house caught fire can you get out of your door secured with a smart lock?
Once we summit the peak of inflated expectations and the bubble bursts hopefully we’ll get back to evaluating the technology on its merits.
LLM’s definitely have some interesting properties but they are not universal problem solvers. They are great at parsing and summarizing language. There ability to vibe code is entirely based on how closely your needs match the (vast) training data. They can synthesise tutorials and stack overflow answers much faster than you can. But if you are writing something new or specialised the limits of their “reasoning” soon show up in dead ends and sycophantic “you are absolutely right, I missed that” responses.
More than the technology the social context is a challenge. We are already seeing humans form dangerous parasocial relationships with token predictors with some tragic results. If you abdicate your learning to an LLM you are not really learning and that could have profound impacts on the current cohort of learners who might be assuming they no longer need to learn as the computer can do it for them.
We are certainly experiencing a very fast technological disruption event and it’s hard to predict where the next few years will take us.
One of the things I like about Horizon Zero Dawn is they introduced cosmetics so you didn’t have to compromise your visual style for the right set of numbers for your current opponents.


Modern machines have TPM so we can do attested boot and validate a system hasn’t been tampered with. They don’t need third party kernel modules to test that.


Fundamentally the reason they want to use kernel modules is to observe the system for other executables interfering with the game. This is a hacky solution at best
The TPM hardware can support attested boot so you can verify with the hardware nothing but the verified kernel and userspace is running. That gives you the same guarantees but without letting third parties mess with your kernel.


It’s nice to see Valve and Igalia see the benefit of open GPU drivers for Proton and FEX utilise.
I would have thought unified memory would pay off, otherwise you spend your time shuffling stuff between system memory and vram. Isn’t the deck unified memory?


Did you ever play with the audio visualiser? I believe it was built in with the CD-ROM drive? What about Tempest 2000?


I never got a Jaguar despite being a signed up Atari fan boy at the time. The hardware was ridiculously complex which made ports to it a hard sell and Atari just didn’t have the first party exclusive clout needed to sustain a console at launch.
I do wish I’d had a chance to play with some of Jeff Minter’s creations on it though. Apparently there was a nice audio visualiser that built on the trip-a-tron from the ST days as well as some reboots of classic arcade games like Tempest 2000.


I’ve generally been up front when starting new jobs that nothing impinges my ability to work on FLOSS software on my own time. Only one company put a restriction in for working on FLOSS software in the same technical space as my $DAYJOB.


The article mentioned there is a long history of forks in the open source Doom world. It seems the majority of the active developers just moved to the new repository.


Cost, the reason is cost.
He does?
I read the first link in the thread that examines his blog post about London. While I don’t agree with his politics he wouldn’t be unusual amongst a significant minority of the population who vote for the likes of Reform. That seems to be enough for some to draw the conclusion he’s a Nazi he wants to arbitrarily murder people.
This automatic jump to accusing anyone who you disagree with a Nazi just devalues the term.


I helped with the initial Aarch64 emulation support for qemu as well as working with others to make multi-threaded system emulation a thing. I maintain a number of subsystems but perhaps the biggest impact was implementing the cross compilation support that enabled the TCG testing to be run by anyone including eventually the CI system. This is greatly helped by being a paid gig for the last 12 years.
I’ve done a fair bit of other stuff over my many decades of using FLOSS including maintain a couple of moderately popular Emacs packages. I’ve got drive by patches in loads of projects as I like to fix things up as I go.


I have paid for Newsblur ever since they cancelled Google Reader. I also use elfeed on various emacs instances for project and update feeds of various types.


It’s all relative I guess. I can see why the original GPT’s used the Reddit corpus for training. However I’ve always been a little sceptical about the quality of the training set in any social media given how much it exaggerates the extremes of people’s behaviour.


I don’t need to get through winter, I just need to get from dusk to when the cheap energy is starts. Currently that’s about 4kwh - or a small portion of my car battery before or recharges on the cheap rate.


There are large areas of open source that don’t rely on volunteer labour because companies with a vested interest pay people to work on them. They tend to be the obvious large projects that are continuously developed and gain new features. The trouble with something like xz is it was mostly “done” (as in it did the thing it was intended to do) but still needed maintenance to address the minor niggles, bug reports and updates to tooling and dependencies.
The foundations could do a better job here of supporting the maintainers. After Heartbleed the Linux Foundation started the Core Infrastructure Initiative to help fund those under recognised projects. I would hope the people running that could be more proactive identifying those critical understaffed components.
Edit I think it’s now called the Open Source Security Foundation: https://openssf.org/
Now I’ve read the article it’s unnamed industry analysts and it’s written by an AI. For all I know the AI has hallucinated the number.