![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.ml/pictrs/image/q98XK4sKtw.png)
it means that you have to manually reposition every single window, every single time. for any and all apps, by design
it means that you have to manually reposition every single window, every single time. for any and all apps, by design
note: on most computers, it worked the opposite to how one would think. Turning it on slowed your cpu to around 33 MHz
more like "move glacially and declare things as "will not support’ so technically we had nothing TO fix!"
it’s when devs of a graphics stack just suddenly feel the need to protect your own computer from itself, so they say fuck you to any features that they deem “insecure”, including accessibility features (they will claim they fixed this, but it’s opt-in per app. old apps will just be completely unusable for some people with special needs.)
But they eliminated tearing on the desktop! woo!!!
you know that the confidence value is generated by the ai itself right? So it could still spew out bullshit with high confidence. The confidence score doesn’t really help much
change one pixel and suddenly it doesn’tmatch. Do the comparison based on similarity instead and now you’re back to false positives
ever notice how most of them look like a rectangle and have glass on the front?
control-z, kill %1
you mean it doesn’t work when the device is turned off? weird! /s
rule of thumb: multiply by 4 to convert from years to quarters. You’re welcome.
no, the truth is it’s impossible even then. If the result involves randomness at its most fundamental level, then it’s not reliable whatever you do.
suffers from all the same problems features. It’s inherent to the tech itself.
much sad. very grieve. wow
celsius or fahrenheit?
the results are random therefore the dataset is useless.
tell that to any fpga toolchain
you released it under a non open source license. So very clearly: no it is not
it is only open source if i can build it myself. Which I can’t if you just give me the weights.
The weights are the “compiled” version of the dataset. It’s the dataset that’s the source, not the weights
it never knows what it’s saying
not stole. Were given.
If code is law, then they just found the right way to ask. And the code gave the money to them, because they asked nicely.
it’s opt-in, per app. Meaning unless old apps are patched and recompiled, they will be inaccessible.