

OCR isn’t normally considered AI and ive never used or wanted to use such a feature. I’ve literally never received an image with non trivial txt on it that I needed to copy on my phone.
You failed to explain what you were talking about


OCR isn’t normally considered AI and ive never used or wanted to use such a feature. I’ve literally never received an image with non trivial txt on it that I needed to copy on my phone.
You failed to explain what you were talking about


I didn’t downvote you. You got downvoted because you seemed to be saying that somehow highlighting text is an example of AI. You now seem to be saying that samsung has a feature whereby you can copy text out of a picture. This is probably still wrong because OCR isn’t neccesarily AI but none the less I didn’t vote either way.


You do realize that you just said highlighting and most people aren’t reading this on a Samsung thus don’t know what you are talking about.


AI is in fact not meaningfully repeatable in actual usage patterns.


How does highlighting have to do with AI?
KDE by default doesn’t have a mac style global menu. If the third party extension that provides this looks fancy but doesn’t work perfectly ask the devs to do more free work or roll up your sleeves. In any case its not part of KDE.
If you use a proper style sheet/dark theme for both QT and GTK and set flatpak to use it you really shouldn’t have any complaints about dark themes save for websites. Trying to make websites all dark themed is a fools errand. You’ll eventually find that some don’t style right if you force it.
Use integer scaling. Buy devices that are 4K at 24-32 or 1080p at 11-14" you know the most common sizes?


Because your family tree is a circle, your mom fucked your grandad, and now I’m my my own uncle then you came along. Sup cuz


Computation is arranging the structure of the universe to reflect itself and something not a mind but a necessary component part thereof. All computing is literally computing on the structure of the universe. A well expressed thought to which you shall reply what? “Dumby” not even a word anyone uses. It’s not merely that you are incapable of expressing an alternative position you are not even capable of calling someone stupid correctly.


Were they already very short with no hope of hiring enough for years when fired a bunch of them?


You have gotten close to understanding something profound and failed to understand that the thing in your hand also represents that profundity.


You mean like all computers


Its funny that most posts including the one our are responding to are fully incoherent by people who not only didn’t read the article but are incapable of doing so


unless ash is curvy?
ooh burn


Because if you draw things with very few pixels it tends to look blocky and unrealistic because the universe like your mom has curves. The more pixels you use the more realistically we can represent both real and virtual pictures. Cambridge says people can see up to 94 PPD. This means that 4K monitors on your desk are trivially within the range that people can distinguish but its dubious that 8K TVs are useful. The more you know!


If you can’t tell the difference your corrective eyewear may be insufficiently corrective. The people who can see can tell the difference between FHD and 4K


The study says that users can appreciate resolutions up to 94 pixels per degree. A FHD 27" monitor at 18" distance is 29 PPD. At 4K its 58. Users can appreciate the fact that a 4K display is much better.
And no, 4k desktops do not “look nicer”, it is stupid and tiny for no reason. Unless you have like 250 shortcuts on your desktop what is the point?
And no, 4k desktops do not “look nicer”, it is stupid and tiny for no reason. Unless you have like 250 shortcuts on your desktop what is the point?
Couldn’t find the setting called scale on your windows desktop? Ok mr manager. Do you also call IT when your monitor is turned off to tell them your CPU is broken?


Basically every modern OS in existence including Linux supports proper scaling for higher resolution displays. You don’t just have to make the text bigger. Proper scaling is implemented. Integer scaling is best supported.
https://linux-hardware.org/?view=mon_resolution&colors=10
Let’s look at desktop users
4k = 13.7% of Linux users QHD = 12.4% 3440x1440 = 3.9%
30% of desktop users are using > FHD


On the internet where you go by “Moonpoo” you in fact have no credentials because nobody can verify anything.
It is in a way hilarious to imagine that IBM is so broken that its employees can’t figure out how to make fonts not tiny on 4K. You must have been a manager.


Credentials like “made my living in hardware” are both non-specific and non-verifiable they mean nothing. I have 2 27" 4K 60hz monitors because last gen hardware just isn’t that expensive.
When not gaming this looks nicer than 2x FHD and I run it in either 1080 or 4K depending on the game depending on what settings need to be set to get a consistent 60 FPS. My hardware isn’t poverty level nor is it expensive. An entry level Mac would be more expensive.
Leaving aside gaming isn’t it obvious to you that 4K looks nicer in desktop use or are your eyes literally failing?
Actually in most of western europe its half think he’s their enemy 40 some percent think he gives zero fucks either way and only 7-9% think he’s on their side.