• 0 Posts
  • 19 Comments
Joined 11 months ago
cake
Cake day: August 14th, 2023

help-circle
  • Yeah, timestamps should always be stored in UTC, but actual planning of anything needs to be conscious of local time zones, including daylight savings. Coming up with a description of when a place is open in local time might be simple when described in local time but clunkier in UTC when accounting for daylight savings, local holidays, etc.


  • They did, eventually. The first PlayStation was relatively easy to pirate for (with a mod chip), but it took a while for that stuff to become available. Someone had to go and manufacture the chips, or reverse engineer the check.

    By the time that scene matured, Sega released the Dreamcast right into a more sophisticated piracy scene that could apply lessons learned to the Dreamcast right away.

    On paper, Sega had more sophisticated copy protection than the first PlayStation did. But it also released 4 years later.


  • you could go to your local library and carry a USB stick.

    I don’t remember it this way. Nothing else came close to the portable storage capacity of CD (and thus CD-R and CD-RW). The iomega zip drive was still a popular medium, allowing rewritable 100mb or 250mb cartridge. That was the preferred way to get big files to and from a computer lab when I was an engineering student in 2000.

    USB flash drives had just been released in 2000, and their capacity was measured in like 8/16/32mb, nowhere near enough to meaningfully move CD images.

    Then again, as a college student with on-campus broadband on the completely unregulated internet (back when HTTP and the WWW weren’t necessarily considered the most important protocols on the internet), it was all about shared FTP logins PMed over IRC to download illegal shit. The good stuff never touched an actual website.





  • I’d say the real world doesn’t reward being actually gifted.

    More accurately, the real world punishes being below average at any one of like a dozen skillets. You can’t min/max your stats because being 99th percentile at something won’t make up for being 30th percentile at something else. Better to be 75th percentile at both.

    The real world requires cross-disciplinary coordination, which means thriving requires both soft skills and multiple hard skills.





  • It basically varies from chip to chip, and program to program.

    Speculative execution is when a program hits some kind of branch (like an if-then statement) and the CPU just goes ahead and calculates as if it’s true, and progresses down that line until it learns “oh wait it was false, just scrub all that work I did so far down this branch.” So it really depends on what that specific chip was doing in that moment, for that specific program.

    It’s a very real performance boost for normal operations, but for cryptographic operations you want every function to perform in exactly the same amount of time, so that something outside that program can’t see how long it took and infer secret information.

    These timing/side channel attacks generally work like this: imagine you have a program that tests if variable X is a prime number, by testing if every number smaller than X can divide evenly, from 2 on to X. Well, the bigger X is, the longer that particular function will take. So if the function takes a really long time, you’ve got a pretty good idea of what X is. So if you have a separate program that isn’t allowed to read the value of X, but can watch another program operate on X, you might be able to learn bits of information about X.

    Patches for these vulnerabilities changes the software to make those programs/function in fixed time, but then you lose all the efficiency gains of being able to finish faster, when you slow the program down to the weakest link, so to speak.


  • This particular class of vulnerabilities, where modern processors try to predict what operations might come next and perform them before they’re actually needed, has been found in basically all modern CPUs/GPUs. Spectre/Meldown, Downfall, Retbleed, etc., are all a class of hardware vulnerabilities that can leak crypographic secrets. Patching them generally slows down performance considerably, because the actual hardware vulnerability can’t be fixed directly.

    It’s not even the first one for the Apple M-series chips. PACMAN was a vulnerability in M1 chips.

    Researchers will almost certainly continue to find these, in all major vendors’ CPUs.







  • Notice that your comment is framed from the perspective of what Libertarians believe, and analyzing from that context. Mine is different: analyzing a specific type of personality common in tech careers, and analyzing why that type of person tends to be much more receptive to libertarian ideas.

    I’m familiar with libertarianism and its various schools/movements within that broader category. And I still think that many in that group tend to underappreciate issues of public choice, group behaviors, and how they differ from individual choice.

    Coase’s famous paper, the Theory of the Firm, tries to bridge some of that tension, but it’s also just not hard to see how human association into groups lays on a spectrum of voluntariness, with many more social situations being more coercive than Libertarians tend to appreciate, and then also layering Coase’s observations about the efficiencies of association onto involuntary associations, too.

    Then at that point you have a discussion about public choice theory, what the group owes to defectors or minority views or free riders within its group, what a group owes to others outside that group in terms of externalities, how to build a coalition within that framework of group choice, and then your nuanced position might have started as libertarianism but ends up looking a lot like mainstream political, social, and economic views, to the point where the libertarian label isn’t that useful.


  • I think technical-minded people tend to gravitate towards libertarian ideologies because they tend to underestimate the importance of human relationships to large scale systems. You can see it in the stereotype of the lone programmer who dislikes commenting or documentation, collaboration with other programmers, and strongly negative views towards their own project managers or their company’s executives. They also tend to have a negative view of customers/users, and don’t really believe in spending too much time in user interfaces/experiences. They have a natural skepticism of interdependence, because that brings on extra social overhead they don’t particularly believe they need. So they tend to view the legal, political, and social world through that same lens, as well.

    I think the modern world of software engineering has moved in a direction away from that, as code complexity has grown to the point where maintainability/documentation and collaborative processes have obvious benefit, visible up front, but you still see streaks of that in some personalities. And, as many age, they have some firsthand experience with projects that were technically brilliant but doomed due to financial/business reasons, or even social/regulatory reasons. The maturation of technical academic disciplines relating to design, user experience, architecture, maintainability, and security puts that “overhead” in a place that’s more immediate, so that they’re more likely to understand that the interdependence is part of the environment they must operate in.

    A lot of these technical minded people then see the two-party system as a struggle between MBAs and Ph.Ds, neither of whom they actually like, and prefer that problems be addressed organically at the lowest possible level with the simplest, most elegant rules. I have some disagreements with the typical libertarians on what weight should be assigned to social consensus, political/economic feasibility, and elegant simplicity in policymaking, but I think I get where most of them are coming from.