• 0 Posts
  • 153 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle

  • There’s some slight technical reason for it, but I think they swung a bit too far in the asshole direction with blocking too many.

    The LTE rollout was completely botched from the start. LTE voice is technically supported on all LTE chipsets, but early on the voice spec changed. Early phones used LTE for data and 2G or 3G for voice.

    Complicating matters further, AT&T and Verizon both have separate and slightly tweaked versions of the spec, as they didn’t want to wait for it to be finalized, and of course they’re both different in different ways. It’s also why T-Mobile allows so many devices. They just rode their very fast for the time HSPA+ network until LTE was finalized, got generic hardware on the network, and flipped the power switch.

    To top it off, AT&T was sued at one point for 911 not working due to a handset bug and they got very controlling at that point to avoid future lawsuits.

    VoLTE is ostensibly VoIP over cellular data at its core. All phones have to talk with the correct SIP signaling on VoLTE for voice calls to work. With 2G and 3G, the circuit-switched method of signaling was much more standardized (although not necessarily simpler, WCDMA at its end spanned literal volumes of books.) This made it so phones and networks were more easily compatible for basic things like voice, 911, etc.

    Now, on top of Verizon and AT&T thinking that rolling their own flavors of LTE was a good idea, every phone maker also had their own idea about how the VoLTE SIP signaling was supposed to work. Due to flaws in the LTE spec, carriers going rogue, and companies interpreting things wrong, it has turned quite literally into a clusterfuck.

    TL;DR: It took a long time for LTE to standardize enough across product lines, and there are a whole bunch of phone models that don’t talk the language quite right. So carriers chose to ban rather than make workarounds or work with the vendor to roll a software fix to the phone.








  • My best guess is that I know one of them uses Facebook. Apple phones. Facebook, Uber, and a few others have had pretty deep access to APIs not accessible to other software companies. Sometimes they’re caught like when Uber was caught using a screen scraping API. Sometimes they aren’t. The other guess that glues it together is that Facebook has indeed scraped audio to text for a long time. It was almost 10 years ago that I had the EE conversation.

    Google and Meta pay Apple money to gain access to their user metrics. It’s likely symbiotic relationships. Facebook once had hooks directly in iOS. Likewise, the little mic/video indicators the OS displays when they are “active” are completely software-controlled and can be overridden.

    At a time, I worked at a company that had(has) deep access to other aspects of iOS. Apple always required the source code is available to them so they could inspect it. I doubt that has changed. It also means they would be complicit. External tools wouldn’t really be able to figure this out. For someone to black-box this they’d need a jailbroken iPhone and some specialized tooling or MITM decryption capabilities.

    Not to sound hyperbolic, I’m connecting dots with no evidence, it’s pure speculation. The compute seems to be there and with no regulation in source code, anything goes, if you want money bad enough. Especially with the mad dash every tech company has been on for the last 20ish years to harvest everything they can, ever since smartphones became powerful and commonplace enough.



  • Was on some United flights recently with their new seatback media systems. The user experience is much better than Delta’s, but also, they actively harvest your information at your seat to build a “profile” on you, they even ask you to choose the type of flight profile you want like “relax” or “fun” etc. and it modifies the content filters for you.

    The kicker though, was on the last flight, when the lighting was just right that I noticed they have a pinhole camera installed on the lower left of the display, along with some IR blasters to power a proximity sensor around a software button.

    Blasters likely produce enough light that the camera can see you even when the screen is off/cabin is dark. So they’re likely building passenger profiles with visual data now as well, it’d be trivial to do facial recognition of “happy, sad, sleepy, etc” on top of capturing your movement in the seat. Did you just use your phone? Did you use the seatback screen? Are you reading a book? What food did you choose?



  • And the health apps know when you’re sleeping, they know your heartrate throughout the day, your o2 sats. They can take all this mortality risk data to factor in things, advertise drugs to you, advertise foods they know you’ll eat even though it’s bad, manipulate how your insurance pays out for your next treatment because it would have been preventable if you hadn’t eaten those donuts. The phone manufacturers know you run apps, how long, what you do (yes, even Apple, especially Apple, they hide behind “privacy” so you feel ok with what they do to you) what web pages you open, how long you view them.

    They could biometrically paint a picture of your day, your movement, there’s an entire profile of data available on many humans. I wouldn’t be surprised if they aren’t already tying heart rate data to viewership of media and advertising.


  • It’s surprisingly easy to use adtech without voice and make a connection to serve a targeted ad. Had a friend ask me about what I was drinking. They were on my guest wifi network. They searched for it. Next day, I’m getting ads because of geoIP pinned my IP address as having an interest.

    Also had someone that lives off the grid with no active network or devices watch a DVD of a movie and the entirety of their Internet connectivity was two cell phones in the room. They started seeing things related to the movie. They’re older and not constantly on their phones. The phones just sit somewhere in the room.

    Had a discussion with some tech friends a few years back and remarked that keeping awake to do this would take a lot of power. The EE mentioned running audio recording would take basically nothing. I expanded from there, the device uploads audio for off-phone translation to text, or queues batch jobs to process locally when power is high enough or on charger. Etc.

    It is 100% probable that code runs on phones and just ships off amalgamated text frequency charts or entire conversations and the user won’t even notice the battery dent.

    That being said, I can’t find even in the greediest capitalist money-claw that the person giving a go would not think, “well, I can’t trust my own device anymore…” and maybe go: “yeah, I shouldn’t do this.” Maybe I’m too optimistic though.