• 0 Posts
  • 419 Comments
Joined 5 months ago
cake
Cake day: February 10th, 2025

help-circle






  • The problem with trying to increase the signal to noise ratio is that you don’t know all of the datapoints that are being collected and some of those datapoints could be used to filter the real from the fake.

    Like, in your example, if you made all of these account from the same browser then they could be linked together. If they were made on the same IP, they could be linked together. If you were using the same phone, they could be linked together. Those are just the datapoints that we know to try to protect, it’s the datapoints that you don’t know that get you.

    Like, maybe your phone or desktop is screenshotting itself every 5 seconds (“for AI purposes”) or maybe the app that you’re trying to fool also secretly sends your GPS location during account creation or maybe the adversary has malware running on your PC which is keylogging you.

    IF you knew all of the ways that they were collecting data on you, then you could take countermeasures. Since you don’t, you have to assume that any of your identities can be linked to your person unless you take unusual measures such, not using Microsoft/Google/Meta/Amazon/etc products at a minimum. Depending on your security needs this could also mean things like using burner hardware, non-commercial VPNs, physically disabling sensors/radios/ports, traffic/network monitoring, etc.


  • How are they gonna trace that to you?

    The modern Internet is essentially about spying on you as much as possible and then selling the data to whoever wants to buy it. Linking identities with devices/browsers is worth a lot of money and so most every website/app has a way of linking you to the devices and software that you use.

    Unless the user took some pretty extreme measures to create the account, they’ve likely logged in from a phone/ip/browser that has been linked to their real identity at some point in its lifetime. That link will be sold to data brokers and used to tie the random handle to you, the person. Then the State Department just buys that information.

    Alternatively, you should be assuming that sovereign entities with the means are reading all public network data. There’s a lot of information that you can learn from that as well. Like, over time, the posts from the ‘random’ account could be strongly correlated to the times that you were accessing the site even if all of the data was encrypted with HTTPS.

    Alternatively, alternatively. There is a threat known as Store Now Decrypt Later (SNDL). The idea is basically: Quantum Computers are coming and they can break some cryptographic primitives. If someone saves all of the encrypted traffic that they would want to read, in a few years they will have the means to read that data. We won’t know when this moment occurs, because it’ll likely be a secret, but we do know that it will happen and so you should additionally assume that anything that isn’t using post-quantum encryption, which transited a public network, will be read and used to link you to your identities.

    This is, essentially, the core thing that the Privacy community is attempting to mitigate.





  • The language about collecting and using data have been in TOSs for basically every online service since the early '00s.

    I’m not saying that this is okay. The data that these services collect, which we’ve given them unlimited rights to, has only become more valuable and the incentives for these companies are always for them to gather more data about you.

    You can use archive.org if you want to look at older policies from the same company. But, if you pull up any other game with an online component you will see that they all are essentially “Don’t cheat our services or hide your identity, We’re going to collect your data and use it how we want, and you have to enter into binding arbitration” with various levels of detail and verbosity.








  • It is also worth nothing that no Borderlands games use anti-cheat, much less kernel anti-cheat. I’d even go as far as to say that no Gearbox, Take2 or 2k Games use kernel anti-cheat.

    This is boilerplate language for games which include an online service component. Publishers often use the same Terms of Service across all of their games, so they include language that is often irrelevant for any specific game.

    The only thing that’s different about this is that there are a bunch of bored people who consume engagement farming content, which often make outrageous claims in order to earn money from engagement farming. This “story” is not an actual story, but it is a great example of how a mob can be summoned with some creative writing and a credulous audience.