Is there something that can generate random Internet usage to make the real sites I go to a bit obfuscated?
I'm thinking something that runs on my server, and simply visits a random website. It probably shouldn't actually be random, and some sort of tweaking would be great. Like the ability to have it visit every news site there is. That way the ISP will have a harder time telling my political bias.
The threat model for this is below using a VPN for normal usage, although getting a dedicated VPN IP address is a project for one day.
As I mentioned I have a server, and I use a VPN to connect always to it. This makes using a paid VPN a bit harder. The dedicated VPN IP should fix this issue but I haven't looked into how difficult that'd be.
Yeah it really slims down your VPN choices as having an IP address associated with your account makes it much more identifiable. So some providers wont offer them (such as mullvad).
It also usually costs more. The one I know offers a static IP is express VPN and ive heard Proton has plans on offering it. It looks like PIA offers it too.
I know with tailscale you can set Mullvad as the exit point for all clients within a subset. I imagine you can do something similar with a private VPN, with a ton more effort.
If you're on a VPN then the thousand other people using that server provide that type of obfuscation.
What you seem to want is a web crawler that is perceived to be a real human and navigates alternate sites in real time. That's a near-impossible cat and mouse game. Ultimately not worth the effort. All you have to do is be harder to track than the majority. The value for ad tech is in efficiently profiling the 99% — not the 1% of paranoid folk.
you can run an ArchiveTeam Warrior on your server and choose the URLs project. if i understand correctly, the Warrior will continuously visit randomly discovered websites to download their contents and upload them to a server that later feeds the data into the Internet Archive. best of both worlds - your ISP has a harder time distinguishing your real traffic from the ArchiveTeam-generated one, and your server is actively contributing to IA.
God, I only use Ublock Origin on Firefox. No TOR, VPNs, or anything like that.
Despite that, there are a handful of Google-related websites like Virustotal that now permanently trap me in repeating captchas. Youtube will occasionally decide to block my IP entirely for a week.
Let me tell you, this shit doesn't make me more inclined to disable ad blocking. Instead, I've starting finding alternatives and using a sandboxed vanilla Chromium for problem pages.
First off, if you're concerned about ISPs selling your data (couldn't exactly tell if that's a part of your concern), switching to private DNS provider and enabling DNS over HTTPS/TLS can significantly cut down on that, since most of what ISPs sell comes from DNS requests. That being said, they can still tell what sites you visit if you don't use a VPN/Tor, but they're less likely to care unless you're doing something illegal.
In terms of your obfuscation plan, I'm not sure that'd do much; if anything, it'd make you stand out more. A bunch of random traffic, even tweaked to fit your browsing habits, probably would look suspicious on their end and it wouldn't actually hide or disguise anything.
So ideally, you're just going to want to figure something out to set up some sort of VPN at some point. Switching DNS providers might be a bit of help in reducing sale of your traffic data, however. My recommendation is Quad9 but any privacy-friendly provider is fine.
It'd be pretty quick to write a script that loads a randomly selected url from a prepopulated list at random intervals. Could probably do it in grease monkey directly in Firefox so you could use other tools in addition like adnausium and a client spoofer.
Just start listing the most popular and generic sites. Then Google a topic like technology and copy whatever those sites are. I imagine you could have a pretty decent list populated in 15 minutes. You could also just ask chatgpt to create lists of the top 100 sites for "x".
What would write in? I might be willing to help because this interests me as well.
Just curl a bunch of sites at random times? Under https, everything in the URL except the domain is encrypted, so it'll look roughly like a regular user requesting a page.
It usually isn't super hard to tell apart randomized junk like this from real human patterns. That is why Tor Browser for example tries its best to make everyone look the same instead of randomizing everything.
That said, for the mere purpose of throwing off the ISPs profiling algorithms, you could make a relatively simple python program to solve this. A naive solution would just do an http GET to each site, but a better solution would mimic human web browsing:
Get a list of various news sites and political forum sites
Setup headless firefox or chromium
Use Selenium or similar to crawl links on each site. Make sure you have the pages fully load and wait a random amount of time that a human would before going to the next page.
If you have no programming capability this will be rough. If you have at least a little you can follow tutorials and use an LLM to help you.
The main issue with this goal is that it isn't possible to tell how advanced your ISP's profiling is, so you have no way to know if your solution is effective.