Is there something that can generate random Internet usage to make the real sites I go to a bit obfuscated?
I'm thinking something that runs on my server, and simply visits a random website. It probably shouldn't actually be random, and some sort of tweaking would be great. Like the ability to have it visit every news site there is. That way the ISP will have a harder time telling my political bias.
The threat model for this is below using a VPN for normal usage, although getting a dedicated VPN IP address is a project for one day.
It'd be pretty quick to write a script that loads a randomly selected url from a prepopulated list at random intervals. Could probably do it in grease monkey directly in Firefox so you could use other tools in addition like adnausium and a client spoofer.
Just start listing the most popular and generic sites. Then Google a topic like technology and copy whatever those sites are. I imagine you could have a pretty decent list populated in 15 minutes. You could also just ask chatgpt to create lists of the top 100 sites for "x".
What would write in? I might be willing to help because this interests me as well.
#!/bin/bash
# Random_Curl_Request.sh
# CSV file containing websites
CSV_FILE="/home/user/Documents/randomSiteVisitor/websites.csv"
# Get a random line from the CSV file
RANDOM_LINE=$(shuf -n 1 "$CSV_FILE")
# Extract the website URL from the random line
WEBSITE=$(echo $RANDOM_LINE | cut -d ',' -f 1)
# Make a curl request to the random website every minute
while true; do
curl $WEBSITE
sleep 60
# Get a new random line from the CSV file
RANDOM_LINE=$(shuf -n 1 "$CSV_FILE")
# Extract the website URL from the new random line
NEW_WEBSITE=$(echo $RANDOM_LINE | cut -d ',' -f 1)
# Update the website URL for the next iteration
WEBSITE=$NEW_WEBSITE
done