Because it's a garbage proprietary format that needs extra software on every OS. But for some inane reason it's become the standard for piracy stuff. I think that's the only reason it's still alive.
For a few hundred kilobyte file sure, the difference is like pocket change. For a larger one you'd choose the right tool for the job though, especially for things like a split archive or a database.
Nowadays it matters if you use a compression algorithm that can utilize multiple cores for packing/unpacking larger data. For a multiple GB archive that can be the difference between "I'll grab a coffee until this is ready" or "I'll go for lunch and hope it is done when I come back"
I personally prefer bzip2 - but it needs to be packed with pbzip, not the regular bzip to generate archives that can be extracted on multiple cores. Not a good option if you have to think about Windows users, though.
Because gzip and bz2 exists. 7z is almost always a plugin or addon, or extra application. While the first two work out of the box pretty much everywhere. It also depends on frequency of access, frequency of addendum, size, type of data, etc. If you have an archive that you have to add new files frequently, 7z is gonna start grating on you with the compression times. But it is Ok if you are going to extract very frequently from an archive that will never change. While gz and bz2 are overall the “good enough at every use case” format.
For archiving/backupping *NIX files, tar.whatever still wins as it preserves permissions while 7z, zip and rar don't
Oh, and while 7z is FOSS and supported out of the box on most Linux desktop OSes and on macOS, Windows users will complain they need to install stuff to open your zip. Somehow, tar.gz is supported out of the box on Linux, macOS, and yes Windows 10 and 11!
In the early days of the internet, WinZip was a must have tool. My college had a fast internet connection. I say fast but I bet it was less than 1Mb shared between everyone. Way faster than the 33k modem I had at home.
I used my college connection to download so much and then took it home on floppy disks. For files larger than 1MB I'd use WinZip to split files up.
still using 7z. less space, and easier to browse, since the operating system doesnt have to deal with all the files, easier for the cloud to tag.
not caring about space makes the storage more expensive, even games are bigger now with little to none content.
How about when peoples websites would put the sizes of linked images and files so you could estimate how long it would take to download a given image and such? Basically anything 30KB and above would have a size warning attached.
Depends on what you're doing. Dumps of multitrack CD media should always be bin+cue or a compressed version thereof, such as chd. DVDs and Blu-rays can dump as iso. There are also some extremely niche cases such as specific copy protection that require mdf+mds for a proper dump, but that won't be something the average user ever encounters. Basically, those formats exist and are still used for a reason, whether you understand them or not.
I do reserve some hatred for people who dump PS1 games as iso, or who use ccd+img+sub for things where the subchannels have no valid usage.
1:1 copies of the bits on the disc is a valid option that some people prefer. Especially if you want to make your own physical disc or make compressed files encoded in a very specific way. It's also the most reliable way to archive a disc for long-term storage.