Jesus christ, it feels like everyday I learn a new dreadful horror about American life. How the fuck have we not run out of these creepy facts yet? The country is soo fucked up, I swear to god... It's honestly hard to belive it's a real place, it didn't really sink in how much of a shit hole it is until I saw it my self... It's a twilight world where pain is a virtue for some reason.
Edit: misread. I thought you asked what this was about. Yeah, I'm sure it wasn't just here.
Men would sometimes have doctors add an extra stitch to women after childbirth to make them tighter. I think I've even heard sometimes doctors did it without concent of either party. Some women didn't even know it was done until they had complications and found out.