I'm actually still working on a project kinda related to this, but am currently in a serious "is this embarrassingly stupid?" stage because I'm designing something without enough technical knowledge to know what is possible but trying to keep focused on the purpose and desired outcome.
our position on this is that we do not include any AI tools in our apps and allow users their own choice of where to back up work, allowing them to choose services that don't allow AI access. Thanks :)
I spent a good chunk of my 20s obsessed with building a co-writing web platform I called PlotPlant. I really want to riff off what you did here, but I'm scared it will reignite my interest in the project and I'll just add to the pile of unfinished work
I joined a writing meetup here in Amsterdam which gathers every week in a bar to write, to talk about their writing, to bounce ideas, etc. I kinda got tired of going because there were a worrying number of people using chatgpt to generate ideas. I was the only one trying to write non-fiction, and most of what I was writing would be crit of tech (sometimes genAI) so talking about my writing was always fun. But nonetheless, their use of chatgpt seemed extra weird because we were there, together, to write and support each other, for free.
It's strange to use solidarity, support, and just general helpfulness from others as an explanation for how AI opens writing up to classes or abilities when that's probably one of the top things that social media (and pre-social media social media) gave us on the internet.
I ask you this hoping it isn't insulting, but how are you with os kernel level stuff?