Skip Navigation
A terrible day to have ears: Google’s NotebookLM auto-generates AI podcasts
  • I tried it with a bunch of stuff already and shared with friends.

    I hope your friends find a better friend.

  • Featured
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 6 October 2025
  • Found while poking around today: the Wikipedia club for cleaning up after AI.

    Example: the article Leninist historiography was entirely written by AI and previously included a list of completely fake sources in Russian and Hungarian at the bottom of the page.

  • Featured
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 6 October 2025
  • "My name is Scroder Cher. I take care of the place while the Master is away."

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 6 October 2025

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    187
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 29 September 2024
  • "Space opera's the same, but they call it le space opera."

    It's been a long time since I lived in France, so my sense of what is idiomatic has no doubt grown rusty, but "Théâtre D'opéra" doesn't sound right. The word "Théâtre" doesn't belong in a reference to the place where operas are performed. It's "L'opéra Garnier" and "L'opéra Bastille" in Paris and "L'opéra Nouvel" in Lyon, for example. I'd read "théâtre d'opéra" as more like "operatic theatre" in the sense of a genre (contrasted with, e.g., spoken-word theatre). I could be completely wrong here, but the title feels like a naive machine translation.

  • Routledge nags academics to finish books ASAP — to feed Microsoft’s AI
  • Masto reply guy: "is it not better to train LLM on proper academic papers rather than random rants on Twitter and Reddit?"

    well, actually, it's better to train them not at all, and if anyone advocates their use, to kick them in the nads repeatedly

  • threads is cookin tonite
  • The only use I've had for writing cursive in 30 years has been to copy out an anti-cheating pledge on a standardized test, because some fucker thought cursive magically makes a pledge 300% more honest.

  • Routledge nags academics to finish books ASAP — to feed Microsoft’s AI
  • I don't know of one that has definitively said that they won't.

    Fuck it, we're going to have to found a publisher.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 29 September 2024
  • Surprisingly, not a sexbot!

    Well, not with that attitude

  • under categories "LEADERSHIP -> SAM ALTMAN": making drugs extremely uncool again
  • At some point, that just becomes unfair to the grass.

  • under categories "LEADERSHIP -> SAM ALTMAN": making drugs extremely uncool again
  • You guys aren’t gonna like to hear this, but being super wealthy and successful is always going to confer some degree of cool.

    "Suck a dick, dumb shit," the administrator said.

  • a16z picks the next tech hype after Web3 and AI! It’s … anime?
  • We trained a neural network on 29 years of Evangelion merchandise, and it screamed and dissolved into TANG.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 29 September 2024
  • Twitter — Elon Musk’s X — may be the most fruitful platform for this kind of search thanks to its sub-competent moderation services.

    Zing.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 29 September 2024
  • iterating on new core loops

    We're sorry, but the brainrot is too far advanced. Amputation is your only hope.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 29 September 2024
  • Large Reasoning Models

    May the coiners of this jargon step on Lego until the end of days

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 29 September 2024
  • From an article about a boutique brand that sells books to rich people:

    Assouline has made its name publishing tomes that sell for $1,000 or more.

    Oh, so they publish textbooks.

    "They represent stealth wealth, intended to tell you what your hosts are about and to provide visual evidence: that the owners are people of wealth, education and taste."

    🎶 Please allow me to introduce myself 🎶

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 29 September 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    167
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 22 September 2024
  • For some reason, the news of Red Lobster's bankruptcy seems like a long time ago. I would have sworn that I read this story about it before the solar eclipse.

    Of course, the actual reasons Red Lobster is circling the drain are more complicated than a runaway shrimp promotion. Business Insider’s Emily Stewart explained the long pattern of bad financial decisions that spelled doom for the restaurant—the worst of all being the divestment of Red Lobster’s property holdings in order to rent them back on punitive leases, adding massive overhead. (As Ray Kroc knows, you’re in the real estate business!) But after talking to many Red Lobster employees over the past month—some of whom were laid off without any notice last week—what I can say with confidence is that the Endless Shrimp deal was hell on earth for the servers, cooks, and bussers who’ve been keeping Red Lobster afloat. They told me the deal was a fitting capstone to an iconic if deeply mediocre chain that’s been drifting out to sea for some time. [...] “You had groups coming in expecting to feed their whole family with one order of endless shrimp,” Josie said. “I would get screamed at.” She already had her share of Cheddar Bay Biscuit battle stories, but the shrimp was something else: “It tops any customer service experience I’ve had. Some people are just a different type of stupid, and they all wander into Red Lobster.”

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 22 September 2024
  • Yeah, Krugman appearing on the roster surprised me too. While I haven't pored over everything he's blogged and microblogged, he hasn't sent up red flags that I recall. E.g., here he is in 2009:

    Oh, Kay. Greg Mankiw looks at a graph showing that children of high-income families do better on tests, and suggests that it’s largely about inherited talent: smart people make lots of money, and also have smart kids.

    But, you know, there’s lots of evidence that there’s more to it than that. For example: students with low test scores from high-income families are slightly more likely to finish college than students with high test scores from low-income families.

    It’s comforting to think that we live in a meritocracy. But we don’t.

    And in 2014:

    There are many negative things you can say about Paul Ryan, chairman of the House Budget Committee and the G.O.P.’s de facto intellectual leader. But you have to admit that he’s a very articulate guy, an expert at sounding as if he knows what he’s talking about.

    So it’s comical, in a way, to see [Paul] Ryan trying to explain away some recent remarks in which he attributed persistent poverty to a “culture, in our inner cities in particular, of men not working and just generations of men not even thinking about working.” He was, he says, simply being “inarticulate.” How could anyone suggest that it was a racial dog-whistle? Why, he even cited the work of serious scholars — people like Charles Murray, most famous for arguing that blacks are genetically inferior to whites. Oh, wait.

    I suppose it's possible that he was invited to an e-mail list in the late '90s and never bothered to unsubscribe, or something like that.

  • "The Subprime AI Crisis" - Ed Zitron on the bubble's impending collapse
  • From the documentation:

    While reasoning tokens are not visible via the API, they still occupy space in the model's context window and are billed as output tokens.

    Huh.

  • Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 22 September 2024

    Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    Last week's thread

    (Semi-obligatory thanks to @dgerard for starting this)

    200
    Off-Topic: Music Recommendation Thread

    So, here I am, listening to the Cosmos soundtrack and strangely not stoned. And I realize that it's been a while since we've had a random music recommendation thread. What's the musical haps in your worlds, friends?

    39
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 7 July 2024

    Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh facts of Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > >Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    221
    Honest Government Ad | AI

    Bumping this up from the comments.

    2
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 16 June 2024

    Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    14
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 9 June 2024

    Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    14
    Neil Gaiman on spicy autocomplete
    www.tumblr.com Neil Gaiman

    I apologize if you’ve been asked this question before I’m sure you have, but how do you feel about AI in writing? One of my teachers was “writing” stories using ChatGPT then was bragging about how go…

    > Many magazines have closed their submission portals because people thought they could send in AI-written stories. > > For years I would tell people who wanted to be writers that the only way to be a writer was to write your own stories because elves would not come in the night and do it for you. > > With AI, drunk plagiaristic elves who cannot actually write and would not know an idea or a sentence if it bit their little elvish arses will actually turn up and write something unpublishable for you. This is not a good thing.

    2
    Cybertruck owners allege pedal problem as Tesla suspends deliveries
    arstechnica.com Cybertruck owners allege pedal problem as Tesla suspends deliveries

    Owners will have to wait until April 20 for deliveries to resume.

    > Tesla's troubled Cybertruck appears to have hit yet another speed bump. Over the weekend, dozens of waiting customers reported that their impending deliveries had been canceled due to "an unexpected delay regarding the preparation of your vehicle." > > Tesla has not announced an official stop sale or recall, and as of now, the reason for the suspended deliveries is unknown. But it's possible the electric pickup truck has a problem with its accelerator. [...] Yesterday, a Cybertruck owner on TikTok posted a video showing how the metal cover of his accelerator pedal allegedly worked itself partially loose and became jammed underneath part of the dash. The driver was able to stop the car with the brakes and put it in park. At the beginning of the month, another Cybertruck owner claimed to have crashed into a light pole due to an unintended acceleration problem.

    Meanwhile, layoffs!

    0
    Google Books Is Indexing AI-Generated Garbage
    www.404media.co Google Books Is Indexing AI-Generated Garbage

    Google said it will continue to evaluate its approach “as the world of book publishing evolves.”

    > Google Books is indexing low quality, AI-generated books that will turn up in search results, and could possibly impact Google Ngram viewer, an important tool used by researchers to track language use throughout history.

    0
    Elon Musk’s Tunnel Reportedly Oozing With Skin-Burning Chemical Sludge
    futurism.com Elon Musk’s Tunnel Reportedly Oozing With Skin-Burning Chemical Sludge

    Elon Musk's Boring Company has only built a few miles of tunnel underneath Vegas — but those tunnels have taken a toxic toll.

    [Eupalinos of Megara appears out of a time portal from ancient Ionia] Wow, you guys must be really good at digging tunnels by now, right?

    0
    New York taxpayers are paying for spicy autocomplete to tell landlords they can discriminate
    themarkup.org NYC’s AI Chatbot Tells Businesses to Break the Law – The Markup

    The Microsoft-powered bot says bosses can take workers’ tips and that landlords can discriminate based on source of income

    > In October, New York City announced a plan to harness the power of artificial intelligence to improve the business of government. The announcement included a surprising centerpiece: an AI-powered chatbot that would provide New Yorkers with information on starting and operating a business in the city. > > The problem, however, is that the city’s chatbot is telling businesses to break the law.

    0
    Chris Langan and the "Cognitive Theoretic Model of the Universe"? Oh boy!

    a lesswrong: 47-minute read extolling the ambition and insights of Christopher Langan's "CTMU"

    a science blogger back in the day: not so impressed

    > [I]t’s sort of like saying “I’m going to fix the sink in my bathroom by replacing the leaky washer with the color blue”, or “I’m going to fly to the moon by correctly spelling my left leg.”

    Langan, incidentally, is a 9/11 truther, a believer in the "white genocide" conspiracy theory and much more besides.

    0
    Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 31 March 2024

    Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut'n'paste it into its own post, there’s no quota here and the bar really isn't that high

    > The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    0
    Elsevier keeps publishing articles written by spicy autocomplete

    If you've been around, you may know Elsevier for surveillance publishing. Old hands will recall their running arms fairs. To this storied history we can add "automated bullshit pipeline".

    In Surfaces and Interfaces, online 17 February 2024:

    > Certainly, here is a possible introduction for your topic:Lithium-metal batteries are promising candidates for high-energy-density rechargeable batteries due to their low electrode potentials and high theoretical capacities [1], [2].

    In Radiology Case Reports, online 8 March 2024:

    > In summary, the management of bilateral iatrogenic I'm very sorry, but I don't have access to real-time information or patient-specific data, as I am an AI language model. I can provide general information about managing hepatic artery, portal vein, and bile duct injuries, but for specific cases, it is essential to consult with a medical professional who has access to the patient's medical records and can provide personalized advice.

    Edit to add this erratum:

    > The authors apologize for including the AI language model statement on page 4 of the above-named article, below Table 3, and for failing to include the Declaration of Generative AI and AI-assisted Technologies in Scientific Writing, as required by the journal’s policies and recommended by reviewers during revision.

    Edit again to add this article in Urban Climate:

    > The World Health Organization (WHO) defines HW as “Sustained periods of uncharacteristically high temperatures that increase morbidity and mortality”. Certainly, here are a few examples of evidence supporting the WHO definition of heatwaves as periods of uncharacteristically high temperatures that increase morbidity and mortality

    And this one in Energy:

    > Certainly, here are some potential areas for future research that could be explored.

    Can't forget this one in TrAC Trends in Analytical Chemistry:

    > Certainly, here are some key research gaps in the current field of MNPs research

    Or this one in Trends in Food Science & Technology:

    > Certainly, here are some areas for future research regarding eggplant peel anthocyanins,

    And we mustn't ignore this item in Waste Management Bulletin:

    > When all the information is combined, this report will assist us in making more informed decisions for a more sustainable and brighter future. Certainly, here are some matters of potential concern to consider.

    The authors of this article in Journal of Energy Storage seems to have used GlurgeBot as a replacement for basic formatting:

    > Certainly, here's the text without bullet points:

    0
    SneerClub Classic: Big Yud's Mad Men Cosplay

    !

    Yudkowsky writes,

    > How can Effective Altruism solve the meta-level problem where almost all of the talented executives and ops people were in 1950 and now they're dead and there's fewer and fewer surviving descendants of their heritage every year and no blog post I can figure out how to write could even come close to making more people being good executives?

    Because what EA was really missing is collusion to hide the health effects of tobacco smoking.

    0
    Billionaires push AI apocalypse risk through college student groups

    > Last summer, he announced the Stanford AI Alignment group (SAIA) in a blog post with a diagram of a tree representing his plan. He’d recruit a broad group of students (the soil) and then “funnel” the most promising candidates (the roots) up through the pipeline (the trunk).

    See, it's like marketing the idea, in a multilevel way

    0
    Talking about a ‘schism’ is ahistorical
    medium.com Talking about a ‘schism’ is ahistorical

    In two recent conversations with very thoughtful journalists, I was asked about the apparent ‘schism’ between those making a lot of noise…

    Emily M. Bender on the difference between academic research and bad fanfiction

    0
    "If you think you can point to an unnecessary sentence within [HPMoR], go ahead and try."

    In the far-off days of August 2022, Yudkowsky said of his brainchild,

    > If you think you can point to an unnecessary sentence within it, go ahead and try. Having a long story isn't the same fundamental kind of issue as having an extra sentence.

    To which MarxBroshevik replied,

    > The first two sentences have a weird contradiction: > >> Every inch of wall space is covered by a bookcase. Each bookcase has six shelves, going almost to the ceiling. > > So is it "every inch", or are the bookshelves going "almost" to the ceiling? Can't be both. > > I've not read further than the first paragraph so there's probably other mistakes in the book too. There's kind of other 'mistakes' even in the first paragraph, not logical mistakes as such, just as an editor I would have... questions.

    And I elaborated:

    I'm not one to complain about the passive voice every time I see it. Like all matters of style, it's a choice that depends upon the tone the author desires, the point the author wishes to emphasize, even the way a character would speak. ("Oh, his throat was cut," Holmes concurred, "but not by his own hand.") Here, it contributes to a staid feeling. It emphasizes the walls and the shelves, not the books. This is all wrong for a story that is supposed to be about the pleasures of learning, a story whose main character can't walk past a bookstore without going in. Moreover, the instigating conceit of the fanfic is that their love of learning was nurtured, rather than neglected. Imagine that character, their family, their family home, and step into their library. What do you see?

    > Books — every wall, books to the ceiling.

    Bam, done.

    > This is the living-room of the house occupied by the eminent Professor Michael Verres-Evans,

    Calling a character "the eminent Professor" feels uncomfortably Dan Brown.

    > and his wife, Mrs. Petunia Evans-Verres, and their adopted son, Harry James Potter-Evans-Verres.

    I hate the kid already.

    > And he said he wanted children, and that his first son would be named Dudley. And I thought to myself, what kind of parent names their child Dudley Dursley?

    Congratulations, you've noticed the name in a children's book that was invented to sound stodgy and unpleasant. (In The Chocolate Factory of Rationality, a character asks "What kind of a name is 'Wonka' anyway?") And somehow you're trying to prove your cleverness and superiority over canon by mocking the name that was invented for children to mock. Of course, the Dursleys were also the start of Rowling using "physically unsightly by her standards" to indicate "morally evil", so joining in with that mockery feels ... It's aged badly, to be generous.

    Also, is it just the people I know, or does having a name picked out for a child that far in advance seem a bit unusual? Is "Dudley" a name with history in his family — the father he honored but never really knew? His grandfather who died in the War? If you want to tell a grown-up story, where people aren't just named the way they are because those are names for children to laugh at, then you have to play by grown-up rules of characterization.

    The whole stretch with Harry pointing out they can ask for a demonstration of magic is too long. Asking for proof is the obvious move, but it's presented as something only Harry is clever enough to think of, and as the end of a logic chain.

    >"Mum, your parents didn't have magic, did they?" \[...\] "Then no one in your family knew about magic when Lily got her letter. \[...\] If it's true, we can just get a Hogwarts professor here and see the magic for ourselves, and Dad will admit that it's true. And if not, then Mum will admit that it's false. That's what the experimental method is for, so that we don't have to resolve things just by arguing."

    Jesus, this kid goes around with L's theme from Death Note playing in his head whenever he pours a bowl of breakfast crunchies.

    >Always Harry had been encouraged to study whatever caught his attention, bought all the books that caught his fancy, sponsored in whatever maths or science competitions he entered. He was given anything reasonable that he wanted, except, maybe, the slightest shred of respect.

    Oh, sod off, you entitled little twit; the chip on your shoulder is bigger than you are. Your parents buy you college textbooks on physics instead of coloring books about rocketships, and you think you don't get respect? Because your adoptive father is incredulous about the existence of, let me check my notes here, literal magic? You know, the thing which would upend the body of known science, as you will yourself expound at great length.

    >"Mum," Harry said. "If you want to win this argument with Dad, look in chapter two of the first book of the Feynman Lectures on Physics.

    Wesley Crusher would shove this kid into a locker.

    1
    blakestacey blakestacey @awful.systems
    Posts 21
    Comments 205
    Moderates