They didn't start it with rocks. The first calculators used gears. Those were hard to reprogram. So they started using relais. That worked but was very slow. Then they found out that lamps (vacuum tubes) could take the place of relais but these wore down too fast. Then someone figured out that rock stuff (silicium) could do the same as a vacuum tube. After that it became a race to make them as small as possible to cram more of them together.
I took a course in computing systems engineering which was basically going all the way from semiconductors up to operating systems and it was incredibly interesting.
One of the things that surprised me was how easy it was to abstract away the lower-level complexity as soon as you got one step up. It's kind of like recursive Lego pieces, you only have to design one piece then you can use a bunch of those to design another piece, then use a bunch of those to design another, and so on. By the end you have several orders of magnitude of the fundamental pieces but you don't really think about them anymore.
The thing about real world processor design though is that all those abstractions are leaky.
At higher levels of design you end up having to consider things like the electrical behavior of transistors, thermal density, the molecular dynamics of strained silicon crystals (and how they behave under thermal cycling), antenna theory, and the limits and quirks of the photolithography process you're using (which is a whole other can of worms with a million things to consider).
Not everyone needs to know everything about every part of the process (that's impossible), but when you're pushing the limits of high performance chips each layer of the design is entangled enough with the others to make everyone's job really complicated.
Generally it's only the wizards that deal with the physical side - such as rock shapping and rock combining - that get magic smoke, though if they did their part wrong the wizards that make rocks think might get it as can the people playing Skyrim using the thinking rocks.
When I learned about how they are making the new CPUs it blew my mind. Dropping a microscopic droplet of metal and blasting it with lasers to a stencil like thingy to create the nanometer circuitry.
I was like how the fuck did you even thought about doing that?..
Technologies like these are really marvelous.
Exactly, and βwe need this as small and precise as possibleβ means βcan lasers do it?β As an engineer I default to fast and precise means computer guided laser if possible
One Switch can have two states. Switch on is a 1 and switch off is a 0. Group 8 switches together and you get a byte. Miniaturize the switches and put 8 trillion of them into the size of a fingernail, and ta-da you have a 1TB micro SD card.
Wire up two switches so that a light bulb only will go on when both switches are on (1). This wiring creates an AND gate. Adjusting the wiring so that if either of the switches are on, the light turns on. This wiring is an OR gate.
Channing the output of the lightbulb and treating it like a new switch allows you to combine enough AND and OR gates to make other logic blocks, NOT, NAND, XOR, etc.
Combine enough logic blocks and you can wire up a circuit so that you can add the value of two switches together, and now you can start to perform addition.
This all naturally evolves to the point where you can play Skyrim with the most degenerate porn mods.
Simply put, the switching doesn't do anything by itself. It's the meaning we assign to the arrangement of on-off switches. Much like flag signals, the flags don't do anything besides be visible and locatable. Yet, we can establish a communication protocol with flags, lights, fingers on a hand, etc. this signaling is done electronically with many layers of meaning and complexity, and nowadays at unfathomable scale and speed.
Watch this. It's a guy who shows how computers work using dominoes. It really helps explain how calculating something works at its most fundamental level
Well... since you put it that way, it is quite staggeringly improbable, isn't it?
"Through these terse, inter-connected runes, an invisible magic flows. You cannot change the rune, as then the spell will be broken." "Where does the magic come from, mommy?"
"From the highest point in the invisible topology of this magic, Billy: the Hoover Dam/Niagara Falls".
No it's even worse. We taught the rock how to think, and now force it to think what we want it to think. Millions of thoughts that we want, every second.
Software is a necessary component, just like screws are a necessary component in an engine. Screws don't exist only in engines, have existed since long before engines, and can be used in other ways. Just like software.
How is software without a CPU useful? Its literally a list of instructions for a CPU.
Also a CPU can still calculate stuff if you just send electrical signals to the right connections. Software is just a way for the CPU to keep going and do more calculations with the results.
Software is algorithmic instructions. We wrote and executed algorithms by hand long before we had calculating machines; and when we did get computers that could run more complex algorithms, they didn't have CPUs. They had vacuum tubes (there were even simpler programmable purely mechanical computers before even vacuum tubes). CPUs didn't come along until much later; we'd been writing software and programming computers for decades before the first CPU.
And even if you try to argue that vacuum tubes computers had some collection of tubes that you could call a "CPU" - which would be a stretch - then it still wouldn't have been made from silicon (rocks) as in the OP post.
But before the first calculating mashing, people are writing algorithms - what software literally is - and executing them by hand long before we had calculating machines to do it for us. Look up how we calculated the ranging tables for artillery in WWII. Algorithms. Computed by hand.
The word "computer" literally comes from the word for the people (often women) who would execute algorithms using their brains to compute results.
my guy megabytes of executable binary are just about as usable as a cpu try reverse engeneering more 1s and 0s than you've ever read into something usable when you dont even know how it converts into assembly and logical operations because you lack the architecture knowledge of a cpu
I'm lost on how a transistor can just stay 0 or 1 when it's just a super teeny tiny circle of wire, basically. Like, I know the typical explanation, but it doesn't really make it any clearer. Electricity moves in a magic shape, and stuff happens. π€·π»ββοΈ
I'm not sure what the typical explanation is, but a transistor is not a wire.
A wire is a conductor. It conducts electricity from end to the other.
A transistor is a semi-conductor device made from semi-conducting materials, so it conducts electricity between 2 ends with a variable electrical resistance. This variable can be controlled by putting voltage on the third leg. This way a transistor is basically a resistor with a variable resistance, which unlike a resistor is also controllable by a third input.
This ability is a property of the material. It cannot be constructed by a regular wire.
The thing is, it never stays in a constant state! It's more like a water dam with a steady water flow that you can open and close. You said you know the typical explanation already, so I won't cry to explain it again.
The memory is generally done by something called a capacitor (though there are more techniques) which just can hold a little electrical charge - roughly having an electrical charge means it's a 1, otherwise it's a 0.
Get 8 of those things and you have a byte.
It's generally easier to think of it as water: electrical lines are tubes moving water, capacitors are little containers that can have water (meaning bit = 1) or be empty (bit = 0), transistors are one-way water valves which are controlled by water (imagine they have a pressure button that opens the gate if there is water running in the tube passimg by that button, putting pressure on it).
From this simple basis you can actually create a lot of complexity by having a LOT of these things combined in weird ways.
Further, there's also a lot of complexity due to the Physics of the real world being less than perfect (for example, the "capacitors" leak water, so not only do you have to say that bit=1 is "water above a certain level" rathe than "full" - since as soon as it's filled it starts losing water - but you even have to check them once in a while and top up the ones which are supposed to be full before so kuch water leaks that the level has fallen below that treated as a "1" - this is actually how DRAM memory works, though with electric charge rather than water).
Related to development of semiconductor devices, especially transistors. The original computers used punched holes. In some way itβs an extension of that same idea