They didn’t start it with rocks. The first calculators used gears. Those were hard to reprogram. So they started using relais. That worked but was very slow. Then they found out that lamps (vacuum tubes) could take the place of relais but these wore down too fast. Then someone figured out that rock stuff (silicium) could do the same as a vacuum tube. After that it became a race to make them as small as possible to cram more of them together.
I took a course in computing systems engineering which was basically going all the way from semiconductors up to operating systems and it was incredibly interesting.
One of the things that surprised me was how easy it was to abstract away the lower-level complexity as soon as you got one step up. It’s kind of like recursive Lego pieces, you only have to design one piece then you can use a bunch of those to design another piece, then use a bunch of those to design another, and so on. By the end you have several orders of magnitude of the fundamental pieces but you don’t really think about them anymore.
The thing about real world processor design though is that all those abstractions are leaky.
At higher levels of design you end up having to consider things like the electrical behavior of transistors, thermal density, the molecular dynamics of strained silicon crystals (and how they behave under thermal cycling), antenna theory, and the limits and quirks of the photolithography process you’re using (which is a whole other can of worms with a million things to consider).
Not everyone needs to know everything about every part of the process (that’s impossible), but when you’re pushing the limits of high performance chips each layer of the design is entangled enough with the others to make everyone’s job really complicated.
EDIT: Some interesting links:
https://www.youtube.com/watch?v=U885cIhOXBM
When I learned about how they are making the new CPUs it blew my mind. Dropping a microscopic droplet of metal and blasting it with lasers to a stencil like thingy to create the nanometer circuitry. I was like how the fuck did you even thought about doing that?.. Technologies like these are really marvelous.
You start with macroscopic photolithography, add material science of semiconductors and then iterate a million times. It didn’t start at nanoscale.
Give me a break… I’m still trying to wrap my head around how transistors work. For a layman this is like magic.
Photolithography started as a printing technique and is pretty basic.
It’s essentially the same as somebody with a couple of cans of spray piant and a handfull of carboard sheets with cutouts spray paimting a muti-color tag on a wall.
As the logo kept getting smaller and smaller and the errors of the process of just putting that cardboard in front of the wall and spraying the whole thing had too much imperfection for tiny logos, they had to come up with more and more tricks to get it to still do tiny logos without those logos ending up too distorted.
Like glueing a cardboard to the wall and then dissolving it after spraying :)
Exactly, and “we need this as small and precise as possible” means “can lasers do it?” As an engineer I default to fast and precise means computer guided laser if possible
They use electron beams and extreme UV light nowadays. Lasers are not necessarily the best light source, even at other wavelengths.
I’m still stuck on how you get from “switch go on, switch go off” to, well, anything
One Switch can have two states. Switch on is a 1 and switch off is a 0. Group 8 switches together and you get a byte. Miniaturize the switches and put 8 trillion of them into the size of a fingernail, and ta-da you have a 1TB micro SD card.
Wire up two switches so that a light bulb only will go on when both switches are on (1). This wiring creates an AND gate. Adjusting the wiring so that if either of the switches are on, the light turns on. This wiring is an OR gate.
Channing the output of the lightbulb and treating it like a new switch allows you to combine enough AND and OR gates to make other logic blocks, NOT, NAND, XOR, etc.
Combine enough logic blocks and you can wire up a circuit so that you can add the value of two switches together, and now you can start to perform addition.
This all naturally evolves to the point where you can play Skyrim with the most degenerate porn mods.
I like your explanation, but i dont understand it. Keep up the good work.
This game may help, but I didn’t play it myself (yet)
In addition to Turing Complete, which is really good, Code: The Hidden Language of Computer Hardware and Software is a fantastic book that literally goes from two kids trying to talk to each other at night with flashlights, to a fully working Z80 clone, while not being hard to understand and using a really good conversational teaching method. It’s how I figured out a lot about CPU design, microarchitectures, assembly and machine langauge and a lot of other things.
Simply put, the switching doesn’t do anything by itself. It’s the meaning we assign to the arrangement of on-off switches. Much like flag signals, the flags don’t do anything besides be visible and locatable. Yet, we can establish a communication protocol with flags, lights, fingers on a hand, etc. this signaling is done electronically with many layers of meaning and complexity, and nowadays at unfathomable scale and speed.
Watch this. It’s a guy who shows how computers work using dominoes. It really helps explain how calculating something works at its most fundamental level
Transistors (electrically activated diodes) allow for logic gates. Logic gates allow for wild bullshit
Well… since you put it that way, it is quite staggeringly improbable, isn’t it?
“Through these terse, inter-connected runes, an invisible magic flows. You cannot change the rune, as then the spell will be broken.”
“Where does the magic come from, mommy?”
“From the highest point in the invisible topology of this magic, Billy: the Hoover Dam/Niagara Falls”.blessed be the white magic that reoies not on corruption of the elements
sighs Okay, to start, you’re going to need some amber and sheep’s wool…
deleted by creator
It was later called the stone age.
No it’s even worse. We taught the rock how to think, and now force it to think what we want it to think. Millions of thoughts that we want, every second.
It’s only time until the rock fights back.
I’m a little offended that this utterly skips over software, as if a CPU would do anything without the component that was invented before any CPU.
Software without a CPU is still useful. The reverse is not true.
Isn’t software the “trick rock into thinking” part?
Software is a list of instructions, you can execute it with pen and paper or just using your brain if you want.
Try doing that with my code.
Or with a Skyrim porn mod
Insert Matrix quote " All I see is blonde, brunette, redhead…"
Try doing it with IOCCC code.
Yuck
Software is a necessary component, just like screws are a necessary component in an engine. Screws don’t exist only in engines, have existed since long before engines, and can be used in other ways. Just like software.
How is software without a CPU useful? Its literally a list of instructions for a CPU.
Also a CPU can still calculate stuff if you just send electrical signals to the right connections. Software is just a way for the CPU to keep going and do more calculations with the results.
Software is algorithmic instructions. We wrote and executed algorithms by hand long before we had calculating machines; and when we did get computers that could run more complex algorithms, they didn’t have CPUs. They had vacuum tubes (there were even simpler programmable purely mechanical computers before even vacuum tubes). CPUs didn’t come along until much later; we’d been writing software and programming computers for decades before the first CPU.
And even if you try to argue that vacuum tubes computers had some collection of tubes that you could call a “CPU” - which would be a stretch - then it still wouldn’t have been made from silicon (rocks) as in the OP post.
But before the first calculating mashing, people are writing algorithms - what software literally is - and executing them by hand long before we had calculating machines to do it for us. Look up how we calculated the ranging tables for artillery in WWII. Algorithms. Computed by hand.
The word “computer” literally comes from the word for the people (often women) who would execute algorithms using their brains to compute results.
I think you’re conflating “algorithm” with “software”. You’re right in saying that algorithms can be computed by hand, but I don’t think anyone would refer to that as “running software”. The word “software” implies that it’s run on “hardware”, and hardware usually implies some sort of electronic (or even mechanical*) circuit, not pen and paper and a human brain.
Software runs on processing power. Doesn’t matter if it’s mechanical, electrical or biological computing power.
The important part is, that something is processing it.
And although by now software development through abstraction feels disconnected from just specialised algorithms: everything will break down into numbers and some form of algorithm to process the informationSay I agree with your distinction - or restriction. There was still software written for, and programmed into, general-purpose, Turing-complete calculating machines long before there are CPUs.
So let’s look at the technical details of the word. The term “Software” was coined in 1958 by John Tukey. The computers in use at that time were machines like the IBM 704, the PDP-1, and the UNIVAC 1107; these are all vacuum tube computers that contained no silicon microchips and no CPUs. Even technically, the term “software” predates silicon and CPUs.
Non-technically, I disagree with your premise on the basis that it’s often been argued - and I agree with the argument - that humans are just computers with software personalities programmed by social conditioning, running on wetware and a fair bit of firmware. And there’s increasing evidence that there’s no real CPU, just a bunch of cooperating microorganisms and an id that retroactively convinces itself that it’s making the decisions. Even if the term “software” wasn’t coined until 1958, software has been a thing since complex organisms capable of learning from experience arose.
Unless we’re all living in a simulation, in which case, who knows if software or hardware really exist up there, or whether there’s even a distinction.
They called the box with all the tubes in it that executed instructions a “CPU”; memory, CPU, and IO subsystems were distinct and well-defined.
I feel like you mean “microprocessor”
We also had machines and computers based on relays and other electro mechanical devices earlier than even vacuum tubes. If you follow Technology Connections he breaks down the inner workings of a pinball machine using that technology, but programmable machines have also been made with it.
The babel(SP?) machine from Greek times.
my guy megabytes of executable binary are just about as usable as a cpu try reverse engeneering more 1s and 0s than you’ve ever read into something usable when you dont even know how it converts into assembly and logical operations because you lack the architecture knowledge of a cpu
People who are coding Skyrim porn mods are smarter than the one who invented CPU /s
And the “inscribe ancient runes” step takes up to 4 months.
I’m lost on how a transistor can just stay 0 or 1 when it’s just a super teeny tiny circle of wire, basically. Like, I know the typical explanation, but it doesn’t really make it any clearer. Electricity moves in a magic shape, and stuff happens. 🤷🏻♂️
The thing is, it never stays in a constant state! It’s more like a water dam with a steady water flow that you can open and close. You said you know the typical explanation already, so I won’t cry to explain it again.
I’m not sure what the typical explanation is, but a transistor is not a wire.
A wire is a conductor. It conducts electricity from end to the other.
A transistor is a semi-conductor device made from semi-conducting materials, so it conducts electricity between 2 ends with a variable electrical resistance. This variable can be controlled by putting voltage on the third leg. This way a transistor is basically a resistor with a variable resistance, which unlike a resistor is also controllable by a third input.
This ability is a property of the material. It cannot be constructed by a regular wire.
I think OP is referring to a typical SRAM bit.
The memory is generally done by something called a capacitor (though there are more techniques) which just can hold a little electrical charge - roughly having an electrical charge means it’s a 1, otherwise it’s a 0.
Get 8 of those things and you have a byte.
It’s generally easier to think of it as water: electrical lines are tubes moving water, capacitors are little containers that can have water (meaning bit = 1) or be empty (bit = 0), transistors are one-way water valves which are controlled by water (imagine they have a pressure button that opens the gate if there is water running in the tube passimg by that button, putting pressure on it).
From this simple basis you can actually create a lot of complexity by having a LOT of these things combined in weird ways.
Further, there’s also a lot of complexity due to the Physics of the real world being less than perfect (for example, the “capacitors” leak water, so not only do you have to say that bit=1 is “water above a certain level” rathe than “full” - since as soon as it’s filled it starts losing water - but you even have to check them once in a while and top up the ones which are supposed to be full before so kuch water leaks that the level has fallen below that treated as a “1” - this is actually how DRAM memory works, though with electric charge rather than water).
The classic response is that you have to capture lightning first to apply to the rick that you want to do the thinking.
What’s going on when it gets turned back into a rock?
Probably the silicon being made into a wafer.
Skyrim porn mods sound good which ones are those
Too many to count
Related to development of semiconductor devices, especially transistors. The original computers used punched holes. In some way it’s an extension of that same idea
Edit: for the curious, some accessible intros
https://www.livescience.com/20718-computer-history.html
https://news.stanford.edu/stories/2023/09/stanford-explainer-semiconductors