How is software without a CPU useful? Its literally a list of instructions for a CPU.
Also a CPU can still calculate stuff if you just send electrical signals to the right connections. Software is just a way for the CPU to keep going and do more calculations with the results.
Software is algorithmic instructions. We wrote and executed algorithms by hand long before we had calculating machines; and when we did get computers that could run more complex algorithms, they didn’t have CPUs. They had vacuum tubes (there were even simpler programmable purely mechanical computers before even vacuum tubes). CPUs didn’t come along until much later; we’d been writing software and programming computers for decades before the first CPU.
And even if you try to argue that vacuum tubes computers had some collection of tubes that you could call a “CPU” - which would be a stretch - then it still wouldn’t have been made from silicon (rocks) as in the OP post.
But before the first calculating mashing, people are writing algorithms - what software literally is - and executing them by hand long before we had calculating machines to do it for us. Look up how we calculated the ranging tables for artillery in WWII. Algorithms. Computed by hand.
The word “computer” literally comes from the word for the people (often women) who would execute algorithms using their brains to compute results.
I think you’re conflating “algorithm” with “software”. You’re right in saying that algorithms can be computed by hand, but I don’t think anyone would refer to that as “running software”. The word “software” implies that it’s run on “hardware”, and hardware usually implies some sort of electronic (or even mechanical*) circuit, not pen and paper and a human brain.
Software runs on processing power. Doesn’t matter if it’s mechanical, electrical or biological computing power.
The important part is, that something is processing it.
And although by now software development through abstraction feels disconnected from just specialised algorithms: everything will break down into numbers and some form of algorithm to process the information
Say I agree with your distinction - or restriction. There was still software written for, and programmed into, general-purpose, Turing-complete calculating machines long before there are CPUs.
So let’s look at the technical details of the word. The term “Software” was coined in 1958 by John Tukey. The computers in use at that time were machines like the IBM 704, the PDP-1, and the UNIVAC 1107; these are all vacuum tube computers that contained no silicon microchips and no CPUs. Even technically, the term “software” predates silicon and CPUs.
Non-technically, I disagree with your premise on the basis that it’s often been argued - and I agree with the argument - that humans are just computers with software personalities programmed by social conditioning, running on wetware and a fair bit of firmware. And there’s increasing evidence that there’s no real CPU, just a bunch of cooperating microorganisms and an id that retroactively convinces itself that it’s making the decisions. Even if the term “software” wasn’t coined until 1958, software has been a thing since complex organisms capable of learning from experience arose.
Unless we’re all living in a simulation, in which case, who knows if software or hardware really exist up there, or whether there’s even a distinction.
We also had machines and computers based on relays and other electro mechanical devices earlier than even vacuum tubes. If you follow Technology Connections he breaks down the inner workings of a pinball machine using that technology, but programmable machines have also been made with it.
How is software without a CPU useful? Its literally a list of instructions for a CPU.
Also a CPU can still calculate stuff if you just send electrical signals to the right connections. Software is just a way for the CPU to keep going and do more calculations with the results.
Software is algorithmic instructions. We wrote and executed algorithms by hand long before we had calculating machines; and when we did get computers that could run more complex algorithms, they didn’t have CPUs. They had vacuum tubes (there were even simpler programmable purely mechanical computers before even vacuum tubes). CPUs didn’t come along until much later; we’d been writing software and programming computers for decades before the first CPU.
And even if you try to argue that vacuum tubes computers had some collection of tubes that you could call a “CPU” - which would be a stretch - then it still wouldn’t have been made from silicon (rocks) as in the OP post.
But before the first calculating mashing, people are writing algorithms - what software literally is - and executing them by hand long before we had calculating machines to do it for us. Look up how we calculated the ranging tables for artillery in WWII. Algorithms. Computed by hand.
The word “computer” literally comes from the word for the people (often women) who would execute algorithms using their brains to compute results.
I think you’re conflating “algorithm” with “software”. You’re right in saying that algorithms can be computed by hand, but I don’t think anyone would refer to that as “running software”. The word “software” implies that it’s run on “hardware”, and hardware usually implies some sort of electronic (or even mechanical*) circuit, not pen and paper and a human brain.
Software runs on processing power. Doesn’t matter if it’s mechanical, electrical or biological computing power.
The important part is, that something is processing it.
And although by now software development through abstraction feels disconnected from just specialised algorithms: everything will break down into numbers and some form of algorithm to process the information
Say I agree with your distinction - or restriction. There was still software written for, and programmed into, general-purpose, Turing-complete calculating machines long before there are CPUs.
So let’s look at the technical details of the word. The term “Software” was coined in 1958 by John Tukey. The computers in use at that time were machines like the IBM 704, the PDP-1, and the UNIVAC 1107; these are all vacuum tube computers that contained no silicon microchips and no CPUs. Even technically, the term “software” predates silicon and CPUs.
Non-technically, I disagree with your premise on the basis that it’s often been argued - and I agree with the argument - that humans are just computers with software personalities programmed by social conditioning, running on wetware and a fair bit of firmware. And there’s increasing evidence that there’s no real CPU, just a bunch of cooperating microorganisms and an id that retroactively convinces itself that it’s making the decisions. Even if the term “software” wasn’t coined until 1958, software has been a thing since complex organisms capable of learning from experience arose.
Unless we’re all living in a simulation, in which case, who knows if software or hardware really exist up there, or whether there’s even a distinction.
They called the box with all the tubes in it that executed instructions a “CPU”; memory, CPU, and IO subsystems were distinct and well-defined.
I feel like you mean “microprocessor”
We also had machines and computers based on relays and other electro mechanical devices earlier than even vacuum tubes. If you follow Technology Connections he breaks down the inner workings of a pinball machine using that technology, but programmable machines have also been made with it.
The babel(SP?) machine from Greek times.