News:

;) This forum is the property of Proton software developers

Main Menu

The Computer That Came First - And Was Erased From History

Started by GDeSantis, Mar 14, 2026, 03:53 PM

Previous topic - Next topic

GDeSantis


charliecoutas

It all depends on how you define "Computer". There were relay machines in the 1930's and mechanical versions possibly before that. The modern definition is electronic, has an electronic stored program, no relays, wheels or drums of rotating capacitors. This rules out both ENIAC and Colossus. Colossus ran before ENIAC, in 1944.

The first electronic, stored program machine was the Manchester Baby, at Manchester University, England. It first ran in June 1948. It was inspired by Max Newman (ex Bletchley Park) and Alan Turning, who first wrote down the design for such a machine in 1936, since called The Turing Machine. There is no doubt that the Americans were not far behind the UK.

Charlie

RGV250

Hi Charlie,
You can't let the facts get in the way of youtube (probably AI slop) rubbish.

Last time you asked how I was, I got back from my woodturning trip to Newark and am now in hospital with Campolybacter not sure how I got it. If you have never had it, you don't want it.

If you have never been, I can recommend a trip round IWM Duxford, most interesting.

Regards,
Bob

charliecoutas

Yikes Bob! Google says: "Campylobacter is most commonly contracted by eating raw or undercooked poultry, unpasteurized milk, or contaminated water". How did you get it? No one of those fast-food joints in down-town Newark is it?

Take care and get well soon. We can't do without you on the forum.

With respect to "The first computer", ENIAC takes it every time so what's the point of chasing the truth? It seems that if you say something often enough then that is then TRUTH. ChatGPT says:

First electronic digital computing machine: Atanasoff–Berry Computer (1939)
First programmable electronic digital computer: Colossus (1943)
First general-purpose electronic digital computer: ENIAC (1945)

But ABC had the capacitor mechanical drum, so it doesn't count.
Colossus wasn't programmable in the true sense.
ENIAC wasn't programmable in the same way either.

Strange that Manchester Baby (that was programmable) doesn't get a look-in.

Take care.
Charlie

RGV250

Hi,
And then there's Donald Trump's "Truth" ;D  ;D

Funny how ChatGPT show the American ones as the first which shows it is not intelligent and only knows the crap that it is fed in.

Hopefully not offended anyone, just my warped sense of humour.

Hi Charlie,
I have a couple of ideas, but probably my own packed lunch for the journey :-(

Have you asked ChatGPT where the Manchester baby come in the list?

Bob

charliecoutas

Good idea! Here's what he/she/it says:

"Year                 Computer                                  Key significance
1939–1942   Atanasoff–Berry Computer       Early electronic digital computer (not programmable)
1943–1944   Colossus                                    First programmable electronic digital computer (special-purpose)
1945           ENIAC                                    First general-purpose electronic digital computer (programmed manually with cables/switches)
1948           Manchester Baby                   First stored-program computer to run a program
1949           Manchester Mark 1                   First practical stored-program computer"

Well that's a little bit better.  I suppose another part of the definition of "Computer" should be "..and it actually does something of general practical use..."

Charlie

John Drew

That set the cat amongst the pigeons.Just points out the problem  with AI.
As Charlie pointed out it comes down to what fits the definition of a computer.
Does it come down to a counting stick, a mechanical adding machine, a digital calculating machine or ....

It seems that the general consensus is that a computer in modern terms fits the definition that is recognisably an electronic digital stored-program computer. This was the Manchester Baby, a product of the University of Manchester that ran its first program on 21 June 1948. It met the criteria.

But if AI wants to make a counting stick the first then so be it.
 
Just to be clear, the University of Manchester is in Manchester, UK.

The world of science 'stands on the shoulders of giants' and the Manchester scientists' achievements do not take anything away from the many scientists in many countries that contributed knowledge to the success of the University of Manchester team. Like many 'inventions', progress in a field is incremental. Well done 'The Baby'.

charliecoutas

#7
John's modesty probably stopped him mentioning CSIRAC, Australia's valuable contribution. In my humble opinion the first three, truly general purpose, stored program, electronic, digital computers were:

The Manchester Baby, or Small-Scale Experimental Machine (SSEM), was the world's first stored-program computer, running its first program on June 21, 1948. Developed at the University of Manchester England. It ran a program to generate prime numbers. It proved the viability of Williams tube memory and led to the Manchester Mark I and the Ferranti Mark 1.

The EDSAC (Electronic Delay Storage Automatic Calculator) ran its first successful program on May 6, 1949, at Cambridge University, England. Led by Maurice Wilkes, the machine calculated and printed a table of squares, marking it as the first practical, full-scale stored-program computer to operate a regular computing service. [We have a rebuild of EDSAC at the National Museum of Computing, Bletchley]

Australia's first computer was the CSIRAC (Commonwealth Scientific and Industrial Research Automatic Computer), originally known as the CSIR Mark 1. It was designed in 1947, ran its first test program in November 1949, and was one of the world's first stored-program digital computers.

And again, in my opinion, the two individuals who had the biggest influence: Alan Turning and John Von Neuman (USA).

Charlie

Fanie

The Computer That Came First - And Was Erased.
I disagree.  The oldest computer is the one that is sitting between your ears, and it was erased throughout history with political, religious, psychological and chemical methods and substances amongst other erasure tools.
You even have a type of wi-fi / bluetooth built in that is far superior to the computer stuff you get today.
My forefathers were those who were persecuted through out Europe by the Christians, they fled to the Cape where the Cape Dutch Afrikaner continued the persecution which led to the Great Trek and a whole series of incidents that ended with the Boers having their country internationally recognized.
During this history there were many things that were achieved and done which is difficult to understand in the computerized and controlled (programmed) environment of today.
There were for instance of the Trekboers who made appointments to meet in an area, there were no maps, no roads, just a wilderness and they did not know of any beacons to look for.  They found each other.
Same during the War, with half a million British hunting the Boers, who was never caught because they somehow knew where and how to escape.  After the War's so called end, the Afrikaner military hunted and shot these Boer individuals who had certain abilities they did not understand.
Even today there are certain Boers who have certain talents which is difficult to explain considering the sh3t we live in.
It's like water witching, it is not recognized anywhere, but every one uses it to find water.  Because it works.
So what I'm saying is, you have the most amazing computer attached to yourself to your disposal.  It just depends if you choose to use it or not.
I saw my dad adding six figure amounts with cents by dragging his finger down a column and writing the total down.
Most underestimate themselves, you are far more capable than you think if you can peel off the stuck rubbish attached to the thinking part.

Do some research on remote viewing, if that has not been censored on the internet too.

See_Mos

As I have written before, to my mind artificial and intelligence are polar opposites

John Drew

As a youngster at Uni I remember a Physics class where a  magnetic core memory was passed around. It was late fifties and in the form of a cube, maybe 75mm a side with a matrix of wires and tiny ferrite beads. I have no idea of how many bits of memory were available, I do remember wondering about how they got all those things wired up. Little did I realise how that cube represented the early stages of a major industry and a future boon to science and modern life - social media excluded.
John

charliecoutas

My first exposure to digital computers was in the 1960's. The Elliott 803 had 8K words of ferrite core memory.
This comprised two 4K modules, a bit like the picture below. Each word was 40 bits: 40x8192=327,680 ferrite rings!  Obviously they were tiny. I believe that they were assembled by Spanish ladies from the lacemaking parts of Spain, because they were so very small.

Clever technology.

Charlie

top204

It has always amazed me how a person realised a ferrite core could be used as memory, as long as it is being scanned. Apparently, there were a few people who came up with the idea around the same time.

The rotating capacitor memory, I can see the logic in, but to use the charge in a ferrite core is staggering, and brilliant!

Is the Elliott 803 computer, the one you made the simulator for Charlie?

Regards
Les

charliecoutas

Yes Les, the Elliott 803 is the lovely machine I made the simulator for. Using a 200MHz PIC, I had to slow it by a factor of 35,000 to get the "correct" speed for the 1960's 803. If you think it might be of interest I could post the Positron code?

I think the first realistic "RAM" was designed by Freddie Williams and Tom Kilburn in the 1940's. They realised that the glow on the face of a cathode ray tube lasted a short time. By some amazing thinking they worked out that if you could "read" that fading dot and refresh it fast enough, you had an early form  of random access memory. But I agree with you, realising that the hysteresis of a magnetic material could be used for a memory is remarkable. The Manchester Baby computer was their test-bed.

Charlie


JonW

I agree Les. Ferrites really are extraordinary materials, but once you understand their behavior they can be applied to all sorts of tech.

In the memory core application, that ultra-square BH curve was everything — the bistable remanence stored the bit, and the coincident-current trick exploited the sharp switching threshold to address individual cores in a matrix without disturbing the neighbours. The destructive readout was just an accepted consequence of the physics, dealt with by an immediate write-back cycle. Many moons ago, I worked on a Braille tablet project that used a beautiful variation on the same theme — the remanence interacted with a permanent magnet to hold a ferrite pin position with zero quiescent power. The magnet did the latching work: a short high-current pulse through a coil wound into multiple PCB layers just tipped the core pin polarity one way or the other. Elegant, fairly quick and very low power once latched. 

We still use ferrites in microwave applications in a completely different way. Here you're not really interested in the BH curve at all — instead, when you bias a ferrite with a DC magnetic field, the electron spins inside the material begin to precess at microwave frequencies, and this gives the material a directional quality where a wave behaves differently depending on which way it's travelling through it. In a Faraday rotator, the plane of polarisation of the wave physically rotates as it passes through the ferrite — flip the direction of the bias current, and you flip the direction of rotation of the incident wave. This is still used today in feeds to switch between horizontal and vertical polarisation electronically, with no moving parts and no wear.

The fact that the same class of oxide material underpins 1950s computer memory, a Braille tablet pin actuator, and a modern RF/Microwave polarisation switch — through entirely different physical mechanisms — says everything about how rich the physics of ferrimagnetism really is.

There are some very cool films on YouTube showing how they wound the memory arrays.

ricardourio

From IA again :

"The word "computer" originates from the Latin computare, meaning "to calculate, count, or sum up". First used in English in 1613 by Richard Braithwaite to describe a person who performed calculations, it meant a human calculator—often women—until the late 19th century, when it transitioned to describe a calculating machine.

Key Facts on "Computer" Word Origin
Root Definition: Derived from com- (together) and putare (to reckon or think).
Initial Use (17th Century): Originally a job title for a person who computed mathematical tables for navigation, science, or finance.
Transition to Machine: By the late 1800s and early 1900s, the term began shifting toward mechanical calculators and, eventually, electronic devices.
First Recorded Use: English writer Richard Braithwaite used "computer" in his book The Yong Mans Gleanings in 1613.
Agent Noun: Just as a "teacher" teaches, a "computer" was originally a person who computed."

I guess this definition of "to reckon or think" leads us to another crossroads. For me, "to reckon" is something already done by mechanical devices, while "to think" refers to making logical decisions, using variable instructions.

In fact, the first question to be resolved is what the definition of a computer is from primitive perspectives, because modern definitions are based on existing devices.

Ricardo Urio

charliecoutas

During the war (Rodney) the people at Bletchley Park who did calculations and reckoning were known as Computers. They were mainly women.

Charles

GDeSantis

In the 19th century, the Harvard College Observatory faced the challenge of working through an overwhelming amount of astronomical data due to improvements in photographic technology.  Harvard Observatory's director, Edward Charles Pickering, hired a group of women to analyze the astronomical data recorded on the growing collection of plate negatives. While Pickering was the director of the Harvard Observatory, he hired over eighty women who were known as computers

RGV250

Hi Charlie,
It was uncle Albert :)

Bob

top204

It would have definitely been 'Trigger', Bob. :-)

'During the woawa', Albert was too busy being sunk in ships. :-)