On two occasions, I have been asked [by members of Parliament], "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?"...I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

- Charles Babbage, *Passages from the Life of a Philosopher (1864)*

### Podcast of the Day

From the mobile phone to the office computer, mathematician Hannah Fry looks back at 70 years of computing history, to reveal the UK's lead role in developing the technology we use today.

In the first episode, she travels back to the 1940s, to hear the incredible story of the creation, in Britain, of the computer memory.

Three teams from across the country - in Teddington, Manchester and Cambridge - were tasked with designing automatic calculating engines for university research. But which team would be first to crack the tricky problem of machine memory?

Meanwhile, tabloid headlines proclaimed that engineers were building 'electronic brains' that could match, and maybe surpass, the human brain, starting a debate about automation and artificial intelligence that still resonates today.

### Video of the Day

### Short Article of the Day

In 1936, whilst studying for his Ph.D. at Princeton University, the English mathematician Alan Turing published a paper, “*On Computable Numbers, with an application to the Entscheidungsproblem,” *which became the foundation of computer science. In it Turing presented a theoretical machine that could solve any problem that could be described by simple instructions encoded on a paper tape. One Turing Machine could calculate square roots, whilst another might solve Sudoku puzzles. Turing demonstrated you could construct a single *Universal Machine* that could simulate any Turing Machine. One machine solving any problem, performing any task for which a program could be written—sound familiar? He’d invented the computer...

Continue reading Ian Watson's article: How Alan Turing Invented the Computer Age

### Further Reading

Before the 20th century, most calculations were done by humans. Early mechanical tools to help humans with digital calculations, such as the abacus, were called "calculating machines", by proprietary names, or even as they are now, calculators. The machine operator was called the computer.

The first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain the result. Later, computers represented numbers in a continuous form, for instance distance along a scale, rotation of a shaft, or a voltage. Numbers could also be represented in the form of digits, automatically manipulated by a mechanical mechanism. Although this approach generally required more complex mechanisms, it greatly increased the precision of results. A series of breakthroughs, such as miniaturized transistor computers, and the integrated circuit, caused digital computers to largely replace analog computers. The cost of computers gradually became so low that by the 1990s, personal computers, and then, in the 2000s, mobile computers, (smartphones and tablets) became ubiquitous in industrialized countries...

Continue reading the Wikipedia article: History of computing hardware

### Related Topics

The Internet | Logic | Mathematics

Each day I post the best introductory resources I can find on an important philosophical, scientific or historical topic. By collecting the best educational content the internet has to offer, I hope to make it as easy as possible for everyone to get into the habit of learning something valuable every day. If you’d like to join me, simply enter your email below:

Do you know of a great introductory video, podcast, or article that deserves to be featured? Get in touch on Twitter or Facebook.