Brains and machines

The (sub-)title of After Digital: Computation as Done by Brains and Machines by James Anderson intrigued me. It’s a book of a course taught by the author, a pioneer in neural network research, which was foundational for modern AI, and so explains (in perhaps too much detail for some readers and at too basic a level for experts but fine for me!) how computers compute and how brains compute. It starts with the difference between analogue and digital computing – the former with hardware tailored to specific problems, the latter with generic hardware and the software is decisive – and then goes on to describe how neurons and the brain compute – analogue with digital characteristics, much slower than digital computers but massively more energy efficient.

I didn’t take away big messages, but did get lots of interesting snippets. Computers could get smaller as they got faster, for example, because it takes light a nanosecond to travel about a foot (about 30cm). Between the early 1800s and 1860 the time it took to get a message from New York to Boston dropped to instantaneous rather than a week. It made me ponder the relationship between Godel’s Incompleteness Theorem and the eventual capabilities of AI.

Anyway, if you want an introductory course (as a reader or a teacher) to both computation and neuroscience, it’s excellent. But not a general read. (I loved the cover.)

Screenshot 2023-09-13 at 09.26.56