Cognitive Science Resources

This list of resources is my working list of things I might like to consult. I have not yet worked through all of these, but as I do, I may link to a review or summary of my own.


Reading / Read

Consciousness Explained, by Daniel Dennett.

The Secret Life of the Mind, by Mariano Sigman.

How Emotions are Made: The Secret Life of the Brain by Lisa Feldmean Barrett. I am reading this now. The main idea is the theory of the “construction of emotions”, an active process in which our concepts (driven by language) play a large role. I should write a longer review. There’s also a YouTube Video by Dr. Barrett.


Plato’s Camera: How the Physical Brain Captures a Landscape of Abstract Universals – On how the brain learns to categorize

Second Nature: Brain Science and Human Knowledge – Gerald Edelman. A big influence on Barrett, but not well-reviewed.


One critical review of Gerald Edelman’s Second Nature: Brain Science and Human Knowledge, contains this enticing bit of a thread to pull on:

  • “I’ve read loads of books now on the mind, from Searle, the Churchlands, Fodor, Pinker, Ramachandran, Gazzaniga, et al, and all of them are able to express themselves clearly even when discussing sophisticated concepts. But for me, reading anything by Gerald Edelman is a terrible chore - like some crazy reading comprehension test devised by a sadist - rather than an exciting journey through a brilliant mind.”


The first course I’m going to work through is Coursera’s Synapses, Neurons, and Brains. Here are my course notes. has many courses on Neuroscience.

MIT OpenCourseWare has a course, The Human Brain

Video Talks

How Do Neural Networks Grow Smarter Peter Robin Hiesinger – see The Self-Assembling Brain. This vide is a lot more accessible, overviews the history of neuroscience and ANNs. Perceptrons from a 1958 paper by Frank Rosenblatt. Mark 1 Perceptron machine later. Idea was that network was wired pretty randomly. No consensus in Wikipedia page on Biological intelligence. Hiesinger’s point is that we Rosenblatt kind of got it wrong – we need a combination of connectivity and learning, not just a random network. Birth of AI - Dartmouth Workshop 1956. Frank Rosenblatt wasn’t there. Their focus was on symbolic logic using von Neumann machines. These leaders of the field wrote a book “Perceptrons”, and talked about all the things Nueral Networks can’t do. Sidelined neural networks as an AI project. Big progress in last 10 years has been on learning side, focus on self learning, “Make the ANN learn by itself!” (reinforcement learning). London based startup, DeepMind (now Google Deepmind – MuZero now a self-learning, naive AI that can play against another player. McCulloch and Pitts, 1943 – The Artificial Neuron. Connectome diagrams great, but don’t capture all the chmeistry etc. Algorithmic Information Theory of Ray Solomonoff.

Since you can’t predict results of evolutionary process (c.f. rule 110) – you need evolutionary programming to make AI smarter. Wolfram (mathematica guy) figured out rule 110. :) Wolfram, A New Kind of Science. Algorithmic Growth and Its Consequences Human Level AI or Artificial General Intelligence. See also

Your brain hallucinates your conscious reality - Ted Talk by Anil Seth, quite good, with different forms of the self.

Donald Hoffman Do We See Reality as it Is argues that our perceptual system evolved to keep us alive, not to display reality accuratlely. This apparently is the topic of his book, The Case Against Reality: Why Evolution Hid the Truth from Our Eyes.

Useful Sites sports an excellent online app for interactive gross brain anatomy.

Allen Institute’s Brain Modelling Toolkit is a cool Python library for “building, simulating and analyzing large-scale neural network models.”