Description:
This is the third time I've taught this course and a
great deal has changed in the field of computational
neuroscience since 2006. With the recent press releases
about large projects to simulate the brain, there seems
to be a widespread popular opinion that we now know
enough to have some chance of succeeding in the endeavor
in the near term. Most experts I know disagree with
this viewpoint, and, among other areas where our current
knowledge is inadequate, often cite our meager
understanding of neuronal gene expression and the
incredible diversity of signaling mechanisms and highly
dynamic character of neural communication across
multiple scales. I believe that success will come
sooner or later but it will likely hinge on progress in
a number of critical enabling technologies that are just
beginning to mature and no doubt a few that have not yet
been invented. These technologies include the ability
to stimulate and record simultaneously from large
populations of neurons, to determine gene expression
levels within individual neurons and across ensembles,
and to read off the wiring diagram of neural networks
right down to the molecular level.
Working at Google, I am acutely aware of the advantages
and the challenges of working at large scales, as well
as the importance of educating and motivating the next
generation of scientists and engineers to work at
unprecedented scales of engineering. This year students
in CS379C will have the opportunity to interact with
some of the most innovative scientists and engineers
working in systems neuroscience today. We will study
their methods and hear directly from them about the
challenges they face, some of which the students can
actually help out with now. We will look at
state-of-the-art computing technologies, see if they are
up to the considerable computational challenges facing
systems neuroscience, and, if not, what can be done to
influence the relevant technology roadmaps. Finally, we
will apply what we've learned to projects that exercise
our understanding of the key problems or contribute
directly to solving them.
Students will be graded on their presentation, class
participation, and a project to be determined in
collaboration with the teaching staff. Projects will
include replicating and evaluating existing
computational models and implementing novel models that
extend or combine the features of existing ones. Small
interdisciplinary group projects are encouraged. The
projects will be graded on the basis of an initial
proposal which will be due around midterm and a final
report and demonstration due during the exam period.
There will be no traditional midterm or final exams.
Location and Time:
MW, 4:15-5:30pm, Gates 100
Staff:
Instructor: Thomas Dean
Email: tld [at] google [dot] com
Office hours: by appointment
Course Assistant: Rohan Kamath
Email: rdkamath [at] stanford [dot] edu
Textbooks:
There are no required textbooks for this course but
you are expected to do a lot of reading on your own
and these three texts are good to have around for
reference. I’ve yet to meet anyone who has
read them cover to cover but over the years,
I’ve probably read most of the chapters in one
edition or the other and found them consistently
useful. A copy of each book will be put on the
reference desk should you want to read a selection,
and, in the case of the latter two, you can also
often find preprint versions of individual chapters
on the web pages of the contributing authors:
-
- Neuroscience: Exploring the Brain (Third Edition), Bear, Connors and Paradiso.
-
- The Cognitive Neurosciences (Third Edition), Gazzaniga.
-
- Principles of Neural Science (Fourth Edition), Kandel, Schwartz and Jessell.
Grading:
- Class participation including presentation (20%)
- Project proposal due around midterm (20%)
- Project documentation and demonstration (60%)