Neuromorphic Computing Guide
A guide covering Neuromorphic Computing including the applications, libraries and tools that will make you better and more efficient with Neuromorphic Computing development.
Note: You can easily convert this markdown file to a PDF in VSCode using this handy extension Markdown PDF. Also, checkout the mdBook version Neuromorphic Computing Guide mdBook (Special thanks to jonathanwoollett-light).
Types of Neural Networks
Table of Contents
-
Electric charge, field, and potential
- Charge and electric force (Coulomb's law): Electric charge, field, and potential
- Electric field: Electric charge, field, and potential
- Electric potential energy, electric potential, and voltage: Electric charge, field, and potential
-
- Ohm's law and circuits with resistors: Circuits
- Circuits with capacitors: Circuits
-
Magnetic forces, magnetic fields, and Faraday's law
- Magnets and Magnetic Force: Magnetic forces, magnetic fields, and Faraday's law
- Magnetic field created by a current: Magnetic forces, magnetic fields, and Faraday's law
- Electric motors: Magnetic forces, magnetic fields, and Faraday's law
- Magnetic flux and Faraday's law
-
Electromagnetic waves and interference
- Introduction to electromagnetic waves: Electromagnetic waves and interference
- Interference of electromagnetic waves
Getting Started with Neuromorphic Computing
Neuromorphic Computing is the use of very large scale integration (VLSI) systems containing electronic analog circuits to simulate the neuro-biological architectures present in the human brain ad nervous system.
Intel Loihi 2, its second-generation neuromorphic research chip.
The Akida Neuromorphic System-on-Chip (NSoC) developed by BrainChip.
Developer Resources
-
Next-Level Neuromorphic Computing: Intel Lab's Loihi 2 Chip | Intel
-
Light-Emitting Artificial Synapses for Neuromorphic Computing (Research Paper PDF)
Online Training Courses
-
Computational Neuroscience: Neuronal Dynamics of Cognition Course Online | edX
-
Fundamentals of Neuroscience, Part 2: Neurons and Networks | Harvard Online Learning
-
Fundamentals of Neuroscience, Part 3: The Brain | Harvard Online Learning
-
Introduction to Computational Neuroscience | MIT OpenCourseWare
-
Brain and Cognitive Sciences Online Course | MIT OpenCourseWare
-
PyTorch on Azure - Deep Learning with PyTorch | Microsoft Azure
Books
-
Neuromorphic Computing Principles and Organization by Abderazek Ben Abdallah and Khanh N. Dang
-
Memristors for Neuromorphic Circuits and Artificial Intelligence Applications by Jordi Suñé
-
Neuromorphic Photonics by Bhavin J. Shastri, Paul R. Prucnal
YouTube videos
-
Neuromorphic Computing Explained | Jeffrey Shainline and Lex Fridman
-
Brains Behind the Brains: Mike Davies and Neuromorphic Computing at Intel Labs | Intel
-
How Neuromorphic Computing Uses the Human Brain as a Model | Intel Labs
-
ESWEEK 2021 Education - Introduction to Neuromorphic Computing
-
Stanford Seminar: Neuromorphic Chips: Addressing the Nanostransistor Challenge
-
Photonic Neuromorphic Computing: The Future of AI? | ExplainingComputers
-
Machine learning + neuroscience = biologically feasible computing | Benjamin Migliori | TEDxSanDiego
Neuromorphic Computing Tools, Libraries, and Frameworks
Lava is an open-source software framework for developing neuro-inspired applications and mapping them to neuromorphic hardware. Lava provides developers with the tools and abstractions to develop applications that fully exploit the principles of neural computation. Constrained in this way, like the brain, Lava applications allow neuromorphic platforms to intelligently process, learn from, and respond to real-world data with great gains in energy efficiency and speed compared to conventional computer architectures.
Lava DL is an enhanced version of SLAYER. Some enhancements include support for recurrent network structures, a wider variety of neuron models and synaptic connections (complete list of features here). This version of SLAYER is built on top of the PyTorch deep learning framework, similar to its predecessor.
Lava Dynamic Neural Fields (DNF) are neural attractor networks that generate stabilized activity patterns in recurrently connected populations of neurons. These activity patterns form the basis of neural representations, decision making, working memory, and learning. DNFs are the fundamental building block of dynamic field theory, a mathematical and conceptual framework for modeling cognitive processes in a closed behavioral loop.
Neuromorphic Constraint Optimization is a library of solvers that