Today, we are entering the era of cognitive computing, which holds great promise in deriving intelligence and knowledge from huge volumes of data. In today’s computers based on von Neumann architecture, huge amounts of data need to be shuttled back and forth at high speeds, a task at which this architecture is inefficient.

It is becoming increasingly clear that to build efficient cognitive computers, we need to transition to non-von Neumann architectures in which memory and processing coexist in some form. At IBM Research – Zurich, we explore various such computing paradigms from in-memory computing to brain-inspired neuromorphic computing. Our research spans from devices and architectures to algorithms and applications.

Ask the experts

Abu Sebastian

Abu Sebastian

Principal Research Staff Member

In-memory computing

In-memory computing could be a significant first step towards non-von Neumann computing. Here the key idea is to perform certain tasks, such as bit-wise operations and arithmetic operations, in memory. We call such a memory unit computational memory, where resistive memory devices, in particular phase-change memory (PCM) devices, could play an important role as building blocks. We have found that, if data is stored in PCM devices, the physical attributes of those devices can be exploited to achieve in-place computation. When organized in a cross-bar array, PCM devices can be used to perform matrix-vector multiplications with very low computational complexity. An application of this concept is the problem of compressed sensing.

We are also exploring the concept of mixed-precision computing to counter the lack of precision arising from such matrix-vector multiplication operations. One example is that of solving systems of linear equations. Detecting temporal correlations in an unsupervised manner is another intriguing application of computational memory that exploits the crystallization dynamics of PCM devices.

Spiking neural networks

Despite our ability to train deep neural networks with brute-force optimization, the computational principles of neural networks remain poorly understood. Hence, significant research is aimed at unravelling the principles of computation in large biological neuronal networks.

It is widely believed that, because of the added temporal dimension, spiking neural networks are computationally more powerful. However, a killer application that transcends conventional deep learning is still missing. Moreover, specialized non-von Neumann computational platforms are needed to implement these novel spike-based algorithms efficiently.

PCM devices could also play a key role in this space. A particularly interesting application is the emulation of neuronal and synaptic dynamics. The ability to alter the conductance levels in a controllable way makes PCM devices particularly well-suited for synaptic realizations. It is also possible to exploit the crystallization dynamics to emulate neuronal dynamics. Such phase-change neurons exhibit intrinsic randomness and can be used for the representation of high-frequency signals via population coding. They can also be used to realize spiking neural networks and the associated learning rules in a highly efficient manner. We are also exploring new architectures and learning algorithms to exploit the potential computational advantages of spiking neural networks.

Exploratory memory

A critical element in the emerging non-von Neumann computing paradigms is a very-high-density, low-power, variable-state, programmable and non-volatile nanoscale memory device. The exploration of such a memory element for non-von Neumann computing forms a significant part of our research.

We are at the forefront of research on multi-level phase-change memory (PCM). A significant part of our work is focused on understanding the rich dynamic behavior of PCM devices comprising an intricate feedback interconnection of electrical, thermal and structural dynamics. We are also exploring new device concepts that can reduce the programming energy and counter undesirable attributes such as resistance drift.

Another resistive memory concept we pursue is based on carbon. Carbon-based memory could be a significant complement to the rapid advances in carbon-based nano-electronics. It could pave the way for potential all-carbon computing devices of the future. The elemental nature of carbon would enable a carbon-based memory to be scaled down to very small feature sizes and to be immune to compositional changes that typically plague alternate multi-elemental non-volatile memory materials. Moreover, the high resilience of carbon to a variety of external stimuli would ensure he robustness and endurance of such a carbon-based memory.

Exploratory memory

Publications

[1] C. Rios et al.,
In-memory computing on a photonic platform,”
Science Advances, 2019 (open access).

[2] A. Sebastian et al.,
Multi-level storage in phase-change memory devices,”
IBM RZ 3947, 2019.

[3] S. Hamdioui et al.,
Applications of Computation-In-Memory Architectures based on Memristive Devices,”
Proc. Design, Automation and Test in Europe 2019 (DATE), 2019.

[4] M. Le Gallo et al.,
Compressed Sensing With Approximate Message Passing Using In-Memory Computing,”
IEEE Trans. Electr. Dev. 65(10), 2018 (open access) IBM RZ 3944.

[5] I. Boybat et al.,
Neuromorphic computing with multi-memristive synapses,”
Nature Communications 9, 2514, 2018 (open access), PDF.

[6] A. Sebastian et al.,
Tutorial: Brain-inspired computing using phase-change memory devices,”
J. Appl. Phys. 124, 111101, 2018 (open access) IBM RZ 3946.

[7] I. Giannopoulos et al.,
8-bit Precision In-Memory Multiplication with Projected Phase-Change Memory,”
Proc. IEDM, 2018 (not yet open due to embargo period).

[8] M. Salinga et al.,
Monatomic phase change memory,”
Nature Materials 17, 681–685, 2018 (Cover) PDF (open access).

[9] M. Le Gallo et al.,
Collective structural relaxation in phase-change memory devices,”
Adv. Electronic Materials 4(9), 2018, PDF.

[10] M. Le Gallo et al.,
“Mixed-precision in-memory computing,”
Nature Electronics 1, 246–253, 2018 arXiv preprint arXiv:1701.04279 (open access) PDF.

[11] N. Gong et al.,
Signal and noise extraction from analog memory elements for neuromorphic computing,”
Nature Communications 9(2102), 2018.

[12] T. Moraitis et al.,
Spiking neural networks enable two-dimensional neurons and unsupervised multi-timescale learning,”
Int’l Joint Conference on Neural Networks (IJCNN), 2018.

[13] N. Papandreou et al.,
Exploiting the non-linear current-voltage characteristics for resistive memory readout,”
Int’l Symposium on Circuits and Systems (ISCAS), 2018.

[14] T. Moraitis et al.,
The role of short-term plasticity in neuromorphic learning,”
IEEE Nanotechnology Magazine 12(3), 45–53, 2018.

[15] S. Woźniak et al., 
Deep Networks Incorporating Spiking Neural Dynamics,”
arXiv preprint arXiv:1812.07040, 2018.

[16] S. Woźniak et al.,
Online Feature Learning from a non-iid Stream in a Neuromorphic System with Synaptic Competition,”
Joint Conference on Neural Networks (IJCNN), 2018.

[17] S.R. Nandakumar et al.,
Mixed-precision training of deep neural networks using computational memory”,
arXiv preprint arXiv:1712.01192, 2017 (open access).

[18] T.A. Bachmann et al.,
Memristive Effects in Oxygenated Amorphous Carbon Nanodevices,”
Nanotechnology 29(3), 2017.

[19] S. Woźniak et al.,
Neuromorphic architecture with 1M memristive synapses for detection of weakly correlated inputs,”
IEEE Transactions on Circuits and Systems II: Express Briefs 64(11), 2017.

[20] S. Woźniak et al.,
Neuromorphic system with phase-change synapses for pattern learning and feature extraction,”
Int’l Joint Conference on Neural Networks (IJCNN), 2017.

[21] T. Moraitis et al.,
Fatiguing STDP: Learning from spike-timing codes in the presence of rate codes,”
Proc. Int’l Joint Conf. on Neural Networks (IJCNN), 2017.

[22] S. Sidler et al.,
Unsupervised learning using phase-change synapses and complementary patterns,”
In: A. Lintas et al. (eds) Artificial Neural Networks and Machine Learning, ICANN 2017. Lecture Notes in Computer Science 10613. Springer, Cham, 2017.

[23] A. Sebastian et al.,
Temporal correlation detection using computational phase-change memory,”
Nature Communications 8, article 1115, 2017.

[24] M. Le Gallo et al.,
Mixed-precision in-memory computing,”
arXiv preprint arXiv:1701.04279, 2017.

[25] G.W. Burr et al.,
Neuromorphic computing using non-volatile memory,”
Advances in Physics: X 2.1, 2017.

[26] J. Secco et al.,
Flux-charge memristor model for phase change memory,”
IEEE Trans. Circuits and Systems II: Express Briefs, 2017.

[27] T.A. Bachmann et al.,
Temperature evolution in nanoscale carbon-based memory devices due to local Joule heating”,
IEEE Trans. Nanotechnology 16(5), 806-811, 2017.

[28] M. Le Gallo et al.,
Compressed sensing recovery using computational memory,”
Proc. IEEE Int’l. Electron Devices Meeting (IEDM), 2017.

[29] S.R. Nandakumar et al.,
Supervised learning in spiking neural networks with MLC PCM synapses,”
75th Annual Device Research Conf. (DRC), 2017.

[30] I. Boybat et al.,
Stochastic weight updates in phase-change memory-based synapses and their influence on artificial neural networks,”
13th Conf. on PhD Research in Microelectronics and Electronics (PRIME), 2017.

[31] T. Tuma et al.,
Stochastic phase-change neurons,”
Nature Nanotechnology 11, 693-699, 2016.

[32] T. Tuma et al.,
Detecting correlations using phase-change neurons and synapses,”
IEEE Elec. Dev. Lett. 37(9), 1238-1241, 2016.

[33] A. Pantazi et al.,
All-memristive neuromorphic computing with level-tuned neurons,”
Nanotechnology 27(35), 355205, 2016.

[34] M. Le Gallo et al.,
Evidence for thermally assisted threshold switching behavior in nanoscale phase-change memory cells,”
J. Appl. Phys. 119, 025704, 2016.

[35] G.W. Burr et al.,
Recent progress in phase-change memory technology,”
IEEE J. Emerging and Selected Topics in Circuits and Systems 6(2), 146–162, 2016.

[36] M. Le Gallo et al.,
The complete time/temperature dependence of I–V drift in PCM devices,”
Proc. 2016 IEEE Int’l Reliability Physics Symposium (IRPS), 2016.

[37] M. Le Gallo et al.,
Inherent stochasticity in phase-change memory devices,”
Proc. European Solid-State Device Conf. (ESSDERC), 2016.

[38] W.W. Koelmans et al.,
Carbon-based resistive memories,”
Proc. Int’l Memory Workshop (IMW), 2016.

[39] S. Wozniak et al.,
Learning spatio-temporal patterns in the presence of input noise using phase-change memristors,”
Proc. Int’l Symposium on Circuits and Systems (ISCAS), 2016.

[40] C.A. Santini et al.,
Oxygenated amorphous carbon for resistive memory applications,
Nature Communications 6, article 8600, 2015.

[41] W.W. Koelmans et al.,
Projected phase-change memory devices,”
Nature Communications 6, article 8181, 2015.

[42] M. Le Gallo et al.,
Subthreshold electrical transport in amorphous phase-change materials,”
New. J. Phys. 17, 093035, 2015.

[43] P. Hosseini et al.,
Accumulation-based computing using phase change memories with FET access devices,”
IEEE Elec. Dev. Lett. 36(9), 975-977, 2015.

[44] M. Kaes et al.,
High field electrical transport in amorphous phase-change materials,”
J. Appl. Phys. 118, 135707, 2015.

[45] A. Sebastian et al.,
A collective relaxation model for resistance drift in phase change memory cells,”
Proc. 2015 IEEE Int’ Reliability Physics Symposium (IRPS), 2015.

[46] A. Athmanathan et al.,
A finite-element thermoelectric model for phase-change memory devices,”
Proc. Int’l Conf. on Simulation of Semiconductor Processes and Devices (SISPAD), 2015.

[47] A. Sebastian et al.,
Crystal growth within a phase change memory cell,”
Nature Communications 5, article 4314, 2014.