Today, we are entering the era of cognitive computing, which holds great promise in deriving intelligence and knowledge from huge volumes of data. In today’s computers based on von Neumann architecture, huge amounts of data need to be shuttled back and forth at high speeds, a task at which this architecture is inefficient.

It is becoming increasingly clear that to build efficient cognitive computers, we need to transition to non-von Neumann architectures in which memory and processing coexist in some form. At IBM Research – Zurich, we explore various such computing paradigms from in-memory computing to brain-inspired neuromorphic computing. Our research spans from devices and architectures to algorithms and applications.

Ask the experts

Abu Sebastian

Abu Sebastian

IBM Research scientist

Evangelos Eleftheriou

Evangelos Eleftheriou

Department head, IBM Fellow

In-memory computing

In-memory computing could be a significant first step towards non-von Neumann computing. Here the key idea is to perform certain tasks, such as bit-wise operations and arithmetic operations, in memory. We call such a memory unit computational memory, where resistive memory devices, in particular phase-change memory (PCM) devices, could play an important role as building blocks. We have found that, if data is stored in PCM devices, the physical attributes of those devices can be exploited to achieve in-place computation. When organized in a cross-bar array, PCM devices can be used to perform matrix-vector multiplications with very low computational complexity. An application of this concept is the problem of compressed sensing.

We are also exploring the concept of mixed-precision computing to counter the lack of precision arising from such matrix-vector multiplication operations. One example is that of solving systems of linear equations. Detecting temporal correlations in an unsupervised manner is another intriguing application of computational memory that exploits the crystallization dynamics of PCM devices.

Spiking neural networks

Despite our ability to train deep neural networks with brute-force optimization, the computational principles of neural networks remain poorly understood. Hence, significant research is aimed at unravelling the principles of computation in large biological neuronal networks.

It is widely believed that, because of the added temporal dimension, spiking neural networks are computationally more powerful. However, a killer application that transcends conventional deep learning is still missing. Moreover, specialized non-von Neumann computational platforms are needed to implement these novel spike-based algorithms efficiently.

PCM devices could also play a key role in this space. A particularly interesting application is the emulation of neuronal and synaptic dynamics. The ability to alter the conductance levels in a controllable way makes PCM devices particularly well-suited for synaptic realizations. It is also possible to exploit the crystallization dynamics to emulate neuronal dynamics. Such phase-change neurons exhibit intrinsic randomness and can be used for the representation of high-frequency signals via population coding. They can also be used to realize spiking neural networks and the associated learning rules in a highly efficient manner. We are also exploring new architectures and learning algorithms to exploit the potential computational advantages of spiking neural networks.

Exploratory memory

A critical element in the emerging non-von Neumann computing paradigms is a very-high-density, low-power, variable-state, programmable and non-volatile nanoscale memory device. The exploration of such a memory element for non-von Neumann computing forms a significant part of our research.

We are at the forefront of research on multi-level phase-change memory (PCM). A significant part of our work is focused on understanding the rich dynamic behavior of PCM devices comprising an intricate feedback interconnection of electrical, thermal and structural dynamics. We are also exploring new device concepts that can reduce the programming energy and counter undesirable attributes such as resistance drift.

Another resistive memory concept we pursue is based on carbon. Carbon-based memory could be a significant complement to the rapid advances in carbon-based nano-electronics. It could pave the way for potential all-carbon computing devices of the future. The elemental nature of carbon would enable a carbon-based memory to be scaled down to very small feature sizes and to be immune to compositional changes that typically plague alternate multi-elemental non-volatile memory materials. Moreover, the high resilience of carbon to a variety of external stimuli would ensure he robustness and endurance of such a carbon-based memory.

Exploratory memory


[1] A. Sebastian, M. Le Gallo, and D. Krebs,
Crystal growth within a phase change memory cell,”
Nature Communications 5, article 4314, 2014.
[2] C. A. Santini et al.
Oxygenated amorphous carbon for resistive memory applications,
Nature Communications 6, article 8600, 2015.
[3] W. W. Koelmans, A. Sebastian, V. Jonnalagadda, D. Krebs, L. Dellmann, and E. Eleftheriou,
Projected phase-change memory devices,”
Nature Communications 6, article 8181, 2015.
[4] M. Le Gallo, M. Kaes, A. Sebastian, and D. Krebs,
Subthreshold electrical transport in amorphous phase-change materials,”
New. J. Phys. 17, 093035, 2015.
[5] P. Hosseini, A. Sebastian, N. Papandreou, C. D. Wright, and H. Bhaskaran,
Accumulation-based computing using phase change memories with FET access devices,”
IEEE Elec. Dev. Lett. 36(9), 975-977, 2015.
[6] M. Kaes, M. Le Gallo, A. Sebastian, M. Salinga, and D. Krebs,
High field electrical transport in amorphous phase-change materials,”
J. Appl. Phys. 118, 135707, 2015.
[7] A. Sebastian, D. Krebs, M. Le Gallo, H. Pozidis, and E. Eleftheriou,
A collective relaxation model for resistance drift in phase change memory cells,”
Proc. 2015 IEEE Int’ Reliability Physics Symposium (IRPS), 2015.
[8] A. Athmanathan et al.,
A finite-element thermoelectric model for phase-change memory devices,”
Proc. Int’l Conf. on Simulation of Semiconductor Processes and Devices (SISPAD), 2015.
[9] T. Tuma, A. Pantazi, M. Le Gallo, A. Sebastian, and E. Eleftheriou,
Stochastic phase-change neurons,”
Nature Nanotechnology 11, 693-699, 2016.
[10] T. Tuma, M. Le Gallo, A. Sebastian, and E. Eleftheriou,
Detecting correlations using phase-change neurons and synapses,”
IEEE Elec. Dev. Lett. 37(9), 1238-1241, 2016.

[11] A. Pantazi, S. Wozniak, T. Tuma, and E. Eleftheriou,
All-memristive neuromorphic computing with level-tuned neurons,”
Nanotechnology 27(35), 355205, 2016.
[12] M. Le Gallo, A. Athmanathan, D. Krebs, and A. Sebastian,
Evidence for thermally assisted threshold switching behavior in nanoscale phase-change memory cells,”
J. Appl. Phys. 119, 025704, 2016.
[13] G. W. Burr et al.,
Recent progress in phase-change memory technology,”
IEEE J. Emerging and Selected Topics in Circuits and Systems 6(2), 146-162, 2016.
[14] M. Le Gallo, A. Sebastian, D. Krebs, M. Stanisavljevic, and E. Eleftheriou,
The complete time/temperature dependence of I-V drift in PCM devices,”
Proc. 2016 IEEE Int’l Reliability Physics Symposium (IRPS), 2016.
[15] M. Le Gallo et al.,
Inherent stochasticity in phase-change memory devices,”
Proc. European Solid-State Device Conf. (ESSDERC), 2016.

[16] W. W. Koelmans et al.,
Carbon-based resistive memories,”
Proc. International Memory Workshop (IMW), 2016.
[17] S. Wozniak et al.,
Learning spatio-temporal patterns in the presence of input noise using phase-change memristors,”
Proc. International Symposium on Circuits and Systems (ISCAS), 2016.
[18] A. Sebastian et al.,
Temporal correlation detection using computational phase-change memory,”
Nature Communications 8, article 1115, 2017.
[19] M. Le Gallo et al.,
Mixed-precision in-memory computing,”
arXiv preprint arXiv:1701.04279, 2017.
[20] T. Bachmann et al.,
Memristive effects in oxygenated amorphous carbon nanodevices,”
Nanotechnology, accepted Manuscript online November 13, 2017.

[21] G. W. Burr et al.,
Neuromorphic computing using non-volatile memory,”
Advances in Physics: X 2.1, 2017
[22] S. Wozniak et al.,
Neuromorphic architecture with 1m memristive synapses for detection of weakly correlated inputs,”
IEEE Trans. Circuits and Systems II: Express Briefs 64(11), 1342-1346, 2017.
[23] J. Secco, F. Corinto, and A. Sebastian,
Flux-charge memristor model for phase change memory,”
IEEE Trans. Circuits and Systems II: Express Briefs, 2017.
[24] T. A. Bachmann et al.,
Temperature evolution in nanoscale carbon-based memory devices due to local Joule heating”,
IEEE Trans. Nanotechnology 16(5), 806-811, 2017.
[25] M. Le Gallo et al.,
“Compressed sensing recovery using computational memory,”
Proc. IEEE Int’l. Electron Devices Meeting (IEDM), 2017.
[26] T. Moraitis et al.,
Fatiguing STDP: Learning from spike-timing codes in the presence of rate codes,”
Proc. Int’l Joint Conf. on Neural Networks (IJCNN), 2017.
[27] S. Woźniak et al.,
Neuromorphic system with phase-change synapses for pattern learning and feature extraction,”
Proc. Int’l. Joint Conf. on Neural Networks (IJCNN), pp.3724-3732, 2017.
[28] S. R. Nandakumar et al.,
Supervised learning in spiking neural networks with MLC PCM synapses,”
75th Annual Device Research Conf. (DRC), 2017.
[29] S. Sidler et al.,
Unsupervised learning using phase-change synapses and complementary patterns,”
Int’l Conf. on Artificial Neural Networks (ICANN 2017), Lecture Notes in Computer Science, vol 10613, pp. 281-288, 2017.
[30] I. Boybat et al.,
Stochastic weight updates in phase-change memory-based synapses and their influence on artificial neural networks,”
13th Conf. on PhD Research in Microelectronics and Electronics (PRIME), 2017.