Today’s data-agile economy is characterized by a steady annual growth of 60% in data generation and processing, propelled by the never-ending creation of content, the relentless expansion of cloud applications and the flourishing of the Internet-of-Things. With this rapid growth, the share of ICT in the global energy footprint is expected to skyrocket to 21% in 2030.
An important aspect in the realm of ICT is data processing. However, standard computing architectures fail to represent the information structures observed in the real world, which instead have to be virtualized at a large detriment to the overall efficiency. At the same time, the continual progress in Moore’s law that has been enjoyed during the past decades starts to flatten out as the node size of microelectronics decreases and a digital energy brick wall is faced, at the extent of a required energy of 100 pJ per multiply-accumulate operation. This caps the overall scale of what can be achieved in the high-performance computing realm.
Nature 561, 163 (2018)
On the other side is the human brain: a masterpiece of biology that breaches this energy wall with an astonishing amount of nearly 100 quadrillion (1017) operations per Watt, or a sub-Attojoule energy per operation. With this, the brain is eight orders of magnitude more efficient by virtue of interconnecting 100 billion (1011) neurons, leveraging a “processor” that fundamentally differs on both, the physical and architectural level.
JOLLYBEE is a research-driven project that leverages on this bio-inspired computing architecture in order to mimic the natural processing capabilities of the brain. It does so by blending two key enabling technologies – artificial intelligence and photonics – as it aims at developing a neuromorphic platform that is propelled through GHz operation of optical neural networks and their ultra-low time-of-flight inference latencies. Through the experimental demonstration of a chip-scale implementation of an optical neural network, JOLLYBEE will offer the means for distributed artificial intelligence in mission-critical applications at the edge of today’s ICT infrastructure.