K
Kathleen Martin
Guest
In one corner of the AI/ML world, NVIDIA does its best to convince anyone who wants to dip their toes into training that more raw power is the answer. In the other, CEA-Leti recently announced that Elisa Vianello, an Edge AI program coordinator, nabbed a €3 million grant from the European Research Council (ERC) to develop new edge AI systems inspired by insect nervous systems.
According to Vianello, one of the biggest challenges around bringing AI directly to IoT devices, which would allow them to make autonomous decisions based on comparing sensor input with trained data and decision trees, is that the current chip architectures waste up to 90% of their total energy consumption on moving data around, not processing it.
See also: Researchers Surpass Speech Recognition Methods with Help From Honey Bees
Because of this waste, IoT devices are either hindered in their AI capabilities or have to be physically tethered to a stable supply of power, which means they’re not nearly as flexible as many organizations would like. Doubly so for one proposed application, implantable medical diagnostic microchips, which would rely heavily on the user’s trust in the device’s reliability.
What’s the hiccup when it comes to small-scale devices? There just isn’t a type of memory that’s high-density, high-resolution, non-volatile, and endlessly endurant. Vianello says many industry labs and research centers have tried developing in-memory architectures in nano-scale, which use in-memory processing, but the results have been mixed at best. DRAM, for example, is volatile, which means that its contents are deleted when power is lost—a likely occurrence in many IoT settings. Non-volatile memory types, such as NVRAM, have dramatically improved endurances over the years; they’re still not completely infallible.
Vianello and her team will use the grant funding to research how the functions of specific insect nervous systems resemble the functions performed by deterministic, probabilistic, volatile, and non-volatile memory and then explore how those could be recreated in “high-performance, energy-efficient, silicon-based nanosystems.” Vianello says, “Crickets make accurate decisions based on sluggish, imprecise, and unreliable neurons and synapses in order to escape their predators. Looking closely at their biology, we identified a diversity of memory-like functions at play in their sensory and nervous systems. By combining these different functions, the cricket’s internal computing system achieves amazing performance and energy efficiency.”
For example, crickets have several sensors on their body, along with numerous local processing units in their abdomen, capable of continuous learning and decision-making without involving the central brain. Because it has a distributed computing system, it makes decisions faster—without having to transfer data from one place to another before processing it.
Continue reading: https://www.rtinsights.com/are-bugs-the-future-of-ai-in-the-internet-of-things/
According to Vianello, one of the biggest challenges around bringing AI directly to IoT devices, which would allow them to make autonomous decisions based on comparing sensor input with trained data and decision trees, is that the current chip architectures waste up to 90% of their total energy consumption on moving data around, not processing it.
See also: Researchers Surpass Speech Recognition Methods with Help From Honey Bees
Because of this waste, IoT devices are either hindered in their AI capabilities or have to be physically tethered to a stable supply of power, which means they’re not nearly as flexible as many organizations would like. Doubly so for one proposed application, implantable medical diagnostic microchips, which would rely heavily on the user’s trust in the device’s reliability.
What’s the hiccup when it comes to small-scale devices? There just isn’t a type of memory that’s high-density, high-resolution, non-volatile, and endlessly endurant. Vianello says many industry labs and research centers have tried developing in-memory architectures in nano-scale, which use in-memory processing, but the results have been mixed at best. DRAM, for example, is volatile, which means that its contents are deleted when power is lost—a likely occurrence in many IoT settings. Non-volatile memory types, such as NVRAM, have dramatically improved endurances over the years; they’re still not completely infallible.
Vianello and her team will use the grant funding to research how the functions of specific insect nervous systems resemble the functions performed by deterministic, probabilistic, volatile, and non-volatile memory and then explore how those could be recreated in “high-performance, energy-efficient, silicon-based nanosystems.” Vianello says, “Crickets make accurate decisions based on sluggish, imprecise, and unreliable neurons and synapses in order to escape their predators. Looking closely at their biology, we identified a diversity of memory-like functions at play in their sensory and nervous systems. By combining these different functions, the cricket’s internal computing system achieves amazing performance and energy efficiency.”
For example, crickets have several sensors on their body, along with numerous local processing units in their abdomen, capable of continuous learning and decision-making without involving the central brain. Because it has a distributed computing system, it makes decisions faster—without having to transfer data from one place to another before processing it.
Continue reading: https://www.rtinsights.com/are-bugs-the-future-of-ai-in-the-internet-of-things/