• Welcome to the Online Discussion Groups, Guest.

    Please introduce yourself here. We'd love to hear from you!

    If you are a CompTIA member you can find your regional community here and get posting.

    This notification is dismissable and will disappear once you've made a couple of posts.
  • We will be shutting down for a brief period of time on 9/24 at around 8 AM CST to perform necessary software updates and maintenance; please plan accordingly!
K

Kathleen Martin

Guest
Lately, I’ve been reporting more and more on algorithms that parse incoming data locally in order to make some sort of decision. Last week, for example, I wrote about a company doing voice analysis locally to detect Alzheimer’s. I’ve also covered startups that process machine vibrations locally to detect equipment failures. Each of these examples benefits from machine learning (ML) algorithms that run on microcontrollers, or what we call TinyML.
Running machine learning algorithms locally helps reduce latency. That means a burgeoning machine problem can be detected and the machine turned off quickly if needed. Running machine learning algorithms locally also protects privacy, something especially important in the medical sector.  Indeed, I would prefer that neither Google nor Alexa be aware if I develop Alzheimer’s.
But as companies push sensitive and necessary algorithms out to the edge, ensuring they perform as intended becomes essential. Which is why I spent time this week learning about the security risks facing our future sensor networks running TinyML.
Continue reading: https://staceyoniot.com/how-can-we-make-tinyml-secure-and-why-we-need-to/
 

Attachments

  • p0006814.m06460.stacey_on_iot_logo_1.png
    p0006814.m06460.stacey_on_iot_logo_1.png
    147.7 KB · Views: 34