Did you receive this newsletter as a forward? Subscribe here
|
|
Can TinyML really provide on-device learning? |
|
By Stacey Higginbotham |
Imagine if your smart speaker could be trained to recognize your accent, or if a pair of running shoes could alert you in real time if your gait changed, indicating fatigue. Or if, in the industrial world, sensors could parse vibration information from a machine that changed location and function often in real time, halting the machine if that information suggested there was a problem.
We often write about the value of on-device machine learning (ML), but what we're generally discussing is running existing models on a device and matching incoming data against the established model. This is known as inference. So when you say the name "Alexa," your smart speaker matches the pattern and wakes up. Inference is great, and there is a robust community of researchers and product managers adding on-device machine learning to phones, cameras, wearables, and more. But the next big research goal is on-device learning. |
|
— A slide from a Qualcomm presentation on the benefits and challenges of machine learning. Image courtesy of Qualcomm. |
|
|
In the ML world, on-device learning is generally referred to as "training." And training an algorithm, which is what happens when a researcher feeds data into computers running different types of models in order to create a usable algorithm, takes place in the cloud.
But training in the cloud requires a lot of data and a lot of compute power. Which is why getting a small device, such as a sensor or wearable, to take in data locally and adjust its algorithm accordingly, feels impossible. But if researchers can make on-device learning real it would open up a lot of use cases.
One is personalization. So in the earlier example, if an individual says "Awexa" instead of "Alexa," the wake-word recognition algorithm could adapt over time, learning that in this individual's home, "Awexa" is the wake word.
While personalization is a compelling reason to focus on local on-device learning, it's not the only one. Remember that, broadly speaking, doing anything locally with machine learning can help save on bandwidth and connectivity costs as well as save power and reduce latency. Because data isn't heading up to the cloud, it also protects privacy. Add learning to the mix and you can protect privacy even further, because identifying data doesn't need to head up to the cloud for training, either.
So in areas where connectivity is expensive or intermittent, on-device learning might make it easier to, say, train camera traps in a rainforest to recognize different animals. Or it might allow the personalization of anomaly detection on a machine running in a mine in remote Western Australia.
For all of these reasons, this week I spent two days virtually watching a series of presentations put on by the TinyML Foundation discussing on-device learning at the edge.
In one presentation, a researcher from Qualcomm showed how pulling local biometric data from local devices to create a hash value would let companies build a way to use local on-device learning to perform biometric authentication without needing to send personal data to the cloud. (Here is the related research paper, because there's no way I can explain this succinctly in a paragraph or two.)
After watching these presentations, it's clear to me that the research community sees promise in on-device machine learning. But the technology is still in its early stages. There are numerous challenges associated with training models on devices with little computing power and memory. Local on-device learning in particular introduces new security challenges, such as adversarial attacks on sensors. Imagine if, for example, you were able to access the Amazon speaker I referenced earlier and train it to respond only to "Alesta."
There are also challenges associated with testing and scaling models that run on local devices. So how does one ensure that an on-device model that's trained locally is performing as intended?
Let's look at some of the solutions proposed in the presentations this week to address the limitations of microcontrollers. TinyML is the act of running machine learning algorithms on very constrained hardware, which includes microcontrollers. These devices have limited computing and memory in their kilobytes. So before you can train on the sensor edge, you need an ML framework designed to run on such constrained devices.
Among the presentations I watched was one from Google Sr. Program Manager Bill Luan, who showed off TensorFlow Lite for Microcontrollers, a framework that supports a tiny 16 Kb runtime core and battery power. Luan said that an accompanying Coral Dev Board Micro would be out in mid-October. This developer board includes an ARM M-4 core to run TensorFlow Micro and a fatter Arm M7 core that could run TensorFlow Lite. He demonstrated on-device learning using a Raspberry Pi, which is a far more robust computer than a microcontroller, but said he was working on a demonstration of an on-device learning model for shape and color sorting that would run on the Coral Dev Board Micro.
Having a framework for tiny training models is only one step in trying to make on-device learning for MCUs real. Researchers also have to find the right neural network strategies for each use case. For example, in another presentation I watched, Valeria Tomaselli of STMicroelectronics proposed using echo state networks (ESNs) for change detection for anomaly detection. ESNs are a form of recurrent neural networks that can readily parse time series data. A researcher from Siemens gave a presentation where he suggested retraining the top layer of a neural network at the edge, while keeping the bottom layers of a neural network stable.
Instead of focusing on the math behind an algorithm, other researchers envisioned new hardware. Kaoutar El Maghraoui, principal research scientist at IBM, proposed in her presentation specialized hardware, including chips that used in-memory processing or analog processing to conserve power or probabilistic computing to preserve power. She added that IBM is also researching quantum AI, but that is much farther out than analog or probabilistic computing.
For many in business, the idea of personalization and privacy through on-device machine learning is exciting, but it's also still far off. Today, companies are just starting to use ML at the edge that does only inference, running the incoming data against an existing algorithm. Using incoming data to locally teach an AI is still the stuff of science fiction and research labs. But the benefits it could offer make it worth keeping an eye on. |
|
|
|
Infineon makes IoT work SPONSORED |
|
|
The Internet of Things makes the way we live, work, commute, and communicate easier by connecting devices worldwide – each equipped with powerful electronics to enable contextual awareness, reliable wireless connectivity and robust security.
But building intelligent, connected and secured products for the IoT can be challenging.
Infineon's capabilities in sensing, computing, actuating, connecting and securing create the backbone of each IoT system and make the IoT what it needs to be: secure, easy, and real.
Click here to learn more. |
|
|
|
It's not yet fall, but Matter is already here |
|
If you know where to look, the Matter smart home protocol is already here. At the IFA show in Berlin which took place this week, Eve Systems is showing off its sensors interoperating with other Matter-compatible devices, while Google has a sign-up sheet for people to access Matter as part of a private developer preview program. If you can get access to that SDK it's likely you could run Matter in your own home before the official launch.
I also just got a notice from Shortcut Labs, the maker of Flic smart buttons and a connected dial, that its second-generation buttons and its Flic Twist product would support Matter. This was terribly exciting to me because it opens up a new world of devices that I can control with my Flic buttons, likely without needing to go through IFTTT, like I currently do for certain use cases. |
|
— The Eve booth at IFA has several Matter demonstrations. Image courtesy of Eve Systems. |
|
|
None of this should come as a surprise. Back in March, the Connectivity Standards Alliance, the organization that controls the Matter protocol, said it would delay the release of the standard until the fall. It's not quite meteorological fall, which occurs on the autumnal equinox, but it is technically September.
In the meantime, I've heard from a dozen different sources that the plug fests and testing events held over the summer went really well, with most devices interoperating and needing only a few tweaks. This is great news for those of us who have been eagerly waiting for Matter since it was first announced in December of 2019.
Matter was created to address the interoperability challenges of the smart home. The last decade has been full of frustration for users as they wondered whether their Nest thermostat would work with their Alexa speaker or their Eve sensor might work with their Hue light bulbs. The answer to the first one was yes, and the answer to the second one is no, unless you're running it through HomeKit. You can see why people were hesitant to spend hundreds of dollars on these gadgets.
With the full standard expected some time this month, it's probably time to clarify what Matter will and won't do. For example, Matter is only going to cover a small subset of devices at first. Big devices that aren't covered include video cameras, video doorbells, appliances, and robotic vacuum cleaners. It will also likely cover a small set of use cases as part of the data models associated with individual devices.
For example, devices such as light bulbs might share state, light level, and color using Matter, but not full-on scenes. In other cases, a smart plug might share state, but not electricity usage, with Matter devices. We're going to talk about some of this during the panel I'm hosting at Silicon Labs' Works With event on Sept. 13. (Silicon Labs is a sponsor, but this panel will be essential for anyone who cares about Matter.)
I also have some concerns about how easy it will be for developers to build for Matter and have it work seamlessly with Google and Alexa. Both tech firms have additional programs and SDKs that developers will want to use to troubleshoot their device integrations with each digital assistant, and also to implement additional functionality associated with setting up a device to work with voice assistants.
And when it comes to troubleshooting, I'm curious how consumers will be able to find problems in their Matter smart home and assign those problems to the right vendor to fix. For example, if I have a Matter light bulb that turns on with Google, but not Alexa, is the issue with Amazon or the light bulb maker? For more complex routines, finding broken links could be even more frustrating.
So while I am really excited to be able to take my Flic 2 buttons and have them work with more devices, I'm also preparing for some inevitable glitches as the technology rolls out at scale. Maybe I should sign up for that private beta to see what works and what doesn't. |
|
|
|
Matter Panel at Works With Developer Conference SPONSORED |
|
|
Matter is coming.
Stacey Higginbotham will lead a keynote panel discussion with IoT experts from Google, Comcast, Samsung SmartThings, Connectivity Standards Alliance (CSA), ARM and Silicon Labs on Matter, the highly anticipated new wireless standard. Now that its launch is just around the corner, what can we expect, and what should we as an industry do to take full advantage of it?
Register for Works With to watch this panel at workswith.silabs.com. |
|
|
|
Episode 387: Is Kickstarter still relevant for smart devices? |
|
This week we start off talking about the Federal Trade Commission suing a data broker for sharing sensitive location data. It’s a topic we’re following closely, in part because location information can’t be anonymized even when companies promise that they strip identifying information from it. With that in mind, Fight for the Future, a nonprofit focused on consumer privacy, is asking the FTC to prevent large tech firms from getting access to car data. In more data-sharing news, we talk about Adrich, a Pennsylvania company that has found some success selling Bluetooth tags that track how much of a product has been used and can reorder them for consumers. But it also shares product data usage with the company making the product. Then we kick off the IFA conference with some news bits from the Home Connectivity Alliance adding new members and a plug fest, as well as updated products from Eve. Also, Tado has created a subscription plan to optimize low energy prices. For those interested in the evolution of the security business, check out ADT’s deal with Uber to monitor drivers and riders on request. And for those who want to understand the consolidation happening in the IoT connectivity sector, we talk about Telit’s latest acquisition.
|
|
— Image courtesy of Woosh. |
|
|
This week’s guest is Winston Mok, the founder and product lead of Woosh, a company making a connected air filter. We talk about how Woosh works, its focus on sustainability, and how it plans to integrate within existing smart home services. We also talk about Mok’s decision to use Kickstarter to launch the connected air filter, a decision that would have been a no-brainer back in 2014, but seems almost quaint now. Mok explains why he thinks Kickstarter was a good option for Woosh and shares some of the benefits he got from launching on the platform. He also discusses how it it helped prepare for manufacturing at scale amidst the chip shortage, and shared advice on dealing with that situation. It’s a really useful interview.
|
|
This week on the IoT Podcast Hotline, we answer a listener question about what a listener needs to run Hue bulbs even when the internet is out. The IoT Podcast Hotline is brought to you by Works With.
Works With by Silicon Labs has emerged as the go-to developer conference for building the skills needed to create impactful connected devices. On Sept. 13–15, Silicon Labs is bringing together the most influential technology brands, ecosystem partners, and developers for three days of technical training, keynotes, and expert panels. Learn more at workswith.silabs.com. |
|
|
|
|
News of the Week |
|
Hello Kitty, meet Raspberry Pi: There’s yet another use for the inexpensive Raspberry Pi compute board. And you’re not likely to guess it, so I’ll just tell you: Someone made a cat doorbell using a Pi. No, the cat doesn’t press a button or show up on video. Instead, the Raspberry Pi listens at the front door and uses the open source TensorFlow Lite machine learning platform to determine if it hears a “meow.” I guess that’s to keep away any dogs posing as door-to-door salesmen. If the cloud-connected Raspberry Pi confirms a feline voice, it sends a text message, alerting you to let the kitty in the house. Clever! (Tom’s Hardware) — Kevin C. Tofel
Ring’s new product is an intercom: There’s a new addition to Ring’s hardware lineup and it sounds great for cats apartment dwellers. The Ring Intercom integrates with a legacy intercom system, adding mobile access to it. Ring’s add-on has a Wi-Fi radio, so it can route visitors’ voices to your phone, even if you’re not at home. And if your current system lets you buzz people into your building, you can do that remotely, too. Starting later this month the product will be available in the U.K. for £119.99, with additional European country support to follow. Look for U.S. availability next year. (The Verge) — Kevin C. Tofel
Withings is going big with a new subscription and scale: We have a range of smart scales to choose from these days, so what’s one more? This Body Comp from Withings stands out for two reasons. First, it adds sensors and algorithms for a nerve health assessment. That’s in addition to the body composition and cardiovascular assessment you’ll find on competing products. Second, Withings is launching a new Health+ subscription program that comes with the Body Comp scale. And by “comes with” I mean you’ll pay $209.95 for the scale and a year of Health+. Given that I’ve paid $25 for a smart scale, I can see why Withings is going the subscription route. All that health data being analyzed provides additional value, but if you only charge for the hardware, you’re giving that value away for free. Look for the Withings Body Comp to go on sale next month. (9to5 Mac) — Kevin C. Tofel
Automotive data company Otonomo to lay off employees: This is my least favorite type of news to write, but times are tough and the bloom is off the rose when it comes to the years of easy money for tech companies. Among the latest examples is connected car company Otonomo, which has lost 90% of its public value and will lay off dozens of employees. Otononmo went public via a special purpose acquisition company (SPAC) last year and peaked at $1.26 billion. As of Thursday, it is now worth just $54.3 million. For more on the company, check out a podcast I did with one of its executives two years ago. (CTech) — Stacey Higginbotham
Intel has launched new edge chips: Intel has released its 12th Gen Intel Core SoC processors for IoT Edge. These aren't designed for the battery-powered sensor edge, but rather for gateways and computers that will run locally, taking in data from sensors and other connected devices. Intel has lowered the power consumption and beefed up the graphics capabilities with this generation of chips, enabling them to power kiosks and other devices that have screens. (Intel) — Stacey Higginbotham
Are you ready to think about the privacy implications of the metaverse? Even if you aren't, check out this research paper, which lays out some of the potential concerns around privacy in augmented and virtual reality and then provides some legislative options for protecting user privacy. It's full of terrifying information, such as the fact that commercial extended reality systems can track body movements 90 times per second and that 20 minutes in a VR simulation creates just under 2 million unique body language recordings that can include verbal utterances, the location of items in a room, body movement patterns, and more. (SSRN) — Stacey Higginbotham
The FCC hears back from telcos on privacy inquiry: A few weeks ago the FCC sent letters to the major wireless carriers asking about the location data they collect and what they do with it. The results are in, and most admit to tracking users' locations to manage their network and provide quality service. Many also have various programs where other parties can get your data. Sometimes those programs are opt-in options like person-tracking services for families, while others are opt-out such as marketing messages from the carrier. In all cases, the carriers provide location information to law enforcement when faced with a warrant. Check out the responses and see how your cell phone provider shares your location data. (FCC) — Stacey Higginbotham
Carriers aren't the only source of location data: While cell phone carriers do track a users location and will share it with law enforcement when presented with a warrant, not every cop needs to get one. The Electronic Frontier Foundation has discovered that police agencies across the country have contracted with a company called Fog Data Sciences to buy location information gathered from 250 million devices. The location data is often gathered from applications that people download onto their phones. Those app developers then take the location data and sell it to data brokers and companies like Fog Data. The EFF says Fog Data lets law enforcement subscribe to a service that lays out the location of these devices on a map that officers can use to see what devices are in the vicinity of a crime or follow specific devices as they roam about a city. Officers don't need a warrant for this, although some do get one. This is terrifying and should be clearly disclosed to users. It probably should also be illegal or at least inadmissible in court. (EFF) — Stacey Higginbotham
The taxman cometh ... with computer vision and a drone: The French equivalent of the IRS is using a drone running a computer vision algorithm to scan the countryside for homes with undisclosed swimming pools. When it finds them, it can then serve the homes with higher tax bills based on those homes having a higher value. It's similar to cities in the U.S. that are using drones to scan for well-watered lawns during droughts in order to caution the home owners or fine them for using too much water. I admire the ability to get additional tax revenue from scofflaws, but it's still a new level of surveillance wrought by AI and IoT. (Ars Technica) — Stacey Higginbotham
Wyld Networks will work with Miromico AG for next generation IoT satellite service: Wyld Networks, a company providing LoRaWAN coverage via low-earth satellites has signed a deal with Swiss firm Miromico. Miromico will help Wyld design and and manufacture sensor-to-satellite LoRaWAN terminals and modules. Miromico will also resell Wyld terminals and connectivity, while Wyld will sell Miromico sensors and use those sensors in solutions it develops for clients. (Via Satellite) — Stacey Higginbotham
We have three ad spots remaining this year and just opened Q1 of next year. Request a media kit for more details. Thanks!
|
|
|
|
|
|
|
|
|