Avoiding the coming IoT dystopia
Bradley Kuhn works for the Software Freedom Conservancy (SFC) and part of what that organization does is to think about the problems that software freedom may encounter in the future. SFC worries about what will happen with the four freedoms as things change in the world. One of those changes is already upon us: the Internet of Things (IoT) has become quite popular, but it has many dangers, he said. Copyleft can help; his talk is meant to show how.
It is still an open question in his mind whether the IoT is beneficial or not. But the "deep trouble" that we are in from IoT can be mitigated to some extent by copyleft licenses that are "regularly and fairly enforced". Copyleft is not the solution to all of the problems, all of the time—no idea, no matter how great, can be—but it can help with the dangers of IoT. That is what he hoped to convince attendees with his talk.
A joke that he had seen at least three times at the conference (and certainly before that as well) is that the "S" in IoT stands for security. As everyone knows by now, the IoT is not about security. He pointed to some recent incidents, including IoT baby monitors that were compromised by attackers in order to verbally threaten the parents. This is "scary stuff", he said.
![Bradley Kuhn [Bradley Kuhn]](https://static.lwn.net/images/2019/lca-kuhn-sm.jpg)
The IoT web cameras that he uses at his house to monitor his dogs have a "great way" to avoid any firewalls they may be installed behind. They simply send the footage to the manufacturer's servers in China so that he can monitor his dogs from New Zealand or wherever else he might be. So there may be Chinese "hackers" watching his dogs all day; he hopes they'll call if they notice the dogs are out of water, he said to laughter.
As a community, we are quite good at identifying problems, which is why that joke has been repeated so often, for example. And even though we know that the IoT ecosystem is a huge mess, we haven't really done anything about it. Maybe the task just seems too large and too daunting for the community to be able to do something, but Linux has always provided the ultimate counter-example.
Many of us got started in Linux as hobbyists. There was no hardware with Linux pre-installed, so we had to get it running on our desktops and laptops ourselves. That is how he got his start in the Linux community; when Softlanding Linux System (SLS) didn't work on his laptop in 1992, he was able to comment out some code in the kernel, rebuild it, and get it running. Now it is not impossible to find pre-installed Linux on laptops, but, in truth, most people install Linux on their laptops themselves, like in 1992.
The hobbyist culture, where getting Linux on your laptop meant putting it there yourself, is part of what made the Linux community great, Kuhn said. The fact that nearly everyone at his talk was running Linux on their laptop (and only a tiny fraction of those had it come pre-installed) is a testament to the hard work that many have done over the years to make sure we can buy off-the-shelf hardware and run Linux on it. It was the hobbyist culture that made that happen and they did so in spite of, not with the help of, the manufacturers.
If he said that he didn't think it was important that users be able to install Linux on their laptops, no one would think that was a reasonable position—"not at LCA, anyway". But that is exactly what we are being told in IoT-land. The ironic thing is that most of those devices do come with Linux pre-installed. So you still can't buy a laptop with Linux on it, for the most part, but you can't buy a web camera (or baby monitor) without Linux on it.
If, in 1992, we had heard that 90+% of small devices would come with Linux pre-installed in 2019, we would have been ecstatic, he said. But the problem is that to a large extent, there are no re-installs. Some people do install alternative firmware on their devices, but the vast majority do not. And, in fact, most IoT device makers seek to make it impossible for users to re-install Linux.
Help from the GPL
But the GPL is "one of the most forward-looking documents ever written—at least in software", Kuhn said. While the GPL did not anticipate the advent of IoT, it already contains the words needed to help solve the problems that come with IoT: "the scripts used to control compilation and installation of the executable". The freedom to study the code is not enough; the GPL requires more than just the source code so that users can exercise their other software freedoms, which includes the freedom to modify and to install modified versions of the code.
The Linksys WRT54G series of home WiFi routers is one of the first IoT devices to his way of thinking. These devices brought a major change to people's homes. Like Rusty Russell (whose keynote earlier that day clearly resonated with Kuhn), he remembers going to conferences that had no WiFi. He was not surprised to hear that someone printed out the front page of Slashdot at the first LCA.
The WRT54G was the first device where "we did things right": a large group of people and organizations got together and enforced the GPL. That led to the OpenWrt project that is still thriving today. The first commit to the project's repository was the source code that the community had pried out of the hands of Linksys (which, by then, had been bought by Cisco).
But the worry is that OpenWrt is something of a unicorn. There are few devices that even have an alternative firmware project these days. It is, he believes, one of the biggest problems we face in free software. We need to have the ability to effectively use the source code of the firmware running in our devices.
Another example of a project that came about from GPL-enforcement efforts is SamyGO, which creates alternative firmware for Samsung televisions. That project is foundering at this point, he said, which is sad. He does not understand why the manufacturers don't get excited by these projects; at the very least, they will sell a few more devices. As an example, the WRT54G series is the longest-lived router product ever made; it may, in fact, be the longest-lived digital device of any kind.
GPL enforcement can make these kinds of projects happen. But there is a disconnect between two different parts of our community. The Linux upstream is too focused on big corporations and their needs, to the detriment of the small, hobbyist users, he said to applause. Multiple key members of the Linux leadership take the GPL "cafeteria style"; they really only want the C files that have changed and do not care if the code builds or installs.
But all of those leaders, and lots of others, got their start by hacking on Linux on devices they had handy. Kuhn pointed to a 2016 post by Matthew Garrett that described this problem well:
The next generation of developers will come from the hobbyist world, not from IBM and other corporations, Kuhn said. If all you need to hack on Linux is a laptop and an IoT device, that will allow lots of people, from various income levels, to participate.
Users and developers
In the software world, there is a separation between users and developers, but free software blurs that distinction. It says that you can start out as a user but become a developer if you ever decide you want to; the source code is always there waiting. He worries that our community will suffer if, someday, the only places you can install Linux are in big data centers and, perhaps, in laptops. If all the other devices that run Linux can only be re-installed by their manufacturers, where does the next crop of upstream developers come from? That scenario would be a dystopia, he said; we may have won the battle on laptops, but we're losing it for IoT.
Linux is the most important GPL program ever written, Kuhn said; it may be the most important piece of software ever written, in fact. It was successful because of, not in spite of, its GPL license; he worries that history is being rewritten on that point. Linux was also successful because users could install it on the devices they had access to. He believes that the leaders of upstream Linux are making a mistake by not helping to ensure that users can install their modified code on any and every device that runs Linux. "Tinkering is what makes free software great"; without being able to tinker, we don't have free software.
Upstream does matter, but it does not matter as much as downstream. He said he was sorry to any upstream developers of projects in the audience, but that their users are more important than they are. "It's not about you", he said with a grin.
It is amazing to him, and was unimaginable to him in 1992, that there are many thousands of upstream Linux developers today. But he really did not anticipate that two billion people would have Linux on their devices; that is even more astounding than the number of developers. But most of those two billion potential Linux developers aren't actually able to put code on their devices because those devices are locked down, the process of re-installing is too difficult, or the alternative firmware projects haven't found the time to do the reverse engineering needed to be able to do so.
The upstream Linux developers are important; they are our friends and colleagues. But upstream and downstream can and do disagree on things. Kuhn strongly believes that there is a silent plurality and a loud minority that really want to see IoT devices be hackable, re-installable, changeable, and "study-able". That last is particularly important so that we can figure out what the security and privacy implications of these devices are. Once again, that was met with much applause.
Being able to see just the kernel source is not going to magically solve all of these problems, but it is a "necessary, even if not sufficient condition". We have to start with the kernel; if some device maker wants to put spyware in, the kernel is an obvious place to do so. The kernel's license is fixed, and extremely hard to change as we frequently hear, but it is the GPLv2, which gives downstream users the rights they need. Even if upstream does not prioritize those rights the same way as its downstream users do, it has been quite nice and licensed its code in a way that assures software freedom.
There is no need for a revolution to get what we need for these IoT devices. The GPL already has the words that we need to ensure our ability to re-install Linux on these devices. We simply need to enforce those words, he said.
Call to action
Kuhn put out a call to action for attendees and others in our community. There are some active things that people can do to help this process along. The source code offer needs to be tested for every device that has one. Early on, companies figured out that shipping free software without an offer for the source code was an obvious red flag indicating a GPL violation. So they started putting in these offers with no intention of actually fulfilling them. They believe that almost no one will actually make the request.
That needs to change. He would like everyone to request the source code using the offer made in the manual of any device they buy. It is important that these companies start to learn that people do care about the source code, so even if you do not have the time or inclination to do anything with it, it should still be requested. Every time you buy a Linux-based device, you should have the source code "or something that looks like it" or you should request it.
If you have the time and skills, try to build and install the code on the device. If you don't, see if a friend can help or ask for help on the internet. Re-installing can sometimes brick your device, but that probably means that the company is in violation of the GPL. The idea is to create a culture within our community that publicizes people getting source releases and trying to use them; if they do not work, which is common, that will raise the visibility of these incomplete source releases.
"If it doesn't work, they violated the GPL, it's that simple", Kuhn said. You were promised a set of rights in the GPL and you did not get them. You can report the violation but, unfortunately, SFC is the only organization that is doing enforcement for the community at this point. It has a huge catalog of reports, so it may not be able to do much with the report, but the catalog itself is useful. It shows the extent of the problem and it helps the community recognize that there is a watchdog for the GPL out there; for better or worse, he said, that watchdog is SFC.
But he had an even bigger ask, he said. He is hoping that at least one person in the audience would step up to be the leader of a project to create an alternative firmware for some IoT device. It will have to be done as a hobbyist, because no company will want to fund this work, but it is important that every class of device has an alternative firmware project. Few people are working on this now, but if you are interested in the capabilities of some device, you could become the leader of a project to make it even better. "Revolutions are run by people who show up."
It feels like an insurmountable problem, even to him most days, but it does matter. It only requires that we exercise the rights that we already have, rights that were given to us by the upstream developers by way of the GPL.
Being able to rebuild and re-install Linux on these devices won't magically fix the numerous privacy and security flaws that they have—but it is a start. The OpenWrt project started by getting the source code it was provided running, it then started adding features. Things like VPN support and bufferbloat fixes have improved those devices immeasurably. We can restore the balance of power by putting users back in charge of the operating systems on their devices; we did it once with laptops and servers and we can do it again for IoT devices.
A WebM format video of the talk is available, as is a YouTube version.
[I would like to thank LWN's travel sponsor, the Linux Foundation, for
travel assistance to Christchurch for linux.conf.au.]
Index entries for this article | |
---|---|
Conference | linux.conf.au/2019 |
Posted Feb 12, 2019 19:34 UTC (Tue)
by smurf (subscriber, #17840)
[Link] (21 responses)
If you use your own firmware they will do no such thing, thereby causing the TV maker to lose money on your sale, thus "selling a few more units" is a non-starter argument here.
Posted Feb 12, 2019 20:09 UTC (Tue)
by pj (subscriber, #4506)
[Link] (6 responses)
Posted Feb 12, 2019 20:31 UTC (Tue)
by ay (guest, #79347)
[Link] (5 responses)
Posted Feb 13, 2019 8:11 UTC (Wed)
by zdzichu (subscriber, #17118)
[Link]
Posted Feb 13, 2019 16:20 UTC (Wed)
by NYKevin (subscriber, #129325)
[Link] (3 responses)
Posted Feb 13, 2019 18:50 UTC (Wed)
by mathstuf (subscriber, #69389)
[Link] (2 responses)
Posted Feb 19, 2019 17:59 UTC (Tue)
by jccleaver (guest, #127418)
[Link] (1 responses)
Posted Feb 19, 2019 18:34 UTC (Tue)
by mathstuf (subscriber, #69389)
[Link]
Posted Feb 12, 2019 20:32 UTC (Tue)
by Cyberax (✭ supporter ✭, #52523)
[Link] (2 responses)
Posted Feb 12, 2019 20:34 UTC (Tue)
by ay (guest, #79347)
[Link] (1 responses)
Posted Feb 12, 2019 21:20 UTC (Tue)
by Cyberax (✭ supporter ✭, #52523)
[Link]
TVs don't show ads (at least so far) and the built-in Crappstores are mostly unused. Though I suspect TV manufacturers might get referral fees from Netflix/Roku if their apps are installed through TV's crappstore.
Now, these couple of dollars might affect the _margin_ quite a bit. The TV market is so commoditized that the margin is often close to zero.
Posted Feb 12, 2019 21:37 UTC (Tue)
by perennialmind (guest, #45817)
[Link]
And wouldn't that be a great thing? If everyone took the subsidized hardware and subverted it for their own ends, the tactic would backfire, as it did with AOL's free floppies and CueCat's free barcode scanners. Let customers be customers and products be products, not the bizarre inversion.
I don't think the pitch is more sales. I think the pitch should be avoiding the race to the bottom which prices out the scrupulous and devalues qualities of engineering, design, and production.
Posted Feb 14, 2019 8:12 UTC (Thu)
by jimbo (subscriber, #6689)
[Link] (9 responses)
Posted Feb 16, 2019 3:39 UTC (Sat)
by murukesh (subscriber, #97031)
[Link] (8 responses)
Posted Feb 16, 2019 13:17 UTC (Sat)
by excors (subscriber, #95769)
[Link] (7 responses)
That means the DRM implementation isn't just a TPM, it includes parts of video decoding, display composition, HDCP negotiation, etc, which is a significant amount of functionality and has a large and complex interface with the rest of the software. It's probably too complex to do entirely in hardware, so it needs some kind of trusted firmware that can be certified to follow the DRM system's security requirements. That certification would be meaningless if the user could modify the firmware, so the firmware can't be usefully user-replaceable and can't be GPLv3 (because of anti-tivoization). (Also it may be required to include proprietary code from the DRM provider, so it can't be GPLv2 either, but that's a separate issue.)
Posted Feb 18, 2019 16:13 UTC (Mon)
by nybble41 (subscriber, #55106)
[Link] (6 responses)
In principle all of that could be open-source and user-replaceable, as long as the TPM can verify the software in use and only expose its internal keys for use by officially approved binaries. Users could use modified software with non-DRM media or with media encrypted with keys they install into the TPM themselves. It's still a DRM scheme, and thus evil, but it doesn't seem incompatible with open source—even GPLv3.
Posted Feb 18, 2019 17:28 UTC (Mon)
by excors (subscriber, #95769)
[Link] (5 responses)
GPLv3 says:
> If you convey an object code work under this section [...] the Corresponding Source conveyed under this section must be accompanied by the Installation Information. [...] "Installation Information" for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made.
I'm not sure what "continued functioning" legally means, but if your TV is intentionally designed to stop supporting Netflix solely because you made an insignificant modification to the firmware, it seems plausible to argue to the TV has not continued functioning.
On the other hand https://events.linuxfoundation.org/wp-content/uploads/201... argues that GPLv3's anti-tivoization clause doesn't actually prevent what Tivo did - it's okay if a vital part of the product is proprietary code that stops functioning if the GPLv3 code is modified, because it's not the GPLv3 code that has stopped functioning. But that sounds like it's exploiting a bug in the GPLv3, because the rationale was obviously to prevent what Tivo did, so it seems pretty dodgy to rely on that argument.
Posted Feb 18, 2019 18:11 UTC (Mon)
by rgmoore (✭ supporter ✭, #75)
[Link] (3 responses)
I think a lot of people would be OK with that. My personal experience is that the "smart" TV is pretty useless, and I'm much more likely to use features through an add-on device than through the TV. Popular ways of getting smart features not through the TV include dedicated devices like Roku or Chromecast, video game consoles, home media PCs, and smart DVRs. In my experience, all of those things have better UIs than smart TVs, and many people are going to get them for the convenience and added features without even considering the issue of TVs spying on them.
Posted Feb 18, 2019 18:19 UTC (Mon)
by zdzichu (subscriber, #17118)
[Link] (1 responses)
Posted Feb 21, 2019 22:47 UTC (Thu)
by Wol (subscriber, #4433)
[Link]
Now binned because it broke, but I had a Logik / JVC TV that used to crash when you tried to record to USB. It did it somewhat at random but that's obviously very annoying.
My car stereo now seems to spend an inordinate amount of time "scanning USB" which impacts quite seriously on its usability - I think that's a bug ...
And trying to get this sort of stuff fixed is a nightmare, as it's almost impossible to contact the manufacturer, and the retailer's attitude is "well almost everything works". The fact that the bit that doesn't is very important to you, isn't important to them.
Cheers,
Posted Feb 18, 2019 18:49 UTC (Mon)
by excors (subscriber, #95769)
[Link]
The same issues apply to the add-on device. I don't think telling customers of a Roku streaming stick "you can freely replace the firmware, as long as you don't want to stream anything from Netflix" would be useful. There's nothing particularly special about the hardware; if you just wanted a small programmable box with HDMI output, you could run your code on a Raspberry Pi instead. The interesting part is the device's original fully-functioning firmware; that's the thing you'd want to modify out of curiosity or for security.
> without even considering the issue of TVs spying on them
If you're concerned about that, the streaming sticks could spy on you just as easily as your TV could - there's often a microphone in the remote control (for voice search).
Posted Feb 18, 2019 21:07 UTC (Mon)
by nybble41 (subscriber, #55106)
[Link]
I'm not sure *anyone* knows what "continued functioning of the modified object code" legally means, including the people who wrote that clause. However, I would say that both the modified object code and the TPM are each functioning exactly as designed. I would consider this analogous to the case of a GPLv3 client for a private third-party web service: You can modify and execute the software as you please, but you'll have to make your own agreement with the third-party service to use their APIs. That third party might require you to use only approved software to access their APIs as part of their terms of service. You can either implement your own API-compatible service (use only DRM-free media) or negotiate to get your modified software approved.
Whether "the TV has not continued functioning" is beside the point. The GPLv3 doesn't talk about the functionality of the device as a whole, just the modified object code. Systems with modified code may not have access to the keys in the TPM which would be necessary to play DRM'd media, but the system isn't actually blocking the modified code or interfering with its execution.
> In the context of a smart TV, I don't think telling customers "you can replace the firmware, as long as you don't want to watch Netflix or any subscription channels" is much good.
What you're really objecting to here is that Netflix and subscription channels require DRM. I agree with you on this, but in practice they're never going to entrust their content to software they don't control (even though the empirical evidence says that whatever they're obsessing over can almost certainly already be found online without the DRM). The best you can reasonably hope for is that the parts which handle their content are sufficiently isolated from the front end that the front end software can be open-source and user-replaceable without getting involved with the "protected" media path. Since that requires more expensive hardware, I wouldn't recommend holding your breath for such a design in mass-produced consumer electronics.
Posted Feb 12, 2019 21:39 UTC (Tue)
by excors (subscriber, #95769)
[Link] (32 responses)
On a laptop or PC you can use UEFI Secure Boot and allow someone with physical access to enter the BIOS and type in their own keys, so that it will only run kernels and kernel modules signed by that person.
But on an IoT device, physical access is much less strongly related to trust: the devices are sometimes outdoors, and often spread all around your house, where it's fairly easy for an attacker to reach them without being noticed. And you're lucky if the device has even a single physical button on it; there's no physical way to enter a new key.
Their main UI is probably a smartphone app, connected over a local network or via the cloud, with some setup process to pair the device and phone. But I doubt the pairing process is trustworthy enough that you'd want to count it as permission to install custom firmware. It's good enough for regular use - if I'm visiting your house and I try to pair your IoT camera with my phone, you'll quickly notice that your phone can't access the camera any more (hopefully it'll send you a notification immediately), so you'll re-pair it and I'll lose access, and little harm is done. But if I can replace the firmware while I've temporarily got access, I can make it send me the video feed even after it's been paired back to your phone, and I can make it lie about what firmware it's currently got installed, so I have permanent undetectable access.
(Of course an attacker with physical access could, in principle, swap your device for an identical-looking one with custom hardware and custom firmware that spies on you, but that's orders of magnitude harder than simply modifying the public firmware source and reflashing it, and almost impossible to keep hidden if the user gets suspicious and starts investigating the device.)
Obviously lots of current IoT devices have terrible security, but some manufacturers do try to get it right - and that includes using some form of secure boot, probably with the keys in one-time-programmable memory (or at least in write-protected flash that's really hard to get physical access to), so you can guarantee that simply power-cycling the device will get it into a known-good state (according to the manufacturer's definition of "good"). The ones who are trying to do the right thing shouldn't be asked to compromise security for hackability. So is there any practical way to get both hackability and security?
Posted Feb 12, 2019 21:54 UTC (Tue)
by ay (guest, #79347)
[Link] (8 responses)
Posted Feb 12, 2019 22:52 UTC (Tue)
by excors (subscriber, #95769)
[Link] (5 responses)
Maybe it needs some kind of remote attestation mechanism or something? Let an attacker replace the firmware signing keys if they want, but when you (the legitimate user) try to pair the device with your phone, the device sends a signed copy of those keys (plus nonce provided by the phone, to avoid replay attacks), computed by some small non-user-replaceable trusted code with a secret per-device private key, so your app can check whether it's still running clean firmware. And avoid MITM attacks with some out-of-band way to verify the device's identity, like an ID number printed on the device that the user has to scan into their phone, which is compared against the public key used for the signature. I suppose that doesn't sound too bad? But I don't know if that would actually work and be secure enough (or at all).
Posted Feb 13, 2019 6:50 UTC (Wed)
by felixfix (subscriber, #242)
[Link] (4 responses)
I knew a components engineer whose entire job consisted of scouring hundreds of data books in different languages for equivalent components which were cheaper; taking even a full week to save a penny per unit paid his salary and more.
Any manufacturer who tried to do right by the GPL would go out of business from all the price shoppers.
I suspect all that would happen if buyers insisted on their GPL rights is that manufacturers would go out of business and evaporate, and everyone would appear in a brand new company doing the same thing, if GPL rights actually threatened their bottom line that way. The economics just don't add up.
Posted Feb 13, 2019 8:09 UTC (Wed)
by Cyberax (✭ supporter ✭, #52523)
[Link] (1 responses)
If people were to start really enforcing GPL, manufacturers would just switch to FreeBSD or in many cases to a smaller RTOS (FreeRTOS by Amazon is nice). In a few years they are going to use Fuchsia for that.
Posted Feb 13, 2019 20:33 UTC (Wed)
by ay (guest, #79347)
[Link]
However we absolutely ruled out GPLv3, in fact yocto makes that easy (and with buildroot you can write your own filters). Again nothing of value was lost, there's just nothing compelling licensed that way that we ever needed in a product.
I think that's the way it's going to continue to be and this whole battle is "lost", but I wouldn't blame insecure crappy IoT entirely, I can assure you the better engineered devices are doing secure boot for security reasons and you're not meant to subvert that without a lot of effort and indeed there's no business justification for even adding a jumper for you to own the thing.
Posted Feb 14, 2019 15:33 UTC (Thu)
by bfields (subscriber, #19510)
[Link] (1 responses)
One moral to take from that is that per-unit costs really dominate when you're selling a massive number of units.
So if the cost of GPL compliance is a fixed cost (some up-front legal and developer review), and the risks are per-unit (per-unit penalties, or products taken off the market), then that might motivate higher GPL compliance in such cases?
Any manufacturer who tried to do right by the GPL would go out of business from all the price shoppers.
But, point taken that given that extreme price pressure, it could be hard to comply unless the risks of noncompliance are real.
Posted Feb 15, 2019 14:59 UTC (Fri)
by NAR (subscriber, #1313)
[Link]
The legal review, download link for the source, etc. is fixed cost. The extra USB port, button, connector, pin, etc. that would tell the device it's OK to download and use a new image from the owner (so the owner can actually install the modified GPL software) - that's a per-unit cost.
Posted Feb 13, 2019 2:52 UTC (Wed)
by faramir (subscriber, #2327)
[Link] (1 responses)
Posted Feb 13, 2019 11:22 UTC (Wed)
by excors (subscriber, #95769)
[Link]
> all bets are off if someone has physical access to the device
It's still important to consider the time, skill, money, equipment etc required by an attacker with physical access, and how easily the victim can discover the attack.
If you invite your neighbours around and their kid sneaks into your bedroom and reprograms your IoT camera with some undetectable off-the-shelf spyware after thirty seconds with a screwdriver and a phone, that's not acceptable security. If they have to desolder the flash chip, plug it into a reprogrammer, then solder it back on again (which requires specialised equipment since it's far too small to do by hand), that's a very different category of attack; maybe that's good enough for a consumer product, though it still doesn't seem great. And if the only way to make it undetectable is to steal the device's private key by spending days with the chip under an electron microscope, that's another category again and is probably good enough. The difficulty is in achieving one of those stronger forms of security while still allowing the legitimate user to replace the firmware.
Posted Feb 13, 2019 8:05 UTC (Wed)
by eru (subscriber, #2753)
[Link] (3 responses)
To hamper the physical access by an attacker on slightly larger devices, you could include a set of DIP switches (8 should be enough) that have to encode some number, specific to each device instance, before a firmware update is allowed. Not entirely unrealistic on things like baby monitors: I once owned one (not Internet-connected!) that had DIP switches in the transmitter and receiver for selecting the channel to use.
Posted Feb 14, 2019 16:40 UTC (Thu)
by felixfix (subscriber, #242)
[Link] (2 responses)
Posted Feb 14, 2019 18:05 UTC (Thu)
by smurf (subscriber, #17840)
[Link] (1 responses)
The cost of hooking random insecure IoT things into your WLAN isn't as high to you-the-homeowner (not yet anyway), but these insecurities enable DDoS attacks and whatnot, thus are costly to somebody else. In any other area of modern life, that would cause even more stringent regulations.
Posted Feb 15, 2019 18:51 UTC (Fri)
by aaronmdjones (subscriber, #119973)
[Link]
What you *do* need to do UL testing for, is if you want to brand a product as UL *Listed*. Since it's a trademark, only the UL can do that, not you, or you'd be liable for trademark infringement.
Posted Feb 13, 2019 12:38 UTC (Wed)
by cstanhop (subscriber, #4740)
[Link] (18 responses)
However this gets into threat modeling, and I don't think there is a one answer that will satisfy everybody. From some perspectives the owner is not authorized to update the firmware and would be considered an attacker. I don't find this an acceptable model for the devices I own and use in my home. In most of my personal scenarios, physical access would be a high enough bar to allow firmware updates. Having a way to remotely attest the firmware running is the firmware I installed would be a nice to have, but not strictly necessary.
Posted Feb 13, 2019 21:02 UTC (Wed)
by nybble41 (subscriber, #55106)
[Link] (17 responses)
It's up to the owner to decide who is authorized and who is not. That's a key part of why they're called "the owner". If you are not authorized to update the firmware, or to decide who else is allowed to update the firmware, then as far as the device is concerned you don't own it.
For devices with Secure Boot or equivalent, it should be considered fundamental that the device has not been properly delivered to its new owner so long as the previous owner controls the keys needed to sign the lowest levels of firmware. Part of the process of legally transferring ownership for such a device should be causing the device to "imprint" on its new owner and recognize their absolute and exclusive authority over it, by reprogramming it with a new set of master keys. Failure to do so should be classified as fraud—you're claiming to turn over ownership but leaving the device in a state where it still recognizes you as its owner.
Posted Feb 13, 2019 22:10 UTC (Wed)
by Cyberax (✭ supporter ✭, #52523)
[Link] (16 responses)
I think everybody would LOVE this system!
Posted Feb 14, 2019 0:54 UTC (Thu)
by nybble41 (subscriber, #55106)
[Link] (15 responses)
There's no need to involve anyone else unless the seller fails to properly update the key.
Alternatives would include delivering the device "blank" and letting the buyer load their own key, which might be a good option for brand-new devices, or including a factory reset switch which would allow anyone with sufficient physical access to clear the device and load their own key (if physical security is less of a concern).
However it's implemented, the one option which is *not* acceptable is for the seller to retain the master keys to a device they no longer own, with no option for the buyer to replace them. The device's idea of who its owner is should be aligned with the buyer's. If you want to sell someone *access* to a device while retaining control over it you're not selling them the device, you're only leasing it; and if that isn't made perfectly clear up front then the contract is invalid.
Posted Feb 14, 2019 1:43 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (4 responses)
Realistically all you can hope for is the sale of blank devices without warranty of any kind.
For large IoT devices like TVs the best way forward is to move all the "smart" functionality into standardized modular components that can be replaced by the user.
Posted Feb 14, 2019 2:36 UTC (Thu)
by excors (subscriber, #95769)
[Link] (3 responses)
Posted Feb 14, 2019 2:44 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (2 responses)
Posted Feb 14, 2019 3:32 UTC (Thu)
by excors (subscriber, #95769)
[Link] (1 responses)
Posted Feb 14, 2019 3:34 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link]
Posted Feb 26, 2019 14:06 UTC (Tue)
by omgold (guest, #109541)
[Link] (9 responses)
They could just do the same as router vendors currently do with Wifi-Passwords.
- store a random (per-device) key/password in the device memory
Posted Feb 26, 2019 15:56 UTC (Tue)
by nybble41 (subscriber, #55106)
[Link] (8 responses)
If it's acceptable to reset the device back to a blank factory-default state prior to the sale (or after, assuming physical access is sufficient proof of ownership) then the new owner can program in their own secret when they receive the device. If not, the existing owner needs to transfer control to the new owner prior to the sale, and doing that without letting the existing owner know the new owner's secret requires a public-key authentication system. Reasons not to perform a factory reset include deterring theft or tampering while the device is in transit and/or wanting to preserve any data already on the device.
Posted Feb 27, 2019 7:28 UTC (Wed)
by omgold (guest, #109541)
[Link] (7 responses)
My reasoning about having the first is that it protects dumb users (who never change the password) from fun guys coming to visit, who want to reflash the firmware unnoticed. It is simply prevented by the owner storing the sheet with the preconfigured pw somewhere unknown to the attacker (which is likely).
About pubkey, it should be enough for the new owner to change the password, no? Not pubkey needed:
- previous owner provides current password, or just the vendor preconfigured-one (the latter would need a factory reset)
Posted Feb 27, 2019 13:07 UTC (Wed)
by excors (subscriber, #95769)
[Link] (6 responses)
I guess that's okay for ISP-provided routers - the customer has a contract with the ISP, so customer support has enough information to verify the identity of the caller then tell them the original password. And the ISP already provides customer support in the user's language, and is being paid like $50/month, so it's relatively cheap to deal with the occasional lost password.
But that solution doesn't scale to many IoT devices. Maybe you're buying the device for $10 from a small Chinese manufacturer. They can't afford to provide that level of individual customer support. And you're definitely going to lose track of all these password sheets if you have one per lightbulb. The manufacturer might still care about security, so they'd like a cleverer technical solution that provides a decent level of security without those support costs.
Posted Feb 27, 2019 18:19 UTC (Wed)
by nybble41 (subscriber, #55106)
[Link] (4 responses)
Posted Feb 28, 2019 7:24 UTC (Thu)
by omgold (guest, #109541)
[Link] (3 responses)
About printing the default key on separate paper, sure it has both advantages and drawbacks.
Advantage: visitors with temporary physical access cannot reflash the device unnoticed
In case that default password is just used for flashing firmware, with a separate password for management of the device (the latter would be set to empty on factory reset), it seems it would be okay. The former use case is rare enough for not causing the vendor much extra work, or customers losing much for being unable to to that (Anyone wanting to mess with the firmware should make sure to obtain and not lose the password). The latter case would just be protected against by making the device lose the wifi pw on factory reset, so it won't go unnoticed.
Also I believe that obtaining a PK certificate from the vendor on request would not protect from the "visitor attack" scenario, as the visitor might just do that for his own pubkey. How would the vendor determine whether the request comes from the rightful owner, when the only proof is physical access?
Posted Feb 28, 2019 17:19 UTC (Thu)
by nybble41 (subscriber, #55106)
[Link] (2 responses)
That isn't the goal. Before the device has been handed over it hasn't been configured yet, doesn't contain any of the new owner's information, and isn't trusted by other devices in the owner's network. By the time the new owner starts using the device, however, they should have control over it—and the prior owner should not. This is incompatible with per-device passwords which could be retained by the previous owner(s), or other parties, and not replaced by the actual owner of the device.
> In case that default password is just used for flashing firmware, with a separate password for management of the device (the latter would be set to empty on factory reset), it seems it would be okay.
Sure, but it's an unnecessary complication. If you're going to allow anyone who ever had access to the per-device password to reflash the firmware then you might as well just open it up to anyone with physical access and skip the password. Especially if the password is attached to the device, as others have suggested.
> (Anyone wanting to mess with the firmware should make sure to obtain and not lose the password).
The current owner might not care, but what about the next one? If devices are sold with a separate paper which is only needed to install non-vendor-approved firmware, the odds that paper will survive long enough to be included with any resale of the device are slim. The idea here is that a single key (most likely embodied in a smart card or other hardware token) which you use to identify yourself to all your devices, and which is required for setting up any new device, would be more likely to be retained than per-device default passwords.
> Also I believe that obtaining a PK certificate from the vendor on request...
I don't recall anyone suggesting that. If physical access is sufficient to reset the device then the vendor doesn't need to be involved; one would simply reset it and load new credentials. This doesn't prevent the "visitor attack" but such attacks can be made detectable. The most obvious approach would be to identify the device itself using a unique internal key which is destroyed and regenerated as part of the factory reset, making it appear as a new device. At that point the "visitor attack" becomes equivalent to a visitor simply removing the original device and replacing it with one of their own. They can mimic the original configuration but the absence of the original device will be immediately noticeable, and none of the other devices on the network will trust the new device.
Posted Mar 4, 2019 8:43 UTC (Mon)
by omgold (guest, #109541)
[Link] (1 responses)
"allow anyone who ever had access to the per-device password" < "anyone with physical access". I believe the most likely case will be devices which are never resold and the password will never be changed from the default, in which case it is a protection.
Okay, when assuming that a device is no longer a threat after wiping all personal information, the following mechanism should suffice:
- a factory reset wipes all personal information and reinstalls the original firmware
> Especially if the password is attached to the device, as others have suggested.
True, not a good idea to do that.
> The idea here is that a single key (most likely embodied in a smart card or other hardware token) which you use to identify yourself to all your devices, and which is required for setting up any new device, would be more likely to be retained than per-device default passwords.
Where would the hardware token come from?
What if it gets destroyed or lost, or if the previous owner does not include it or reset the association?
I'd say, if this is implemented, it would still be necessary that a factory reset clears the association and allows anyone with physical access to associate it to an arbitrary other token.
In that case the security would not be any better than allowing the user to set his own access password (which is also wiped on factory reset). That was my original argument/question. What kind of attack scenario, a PK authentication mechanism (with hardware token or not) would protect from?
Posted Mar 4, 2019 16:28 UTC (Mon)
by smurf (subscriber, #17840)
[Link]
Why? if you can access the device's firmware flashing page then presumably you can also download the settings and re-upload them afterwards.
Posted Feb 27, 2019 22:33 UTC (Wed)
by andresfreund (subscriber, #69562)
[Link]
For a lot of devices having the default PW printed onto the device itself is viable...
Posted Feb 13, 2019 2:41 UTC (Wed)
by lkundrak (subscriber, #43452)
[Link]
The story is pretty sickening.
On the other hand, there's at least one case, where an unauthorized access to something the article calls a "range-limited, walkie-talkie-esque" baby monitor uncovered an actual (and pretty bizzare) child abuse case: https://www.reddit.com/r/UnresolvedMysteries/comments/71e...
Posted Feb 13, 2019 7:53 UTC (Wed)
by mfuzzey (subscriber, #57966)
[Link] (10 responses)
Of course vendors would be under no obligation to provide source code of their proprietary firmware but even that should be replaceable by a clean room implementation.
As others have pointed out the absolute cost of providing a user firmware update mechanism is fairly low *but* in a highly competitive market it can make a prohibitive difference.
By making it a legal requirement (like EMC testing say) the playing field is levelled.
Posted Feb 14, 2019 16:46 UTC (Thu)
by felixfix (subscriber, #242)
[Link] (9 responses)
The first thing to go wrong will be the government adding all its own security theater so it can "share" co-ownership with you.
The second thing to go wrong will be the government specifying the exact allowed and required technology to make sure evil businesses don't find loopholes to comply with the letter of the law and not the spirit. The regulations will be so onerous and cumbersome that the government will have a thousand ways of showing non-compliance for any manufacturers it deems too uppity, ie, too concerned with what actual customers might want, as opposed to the customers' new co-owners.
The third thing to go wrong will be locking in technology which quickly becomes obsolete.
Posted Feb 14, 2019 17:57 UTC (Thu)
by nix (subscriber, #2304)
[Link] (8 responses)
It's not 'just pass a law', it's 'just pass a law and implement an enforcement mechanism' (for cars, regular legally-mandated checks that include routine verification that e.g. this car has seatbelts, etc). There's a reason the UK MOT test is named after the (historical) government ministry that mandated it (leaving actual checking up to third-party service providers that were periodically inspected).
This works. It is why we don't have cars that routinely burst into flames, cookers and heaters and in-house electrical networks that don't catch fire, buildings that aren't firetraps etc. Regulation makes requiring features economical where otherwise they would be competitively impossible to implement.
Posted Feb 15, 2019 0:08 UTC (Fri)
by felixfix (subscriber, #242)
[Link] (7 responses)
What I meant is that far too many people pass ill-conceived laws in a rush with no thought to consequences, wanted or not, predicted or not. I then listed three ways any such law would go wrong.
I'd rather you answered those three predictions than the perceived-political preamble. But since you thought I went there, and then did go there yourself with a counter-example instead of responding to my three predictions, I will answer your counter-example.
Your last paragraph is as wrong as can be. Car advertising has always had manufacturers touting their safety, and if you look at graphs of vehicle safety per mile traveled, without seeing any years to make it easy, you will be hard-pressed to tell from the graph when government-mandated safety features came into play. Seat belts were available as options long before governments mandated them. Disc brakes, more efficient engines, many many such features were sold before governments mandated them.
Common sense and any reading of history shows that as people become more wealthy, as economies improve and as societies have more spare cash and more free time, they like products which don't blow up, burn them, fall apart, or endanger their children. Pick any product you wish; look at its history over time; and you will see that people demanded better products, always, continuously. When products were new and dangers unknown, people died and were injured; demand brought better products, and cheaper products; more customers meant more demand for safer and cheaper products.
Governments have *always* been late to the game. Maybe you should research saccharine, first declared carcinogenic in spite of all evidence, and finally declared non-carcinogenic 30 or 40 years later. Coffee is good -- bad -- good -- well heck who knows!
To have blind faith in government regs to be either timely or correct is not supported by history.
Now if you please, answer my three predictions with some substance. They may be partly flippant and sarcastic, but that should have made them easier to refute.
Posted Feb 15, 2019 0:47 UTC (Fri)
by sfeam (subscriber, #2841)
[Link] (3 responses)
Determining the properties of a commodity IoT gizmo pulled from a bargain bin may be more feasible than determining the health benefit or harm of a box of fad diet supplement with no ingredient list, but only for someone with a lot more technical savvy and test gear than any normal shopper. The seat-belt example is not parallel because anyone can see whether they are in fact present, and favor a vehicle with seatbelts regardless of whether that feature was required or voluntarily offered. The recent VW diesel emission scandal is maybe a better parallel, since individuals purchasers had no way to detect that the manufacturer was lying about the behavior of an advertised feature. Yeah they choose the "safer" product, but failure of what was supposed to be a government certification of compliance vitiated that choice.
Posted Feb 24, 2019 20:27 UTC (Sun)
by Wol (subscriber, #4433)
[Link]
I don't know what the dates are for front seatbelts, but obviously they're earlier. But those dates mean that, for almost my entire lifetime, rear seatbelts have been an option on new cars. However, until they became mandatory, many cars didn't have them. I know - in my 1985 car - I had them fitted by the dealer before delivery, so they weren't that common.
Cheers,
Posted Feb 25, 2019 2:52 UTC (Mon)
by nybble41 (subscriber, #55106)
[Link] (1 responses)
Not true. *All else being equal*, safe food and medicine are obviously preferable to unsafe food and medicine. However, that condition almost never holds. Safe food and medicine cost more—and given the choice, quite a few people would choose slightly more risky products over safe ones with higher prices. The FDA makes people safer (by some metrics) by forcing them to either pay the higher price or go without, thus trading one highly visible form of harm for another, more subtle kind.
Posted Feb 25, 2019 7:28 UTC (Mon)
by smurf (subscriber, #17840)
[Link]
I've got a more tangible harm of not doing that for you: the same people who are unlikely to buy safe food are also unlikely to voluntarily buy health insurance, therefore (when the unsafe food bites them or the unsafe drugs don't cure their disease) they end up as non-paying customers of the hospital's ER, thus forcing increased healthcare costs on the rest of us.
This is not hypothetical. In countries with mandatory insurance, going to a hospital costs an order of magnitude less than in the US.
Posted Feb 15, 2019 1:04 UTC (Fri)
by karkhaz (subscriber, #99844)
[Link]
> blow up, burn them, fall apart, or endanger their children
it's very unlikely that IoT devices will cause these dangers, and certainly not at the scale at which car fatalities would happen without safety features.
The real problem is that IoT devices are dangerous to populations _other than their users_ at massive scale---what in economics is termed a 'negative externality'. Comparable examples are environmental damage by companies, and this is exactly the situation where governmental regulation is required. Companies typically don't regulate themselves for harm caused to their non-customers, except to prevent damage that is so spectacular that it causes them to lose even customers who are not affected.
Consumers don't care that their IoToaster participated in a DDoS attack against a DNS server or whatever, in fact they don't know what that even means. They're not going to pay more money for a product that promises not to do that.
Posted Feb 15, 2019 15:42 UTC (Fri)
by nix (subscriber, #2304)
[Link] (1 responses)
Posted Feb 15, 2019 16:47 UTC (Fri)
by felixfix (subscriber, #242)
[Link]
Posted Feb 13, 2019 15:13 UTC (Wed)
by PengZheng (subscriber, #108006)
[Link] (7 responses)
The Chinese camera joke is silly trolling. Home camera will send footage to the nearest sever, mostly a local one, otherwise it will be economically inefficient. Not to mention there is the Great firewall making the network quite jerky.
Posted Feb 14, 2019 2:17 UTC (Thu)
by ssmith32 (subscriber, #72404)
[Link]
Depends on your threat model. If the seller is considered one of the primary malicious actors (see comments above about TVs selling data), then reinstallibility is a *key* part of IoT security.
Not taking a side here, just saying, in security, such absolute statements are often, but not always, unwise.
Posted Feb 14, 2019 14:20 UTC (Thu)
by mjthayer (guest, #39183)
[Link] (5 responses)
There is an argument to be made (and I will not actually try to make it further, because there are lots of issues on both sides) that it should be possible for the owner to hire an independent professional to fix their device, that is, anyone with the necessary skills. That does not have to mean that any random person should be allowed to fix any device (radio transmitters?), just as anyone with the necessary skills is able to fix the electrical wiring in their house, but some parts are only allowed to be fixed by recognised professionals. Making it hard for someone without certain qualifications to fix (who decides who they are?) might be reasonable.
Posted Feb 14, 2019 16:00 UTC (Thu)
by smurf (subscriber, #17840)
[Link] (4 responses)
A parallel might be drawn from electrical installation. It's easy enough to screw some wires into some terminals, but (a) you get inoculated from when you're three years old that this electrical shit is damn dangerous, (b) the wires have colors and the terminals have letters, if that, so it's quite obvious that things will go wrong when you don't know your stuff, (c) even when you start really learning about power installations they hit you with safety precautions left and right.
How to translate that to ioT is obvious. The first step is to legislate that manufacturers (or tinkerers) shall carry 100% liability for whatever happens when somebody manages to exploit their bugs. Second would be to raise awareness, which will be way easier as soon as there are choices other than "add some insecure crap to your network" and "don't buy that stuff in the first place" because there'll be way less cognitive dissonance.
Posted Feb 15, 2019 14:49 UTC (Fri)
by NAR (subscriber, #1313)
[Link] (1 responses)
That would surely solve the "GPL'd software on IoT devices" problem (GPL means no liability, but manufacturer has to provide liability, so it won't be able to use GPL'd software), but I'm not sure this is what the talker wants to achieve...
Posted Feb 15, 2019 16:54 UTC (Fri)
by farnz (subscriber, #17727)
[Link]
Not true - you can use GPL'd software, it's just that you have no-one upstream of you to handle the liability. It all falls on your shoulders as the IoT device maker; if upstream software has an exploitable bug, and you put that on your device, then you are liable for the results.
Same applies if you use Microsoft Windows Embedded, FWIW - the licence says that Microsoft aren't liable, but it's still possible to use it in places where you are liable, you just have to insure against the risk that there's a bug that results in liability for you.
Posted Feb 15, 2019 17:18 UTC (Fri)
by mjthayer (guest, #39183)
[Link] (1 responses)
Has there been any serious attempt anywhere to hold people to account for damage caused to other people (e.g. manufacturer taken to court when lots of their exploitable devices DDoS someone's important service) using existing legislation?
Posted Feb 25, 2019 16:23 UTC (Mon)
by Wol (subscriber, #4433)
[Link]
European legislation on the other hand, as someone else pointed out, is more interested in outcomes. So you should legislate that "businesses must be able to demonstrate robust, working procedures for dealing with the bugs in the code in their products". In other words, if your reaction to a bug report is to sweep it under the carpet, or if it's a serious bug to run around like a headless chicken, then you're in breach of the legislation and liable for the consequences.
If, on the other hand, you have a mechanism for creating, and rolling out, bugfixes to your products to deal with problems then it's much easier to show compliance with the legislation - you just point to your security team and say "watch them triage bugs, create fixes, and roll them out". The system is there, it can be watched in action, and while it may not (indeed, pretty definitely won't) fix all the problems it will result in far fewer of them.
Cheers,
Posted Feb 14, 2019 5:06 UTC (Thu)
by pj (subscriber, #4506)
[Link]
Posted Feb 19, 2019 8:55 UTC (Tue)
by tdz (subscriber, #58733)
[Link] (4 responses)
This is certainly true, but it's not like there's a conspiracy. Big corporations pay developers for what they need. I'd say that a neutral organization (e.g., the Linux Foundation) should hire a number of people for keeping the Linux ecosystem in good and hack-able shape.
Posted Feb 19, 2019 11:48 UTC (Tue)
by nix (subscriber, #2304)
[Link] (3 responses)
Posted Feb 19, 2019 16:59 UTC (Tue)
by tdz (subscriber, #58733)
[Link] (1 responses)
Posted Feb 20, 2019 8:26 UTC (Wed)
by smurf (subscriber, #17840)
[Link]
The main problem is, however, that these needs are not a well-defined deliverable that'd be easy to organize some funding around, crowd- or otherwise.
Posted Feb 20, 2019 2:20 UTC (Wed)
by pabs (subscriber, #43278)
[Link]
Television sets
Television sets
Television sets
Television sets
Television sets
Television sets
Television sets
Television sets
Television sets
Television sets
Television sets
Television sets
There's also the vexed (vexing?) issue of digital restriction management also "DRM" or "digital rights management"
How many content providers would be happy about giving out digital restriction keys to a manufacturer who uses full-stack GPL software? Not very many, I suspect.
Television sets
Television sets
Television sets
Television sets
Television sets
Television sets
In the context of a smart TV, I don't think telling customers "you can replace the firmware, as long as you don't want to watch Netflix or any subscription channels" is much good.
Television sets
Television sets
Wol
Television sets
Television sets
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
I knew a components engineer whose entire job consisted of scouring hundreds of data books in different languages for equivalent components which were cheaper; taking even a full week to save a penny per unit paid his salary and more.
Avoiding the coming IoT dystopia
"So if the cost of GPL compliance is a fixed cost "
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
But on an IoT device, physical access is much less strongly related to trust: the devices are sometimes outdoors, and often spread all around your house, where it's fairly easy for an attacker to reach them without being noticed.
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
- You buy a device.
- You book an appointment at your nearest Department Of Internet Thingies.
- You come during the appointed time, stand in a short line (just 2-3 hours) and then file an ownership transfer form.
- In a couple of weeks you receive your QR code with the signed keys to transfer ownership.
Avoiding the coming IoT dystopia
- Buyer provides the seller with their public key.
- Seller imprints the device with the buyer's key.
- Seller delivers the device to the buyer.
- Buyer confirms that the device recognizes their key.
Avoiding the coming IoT dystopia
> - Seller delivers the device to the buyer.
Sorry, no. This two steps will add something like $100 per device. You'll have to either link individual shipments to consumers or establish local "customization centers". Both are not feasible. Then there's the issue of lost keys and repair/replacement. In short, this scheme is not going to work especially for various $20 IoT light bulbs or baby monitor cameras.
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
- print the key/password on a sheet of paper which is delived to the customer together with the device.
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
- new owner types in current password, which allows him to set his own one
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Disadvantage: the paper may be lost
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
- before doing a firmware flashing operation the devices enforces wiping of all personal information
- included with the device? -> expensive
- needs to be bought separately? -> impractical for unskilled users
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
"Require by law" -- so helpful, solves all the problems -- just pass a law! "So let it be written, so let it be done."
Well, that actually is how one normally solves this problem. If e.g. safety features in cars are optional, a race to the bottom has historically resulted that leads to all best-selling models that don't sell explicitly on the basis of safety using none of them to save costs, or getting outsold by those that do so. Regulations requiring those safety features level the playing field, preventing outcompetition of safer models, and the resulting world is better for everyone.
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Wol
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Maybe you should research saccharine, first declared carcinogenic in spite of all evidence, and finally declared non-carcinogenic 30 or 40 years later.
What on earth does changing scientific understanding have to do with any of this?!
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
"The first step is to legislate that manufacturers (or tinkerers) shall carry 100% liability for whatever happens when somebody manages to exploit their bugs."
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Wol
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia
Avoiding the coming IoT dystopia