Sensors embedded in everyday objects are transforming the way that we interact with technology. This trend will soon be inescapable. Pavement will warn approaching cars of ice, fridges will scan their shelves and reorder essential groceries, and lighting and heating will adjust to the occupants of a room. These devices are known collectively as the Internet of Things (IoT), and experts predict there could be over 50 billion of them before the end of the decade. The smartphone market, by comparison, is only projected to reach 6 billion. “There will be so many IP addresses, so many devices, sensors, things that you are wearing, things that you are interacting with, that you won’t even sense it,” explained Google chairman Eric Schmidt at a panel on the World Economic Forum. “It will be part of your presence all the time.”
IoT devices have the potential to revolutionize commerce and industry (although the value of an “smart” saltshaker is debatable). But the ubiquity of such devices has privacy and security advocates worried. In terms of information security, a complex network is only as secure as its weakest node. This means that hackers can target one insecure device to gain access to many others. Even if someone, say a journalist or activist, has taken great pains to protect her privacy, she may be betrayed by a nearby “smart” object that communicates with her phone or computer. “Vulnerabilities on one system cascade into other systems,” wrote security expert Bruce Schneier for VICE Motherboard, “and the result is a vulnerability that no one saw coming and no one bears responsibility for fixing.”
Once a hacker takes control of one insecure device, your home network router, TV, and computer are all within reach.
Even without the spread of IoT devices, hackers have many tools in their toolbox: they might try spearfishing, hacking into insecure wireless routers, or cracking weak account passwords. But due to the growth of insecure IoT devices, their job has become far easier. Many IoT devices come with default passwords (like “password” or “1234”) that users cannot change. Others send and receive information in an unencrypted format, leaving their messages vulnerable. And once a hacker takes control of one insecure device, your home network router, TV, and computer are all within reach. Recognizing the danger of these potential exploits, the Department of Homeland Security published a report last year urging IoT manufacturers to adopt standard security practices. Legislators should work to codify these suggestions, among others, into sensible regulation that would protect consumers from insecure IoT devices. Allowing easily-fixable security flaws to persist will both harm consumers and lower their confidence in IoT technology.
In fact, some legislators have begun to address the issue. In January, U.S. Senator Deb Fisher (R-NE) introduced the DIGIT Act. The bill is a small step, calling for a panel to study the problem, but it is a start. The challenge for legislators is to come up with rules that will protect consumers without crushing innovation.
But “regulation” is a dirty word for some in Washington, especially when it comes to technology. Many lawmakers believe that the growth of the internet is the result of a lack of government interference, and that the regulation of IoT will smother technologies in their infancy. Chairman of the U.S. Senate Committee on Commerce, Science and Transportation Senator John Thune (R-SD) voiced this concern during a meeting on the Internet of Things on February 11, 2015: “I encourage policymakers to resist the urge to jump head first into regulating this dynamic marketplace. Let’s tread carefully and thoughtfully before we consider stepping in with a ‘government knows best’ mentality that could halt innovation and growth.”
The federal government’s top consumer protection agency shows even less interest in IoT security. In March, the acting head of the Federal Trade Commission (FTC), Maureen Ohlhausen, said that the FTC has not decided whether it should impose mandatory IoT security regulations, and that in fact the FTC is “primarily not a regulator” when it comes to mandating best cyber security practices. Ohlhausen prefers a light-touch approach, whereby the agency does not “speculate about harm five years out” but waits until that harm materializes. Under President Obama, the FTC took only modest steps to address the problem. In 2013 the FTC settled a case with TRENDnet for failing to secure their internet connected cameras. In January of 2015, the FTC released a report that provided companies with suggestions on how to improve IoT privacy and security. Yet in the same report, the FTC opposed legislation codifying security requirements into law. The report stated that “there is great potential for innovation in this area, and IoT-specific legislation at this stage would be premature.”
This is a short-sighted position. At the 2016 DEF CON Hacking Conference in Paris, hackers found forty-seven new vulnerabilities in just twenty-three IoT devices, including door locks, furnaces, and wheelchairs. A “build first, patch later” approach means that these vulnerabilities will only be patched after thefts, fires, and other accidents take place. Even worse, many devices use firmware that is not patchable or upgradable. When vulnerabilities are exposed in these devices, consumers will be stuck with the choice between discarding expensive technology or leaving it vulnerable to hackers. This is a particularly harrowing possibility for those who rely on IoT devices for their health or transportation.
There are genuine risks with regulation, but it need not be overly burdensome. For example, the United States does not currently have a dedicated data security law requiring companies to use reasonable protections to safeguard personal information. Instead, the FTC relies on general consumer protections under Section 5 of the FTC Act. Congress could pass a more specific law mandating breach notification procedures and stopping companies from pre-programming IoT devices with laughably bad login credentials.
In a November 2016 report, the Department of Homeland Security (DHS) outlined a number of steps manufacturers could take to improve IoT security. Many of these could also be codified into law, including:
- Using “opt-out” security measures that are enabled by default.
- Designing IoT devices to “age” gracefully and go offline when outdated.
- Building IoT devices using industry-standard security practices.
- Red-teaming (or deliberately attempting to hack into) devices before their release to test their vulnerability.
- Only connecting IoT devices to the internet when necessary for functionality.
- Limiting IoT exposure to sensitive data.
The diversity of IoT makes regulation difficult. “Regulating a Fitbit is very different from regulating an automobile or regulating an implantable medical device like a defibrillator,” pointed out Lee Tien, a senior attorney at Electronic Frontier Foundation. As a result, lawmakers may be better off developing broad protections rather than trying to pass industry-specific regulation. Alternatively, they could create smaller regulatory agencies tasked with regulating specific sectors, similar to the way the National Highway Traffic Safety Administration works within the Department of Transportation.
In other cases, government regulation may be less effective than an industry-led solution. For example, the lack of consensus among manufacturers on a standard communication system has made cross-platform IoT integration difficult. In 2013, a group of influential technology corporations known as the AllSeen Alliance released a software called AllJoyn specifically designed to deal with this problem. With strong corporate backing, it could become the industry standard. Similarly, “bug bounty” programs and “hackathons” hosted by technology companies routinely expose security flaws that manufacturers can then patch.
Given the size of the IoT market, legislators should not mistake these ad-hoc programs for an adequate response to the problem. After all, for most Americans, the thought of a hacker gaining control over their pacemaker or vehicle is a frightening concept. To the greatest extent possible, legislators should induce companies to adopt strong security policies before this possibility becomes a reality.