The following four security design patterns appear often in the Internet of Things (IoT) and usually result in less secure devices and less trustworthy IoT services. Designers should beware of these patterns, and learn how they undermine the security of the devices.
Baby Duck Authentication
When applied to animals, imprinting is when a baby animal assumes that the first animal it sees must be its mother, and it must be an animal of the same type. In IoT, we see devices do things like:
- accept a connection on a USB at boot time;
- join a network with a well-known SSID;
- accept a connection on a well-known URL, socket, port, etc.; or
- trust any device that connects to a special debugging port (e.g., a JTAG port).
This is important because a device trusts its “mother” to do firmware updates, change configurations, reset the root of trust, and more. Convincing a device that you are its “mother” is usually the first step in undermining its innate security.
Preventing someone from attacking a device via Baby Duck Authentication is almost impossible for the average consumer-grade electronic device. The money, time, and effort put into Blu-Ray DVD security or satellite television set-top box protection is the level of effort to make something robust in the consumer market. That level of effort is rarely economical for consumer-grade hardware.
A television-style remote control, a 12-digit keypad, and a video game controller are just some of the things that have a limited user interface (UI). The Kung-Fu Grip design pattern has the user press a well-known set of keys or buttons to put the device into an administrative or a configuration mode. The Kung-Fu Grip often involves specific timing (e.g., right after power-on, or for a minimum amount of time). With modern devices incorporating sensors like accelerometers, the Kung-Fu Grip can be a special series of taps. With GPS radios in devices we could convince a “thing” that it should permit reconfiguring because it is in a “safe place.”
It is important to remember that we often do not know who has performed the Kung-Fu Grip — it could be a malicious person who is going to reset the device to be exactly as the legitimate user had it, but perhaps with a few extra features turned on or malicious software loaded into it.
The Secret Handshake is similar to the Kung-Fu Grip, but it isn’t physical manipulation like pressing a special button. While the device is online and working normally, it is always ready to complete a Secret Handshake that indicates membership in the club of privileged users. Secret Handshakes can include specially crafted packets based on timing, ports, IPs, and payloads; they can include holding a power line at a certain voltage for a specific interval. After validating the Secret Handshake, the device might be willing to accept a firmware update, reset to its last known good configuration, enter a special mode (such as sleep or wake), or trigger a Postcard Home (see below).
Secret Handshakes are a very insecure design pattern because they are trivial and obvious to spoof. If a Secret Handshake can be captured, then it can be replayed. Anti-replay design patterns exist, but they often add complexity to a process or workflow that does not tolerate a lot of complexity, like the reset procedure, reconfiguration procedure, or initialisation process.
Technical professionals often say “phoning home” as an homage to the film E.T. The term “Postcards Home” is a better analogy for TCP/IP, Bluetooth LE, Wi-Fi, and most other communications that IoT devices use.
If you mailed a postcard home saying “Dear mom, I’m fine, how are you?” you don’t expect that postcard to be private. And if you received a typewritten reply “All is fine here. You have a new brother in New York. Please send $500 to this address. Love Mom” you would be appropriately skeptical. Devices frequently use plain HTTP or other clear-text communications to check for software updates, new configurations, license keys, and more — that communication is like mailing a postcard home, and the reply the device gets is just a postcard reply. Devices rarely have what it takes to detect and reject spurious replies from “home.” Networks can be spoofed; DNS can be spoofed; servers can be impersonated, and so on.
Adding proper HTTPS, properly paired Bluetooth, WPA secured Wi-Fi, or other security on the communications helps, but it comes at a cost. Cryptography adds computational complexity to the request; it adds time to the connection, and it will burn extra battery using the device’s CPU. Furthermore, verifying cryptographic signatures and connections introduces potential faiures in the communications. Devices often have a limited UI with which to communicate that failure to the user, and the user usually has few options about what he can do about it.
The bottom line
Although the form factors and environments of “things” may present unique challenges, there is no real need to pioneer new designs. The challenge is implementing what we already know how to do into a new domain. The IEEE Center for Secure Design identifies security flaws in design and promotes fixing those flaws. That kind of systematic thinking about flaws in software leads to more secure software overall.
Are there other design patterns that you would add to this list? If so, post your suggestions in the comments.
Note: TechRepublic and ZDNet are CBS Interactive properties.