Scientists at New York University’s Tandon School of Engineering may be onto something big: a way to make both digital equipment manufacturers and chip developers happy by ensuring microchips are doing what they are supposed to, and that’s all.

Vetting microchips never seemed to be an issue until a few years ago when Sergei Skorobogatov, a University of Cambridge researcher, and Christopher Woods of Quo Vadis Labs wrote a paper titled: Breakthrough silicon scanning discovers backdoor in military chip (PDF). Eventually proven to be a software problem and not a deliberate attempt at sabotage, it still brought to light the need for equipment manufacturers to somehow ensure microchips operate as advertised and nothing more.

Around the same time, Emil Protalinski wrote on ZDNet, “A former Pentagon analyst reports the Chinese government has pervasive access to about 80 percent of the world’s communications, and it is looking currently to nail down the remaining 20 percent.”

Protalinski next mentions that Michael Maloof, a former senior security policy analyst in the Office of the Secretary of Defense, explains that kind of access is possible because the Chinese government ordered backdoors to be installed in devices made by Huawei and ZTE Corporation. Maloof’s comment is second sourced with the recent passage of China’s new anti-terrorism law that requires telecommunications operators and internet service providers to provide the Chinese government with “backdoor” access to their products.

“The ability to verify has become vital in an electronics age without trust,” mentions Siddharth Garg, an assistant professor of electrical and computer engineering at the NYU Tandon School of Engineering, in this NYU press release. “Gone are the days when a company could design, prototype, and manufacture its chips. Manufacturing costs are now so high that designs are sent to offshore foundries, where security cannot always be assured.”

“Under the current system, I can get a chip back from a foundry with an embedded Trojan,” continues Garg. “It might not show up during post-fabrication testing, so I’ll send it to the customer. But two years down the line it could begin misbehaving.”

SEE: Power checklist: Vetting employees for security sensitive operations (Tech Pro Research)

External verification

Garg and fellow researchers have developed a way to corroborate a chip’s operation using verifiable computing. It’s a process whereby “manufactured for sale” chips contain an embedded verification module that proves the chip’s calculations are correct and an associated external module that validates the embedded verification module.

What makes this exciting is that the external module is fabricated separately from the chip. “Employing an external verification unit made by a trusted fabricator means I can go to an untrusted foundry to produce a chip having the computational circuitry and a module that presents proofs of correctness,” says Garg. That means the less complex external module, an Application-Specific Integrated Circuit (ASIC) is the only part that has to be made by a trusted and validated chip foundry.

Garg mentions this arrangement provides a safety net for the chip maker and the end user, adding, “The nice thing about our solution is that I don’t have to trust the chip because every time I give it a new input, it produces the output and the proofs of correctness, and the external module lets me continuously validate those proofs.”

The press release also mentions that for certain types of computations, the research team’s validation method can perform all of the analysis right on the trusted external module.

Overhead costs

Apparently, there is some overhead that generating and verifying proofs creates. The press release mentions the researchers are currently looking at ways to reduce the overhead. “And with hardware, the proof is always in the pudding, we plan to prototype our ideas with real silicon chips,” says Garg.

Interest from Microsoft, Google, the Air Force, and the Navy

A testimony to the value of this research is the number of grants the research team is getting. Besides a five-year National Science Foundation Large Grant of $3 million, the group has received grants from the Air Force Office of Scientific Research, the Office of Naval Research, a Microsoft Faculty Fellowship, and Google Faculty Research.

It is not difficult to see why interest is so high. Imagine what it would mean to all parties concerned if an integrated circuit could monitor itself, plus flag defects and malicious activity.