Alex Howard recaps the key takeaways from three recent tech events held in Washington, D.C., and contends that data and IT innovations shouldn't be at the expense of U.S. citizens' freedoms.
I attended several forums recently in Washington, D.C. that left me thinking about larger issues that I've promised to consider in this column.
Data Innovation Day 2014
At Data Innovation Day 2014 to talk about the increasing amount of bits and bytes being generated and what people are doing with them. (That's one reason that "data-ism" and data-driven policy and commerce have been top-of-mind for me.) Data Innovation Day attendees heard about open data efforts from the deputy U.S. CTO, data analytics in healthcare, Google Translate, and other highlights.
There are two broad takeaways from this forum. One is straightforward: Much as "software is eating the world," there is now mainstream awareness in the business world that the collection and strategic use of data is important to every industry and sector. While this might seem obvious to people immersed in the technology world, the importance, impermanence, privacy, security, and utility of data (big, meta, open, personal, or otherwise) has become an issue that business leaders, not just their IT staff, need to know about.
The second takeaway is more nuanced but important: The primary factor governing the quality of a given service can be how much data is available. That's true for Google Translate, as a talk on making language data useful by Macduff Hughes, the product manager for the service, highlighted. Hughes described how gathering immense quantities of data has enabled Google to deliver integrated translation to its many users. (Google Translate currently does more than a billion translations every day across 80 languages.) Interestingly, the rate of improvement goes asymptotic at about 100 billion words collected, suggesting a limit for even the most powerful machine-learning to master translation.
This kind of result is going to drive debates about the slippery issue of correlation vs. causation derived from data analysis for years to come. In the meantime, Google will keep collecting data and improving translation. Eventually, we'll see real-time translation in chats and Hangouts, which really will be magical.
State of the Net
The State of the Net conference annually convenes government officials, congresspeople and staff, and technology policy wonks. Fittingly, Akamai's state of the Internet came out the same day, and highlighted both how far the world has yet to go in being connected and how much spam and online attacks flow over it every day.
The agenda for the conference generally reflects what's on the mind of officials and regulators, offering a preview of Congress' legislative agenda and regulatory headaches. This year was no different, from a conversation with FCC chairman Tom Wheeler about network neutrality to discussions and speeches about NSA reform, Bitcoin, academic technology and student privacy, online regulation, and cloud computing startups.
All of these issues matter today and will in the days ahead, though the larger takeaway for me is that the current state of the online world and the associated technologies, platforms, and companies that underpin it have raced far ahead of the legal and regulatory confines that lawmakers have created. To note that technology outpaces legislation isn't a novel insight, but I've been able to find few times in history when the gap between the two is as great as it is today.
That's unfortunate given the state of Washington, where the generation of lawmakers that hold chairmanships in the committees that control the movement of legislation all too often do not have direct experience with the technologies in question. I'll never forget seeing Senator Jay Rockefeller (D-WV) ask Apple chief scientist Bud Tribble about where the settings app was on the iPhone during a hearing and Tribble patiently explain the user interface to him.
While Congress could become smarter on technology and Internet policy if the Office of Technology Assessment were reconstituted and funded, today staffers don't have an in-house nonpartisan entity to consult regarding technology, putting more pressure on Congressional Research Service reports and leaving a vacuum for lobbyists, the technology policy press, activists, trade groups, and public interest advocates to fill. It's a heady, complex, powerful (and heavily-suited) mixture, particularly given the increased amounts of lobbying money from tech companies seen in recent years, with notable gains in soft power. For instance, if almost all of Congress is on Facebook, how will it shift a congressperson's thinking about how to approach online privacy legislation and data breach laws?
The recent historic unproductivity in the 113th Congress has also meant that many decisions regarding privacy and security regulation, standards, and enforcement, like a national data breach law, have been made by the Federal Trade Commission under its existing authority, or proposed and enacted by the White House using executive actions.
A "science fair" convened by the ACT-IAC
The most recent event I attended was a "science fair" convened by the American Council for Technology (ACT)-Industry Advisory Council (IAC), a nonprofit public-private partnership focused on the improvement of government through the application of information technology. I'll highlight five of the interesting projects showcased at the event.
Dwellr is a new mobile application from the U.S. Census Bureau.
The technology behind the mobile alert system every new cellular handset in the U.S. now contains the following.
A mobile application from the Environmental Protection Agency that enables users to learn more about local waterways.
The U.S. Customs and Border Patrol exhibit caught my eye, and their presentation about a biometric identification pilot that's already in operation at the U.S. border was intriguing.
U.S. Border Patrol is not using collected facial images yet for ID but it's "something we'll look into in the future" pic.twitter.com/xNBGsXN584— Alex Howard (@digiphile) February 6, 2014
While the Border Patrol hasn't begun using the database of facial images that they've been collecting for identification yet, it's apparently something they'll be looking into in the future. It’s especially interesting given this week's announcement that the Obama administration is exploring standards for the use of facial recognition by industry and law enforcement agencies.
While the U.S. has clearly lost some moral authority over the past year after revelations regarding massive electronic surveillance by the NSA, it still matters how the federal government handles facial recognition, given the global reach of the platforms and companies created here.
This brings to mind what science fiction author William Gibson famously said, "The future is already here: it's just not evenly distributed yet."
As Dropbox founder Drew Houston noted at the State of the Net conference, the decisions made in D.C. have an impact on how and which technologies can and will be used around the world, from censorship filters to drones to genetic engineering and vaccinations.
It falls to the men and women elected to lead the world's remaining superpower and the public servants who are sworn to protect and defend its Constitution to ensure that the Bill of Rights is applied online and that, in the rush to "innovate," we don't lose the freedoms of free association, expression, and the pursuit of happiness that generations before us fought in the courts and battlefields to gain.
Join the discussion
What data innovations do you think are most promising? What are your thoughts about how data is used, especially for surveillance? Weigh in on these topics.Also, I'm reflecting on what tech events I should attend this year, so please post your recommendations.