Somehow I came across this writeup of Dan Geer’s RSA conference talk. I was blown away by it, not by what information contained in the talk but instead by the questions he was able to ask which I hadn’t even though to ask. You should absolutely read it! I don’t have time for a proper writeup, but I’ll pull out some particularly good items I appreciated from the talk:
‘The Gordian Knot of such tradeoffs — our tradeoffs — is this: As society becomes more technologic, even the mundane comes to depend on distant digital perfection. Our food pipeline contains less than a week’s supply, just to take one example, and that pipeline depends on digital services for everything from GPS driven tractors to robot vegetable sorting machinery to coast-to-coast logistics to RFID-tagged livestock. Is all the technologic dependency, and the data that fuels it, making us more resilient or more fragile?’
Almost everyone here has some form of ingress filtering in place by whatever name — firewall, intrusion detection, whitelisting,and so forth and so on. Some of you have egress filtering because being in a botnet, that is to say being an accessory to crime, is bad for business. Suppose you discover that you are in a botnet; do you have an obligation to report it? Do you have an obligation to report the traffic that led you to conclude that you had a problem? Do you even have an obligation to bother to look and, if you don’t have or want an obligation to bother to look, do you want your government to require the ISPs to do your looking for you, to notify you when your outbound traffic marks you as an accomplice to crime, whether witting or unwitting? Do you want to lay on theISPs the duty to guarantee a safe Internet? They own the pipes and if you want clean pipes, then they are the ones to do it. Does deep packet inspection of your traffic by your ISP as a public health measure have your support? Would you want an ISP to deny access to a host, which might be your host, that is doing something bad on their networks? Who gets to define what is “bad?”
If you are saying to yourself, “This is beginning to sound like surveillance” or something similar, then you’re paying attention.
Relevant as skybox is doing well:
All we have to go on now is the hopeful phrase “A reasonable expectation of privacy” but what is reasonable when one inch block letters can be read from orbit? What is reasonable when all of your financial or medical life is digitized and available primarily over the Internet?
An interesting semi-prediction:
By now it is obvious that we humans can design systems more complex than we can then operate. The financial sector’s “flash crashes” are the most recent proof-by-demonstration of that claim; it would hardly surprise anyone were the fifty interlocked insurance exchanges for Obamacare to soon be another.
I love the phrasing on this:
Let me ask a yesterday question: How do you feel about traffic jam detection based on the handoff rate between cell towers of those cell phones in use in cars on the road?
Let me ask a today question: How do you feel about auto insurance that is priced from a daily readout of your automobile’s black box?
Let me ask a tomorrow question: In what calendar year will compulsory auto insurance be more expensive for the driver who insists on driving their car themselves rather than letting a robot do it? How do you feel about public health surveillance done by requiring Google and Bing to report on searches for cold remedies and the like? How do you feel about a Smart Grid that reduces your power costs and greens the atmosphere but reports minute-by-minute what is on and what is off in your home? Have you or would you install that toilet that does a urinalysis with every use, and forwards it to your clinician?
And he passed on a really interesting point that Joel Brenner made:
During the Cold War, our enemies were few and we knew who they were. The technologies used by Soviet military and intelligence agencies were invented by those agencies. Today, our adversaries are less awesomely powerful than the Soviet Union, but they are many and often hidden. That means we must find them before we can listen to them. Equally important, virtually every government on Earth, including our own, has abandoned the practice of relying on government-developed technologies. Instead they rely on commercial off-the-shelf, or COTS, technologies. They do it because no government can compete with the head-spinning advances emerging from the private sector, and no government can afford to try.
When NSA wanted to collect intelligence on the Soviet government and military, the agency had to steal or break the encryption used by them and nobody else. The migration to COTS changed that. If NSA now wants to collect against a foreign general’s or terorist’s communications, it must break the same encryption you and I use on our own devices… That’s why NSA would want to break the encryption used on every one of those media. If it couldn’t, any terrorist in Chicago, Kabul, or Cologne would simply use a Blackberry or send messages on Yahoo!
But therein lies a policy dilemma, because NSA could decrypt almost any private conversation. The distinction between capabilities and actual practices is more critical than ever… Like it or not, the dilemma can be resolved only through oversight mechanisms that are publicly understood and trusted — but are not themselves … transparent.
(spacing added for readablility)
Generally, this made me very worried about the rest of our industry. It seems living in San Francisco that everything tech-wise is under control- self-driving cars are safer than humans, our own governance is far more effective than the US government and corporations elsewhere in the country, and if only our ideas were more widely implemented, everyone’s lives would be better. Or so the party line goes. We really don’t stop to think about these things, however- we simply don’t have time. We’re either working, cranking things out, or desperately trying to make sure we haven’t squandered our 20s. Perhaps we really should slow down a bit and think about where we are going and try to make some informed decisions.