Yesterday night, I volunteered with FIRST FTC, a high school competitive robotics program. I had previously been a part of not one but two FTC teams, both from my high school. I'm mentoring those teams this year, and a team coach suggested that I register as a volunteer.
I did register, and was assigned the role of Pit Runner. The role description table indicates that Pit Runners aren't generally necessary, but that's OK, since this was my first volunteer role. When I got to the place, I got a badge/lanyard and was issued a copy of the match schedule. My job was to notify the teams when a match they were involved in was coming up. That ensured everybody was queued up when it was time for the match, making everything run smoother.
Almost immediately, somebody came up and asked me a question about where to find something. I had no idea (but I directed her to someone who knew), and now I understand how most volunteers I've seen from the standpoint of a team member feel!
I didn't get much chance to watch the matches, but I was moving most of the time, which I guess means my presence was useful.
The whole event took three and a half hours starting at 6 PM. Despite the tiredness, it was a positive experience. I'll almost certainly be volunteering at the next nearby FTC event.
Saturday, December 19, 2015
Friday, December 18, 2015
Cryptography wins
Several US presidential candidates are calling for restrictions on encryption, in one form or another. Such restrictions or weakenings are impossible, or would do great harm to the security of law-abiding citizens.
Primer: what's so strong about strong encryption?
Encryption obscures data by transforming it using a secret key, or in some cases, part of a key pair. Encryption algorithms are computational processes that encrypt data. "Cryptography" refers to the science of protecting data with encryption. Strong encryption algorithms are mathematically proven to be exceptionally difficult to reverse without the secret key. The only way to defeat such algorithms is to try every possible secret key, of which there could be unfathomable numbers. My personal favorite algorithm, AES, could use any of about 1077 keys. (It takes 78 digits to write that out in full.) Cracking strongly encrypted data would take eons even if you had billions of computers.
Restrictions on strength of cryptography
Many cryptographic algorithms can be used with keys of different lengths, usually measured in bits. For example, AES-256 is the form of the Advanced Encryption Standard that uses 256-bit keys. As you might guess, longer keys mean stronger security.
One possible approach to restricting encryption would be to put a limit on the length of the key. The major problem is that the code responsible for doing the encryption is already on millions of people's phones. Maybe you can somehow get law-abiding citizens to destroy all programs that use too-strong encryption (hint: probably not), but then you're just begging bad guys (not necessarily terrorists) to attack the weakened algorithms that everybody is suddenly using. Then nobody is safe.
Restrictions on export of cryptographic software
In fact, the bit above about restricting key length was done about a decade ago - but not for us at home - in the time of export-grade encryption. Some might think we can go back to that. Too late! Millions of computers around the world have code that can do all sorts of strong encryption. The algorithms are open-source, available to anybody for perusal or modification; there are copies everywhere. Think terrorists will just play along and destroy their copies of the code? Ha, no.
Sure, you might be able to stop the shipping of new cryptographic algorithms out of the country (hint: you won't), but the mathematical knowledge will soon get out, or somebody overseas could invent a brilliant new algorithm. Or, you know, somebody could just stow a computer away on an outbound plane. Also, existing algorithms are doing just fine, AES-256 will be strong for a long time.
Backdooring otherwise-strong algorithms
Some people advocate the creation of algorithms that are strong (i.e. long key length, no vulnerabilities) but have a backdoor. A backdoor is an intentionally-introduced vulnerability that is known only to its inventor. Such a thing would allow the NSA/government to use their secret special key to decrypt any communication that used the algorithm.
My first concern is, what do you do when somebody discovers the backdoor? Changing a key is easy, swapping out an algorithm not so much. If the secret key leaks out to a bad guy, everybody's security is toast. Perhaps more practically, the terrorists aren't going to start using a new algorithm until they're sure it's safe. Again, there are many very strong, very vetted, open-source algorithms already existing.
Banning encryption outright
I hope nobody is seriously considering this; encryption is essential to the safety and security of everyone. To actually ban encryption, it would be necessary to outlaw the storage of random data - good encryption is indistinguishable from line noise. Besides, clear-text transmissions can be intercepted, read, and changed by attackers. Hackers could, say, grab your credit card details while sitting near you in a coffee shop while you buy something on an Internet site over Wi-Fi.
Backdooring phones/devices
"Okay," you say, "we can't do anything about encryption, but we can backdoor people's phones or other devices! We'll just force Silicon Valley to do it!" First up, that's a great way to make profitable tech companies leave the country. Secondly, forcing companies to do something is sliding down a fairly slippery slope when it comes to American values and the Constitution and what have you. Skilled users will certainly discover the backdoor sooner or later. Asking the companies to cooperate is a much better plan, but with the exact same technical issues.
See, not everybody uses a device in which we can hide a backdoor or other feature. There are several open-source phone OSes, and technically skilled users can replace their phone's software with something they control. Alternatively, they could just use a non-mobile device, like a standard PC with an open-source OS, and then you can't do anything. Keep in mind also that attackers can use your backdoor to see all the stuff you can; it's only a matter of time.
Blocking sections of the Internet
It would be rather convenient if we could just stop all network connections to Internet addresses that we believe to be bad. Maybe we could, but we would have to set up our own version of China's Great Firewall - and stoop to that level of authoritarianism - to accomplish it effectively.
Even if a connection to a certain address is blocked, data can flow to it indirectly through proxies. For example, Tor is a network of thousands of computers that bounces traffic around the world in an encrypted fashion, making it so the destination is known only to the last server on the way ("exit node") and the source is known only to the first server ("entry node"). Besides those, only the user can know where the traffic is actually going. To let users actually find these indirection servers, Tor maintains directory servers that have a list of each server's address. But even if you constantly check that list and block addresses on it, users can set up a private Tor bridge to get them into the network and then to anywhere they want to be.
If even one entry node or bridge is accessible, determined users can get anywhere they want to go. The only way to stop such distributed networks is to switch our country's edge filters from allow-by-default to deny-by-default (allowing only a pre-approved list of destinations), and suddenly we're North Korea.
Conclusion: cryptography wins
Encryption is a fact. We will never be able to read the transmissions of careful senders/receivers unless we get the keys they're using. Every policy option I've presented here would be ineffective at best and a danger to all citizens' security at worst. Cryptography wins - we will need to use other tools, other methods, other channels to monitor and stop terrorist activity.
Monday, December 7, 2015
Different at all costs
With a new OS version comes a new design, a new visual style. Windows 10 is no exception; icons have been flattened and the Metro (oh, excuse me, "modern") look has been brought to more aspects of the desktop.
We reached the pinnacle of desktop beauty a while ago, in my opinion, with Windows 7. It took advantage of good hardware; previous instances of Windows had to make some sacrifices to accommodate real specs. The only thing that drives designers to change is the need to make the new product look new. In other words, be different at all costs.
Windows 10 is where this becomes a problem for me. It's the first version of Windows that features icons that I actively dislike. That's just, well, like, my opinion, man, but the point is that companies really like figuring out some new theme that looks sufficiently different from the previous. That new theme then creates unnecessary surprise and confusion in non-technically-adept users.
We reached the pinnacle of desktop beauty a while ago, in my opinion, with Windows 7. It took advantage of good hardware; previous instances of Windows had to make some sacrifices to accommodate real specs. The only thing that drives designers to change is the need to make the new product look new. In other words, be different at all costs.
Windows 10 is where this becomes a problem for me. It's the first version of Windows that features icons that I actively dislike. That's just, well, like, my opinion, man, but the point is that companies really like figuring out some new theme that looks sufficiently different from the previous. That new theme then creates unnecessary surprise and confusion in non-technically-adept users.
Subscribe to:
Posts (Atom)