Topics covered include:
[Top of Page]
JB: Greg Jackson has spent 15 years on the faculty of Harvard and Stanford. He's active in numerous information technology efforts including Internet2, Educause (the new Educause, I might add), and the Common Solutions Group.
Welcome, Howard and Greg. Very glad to have you both here today.
GJ: Hi there, Judith.
HS: Thank you, Judith, and thanks again to CREN for choosing me as the technology anchor for Tech Talk. The job of the technology anchor is to engage our guest expert in a lively technical dialogue that will answer the questions you'd like answered, and ask those very important follow-up questions, which we're going to try to do. You can ask our guest expert, Greg Jackson, your own questions by sending E-mail to firstname.lastname@example.org anytime during this Webcast. If we don't get to your question during the Webcast, we'll provide an answer in the Webcast archives.
Yesterday, the Internet was just a place where we sent each other simple email and looked at each other's Web pages, so no one was all that concerned about security, authentication or authorization -- all issues that Greg's going to be talking about today. Today, we use the Internet to check and modify sensitive personal data, to pass credit cards around, buy and sell stocks and bonds, and to reallocate our investment nest eggs. It is no surprise that we are feeling the need for more security on the Internet.
Just today, IBM and Intel announced a partnership to promote Intel's Common Data Security Architecture, or CDSA (another new acronym that I guess we're going to have to keep track of here). Intel and IBM also say they are working on a standard interface for identifying people based on passwords, smart cards or biometric characters such as fingerprints.
Is it safe to surf the Internet? Do our private systems really give us any privacy? Can we be sure we're communicating with our bank and not a 14-year-old pretending to be our bank? And is someone reading our confidential email and planning to publish it in a tell-all expose on the Web? In today's Webcast, Greg Jackson will address these threats and discuss some ways to stop them. By understanding these issues, you'll probably sleep better at night after you've logged off the Internet.
JB: Well, thanks so much, Howard, and as part of the introduction here, I'd like to mention that I heard Greg talk about these very important issues at a session last January, and at that point he really clarified for me some of the difference between authentication and authorization. We keep hearing people talk about those, and found out that they're quite different kinds of things, Greg.
GJ: If we have a problem out there right now that stops us from moving forward, it's the fact that people confuse figuring out who someone is and establishing that with deciding whether that person can have access to something or another. Every time we get them mixed up, we find ourselves with something that resembles my wallet -- and my wallet right now is full of confusing pieces of plastic and paper which mix this up. Part of the problem we also face is that we don't do very well at distinguishing authentication and authorization when we're not on the network, and because of that, it's a little naive to expect we'll get it right on the network right away.
HS: Are you saying that you have things in your wallet that are authentication things and other things that are authorization things?
HS: Could you give us an example? I think it would be useful to take a peek at your wallet before we take a peek at the Internet.
JB: Now, hold that up to that video camera that we don't have, Greg!
HS: Right. Next year.
GJ: Let's talk about three things that are in there right now that are all flat and all paper or plastics. One of the objects I have in there is my University of Chicago ID card, and I'll come back to that one. Another thing I have is a pile of my business cards, and a third thing that I have in there is a sensor card that has a big blue arrow on it and otherwise is blank that I sort of swipe through a lock to get into the secure pieces of our machine rooms. Interesting thing about those three things is they're all sort of the same size and appear to do the same things, but one of them--the key card--is pure authorization. Whoever has that card is authorized to enter this room, and so I have to be awfully careful about that because no one is double checking to make sure that I' me.
HS: That card, when you go in the room, doesn't record the fact that it's you?
GJ: Well, it records the fact that that card went in, but it doesn't do anything to find out whether I'm really me. Now, there are much fancier systems that go beyond that, but as it happens right now, there happens to be a piece of our machine room (which I will not identify further) which depends only on whether someone has that card.
HS: But they believe you own that card?
GJ: They believe I own that card. But in fact, if someone steals my wallet and happens to know where that door is, they're in there.
If you look at my business card, at the other end of the extreme, my business card is purely an attempt to identify me. It's not a very good attempt because, in fact, anybody can print up a business card that says they're me. But all that business card does is to say, "This is Greg Jackson or somebody who took the pains to put together a business card that says it's Greg Jackson." And it'll have some things on it that'll tend to corroborate that, like phone numbers and the fact that I have it and it has the right logo for the University. But the fact that somebody has that card or even that I have that card doesn't get me anything at all except establishing my identity.
The ID card for the University is the interesting hybrid between those two. This is one of these medium-fancy campus cards, which means that it has a picture of me on the front and it has a lovely blue photograph of the quadrangles here at the University to identify it's from the University. And it has a little bar code that can be read by a bar code reader, and if I flip it over, on the back it has a magnetic stripe which is encoded to standard banking standards.
HS: How come you have a bar code and a mag stripe?
GJ: Well, you have a bar code and a mag stripe because there's an awful lot of cases where mag stripes take more machinery than you want to use. For example, the people in the library very much like using bar codes.
HS: They do here, too.
GJ: They have all their books bar coded, and so since they have to have bar code readers already (and bar code readers work much better for books), they would just as soon read a bar code off the ID card. And it doesn't really cost anything extra to have that on there, so we do that.
And then the mag stripe on the back can be read through swiping through a turnstile -- for example, to get into the libraries. If I go swiping through there, there is this attribute which is if someone has that card, they can get into the library. But since there's a picture of me on it, a guard who grew suspicious could look at the card and see that the person who had it was not me and there's an additional precaution there. And of course, we could have a system where as soon as I've scanned, a little keypad wants me to enter something else to prove that I'm me. But the card itself doesn't require that.
So this is just an example of three very different levels. In addition to my ID card, I have a driver's license. I have several credit cards. I have a card that lets me charge things at our faculty club. I have a little card that has a number on it I can use to make, for example, this long distance call I can't make without knowing a particular number that I have written down. And ten or so other kind of things that let me into one thing or another.
HS: Okay, now that we've gone through your wallet, I think we all know exactly what's in your wallet here and how it works. Could you tell us what these things look like, these authentication versus authorization things look like on the Internet?
GJ: Well, they're pretty familiar to most of us who go out there, only we see them in two different ways. Let me play out what I did this morning when I came in. I came in, the computer sitting here on my desk was on, and sitting in front of me was a Windows NT prompt that wanted me to say what my username and password were. And so I had to say, "Here's my username, which is gjackson." That tends to be my username for many things. And then there's a password, which I typed in.
HS: So that's authentication?
GJ: All that did at that point -- the fact that I knew those two pieces of information in our system was sufficient to convince the machine that, okay, I have Greg Jackson here trying to get in.
HS: Those are kind of authentication things that are of the nature of what-you-know.
GJ: Exactly. That's what's called "what-you-know." Typically, people talk about three different kinds of authentication. They talk about things that involve who-you-are, they talk about things that involve what-you-know, and they talk about things that involve what-you-have.
An example of what-you-know is a username-password pair. That's something that one can transmit very easily. An example of who-you-are is something like a photograph or a fingerprint or a retinal scan or something that involves my physical person. And an example of what-you-have is that there are certain instances -- for example, this key card that gets me into the machine room --where I have to actually have something. The best authentication involves all three of those things.
HS: Anything like that in your University?
GJ: In my University, so far as I know, we have no place that does all three. We have a batch of places that will do two, so that you have to have a card and know a code that corresponds to that card.
HS: That's sort of the way your ATM card works, right?
GJ: Exactly. In fact, most banking things involve a card plus a PIN or some other kind of number that you need to know.
HS: So most banking things are more secure than most university things?
GJ: Most banking things that actually require that PIN are, indeed, more secure. And more, typically, is at stake, so that makes sense.
But realize that we have to, again, separate the fact that my ATM card authenticates me from which accounts it actually lets me see. So back to my computer when I turned it on, at the time when I've given it this matching password and username, the only thing the computer has done, has now established is, "Okay, I have this person who I know is gjackson." It then goes and looks up in a database on itself someplace to see what it should do, faced with this person named gjackson.
One option, it could say, "I know who you are, but you're not allowed to use anything here." Another option is it could say, "Okay, I know who you are. You're the system administrator. I will bring you in and let you do whatever you want on this computer," which is, in fact, what it will do since I am my own system administrator. And there could be any number of options in between, where it let me in and didn't let me change anything but let me run Microsoft Word, for example, so that I could write documents.
That's the distinction. Having identified me, it has not by virtue of that actually figured out what I'm able to do. That's a separate look-up or should be a separate look up.
HS: Do you always go through an authentication step before you go through an authorization step, or are those things really independent?
GJ: Well, in any well-designed system you have to, because ordinarily any resource -- whether it's a network-based resource or bank accounts or who can chain a bicycle up on a rack or whatever else -- there is some classification of people or organizations into those that may and those that may not. And in order to decide whether to authorize someone, you have to be able to categorize them as either I am someone who can or someone who cannot.
You don't necessarily need to know who someone is. You can go back to my favorite authorization mechanism from my childhood -- which is the little thing you walk under in order to drive the race cars at Disneyland. The point where I got to where my head actually bumped into it so I was allowed to go by myself was a big deal, and that's an authorization mechanism. Any kid whose head bumps into that sign is allowed to go in and use these cars. But Disney World does not in any respect find out who I am in order to do that, so there's such a thing as pure authorization.
In universities and in most places, that's not the way we usually do it. We ordinarily want to find out who somebody is, then look up that person's attributes, and then based on those attributes, decide whether to let that person in or not.
HS: One way, Greg, that people talk about keeping people out is by the use of firewalls, and they seem to be used in a little different way in universities than they do in corporations. Could you talk a little bit about the use of firewalls?
GJ: Well, firewalls in general don't keep people out. They keep particular kinds of communication out of a network or out of a machine or out of something. That is, you build a wall and you instruct that wall to only let certain kinds of information through -- sometimes coming in, sometimes going out, depending on which way the firewall works.
Sometimes, what you do in order to decide which information goes in or out is that you require that that information -- whether it's the packets or some other kind of stream -- identify who the person receiving or sending the information is. But more typically, what a firewall does is to actually protect against a particular kind of traffic.
So in a business, what typically will happen is an institution will have a network inside. It will then build a firewall around so that information traveling on the inside network is not visible outside, and so that people cannot send information to or receive information from machines inside. And having built that wall around the organization, it then relaxes and lets all kinds of stuff flow over the network without really worrying that the bad guys (or the Dark Side, as my old colleague Jeff Schiller likes to refer to it) are watching.
In a university, typically we have a campus that is pretty open and we have a network that is pretty open, in the sense that we don't spend a whole lot of time really making sure that everything inside the campus is nailed down. And that means that building a firewall around the university isn't terribly useful because if there are bad guys or double agents or corrupted machines around, they're already inside.
HS: So we wind up building the moat around the enemy?
HS: Not too good.
GJ: Now, we do--most universities do, in fact, have firewalls, but they are typically not around the entire campus to keep the hoi polloi out, but they are rather around particular places on the network that have secure machines or sensitive information. What you then do is you make sure you don't let anybody unauthorized inside that firewall.
HS: So we've got to do something beyond firewalls. Is that something else encryption?
GJ: Encryption certainly is a help. For example, here at the University of Chicago we have a campus and we own a hospital, which used to be part of the University and now is a separate corporation that we own. The hospital, of course, has a great amount of very sensitive information that moves around on its network. The University, by and large, doesn't. The University's network generally is open, which is to say we don't have firewalls around the entire University network, and so the presumption we operate on is that there might be people watching traffic.
That means that anybody on the University side who doesn't want his or her traffic observed -- for example, when I'm corresponding over the Internet with my network security officer, I (since we're often discussing a particular security incident in progress) usually I don't want to see that traffic moving in the clear. And so he and I will agree on a way that we encrypt our transmissions to one another. And sometimes that's just code words that we know, and sometimes it's using a fairly sophisticated encryption mechanism.
The hospitals don't do that. What they've chosen to do is actually to build firewalls around the hospital network, and so if you try to reach a computer inside the hospitals from outside, the firewall simply won't let your traffic through. There are a few specially regulated gateways which are quite secured, but their general approach is to build a wall around the hospital network and then they can be much freer and easier with the data moving over that network, much as they are with carts that are being driven around in the hall.
HS: Greg, we have a question from Nancy Zofen (I hope I'm pronouncing her name correctly) from Michigan Technological University about encryption and key recovery. Nancy says, "How are colleges and universities insuring they have access to the keys (these are the encryption keys) if and when they need to access University data that's been encrypted? And who's the keeper of the keys?"
GJ: Well, there's a "should" and a "do" answer there. The cold hard fact is that most universities at this point are not worrying about this problem very much. They're not worrying about it terribly much because, for the most part, they haven't gone very far down this path.
There's an important distinction here, which is that one uses key technologies of one sort or another for two different kinds of purposes. One purpose is to encrypt a piece of communication, and if you do that with a public key system -- typically if you, Howard, were to have something that you wanted me to encrypt to you, you would have a secret key and you would publish a public key. I would use your published key to encrypt what I wanted to send you. I would then send that over the network, and basically, that is unintelligible to anyone who isn't you or anyone who doesn't have your secret key.
HS: Right, and if I get run over by a truck with my private key tucked away in my pocket--
GJ: Then we have a problem because it means no one can read that traffic. Someone here was on vacation for two weeks and it turned out one of our systems was sending him email with instructions to actually prepare collections of photographs for faculty of people in their classes. And he was on vacation and there was the email going into his packet and we actually had to go in, break into his email account to retrieve these messages, which is something that we do only under the direst of circumstances and with lawyers watching over our shoulders. And if you got run over by a truck and your secret key was gone, yet some critical communication at Princeton was needed, was encrypted that way--
HS: Which happens all the time. We always have critical communications here.
GJ: Your boss would be out of luck because right now, unless you've put a copy of that key someplace just in case, there's pretty much no way to break that encrypted communication.
So the right thing to do with encryption keys is that secret keys that individuals are using on university business ought to be collected someplace.
HS: This is this whole key escrow?
GJ: This is the whole key escrow.
HS: Issue. And you don't think many people are doing it, then?
GJ: Well, I don't think many people are doing it because relatively few institutions actually have a whole lot of encrypted stuff moving over their networks.
This is an idea where the technology is perhaps two or three years old and the idea that this is important really is only about a year old. And so it takes a while, to the point where everyone has public keys and private keys and knows how to get them and knows how to encrypt and has the right tools to do all of this. This is complicated by the fact that there are a couple or three different ways of doing this which aren't terribly compatible with one another.
But once a campus has agreed on a way of doing this, the secret encryption keys, the things that let you decrypt a message really need to be held by somebody. Now, who that should be is the same kind of question as who has a key to my office here on campus, and the answer in that case is two people do. The facilities department can come into my office because they have a set of master keys and the campus police have a set of master keys and they can get into my office. And what you always want to do with technology is do something that is as consistent with--
HS: You probably don't want to give those people your private key.
JB: The key escrow?
GJ: Well, you may or may not give those particular people, but you certainly want to think about why did I choose those people and choose in a similar way.
So for example, analogous to the police -- I mean, first of all one might want to think about, for example, the general counsel and the chief technology people as being the appropriate analogy to the police and the facilities department in the electronic world. Some people are very uncomfortable with the idea that I would have keys and be able to decrypt everyone since one of their favorite things to do is to complain about me. And so I could see them perhaps trusting the general counsel's office more to hold this. As long as it's something like that, then it will work reasonably well. There always will be people who say, "Dammit, those people have no right to this! This is my private business!" And frankly, if it's university business, that's simply not correct. If it's university business, it's university business.
Now the other use of secret key and public key technology is to put digital signatures on things, so that when I write to our local FBI office--
HS: So that's an authentication thing.
GJ: That's an authentication thing. They need to know that the message really came from me, and so I will sign it with my secret key and then they can use my public key to see whether it really came from me. My secret key that I use for signing I should not be escrowing someplace else. If I'm run over by a bus, messages should stop coming from me.
JB: Unless, of course, there's something about the Other World that we don't know yet, right?
HS: We'll keep a watch on that. That will be another Webcast that we'll do.
GJ: And I would have the key with me in that case, so I would be okay.
But it gets very dicey because one uses, one can use the same keys both for encryption and signature, and people have gradually come to understand that if you're going to start down this path, you have to get people to understand that they need an encryption key and a signature key. If you go looking at standard PGP servers for my key, for example, you'll find one of each. And if you want to encrypt things to me, you should use my encryption key. And if you want to see whether I signed something, you'll discover that I've use my signature key.
HS: Greg, from what you're saying and from a lot of the people I've talked to, it sounds like many universities are not very far down the road to implementing appropriate security -- network security -- on campus. I wonder if you could address the issue of what steps, if there are just a few steps, that universities who are not far down the road (which sounds like most of them) ought to take to deal with network security? What are the first couple things they should do to get their security plan in place?
GJ: Yeah, that's a good question and actually, the second piece of this will lead me to quarrel with the statement that most places are doing nothing, because that's not quite right.
The key measure in any security scheme is that first of all there are certain very, very critical things where you want to put a lot of effort into securing them. But by and large, those are a small number of communications among a small set of people. For example, the chief financial officer communicating with the comptroller is something that we want to be very, very careful about.
Let me put that aside, because I think controlling very small flows like that is something where there are well-established technologies, the same kinds of things that banks use to communicate with one another.
The second thing you want to do is you want to take any very widespread authentication procedure, something that happens all the time where people again and again and again are putting their usernames and passwords out over the network--
HS: Like email.
GJ: Like email authentications. And you want to make sure that you secure those. I was just talking to my provost yesterday and he said, "Well, you know, I send mail around the campus all the time and envelopes and people can steam them open and look in it, so why should I be worried about my mail?" And I said to him, "It's not your mail that we're interested in. It's the fact that you authenticate in order to get your mail, and it's that transaction of authentication that's the problem." Someone who uses Eudora and PopMail, which is a fairly common thing on campuses, typically will set Eudora to check the mail every ten minutes, 20 minutes, something like that, which means all day long, every ten minutes, out goes the username and password.
HS: So you're concerned just that those passwords are the same passwords they're using for other things?
HS: And that people can pick those passwords up off the network pretty easily?
GJ: Exactly. It's not that you could pick up a password and then read someone's mail.
HS: So nobody's reading your email. They're reading your password.
GJ: They're reading your password and that's a problem because people have the habit of using the same password and username pair for other things.
HS: Well, because they're easy to remember. Who wants to remember a hundred passwords?
GJ: Exactly. If you have to remember 40 passwords, as I do, the great temptation is to use the same one for everything. And of course, it would be much easier if I were gjackson for every service that I used, and of course, if I succeed in both of those and someone gets my email password, they've got everything. And some of it is that they can read the New York Times, and pretend they're me, which is no big deal. But something else is that they can go in and actually order merchandise to be sent to them, and that is a big deal and worth paying attention to.
So one of the things you want to do fairly quickly is to try to get a handle on very widespread authentication transactions, and as you pointed out, the fact that people repeatedly try to read their email is the one you go after.
So the first thing, bar none, is to try to get to the point where the repeated authentications in email are secure. And that means that when the username password exchange between your computer and the server takes place, that that transaction should be encrypted so that some bystander gets nonsense if he or she sees it.
There are a few ways of doing that. One that's very commonly used at universities is to use something called Kerberos, which was developed at MIT actually, some years back, and the Kerberos protocol is one that basically has me put my username and password in on the computer here and then the computer here does something to that in a prescribed way and sends this hashed-up piece of nonsense off to a server. The server then can replicate this hashed-up process and compare what it comes up with with the nonsense that I sent, and if those two pieces of nonsense match, then it can send a message back to my computer saying, "Okay, looks like you're the right person." And someone who intercepts that has nothing of use because the next time I do it, the nonsense will come out differently.
HS: It'll use a different key.
GJ: And so securing these kinds of transactions with something like Kerberos is something that one wants to move to very quickly.
HS: But in your example of using Eudora and PopMail, or in my case, where I use IMAP, is Kerberos easily available or easily integrated into these systems?
GJ: Well, it depends. One of the reasons I happen to use Eudora--and I should say right at the outside, I own no stock in Qualcom and nor, so far as I know, do any of my mutual funds. Qualcom happens some years ago to have sent a representative to a major meeting of university chief information officers and a bunch of us sort of made it real clear to Qualcom that we would really like to see Kerberos smoothly integrated into that product. And lo and behold, the Qualcom people did that. So Eudora has built into it a fairly smooth implementation of Kerberos, so long as the server that you're using is able to do this and you have the right infrastructure behind that. So a lot of institutions that want to use Kerberos will, in fact, follow that.
Now, there's another way of doing this which is with something called X509 certificates, which is a different form of digital certificate that depends on a somewhat different hierarchy, but the same general idea. And that's something that--
HS: These are the Verasign digital certificates you're talking about?
GJ: Something like, yes. Verasign is an example of that. There's CREN. Judith, who is on the phone with us, has been behind a project to have CREN step in and serve as a trusted source for certifying that universities are who they say they are, and then universities can turn around and certify that their faculty, students and staff are who they say they are. And then you can use those certificates very similarly to the way you would use Kerberos to actually authenticate who you are without sending your password over the network.
HS: So once you add Kerberos--
GJ: And that's built into things -- like Navigator and Explorer are often capable of doing that kind of thing. It's not terribly hard to do this except that you have to train everyone that once you have one of these secure passwords, it's sort of like safe sex. You don't want to use it anyplace that's not secure.
HS: Once you get your email system secured, that is, we plug Kerberos or digital certificates or something into it, right, then it seems like everybody has to change their password because we should assume at that point that all the old passwords are compromised.
GJ: Yeah, you always want to assume when you bring up a new secure system, assume that the bad guys know everything that came before. And that's hard because it means that everybody has to go through one of these huge transitions, but if you don't do that, then the new system is compromised as well, and you don't want that.
HS: But one doesn't have to change their ID's. You can just go out to everybody and tell them to change their passwords.
GJ: Yes. The ideal thing is you would change both, and you would change--
HS: Why does it help to change your ID?
HS: Since I never plan to give mine up!
GJ: The less information that's predictable or recorded someplace, the better.
HS: I mean, aren't ID's pretty well known? I mean, they're almost published. They are published!
GJ: They are published but that means that one of the two pieces of information you need to know isn't a secret. A far better thing is to have something where both pieces are a secret.
There's an architecture now called Challenge-Response, where there's an algorithm for generating a challenge and then having a right answer to that. And then you carry around in your pocket this little card that has the algorithm built into it. When you go to log in, the system you're going to issues a challenge, which will be, you know, 17PQR7F! And so you will then punch that in on your little card and it will calculate what the right response is, which might be, you know, 7732Q. And so then you will put that in where it says PASSWORD and now the challenge has been met with the appropriate response and it authorizes you. The next time you do that, both the challenge and the response will be different. And that's better security. Now, that's overkill for most universities, but it's quite commonplace in high-security network infrastructures.
HS: You mentioned to us when we talked to you before how you managed to keep track of 40 passwords. You said you used some kind of password algorithm. Could you say a few words about that? I think, I mean, it was a new idea to me. I think it will be a new idea to the listeners.
GJ: I go to places where they've gotten picky about not letting me use my wife's name as a password because it'll show up in a dictionary, and so I had to gradually start finding things that wouldn't show up in dictionaries and that I would be able to remember. And so there happens to be -- as most people who ever had a young child, somewhere in the history of that child, there's usually some absolutely memorable piece of nonsense that's meaningless to anyone else. And so sure enough, in our family history, there is such a thing. So I said, "Aha! I'm going to use that as my password!" And then, I guess, at some point I went to change the password because I was on a system that said you have to change it once a month, and I went to use the same password. It said, "Sorry, you can't use the same password." So I said, "Aha, I know how to solve this problem. I will modify my chunk of nonsense each month with something that depends on what month it is." So it was chunk of nonsense followed, for example, by the last digit of the month. So sure enough, that worked, and now I had a pattern where I could not only remember my password, but it would be different every month and I could figure out what it was by reasoning backwards.
And then I realized that I use all these different systems and I could, at the other end of my password or in the middle, put something that had to do with the system I was using. And so it might be some attribute of the name of the system or service I would stick into my password. And over the years, I discovered this was really useful. It's now basically all of my passwords follow this method. So if someone gets one of them, the thing is, they won't know which piece of my password is the nonsense which will be repeated and which piece of it won't, so they really don't have anything terribly useful.
And I can remember what all my passwords are without writing them down, which is the critical thing to do. Budget officers here at the university need on the order of 30 different passwords to do their business, and so what they all do is they use the same password or they write them all down on a post-it and stick it on their monitor.
HS: If we got to the point where we were using biometric data like fingerprints or something like that, could we -- we then would have effectively just one password. Put your thumb on the screen or whatever, and then you'd be done. You wouldn't have to remember 30 passwords.
GJ: Certainly, if you can come up with a reliable what-you-are piece of the authentication process, that can reasonably well replace a what-you-know piece.
HS: Is there any sense that we're going to move in that direction? Or we're just going to keep using passwords and things?
GJ: There's a lot of talk about it. The difficulty is, if you want to do this, then, on a campus, for example, you have to equip all the places where someone might want to authenticate with whatever the reading device is. And these biometric things require something other than a keyboard.
JB: And couldn't we get to the point, Greg, that the screen on some of these devices could serve for that? I mean, we've got screens on almost everything like, you know, cell phones, Pilots, computers.
GJ: Well, it could, but remember, you don't want to take something off the screen. You want to put something into the screen. And most of our monitors these days don't have such screens.
HS: Yeah, they're one-way.
JB: Oh, I forgot about--we had touch screens years ago.
GJ: University of Chicago has on the order of 10,000 entities on its network that aren't routers, or something like that. And if I want this to work, basically I have to equip all of these with something. Now, a gadget that, for example, will take a fingerprint or a retinal scan or something like that probably costs me something like what a video camera -- which means, if I bought 10,000 of them at once, it's probably going to cost me a hundred and some-odd bucks apiece. And then I've got to go install them all. And this is now a new piece of hardware--
JB: And maintain them, right?
GJ: That can break. Yes. Hardware that can break, which is fairly sensitive. And so this is a big deal.
JB: We do have a question coming in that perhaps we might want to take at this point, as we're getting towards the last half of the program, Greg. There's a question coming in from Randall Winchester from the University of Maryland at College Park, and he asks, "What do you think about the future of integrating what-you-have with what-you-know technology?" And then he's got a couple of examples here. For example, what do you think about using a password before you can get to a certificate or certificate server? And then the second part of that is using a certificate like SSL to encrypt a tunnel to type a password?
GJ: These are all good. Again, we have to be careful not to confuse--
HS: What's a tunnel?
GJ: A tunnel is where, I mean, SSL secure socketware.
HS: That I know. Tunnels, I don't know. Tunnels are what I use to go into New York City.
GJ: The idea behind a tunnel is, you know, it's a metaphor for the argument that you create some channel where you can go through it and you're blocked off from everything else. Which is basically what SSL communications are doing. There's a thing called, whatever SSH stands for--Secure Shell, which is often used for places that still want you to log into a computer somehow. They will set up something. The difficulty there is what that is doing is encrypting a particular stream of communications. That's not doing anything to actually authenticate you, unless your authentication is used to set up the tunnel, which sometimes is the case, sometimes isn't.
SSL, for example, typically sets up a tunnel between a machine and a machine, not between a person and a machine. My computer here on my desk will always set up a similar kind of tunnel to another server, regardless of whether I'm using it or my colleague here in the office is using it. That has to do with protecting the password interaction from snooping eyes, and obviously, one way to do it is to secure the thing all the way to the server.
A different way to do it is with Kerberos so that the password never leaves your machine, but that's really not adding a what-you-know to a what-you-have. What-you-have has to involve something biometric. Something that comes out of your mind doesn't count as that. That's a what-you-know. So you have to figure out some way of getting that fingerprint or the retinal scan or how hard you push on the side of your computer.
Some people like to use you type a word, this test phrase, and the idea is that you'll always pause in exactly the same ways between the words. And, you know, if I type "these are the times that try men's souls," it'll measure the intervals between my keystrokes, and that's biometric. That doesn't work very well, but it's a nice idea. It's a nice idea because you don't need a new piece of technology to do it.
So combining encryption of the password interaction with a password interaction doesn't really add anything to authentication, although it's a good thing to do. The more interesting idea is this idea that you have to have a password to get to a certificate. But there, realize all you do at that point is you have what are in effect two what-you-know transactions rather than a what-you-know and a what-you-have. What-you-have always refers to a physical item of some point, not a logical, not a collection of bits.
JB: What-you-have is that authentication card, or pardon me, the authorization card that you use to get into the lab, for example.
GJ: That's a card.
HS: Or a physical key. A physical key is (inaudible.)
JB: Or a physical key.
GJ: I have a whole chain of authorizations.
HS: We forget that we have these physical keys that we carry around, and that's actually a very sophisticated -- I guess newer ones in cars that have little chips on them are even more sophisticated, where your engine just won't start unless the right chip is there.
GJ: And part of the security you use on that. For example, my keys, and I've got a lot of them. I have all these different places that I have to go that aren't electronic. So I hang them on my belt, which means periodically they fall off my belt. And one of the things I'm careful to do is to put no identification on my keys.
HS: Yeah, that's become a problem, actually, in hotels. I'm always given these keys and I forget what room I'm in. And I look at the key, and it's a key.
GJ: And it doesn't even say what hotel it is!
JB: That's right!
HS: At a conference, you're wandering around the halls of the wrong hotel.
GJ: But this is back to Randy, is that what his name was?
GJ: That's analogous to what Randy is proposing. If you're going to use a key to get into something, then do some things so that that key isn't useful to someone else. But that's not a substitute for the key itself.
JB: Okay. Greg, we started talking about, or you started mentioning what are some of the basic steps that a campus needs to be thinking about to get ready.
GJ: And step one, which actually when you look at what you have to do is really step two. The thing to focus on most is to get reasonably secure authentication for a huge fraction of the authentication transactions, if you can. If you're going to do that, it turns out that to do it, you have to do something else first. And that is that you have to figure out who everybody is on campus, if we're talking about a campus.
Many campuses have this sort of smoothly solved so that -- for example, I taught at Harvard for many years and actually my doctorate's from Harvard. And after I got my doctorate, I went off and taught at Stanford for a couple of years and then I came back to teach. And when I went to get my ID card, I was somewhat intrigued by the fact that my Harvard ID card showed up, now that I was an officer of the university, with exactly the same number on it that I had had as a student. So I was sort of curious about that, and went over and discovered there's this thing at Harvard called the Name and Address Office. And I'm not sure whether it's still there, but the Name and Address Office's sole function was to identify everyone who ever has any contact with Harvard, assign that person a number, and then to the best of its ability make sure that if that person has further contact with Harvard at any point, you use the same number.
And what you basically have to do is something like that because otherwise, you start having people with multiple identities, which is one problem. And you start having difficulty figuring out, you know, whether a given person should be authenticated at the university. And so you have to somehow figure out what is either the campus-wide directory problem or the unique identifier problem or the linkage identifier or the net ID. It goes by all these different names. So you have to have something--
HS: What do people use? I mean, it used to be, when we were naive about this whole thing, we used social security numbers as a unique identifier.
GJ: Well, we did except if you were at an institution that had a large number of international students. Each institution would make up fictional social security numbers.
HS: Right. Typically started with three nines, as I recall.
GJ: And those would differ -- so if MIT, for example, had had an undergraduate and made up a number and then that person went to Princeton as a graduate student, that person would now have two different fake social security numbers.
HS: Well, they could have two different social security numbers here, if they were a foreign student. Because different places would just make up different numbers for different purposes.
GJ: True, yes. So we used to use social security numbers, and the dirty little secret, of course, is that we still do. We just don't print it on everything because we're not allowed to disclose it, but many, many unique identifier systems depend critically on a social security number as part of the identification vector, the set of things you use to figure out, do I know who this person is, unambiguously.
Here in Chicago, for example, Greg Jacksons are a dime a dozen. There's something like 80 or 90 of them in the metropolitan area, so it's not a terribly unique name. So that doesn't do much good, and actually, there's something like 15 other Gregory A. Jacksons in the Chicago area, so that doesn't do much good. There aren't any other Gregory A. Jacksons with my social security number, so at that point, I'm fully identified. And I would guess that there aren't very many that have my same birth date and birthplace. So if you just knew my name and my birthplace and my birth date, you can do a pretty good job of figuring out whether you know who I am already. And most of these systems are based on some collection of five or so pieces of information.
HS: So we collect enough data so that we hope that there's not going to be a duplicate anywhere.
GJ: Okay, and so now, you know, when someone says, "Okay, I'm Greg Jackson," you can go to this database and say, "Okay, now I've figured out who Greg Jackson is and then I can use that to look up what kinds of privileges to give this person." So you need to solve this integrated directory problem. And then, once you've done these two things so you can now identify everybody and you can authenticate them securely, now you actually need to build that secure authentication into whatever this big set of applications is, and if email is the dominant one--which it usually is--that means you actually have to bring up an email system that uses secure authentication, which is the easy part. And then the hard part is you have to do the massive public education project to get everyone to understand what it means to have a secure authentication system and respect it. Those are the three big things. The directory thing, the security thing and an application thing.
HS: When I first started introducing this session, I talked about this announcement I'd seen from IBM and Intel about this CDSA, this Common Data Security Architecture. Can you tell us anything about that? Is this an old thing being warmed up again or is it a new thing, and how is this going to fit into this whole scheme?
GJ: Well, rather than embarrass myself, I'm just going to have to say I actually don't know very much about it. My fervent hope is that what it is is an implementation of protocols that are coming to be accepted out there in much the way that Microsoft, after an initial effort to develop its own authentication system, went back and in what is now -- let's see, Howard, you know what this is called. Microsoft Windows 2000 Pro, which some of us used to call NT5.
HS: Yes. Well, in fact, as you know, there's four flavors of it: a workstation flavor and three server flavors.
GJ: But anyhow, the authentication scheme that is built into this now actually is pretty much built on the Kerberos scheme, which was not true earlier. And although there's all the usual suspicions about, you know, will there be some little proprietary wiggle at the edge of the last minute. But if that doesn't happen, there's a general feeling that that was a good thing, that Microsoft saying that this is important was an important step and that they then said, "We're going to build what we do on a protocol that's already widely in use" was a very good thing because it'll encourage this to be integrated rather than encourage some kind of trade war about what the right way to do it is. And my hope is that the CDSA stuff has that same attribute -- that it's building on X509 architectures and it's building on Kerberos architectures and other things that are commonly used already.
HS: So you suspect that one of the impetuses to this is this whole E-commerce thing on the Web, then? There seems to be a lot of concern about the fact that there's a lot of electronic commerce on the Web, and it seems to me that's what's driving a lot of this.
GJ: Yeah, certainly when we get to institutional E-commerce, this is going to be critical. For personal E-commerce, it's not terribly important because there, what you want to do is, you ask someone for their credit card and then you use it to charge them. And the key thing to do is to make sure that you do (back to Randy's point) that you construct some reasonable tunnel through which that credit card number moves. And this has to do with the advice to everyone, which is if you're going to send a credit card and you're filling out a form on the screen, look at the lower left hand corner of your browser and make sure that either the key isn't broken or the little lock is closed or whatever the symbol for your browser that you're talking about is secure interaction. And that's a reasonable protection because then credit cards already have liability limits and things like that behind them.
HS: Your concern about the lock is that if you do see the little lock, it means that your credit card number is going across the network encrypted.
GJ: It doesn't mean that it's really going to L.L. Bean. This is the other thing you have to be careful about is, you know, you went -- you could get a piece of email that says, "Oh, wow, look, you can buy a car for $42. Just type in this URL and give us your credit card number!" You shouldn't do that. Whereas, if you actually followed a reasonable public path to get to a vendor, then you can be much surer that what you actually got to was that vendor. So if I go to www.llbean.com, pretty good bet I'm there because otherwise L.L. Bean would have sued on trademark basis and gotten that back.
There was an interesting problem for a while when Digital brought out the AltaVista search service, which was that there was another company called Alta Vista Technologies out there that then brought out a Website, which was www.altavista.com, and a lot of people went to that site, thinking they were going to the digital search engine, which they weren't.
HS: Yeah, which is www.altavista.digital.com , for folks who were wondering how to get to the real one.
GJ: Exactly, and so you have to be careful about where you're going. But the communication between you and wherever you've gotten to is secured by that little lock or the unbroken key or whatever it has. Or the fact that the URL at the top says "https" at the beginning, not just "http".
JB: That's a good reminder, since we see those little locks on the Web quite often, Greg, so with the question that Randy asked earlier, then if we see that little lock, it means there really is a little tunnel being set up.
GJ: What it means is that you're reasonably protected against what you're sending being intercepted in transit. What it is not is a guarantee that you're actually sending it to the place where you think you're sending it. That, if the site you went to has a certificate, then your browser will often tell you that "I actually have a certificate from this site," and then you look at the hierarchy of trust. So you see that the certificate is signed by the California Certificate Authority, which is then certified by the U.S. Postal Service or something like that. And you say, "Well, okay, that looks like a chain of trust I can believe in."
GJ: That's what this X509 architecture is trying to do.
JB: Well, Greg, listening--you know, we could go on just a very long time. I noticed, I think we're only through, what? About half the questions we had talked about today.
HS: Right. I believe we're over the edge of the broadcast here now.
JB: Yes, we are. So Howard or Greg, a final comment or question?
HS: I'd just like to point out, when Greg said when I asked him about CDSA, and he said, "Well, you know, I don't know a great deal about it." I think that points up the fact that the technology in this whole area is changing very, very rapidly and new stuff is popping up every day. And even people who are involved in this thing quite deeply every day discover that it's quite difficult and challenging to keep up with this stuff. So no matter how many of these broadcasts we do or no matter how many experts we bring out here, you ought to be aware that the technology changes very rapidly. So I urge you to tune into the next time we do something on security. I'll bet that lots of things have changed.
GJ: Let me just say a word on top of that which is there's a great temptation to wait until everything settles down and there's a perfect right answer. And if we do that, we'll wait forever. The trick is to be taking steps to make things better, even if they're not perfect.
JB: And with that, let me just close and thank all of our listeners and the people who sent in questions. And please send follow-up questions as well to email@example.com.
Also be sure and mark your calendars for the next TechTalk event on November 19, and that's two weeks from today, at the same time. This session features Ted Hahns and the Internet2 project. For all of you who tried to visit and participate in this event in September, this is the one that is rescheduled, the one that we had problems with technical difficulties. As you can tell, we have solved or at least addressed those technical difficulties and things are going well.
Thanks to all who helped make this possible today: the Board of CREN; our guest expert, Greg; Howard, technology anchor; Paul Bennett at UM Web Services for the encoding; and all of you for being here. You were here because it's time. 'Bye, Howard.
HS: Thank you, Judith. 'Bye, Judith. 'Bye, Greg.
GJ: 'Bye, Judith, 'bye, Howard.
JB: 'Bye, all. Take care.