Topics covered include:
[Top of Page]
JB: Welcome to the CREN Expert Event Webcast for late spring 1998, and to
this session with Mark Bruhn on Privacy, Security and Handling Academic Business. You are
here because it is time to discuss the leading core technologies in your future.
Judith Boettcher of CREN, one of your hosts for today's session. Our co-host today is Ken
Horning, who is filling in for our regular co-host, Greg Marks. Thanks for being here,
Ken, and welcome. Thanks for filling in at just the last moment.
KH: It's my pleasure. It's a good thing I was fortunate enough to have
done one previously, so it wasn't totally a new thing to me.
JB: That's right, you were all trained, right?
JB: Our guest expert today, as I mentioned, is Mark Bruhn from Indiana
University. Mark is well-versed in network security and is currently in charge of
Technology Policy Development in Education, in the Policy Office of the VP for Information
Technology at Indiana University. That is a long title, isn't it, Mark? I never
MB: Even longer with the "acting" part in there.
JB: Thanks for being here to talk about one of the really hot topics. Is
there anything of particular interest that has happened recently about privacy and
security issues on your campus?
MB: Well, I have to say, business has been really good, and mostly, that's
bad. We're winding down now because the spring term is over here, but we still have some
technology abuse and security cases pending in various agencies, the Dean of Students
office and with the University Police Department. We even have a couple at the local
prosecutor's office, in fact, related to outsiders sending illegal material via e-mail to
a couple of internal users.
Other than that, this is the time of the year when people
kind of relax a little bit. The students head home for break. The ones that are in
summer school (since both sessions are a little bit shortened) they don't have much time
to do much of anything else. So it is a little quieter here now.
JB: I just remembered, I saw in the paper this is the time of the year,
like you say, that young people do weird things, and there was one place where the seniors
played a practical joke and let loose a bunch of roaches into the school. Did you all see
Anyway, Ken, would you like to remind everyone how we're going to be handling
questions today for our session?
KH: Yes, thanks, Judith.
Before we launch into our discussion today, I'd
like to remind everyone as to how we will handle your questions and feed them to Mark and
get them into the archive RealAudio event that people can access later.
As opposed to
previous expert events where we did try and take live telephone calls as well as e-mail
questions, we're now doing it just via e-mail. We invite you to send your questions to
Mark during the Webcast this afternoon via e-mail by addressing your e-mails to
firstname.lastname@example.org. Mark will respond to many of your questions during this session, but as
is usually the case, there's not time to respond and answer all the questions that have
come in. But Mark will also be responding to those questions via the website, so you can
check in the website later, and as well as finding this archived event, find additional
questions and answers that may not have made it into this afternoon's session while we're
live on the net.
Let me also remind you that if you or any of your information technology
friends may have missed this afternoon's Webcast for some reason, they can pick up the
archive session at the CREN website. That's http://www.cren.net, and the transcripts from
the earlier Webcast series are also available up on the site.
JB: Thank you very much, Ken, and thanks, too, for remembering the e-mail
that I almost forgot to remind you of. You're really good at thinking on your feet here.
MB: Pardon me if I interrupt, but at least one of you guys are also
receiving those e-mails?
KH: Yes, I'm hoping to be the person who accesses the e-mails as soon as
they come in and looks for an appropriate place to get them into the discussion.
MB: That'll be just fine because my laptop battery light is blinking in
my eyeball as we speak, so I don't know how that's going to last.
KH: Let's hope the battery on your phone is better than that!
JB: That means that you'll be able to really concentrate on what you're
saying, Mark, right?
Let's go ahead and start our session on this topic today. I find
that it's a topic that more and more people are talking about. Let's perhaps mention that
it's often the case when we're talking about privacy and security that we often find that
it's really a trade-off between privacy and security. I find that I often give up some of
my privacy in return for convenience and speed. And I often find myself giving
information on the Web at a site so that site will remember who I am, for example. What
comment do you have, Mark, about that particular aspect of our privacy and security
MB: Well, it's actually very simple. This is actually a simple answer to
a pretty complicated situation, but simply put -- you just have to be careful. You have to
know who you're dealing with on the other end.
I assume, perhaps wrongly, Judith -- I don't
know -- that you wouldn't buy something from the trunk of someone's car at the mall. In
the same way, you don't interact with some Internet site that you don't know anything
about. If you belong to a CD club like BMG or Columbia or any number of those kind of
things, you're pretty much aware of the relationship that you've had via mail. You know
what they're going to do with the information for the most part, and you've built kind of
a trust with that agency and know what they're going to do.
But if you go to a site that
you don't know anything about, you need to know what it is that they're going to do with
the information that you're going to provide. If it's a secure website -- that's easy to
tell because of the little key or the little lock (depending on what you're using) on the
bottom of your screen is going to be intact. And that will tell you that you're
communicating with that site in a secure manner.
The question that people have after
that, though, is what happens to the information that those people collect -- that that site
collects from you after it leaves that website, after you put it into that Web form?
That's something that you're going to have to be proactive in finding out. Maybe it says
it on the webpage. If you can find their policy somewhere on the webpage, make sure
that it says that they will only use the information that you provide for the purposes for
which you're providing it. They won't sell it, they won't market it, they won't inundate
you with unrelated material unless they give you an opportunity to ask for it.
have to be proactive about finding out what that site is all about. You want to ask them,
if you get an opportunity -- and I guess I'm suggesting you make the opportunity -- what they
do with the data.
If you're going to give them your credit card number, if you're going
to supply them with personal information about yourself, then ask them how they store that
information in the background. If they take it off the secure Web server and they put it
into a database that's open and visible for the network, that kind of defeats the purpose.
So you should know.
Another thing that people worry about are these cookies. There's a
lot of concern about the fact that you visit a website and that website virtually writes
things to your desktop, to your PC drive. Well, that's okay, because it will be just
exactly what you said. The next time you go to that site, you want conveniently for them
to know who you are and what your interests are -- especially if you go there often.
do go to a music club page, for example, and you're a fan of jazz music, you supply that
when you go into their site, and of course, they're going to tailor that site to meet your
interests. And next time you don't want to have to say, "I'm interested in jazz again."
It'd be really neat if you could go there and they know who you are, and that's okay,
because this idea that someone can really invade your privacy via that mechanism isn't
We, I'm sure, don't have time to go into it in detail now, but there's some
really good information about cookies, naturally, on the Netscape website. People need to
go there, read about what they are, read about how they work, and I think people will be
more comfortable with them.
JB: Mark, maybe you could tie that back, then, to some of the questions
and some of the areas that we want to evaluate websites on for trust and for privacy and
for those kinds of issues. What does that mean, then, for our campuses and our campus
websites then, when we start interacting with our students?
MB: Coincidentally, I met yesterday with various people, most importantly
representatives of the Indiana University Treasurer's Office, related to departments
wanting to -- and they are, some of them -- collecting credit card information for various
things on their departmental websites. My task (I always seem to leave meetings with
action items) from that meeting is to write a narrative of the circumstances under which
it's okay for a department to do that, and then the Treasurer's Office will take that.
They will obviously review what I've said there, and that will become what they hand to
departments when those department heads ask them if they can do that on the Web, or if
there's a better way to do that on the Web and what-have-you. And the Treasurer's Office
will then take that to themselves as a financial policy, so if a department does want to
do that, there's going to be a list of things that they're going to have to take a look
at: physical security, logical security, encrypting the traffic, placement of the
database server, placement of the Web server, all of that stuff. They will have a list of
things that they really need to consider.
That will also eventually get into the
hands (because it is a University policy at that point, if it gets that far) of the
internal audit department. So when the audit department schedules a review of a certain
department or a departmental function, they will know that that's a University policy and
one of the questions I assume that they'll ask that department is, "Are you collecting
credit cards on the Web?" If they are, then the auditor will pull out that document and
say, "Are you doing it this way?"
That's one of the ways that we're going to make sure
that's done as safely as we can within the University environment. We're looking for
other formal ways to do that, more -- I guess elaborate's not the right term, but we're
going to try to put that in the hands of agencies that are there now, in place, and have
experience in handling that kind of thing on the net.
One of those, as an example, is
Cybercash -- that credit card information, in fact, will not reside on an Indiana
University server. That will reside on that external agency's server, and then they will
interact with the financial institutions and what-have you, and we won't have to do that.
We'll just see the money come in the other end magically, I guess. So that's how that's
going to work.
But a goal of the security office within the Vice President for IT's
office here is to establish good templates, I guess -- good configurations, good top-ten
ways to better the security on specific platforms. We've got some of that on the security
office website right now, where you can go there if you -- I think LENIX is one that's up
there because the guy that's putting this together just happens to be a LENIX user, so
that's probably why it's there first. But they can go there and they can look. "Well, I
have a LENIX box on my desk, and these are the things that I really need to pay attention
to." And we want to do that for as many of the most common operating systems and hardware
platforms as we can of those that we have within the University environment.
So we want
to tell people the best way to be and help them get there. We're not going to be able to
go into the field and do that for them. Obviously, we don't have the staffing to do that.
In fact, we don't even have the expertise within the security and policy areas to do
that. We have to rely (even ourselves) on technical staff -- technical gurus, experts
around us within the computing department and within the office of the Vice President.
We're going to help them do that as best we can.
KH: And we'd like to help our listeners this afternoon get their questions
to Mark here on the virtual expert event sponsored by CREN. You can do that by sending an
e-mail message to email@example.com. And we'll be looking for your e-mail and ready to feed
your question into the discussion.
Mark, I'd like to ask something: Do we have any
understanding of what percentage, if you will, of security breach attempts are just done
by people who find joy in making the attempt (especially successfully), and the people who
actually make those attempts with malicious use of the data in mind? I hearken back to
about a year ago when you were here in Ann Arbor to tape your portion of the CREN virtual
seminar, that the first time you came, you had to dash off and go back to Indiana because
you had had a security breach that I think involved releasing some social security numbers
in a public place.
MB: And I appreciate your reminding me of that incident. That's something I
would much rather forget! Percentage, I don't think so.
KH: In that instance, do you think the intent of getting that data and
releasing it or publishing it was done just to say, "Hey, look at me! I cracked this
system and I did it!" Or were the perpetrators' intent to really make some sort of
illegal use of those numbers?
MB: This particular incident is an interesting one, and let me just very
quickly review what that was all about. One of our departments had created a faculty
information database. It was going to be used in support of those faculty members putting
together grant proposals. It had information about their interests, about their research
areas, it had office address, office phone number in there. It had social security number
in there. Various other pieces of information like that. It was inadvertently visible to
the network. It was in a directory where it should not have been, behind a gateway that
should not have allowed access outside of that department.
An individual that describes
himself as a privacy expert, an Internet entrepreneur and a talk-show host -- how those
three go together -- apparently the talk-show hosting was done by ham radio, which I'm not
even sure is legal. In any case, and I'm going to try to stay away from --
JB: Multi-talented person, in any case!
MB: Yeah, in any case, I'm going to try and stay away from rendering my
very best (inaudible) opinion about this person, but let me just say that what this person
was doing, and is still doing, by the way, is doing Net searches using various search
engines for strings like SSN = or Social Security Number of anything like that. And what
he found was this file -- this faculty information database on one of our servers.
did subsequently, then, was he copied that data out of that location -- which was visible to
the Internet, yes, but took a bit of doing to get to -- and published that list, I think in
its entirety --I'm not sure at this point, I don't remember -- on his website, which
obviously illuminated it much, much higher to the Internet community than it was before.
And then he started sending notes to the University President. He sent notes to some of
the faculty members in that list. He called some of those faculty members in that list at
their office and at home. I asked him to take the list off. He's refused. He got a
bunch of complaints. (I'm putting that lightly!) Complaints from faculty members here
about his methods. Nobody questions his privacy motive, if that's all it was, because
certainly we needed to know that that information was there that way, and we needed to
have it removed. And obviously, that was the result of that incident.
But I think his
purpose in that was --I guess I really need to be careful here. I don't think his motives
were purely as a privacy advocate. I think there was more to it than that, and I'm not
even going to go any further than that. If that was the case, then certainly all he had
to do was call somebody here and tell us that that material was there and we really needed
to get it off of there, and certainly we would have done that right away. But he did then
take that and put it on his website. He made such a big deal about it. He wrote many,
many different major media outlets and told them about it. It was in the local papers.
It was in the regional papers. It was on the AP wire. There was a little piece about it
on Dateline on network television. There was a little snippet of it in the USA Today
newspaper. He did what he could to make sure everyone knew that he had found that thing.
JB: Which puts another light on the whole issue of privacy. If one makes
an unwise decision or even placement of content on the Web, it can easily be blown out of
MB: Absolutely. We would much rather have just taken that off. And of
course, it wasn't supposed to be there. I mean, that was the bottom line was that it was
not supposed to be there, and we never said that we didn't have any fault in that. It was
just the way that it was handled.
Now, to answer the global question, the overall
question is the motives of the people that are attempting to access this kind of
information or hack into systems. It varies very widely. You would think, within certain
environments, there's not going to be much of anything useful there. They're going to try
to break into that, and maybe once they discover that there's nothing useful there,
nothing that they can actually use, they'll go away and they'll try something else.
people that would be doing that for self-glorification will still do it and they will hold
that pelt above their head and say, "Look, I broke into this institution." Those people
don't do it for financial gain. They'll do it for personal ego gain, but there are those,
then, of course, that search out things that they can make use of. If they find a bank of
credit card numbers, then they want to use those. If they find ATM card numbers with
their pin numbers, they want to use those. So really, you can't say that 80% of them are
for ego building and 20% are for personal gain because it just depends on where you are.
KH: Which is why we have to treat each event seriously.
MB: Correct. You have to take that back, and a lot of times, when we do
that, we are still not able to find out why that person did what they did because we
cannot get that information from them.
JB: Perhaps, Mark, that takes us into an area that we did want to talk about
today because of all of your expertise in it, and that is the whole issue of developing
technology policy on our campuses. What kind of content, in fact, does go up on the Web?
How do we, in fact, alert people to the sensitivity and the dangers that are there? Would
you like to talk about how we get that kind of awareness out to the campus communities and
how do you go about developing technology policy to address these issues?
MB: More and more, we are understanding that it's important that someone
be around to do that full time, and I'm talking about policy development and education at
a level within the institution, within the university that has some reasonable amount of
The technology changes so rapidly, and the things that people can do (and will
do, by the way) multiplies just as fast. Someone has to be around to evaluate the
environment and the climate and coordinate the setting of the rules. And, of course,
publicize and educate the community on the rules and standards of behavior, expectations,
and so on. There's access eligibility questions. There's appropriate use questions.
There's legal questions that come up all of the time. Someone's going to have to look at
these things and make some determination about how they impact our users.
JB: Indiana University is a fairly large university, Mark, and you said
that you were in charge of policy development. I think when you and I talked, you also
mentioned that there's also a security office. Both of them are fairly high level within
your campus. Would you like to mention a little bit about why that's important? And
also, is there something that we can say about providing guidelines for smaller campuses
in this area?
MB: Sure. It's generally the case that the people setting the policies
shouldn't be directly involved in their implementation. It's kind of a
separation-of-function thing. Basically, the fear would be -- and I'm making general
statements here, obviously; the people that are involved may make this nonsense -- that the
policy makers will take the easiest path if they then have to implement what they mandate.
Does that make sense?
MB: For example, one of the things that we're struggling with here: No
connection to the Indiana University network -- and this has not been stated this way yet -- no device can be connected, no person can connect to the Indiana University network
without authenticating their identity.
That's a can of worms. That's a large can of
worms, in fact. A security person knows that that's a large can of worms and very
difficult to implement technically. But, if you step back from the implementation, if you
step back from the technical difficulties, that's a very reasonable policy to have.
That's a very safe thing to do.
So the example there would be, certainly, that if a
security person is responsible for making that kind of statement or making that kind of
policy, then they have to turn around and implement it. Well, they're going to maybe be
dragged kicking and screaming toward making that policy statement to begin with. Whereas
a policy person -- a separate policy person that's involved in setting policy does that at
the institutional values-and-goals level and doesn't have anything to do with the
There's people out there right now that are jumping up and down, I'm
sure, wanting to throw pencils at me for making that statement. But a lot of times, if
you set a policy and a particular technology will not support what that policy requires,
you should not be using that technology. Now they're picking up heavier things and
wanting to throw them as well!
JB: Before, you were being optimistic, as if they were only going to do
MB: Yes, but you understand what I mean. The idea is that you don't choose
technologies -- you choose technologies to implement the policies. You don't put your
policies in place based on the technologies that you have available. And that's not
popular everywhere, but that just happens to be my opinion.
It's the same kind of
thing -- I guess it's analogous of someone approving check writing to the person that
actually submits that batch or submits the job that's going to write that check. You
don't want the person approving it to be the person that's putting it in because
obviously, somewhere along there, a person with the inclination would have the opportunity
to do bad things in that process. It's not exactly the same thing, but I guess you get
JB: Let us go back, perhaps, to that statement that you made that is
something that the policy office, in fact, might mandate. And that would be that no one
can connect to the campus backbone without authenticating themselves. Certainly, on one
level, that sounds a little bit like, "No one can come into my house unless I've given
them a key."
MB: I've heard that before. Exactly.
JB: So certainly, then, is that a policy that is now in place or going to
be in place at IU?
MB: Well, we have at this point a list of recommendations that the
Vice-President's going to have to evaluate, and that statement is in there. The fact that
we are recommending that the policy be issued -- that use of any device on the network
requires some form of authentication. In addition to that (or as a caveat to that, I
guess), there are some instances where that authentication would be supplying a guest
account or a public account or something like that. And in those cases, they're still
authenticating. They're on a device that we know provides service where that's okay, and
they can't do anything else. So they're still authenticating, but we know where they're
doing that at, and we can find out where that thing happens.
JB: And there are fences and walls around that.
MB: Yes. It's a security bubble, I like to call it, around that particular
function, and then they can't do anything else.
One of the major problems that we have is
people using devices on the network that don't require them to supply any credentials at
all, doing things -- sending hate mail or posting inappropriately to internal or external
websites, that sort of thing. So we just want to know where that stuff is coming from.
And we're not going to track that stuff. We're not going to keep that stuff around for
What we want to be able to do is, if we get a complaint, then we're able to
evaluate that and we know where that thing came from, and then we can hand that to another
university agency to decide if a sanction should be levied and what that might be.
JB: Maybe we ought to look at one of these questions that we had thought
would be a good one. That is, just what is the impact that you're finding and that we're
finding generally on technology policy and security because of the explosion in Internet
access by the overall general public?
MB: Let me put it this way. Many more settlers -- which is a really good
thing, I think, but many more gunslingers. We get many, many more reports now about
anonymous and inappropriate mail coming into our users from anonymous mailers, various
ones that I'm not going to name here because anybody that's in this business knows exactly
what I'm talking about.
And also leaving our technology environment destined for those
users elsewhere. I'm not saying it was real quiet before, mind you, but obviously, the
more people that are using a particular medium -- that medium in this case called the
Internet and the Web -- the more interaction problems occur.
A key point with this, and
maybe an obvious one, is that locally we have (and I'm sure they wouldn't like to be
referred to this way) but basically we have a captive audience here locally. If we get a
complaint or if we identify someone using our technology resources inappropriately within
our environment, we can target education toward that person or those persons. But if a
person external to our environment, using an anonymous mailer or posting things to a
website, is not a member of our community --the only thing that we can do is whine,
basically, and complain to that service provider. Maybe they do something and maybe they
Things come to mind. We get an unbelievable amount of mail coming into our
environment that is pornographic in nature, obviously unsolicited electronic mail. Some
of the people that get this here are just flabbergasted that they would be targeted for
such a thing. They're not really targeted individually for it. There may be 4,000 other
people within the environment that got the same message, but that doesn't make it any less
significant for that one person. So we get those complaints.
The fact of the matter is
that we can educate our users as much as we can spend time doing it, but we have families
now that are on the Internet. Within families, you have teenage children. Those teenage
children are either -- how can I put this? Either experienced in the wrong way or not
experienced at all. Now, I have to say that we've come across a lot that know what the
environment is all about. They've been taught correctly, if you will. We touch upon
those on occasion, but the majority of the ones we see, unfortunately, are the ones that
either don't know what they're doing or have learned bad habits. So we come across those
on occasion -- students or children of IU students, on occasion. Unfortunately, the
parents will sometimes share those accounts with their children, and then we -- if
something's done inappropriately with that account, who do we go after? We go after the
account holder and we say, "Something happened with your account, and you know you're
responsible." Once we get into it, they're very surprised to know that their son or
daughter was up at 3:00 AM posting some abusive language to some site somewhere.
addition to the anonymous e-mailers, many, many websites now allow for anonymous postings.
We talked about a couple of these the last time that we gathered like this. We've gotten
involved in a couple of others recently where users post things about other users at IU
and, of course, the users here take offense. Or IU users post things about someone else
and they take offense at what was said about them -- even to the point where legal action
The newspapers a lot of times have discussion groups about local sports
teams or racing or whatever it might be. That's an opportunity for some of these
gunslingers, if you will, to post things inappropriately to those sites. And then, of
course, they turn around and complain that one of our users was involved.
JB: Mark, what about in terms of protecting us from some of those kinds
of things? How do you go about providing gating functions on campuses so that you can, in
fact, provide for this authentication? Do you have rules in your public labs, for
MB: Well, yes and no. We don't allow mail to be sent from our public
labs directly. They have to go through our central PO servers, our central mail relay
servers. So we know where those originate, and we know they have to go through those
servers. They can't send mail directly from those devices, for one thing. Now generally,
in our PO machines or central mail relay servers, we don't allow mail relay. That is,
someone external to IU can't send a piece of mail through our central mail relay servers
to someone else outside of IU. Does that make sense?
MB: We do let people external to send to internal folks, of course. We
let internal folks send to external folks. The problem that we get is when we have an
internal person at home, dialed into their ISP, for example (their Internet Service
Provider), sending mail to an acquaintance somewhere else not at IU and they've got their
laptop set to go through IU. That obviously causes a problem because that becomes
external to external.
One of the things that we can't do is block mail from sites that
are notorious for sending in bulk mail -- unsolicited mail, UCE, UBE, Spam, whatever you
want to call it. Because a lot of times, we're going to lose legitimate mail from those
sites. For example, there's no way that we can block mail from AOL.com. That is just not
going to work. We can't block mail from Hotmail.com or Juno.com or AT &T or Worldnet.com,
or Netcom.com. We can't block mail from those major service providers because our users
have legitimate reasons for interacting with users of those services.
So what we have to
do is we have to be very specific. We have to get down to more qualification. For
example, there wouldn't be any reason for someone dialing into the AT &T modem pool to
access the AT &T service to use that modem pool to send mail to our site. So we can
identify that address and we can block mail from that address. We're not blocking mail
from AT &T because that's the way they should be sending it anyway. So we do that. If we
recognize a particular address is inundating us with junk mail, unsolicited mail, we
filter that address out. If we get a site that is -- I don't know how you would say it. If
it's a very small site, if we don't see much of any other kinds of mail from a particular
full site, and all we get from that site is unsolicited bulk mail, we'll block the whole
KH: And it's a domain called Get Rich Quick.com.
MB:. Well, you know, except for sometimes that doesn't work either. You've
got to really analyze the headers. And this is an important thing, because you can send
complaints and nastygrams to the wrong service provider. You really need to analyze the
headers, because they can use whatever name they want. You just have to make sure that
you've identified the right Internet address in that way. So you've got to know how to
read the headers to make sure that you're not flaming the wrong provider for allowing
their users to do something like that.
And I have no idea whether I answered your
question, Judith, so you can remind me.
KH: Before you do, Judith, let me remind our listeners that we are well
past the halfway point in our broadcast. We have about a little less than 10 minutes
left, but still time for you to send your questions in for Mark. And address your e-mail
JB: Thanks for the reminder, Ken. We've got all kinds of questions
that we want to ask Mark, but we would also like to incorporate a few more from our
Mark, what about this question here? We've talked a little bit about the
public labs and what you do or don't do there. Is there something that campuses should do
in dealing with the local ISPs? Is there a way that the campus community and the local
ISPs can work together to help with security strategies?
MB: Well, both we and the local ISP are interesting situations because we
don't consider ourselves an Internet service provider. We provide computing and network
services to members of our community for the purposes that they are here. So you're not
connecting to Indiana University only to have Internet access. You're doing that because
you are involved in research or in learning or in instruction or whatever it might be.
But we do have information about the users of our resources, and there is a reasonable
expectation on their part that we're not going to use that information inappropriately.
In other words, we're not going to sell it and we're not going to hand it over willy-nilly
to whoever asked us for it.
The ISP's are in a similar situation because the people that
are connecting to their services are either paying for that service or they are using that
service and that ISP is reaping the benefit because of advertising that that user sees.
In either case, it's a business for them. They need to make sure that the users are
comfortable with that service.
The point I'm getting to is that having a free exchange of
user information between Indiana University or whatever university and a local ISP is an
iffy proposition because we're not going to be willing to share perhaps the information
that they want about a particular user. They are not willing to share. And our motives
may be different, but they're not willing to share information about their particular
users. Obviously, they carry some liability in the privacy area, but they're trying to
maintain a business. They're trying to make money. If they become known to willy-nilly
release information about their users to other outfits, then obviously someone's going to
go somewhere else.
There's plenty of other service providers. So establishing a
relationship between the public institution like IU and a local ISP is an interesting one.
The response that we generally get from those, by the way, is, "Well, we'd really like to
help you out, but we've got to protect the privacy of our users. Absolutely, we'll give
you the information that you're asking for if you hand us a court order." We've gotten
that response many, many times. And in a couple of situations, in fact, we said, "Okay,
we're going to do that," and we've got a couple of those, as I said earlier, down in the
Monroe County Prosecutor's Office.
JB: And these are instances, just to clarify the situation here, Mark, where
there have been inappropriate messages coming from the local ISP's into the campus
community, is that correct?
MB: Correct. And in these two particular situations, "inappropriate" is
way too light a term.
JB: My mother taught me not to use some of those other terms.
MB: Obviously, if we're going to go downtown and start talking to judges
about forcing these outfits to supply information, we're looking at some pretty serious
stuff. Some of the things, we kind of weigh what it's going to take us to do it, and we
talk to the individual hereabouts that's involved. If they want to pursue it, then we'll
help them. But if they're wishy-washy, we're wishy-washy, then we'll just write it off as
experience and press on.
So that's kind of iffy. But we have had conversations with
managers of our local ISP's, and they understand that we are here and we understand that
they are there, and they understand that our users are not all happiness and light and do
all the right things. We understand that theirs are not, either. But the relationship is
tenuous at best because they've got to protect their interests, and certainly we've got to
JB: Particularly as the numbers and the communities that are using these
new technologies continue to increase, I think our challenges are going to continue as
MB: You can't wander around on the Internet any more without bumping into
somebody. If they're bad, that will spill out. If it's good, that will spill out.
KH: Mark, we do have an e-mail question from Fred Hurst, who is the
Executive Director of Florida Public Post-Secondary Distance Learnings Institute down in
Ft. Myers, Florida. He asks, "Many distance learning Web clients cannot function behind a
firewall. What strategies may be used to protect institutional data while allowing
legitimate uses of the new technology?"
MB: That is one of the reasons why firewalling is a very difficult thing
to strategize about and implement, because that's not the only situation where that's
going to cause a problem.
We have implemented hereabouts very limited firewalling
techniques. We don't have a machine that we can point at and say, "That's our firewall."
We do TCP wrappers on services that have to be provided in and amongst and between the
networks within their institution, and obviously they have to map those out. They have to
then determine what services must be protected at what level, and they have to place
Having a firewall at one point on your security perimeter (if I
can use my Air Force-remembered jargon) -- one firewall limiting access to and from your
network probably is not going to work. A corporation that has very limited interaction
with the outside world may be able to do that very nicely, but our institutions that have
much, much more communication with like institutions -- with other governmental
institutions, with other laboratories, you name it -- that situation is probably not going
However, what you need to do is define specific security parameters. If you
have an administrative computing complex, for example, if you have one or two machines
that support the business of the university, including student records and student
registration and those sorts of things, then put a perimeter around those and make sure
that those are firewalled. Most times, there's no reason for anyone outside of the
university or the institutional environment to access that information, so don't allow it.
Just put a perimeter up there.
JB: Mark, you mentioned a term somewhat new to me. You called it "put a
TC wrapper around a server?"
MB: TCP wrapper. What that does is it's an IP address filter on the
machine. It's a specific hosting service restriction. So on a UNIX host, for example,
you can limit the source IP addresses for which FTP is available or TelNet access or
whatever other services that you want to do. (I just got to the very edge of my knowledge
in that area, by the way.)
JB: But in terms of particularly the situation with distance learning
does create challenges that are difficult because most distance learning students, for
example, many of them would have IP addresses from all the places that you mentioned
earlier today. The AT &T, WorldNet and AOL and all the rest of that. Is there any way
to -- given that kind of client population, what are some strategies that you might
recommend for ensuring security and protection of data?
MB: You have to use other authentication mechanisms. You can't use network
topology and you can't use filtering to take care of that because of what you just said.
Not only might a particular student be accessing the class materials on a Web at a
particular site from AT & T connectivity, two days later, they might be accessing the same
material as they wander around Canada from some other ISP.
So you can't say to a student,
"Okay, where are you going to be coming from? That's the only place where you're going to
be doing that from."
Sometimes you can do that, because if IU Bloomington is producing
materials and providing classes for students at IU Purdue/Fort Wayne, then you know that's
where they're going to be and you can use that kind of technology. If not, you're going
to have to make sure that you use other authentication mechanisms -- a secure user name and
password, depending on the materials. If it's fairly sensitive proprietary materials,
that may not be enough. You may have to issue them some token, some password. You may
want to use something like Kerberos to make sure that the password traverses the network
encrypted. You just have to take additional steps outside of network topology to take
care of that problem.
JB: To summarize what you were saying, then, one way is to look at your
network topology and the second way is to look at these levels of software authentication.
MB: Yeah, in fact, we use a password token to generate a one-time password,
and we deploy that on systems that support and applications that maintain sensitive
information about people or about the institution. You can't get to those without that
So you can be in Hawaii on vacation, and if you're a payroll clerk, maybe you're
the only payroll clerk or grades recording clerk in that department. You don't want that
stuff floating around across half the world, so we want that data encrypted and we want
those people to be using that card. That does two things. It protects the information
and it also strongly authenticates the identity of the person accessing it. Then you can
drop away some of the other network topology concerns.
JB: I think when you've got a population as varied as some of our distance
learning institutions will have, those will probably be a preferred strategy.
MB: And let me finish off that answer by saying that we have not addressed
that here -- the distance learning question, specifically -- here at IU.
JB: Ken, do you have anything else? I've lost track of my time here a
KH: No, we're running a little bit over. I think it is time to wrap it
JB: I want to thank everyone who joined us, and Ken, I'd like to thank
you as well for jumping in at the last moment for Greg Marks. Let me just say again to
remind everyone to take a look at the website and all the references and URL's that Mark
has recommended. And Mark, you had mentioned just before we started the particular place
that might be good for novices to start. Would you mind saying that one more time, and
then we'll do the final close.
MB: From an appropriate policy and usage standpoint, Cornell has some
very good information, and that's listed in the reference list. If you're talking about
appropriate use, that would be the place to start. They have a Policy and Law Center.
They do seminars. Good place to start right there.
JB: Right, very good. Also, we will be taking follow-up questions. Again,
send them to firstname.lastname@example.org. We will be posting them also on the Web.
Also, put the
next two expert even Webcasts on your calendar. The schedule is only a click away from
the CREN homepage. Our next session is one week from today, on Wednesday, June 3 at 4:00.
Our guest expert at that time will be Ken Klingenstein, and he will be taking questions
on the area of Middleware, Authentication, Authorization and Directory Issues.
is pleased to announce the upcoming release of a new virtual seminar called "Creating
Internet2." This seminar will be available in two formats, both on the Web and on CD.
Again, thanks to everyone who made this possible today: the board of CREN; our expert,
Mark Bruhn; Ken Horning; Brian Vaughn at UM Online for encoding; and all of you. You were
here because it's time. 'Bye, Mark.
MB: See you.
JB: 'Bye, Ken.
KH: 'Bye, Judith.
JB: 'Bye, all.