Josh Work Professional Organizations Trip Reports Conference Report: USENIX 2001

The following document is intended as the general trip report for Josh Simon at the USENIX General Technical Conference in Boston, MA from June 25-30, 2001.


Friday, June 22

Today was a travel day. I managed to get to the hotel without incident — the plane was more or less on time, had no screaming children aboard, and given that it was 5pm Friday getting to the hotel in only 90 minutes from Logan is not bad.

I called a local friend and we went to dinner and had fun at Marché. Ate way too much, but had fun anyhow.


Saturday, June 23

Today was a very relaxed day. Hung out in and around the hotel for most of it, with the occasional side-trip to the attached malls (Copley Place and Prudential Center). Did some swimming in the pool and relaxing in the hot tub (where else?), and had a nice relaxed early dinner at Legal Seafood before crashing.


Sunday, June 24

Today started with a bos.motss dim-sum at China Pearl in Chinatown. So I hopped on the local T and wandered around Chinatown for a while (and was somewhere close to horrified and amused at the fact that McDonald's is selling a lobster roll ("Now with more 100% real lobster!")). Met up with the local motssisti who could make it and had a great time pigging out on beaucoup Chinese food.

After dim-sum and a quick trip to the fabric store (someone wanted a bolt of fabric to make a shirt, and we perused the prints and patterns and so on), I went back to the hotel and napped for a while before going down to registration.

Wound up working the tutorial handouts — mainly because the staff at Registration decided to open early and nobody else was there to do it besides me and Moose. When Dan Klein arrived (still before the scheduled open) I went to get more volunteers to help out. Registered myself (and straightened out a problem where I was not registered for the technical sessions — swapped out the badges and got my Proceedings, had the badge reprinted to use the correct form of my name and email address, and then went back to help hand out more tutorial notes.

After working there for a while, chatted with folks at the Welcome Reception and then went out to tapas dinner with Esther Filderman, Bob Gill, and Trey Harris. Taepo, a short cab ride or longish walk away, was a bit expensive but very tasty. We split a few smaller tapas (the duck in a wild berry reduction was very nice indeed) and shared a nice big paella.

After dinner, did some hottubbing and then crashed.


Monday, June 25

Today was a vacation day. In the morning I helped out by handing out tutorial notes before the sessions started at 9 (including walking a set into Eric Allman's Sendmail tutorial since he didn't have time to grab his own set), then doing miscellaneous stuff until a small lunch, an afternoon nap, dinner at Legal Seafood, and hottubbing to close out the day.


Tuesday, June 26

Today I attended the T8 tutorial, day 1 of Steve Romig's Forensic Computer Investigations course. The first day was a great overview to the topic (day 2, Wednesday, I didn't actually take).

For dinner, I was invited to joing the SAGE Executive Committee and the USENIX office staff at Fire and Ice, a place where you choose your meats, vegetables, and sauces and they cook it for you right there on a big Mongolian-style grill. Apparently I'm officially the communal spouse of the SAGE Executive Committee (ask Ellie Young to explain it), and since spouses were welcome at the dinner I got invited. :-)

After dinner I collapsed in the hot tub 'til closing.


Wednesday, June 27

Today was my other vacation day. I spent most of it either wandering around Boston and shopping or napping. In the evening, I attended the regular GLBTUVWXYZ BOF followed by a mediocre dinner (with terrible karaoke) at Clary's Irish Pub. I bailed and went to the hot tub after dinner.


Thursday, June 28

Today the conference itself started.

1. Introductory Remarks and Keynote Address

1.1. Introductory Remarks

The conference began with Dan Geer, the president of the USENIX Association, thanking Clem Cole and Yoon Park for their work in putting the conference together. Yoon Park then talked about the general session and Clem Cole talked about the Freenix track:

 USENIX Freenix
Papers submitted, 2001 8258
Papers submitted, 2000 9061
Delta down 11%down  5%
Papers accepted, 2001 2427
Papers accepted, 2000 2427
Non-USA papers accepted 48 of 18
Student papers accepted 158 of 15
Attendance, 2001 1400+
Attendance, 2000 2000+
Delta down 7%

Both track chairs thanked their respective program committees, the USENIX conference and office staffs, the IT and Guru coordinators, the volunteers, their current and previous employers and the attendees.

Following the usual general announcements (meet your session chairs and speakers, where to go for WIP submissions, BOF scheduling, and next year's conference location), the best paper awards were presented:

Following this, USENIX Vice President Andrew Hume presented the USENIX Lifetime Achievement Award (also known as the "Flame") to the GNU Project for the ubiquity, breadth, and quality of its freely available software. [Ed. note to USENIX — pull the actual quote from Andrew, since it vanished too soon for me to grab it.] Robert Tricell, the founding director of the Project, accepted. Andrew then presented the Software Tools User Group (STUG) Aware to the Kerberos development team for its secure, scalable, and relatively simple to administer suite of tools. Ted T'so accepted on behalf of the team and donated the $1,000 cash award to USENIX to be used for student stipends for August's USENIX Security Symposium.

1.2. Keynote Address

Dan Frye, Director of IBM's Linux Technology Center, spoke about Linux as a disruptive technology. The term isn't intended to have any derogatory connotations; rather, the talk focused on how the growth of Linux has disrupted the status quo of how businesses chose IT products. This year alone IBM is pouring $1 billion into Linux development, working within the public development community, because of business decisions (it makes the company money, it makes the shareholders money) instead of purely technical ones.

A disruptive technology is one where the skills, the desire, and an open culture of significant size all meet. The desire for Linux and the openness of the community are well documented. Further, over time, the skills in computing have moved from mainly academia (meaning colleges and universities) to all levels of education as well as hobbyists and even industry, thanks in part to the explosion of games, the Web, the growth in technology, and so on. The increasing commoditization of technology has also feuled the explosion of skills.

IBM believes that Linux as a technology is sustainable in the long term. It's got growing marketplace acceptance, it doesn't lock the customer into a particular vendor for hardware or software, is industry-wide, runs on multiple platforms, and is a basis of innovation. Linux has become critical for e-business due to the confluence of desire, skills, and the open culture, with an ever-growing community size.

Dr. Frye went on to dispel some rumors about Linux in the enterprise environment:

Myth: Open Source is undisciplined.
Fact: The community is very disciplined, with reviews of the code and assignments and making sure things are "right" before rolling them into a major distribution.

Myth: Open Source is less secure.
Fact: Open Source is as or more secure, because of the public review and comment to prevent security holes from getting released (or from staying in released code unpatched for long).

Myth: Community doesn't do enterprise features.
Fact: The community wants good designs, but is not against enterprise features. Designing good, scalable solutions — whether for multiple processors (threading code) or different architectures or clusters, or backing up over high-speed devices (networks) — is a major goal of the community.

Myth: The Open Source community will fragment.
Fact: While possible, IBM believes this is unlikely.

Myth: Traditional vendors cannot participate.
Fact: Untrue; IBM is competing quite well, and other vendors such as Compaq and Dell are making Open Source OSes available on their hardware platforms as a customer-orderable option.

Myth: Open Source doesn't scale.
Fact: Definitely untrue. Open Source works on the enterprise scale and inclustering environments.

Myth: Open Source has no applications for it.
Fact: Open Source has over 2,300 business applications for it, not counting the many non-business applications that run under the various versions of Linux and *BSD.

Myth: Open Source is only a niche market.
Fact: Open Source OSes are on nearly a quarter of the servers in production, according to Dr. Frye's slide.

Myth: Open Source is never used in mission-critical applications.
Fact: While this may have been true in the past, it's less and less true over time. It should be mission-critical-capable in the very near term.

The IBM Linux Technical Center's mission is to help make Linux better, working within the community. Their URL is http://oss.software.ibm.com/developerworks/opensourcelinux/.

2. Terminal Room

During this time block, I went to the terminal room to read my mail, catch up on news, and do some web development work. After that, it was time for lunch.

3. Vendor Floor

After lunch I visited the vendor floor. Didn't get a lot of gimmes this conference; nothing jumped out at me this time.

After the vendor floor I did some Hallway Track sessions until the final session of the day.

4. Security Aspects of Napster and Gnutella

Mr. Bellovin began his talk by describing the many common functions between Napster and Gnutella, and by extension every other P2P network. Without central servers controlling the data, the clients are free to decide what to share and what to keep secret. The very protocol supplies the index of peers and connectivity information, allowing direct connection from peer to peer without going thru any intermediary.

Napster uses a central server as a query base for users to query for files and also supplies chat functions. A compiled index keeps track of who has what, at what speed they are connected, etc. When a user selects a file of interest, he gets connection information from the server, and then initiates a direct connect to the peer who is sharing the file. Also available is a 'hot-list' function, allowing a private list of specific users connection status.

The Gnutella protocol is different in that there is no central server whatsoever. Every user has their own index, which is updated from the indexes of the users they are connected to. This creates a very large network of users connecting to users connecting to users, etc. It is not uncommon for any single user to have up to 10 connections. The Gnutella protocol is an open specification.

The search strength of Gnutella resides in its flooding protocol, wherein a user has the ability to speak to every connected machine. When a user is searching for a file, he sends a request to all his neighbors, who in turn forward it to their neighbors. When there is a match, the user directly connects to the user with the file, and download begins. Aside from basic IP address information, there is no authentication of any type.

The talk focused primarily on Gnutella, and at this point, MR Bellovin discussed at great length the specifics of the Gnutella protocol's 5 messages: ping, pong, push, query, and query hits.

Gnutella suffers from the openness of its protocol in several obvious ways. First, there is no authentication of the IP, so the network could conceivably be used in a flooding attack. There would be a lot of attempts to connect to, say, CNN.com, if it were put in a packet that cnn.com:80 was sharing 10,000 files. Also, the Gnutella packet headers contain the MAC address of a computer using Win 95,98,NT. This could be used to link requests to requesters and is an obvious privacy violation.

Using a central authority to authenticate makes it very difficult to fake an IP address. The privacy issues are much more apparent here, as the central site could conceivably keep track of every single session for every single user.

The conclusion was that there is no right way, that although Gnutella is the wave of the future, there are significant privacy concerns. Authentication of some kind would make the Gnutella network more legitimate as well. Clients for both need to be well-written to avoid buffer overflows which are all too prevalent in some kludgy Gnutella clients.


Friday, June 29

The confusion begins — is it Friday or the second day of the confernece, which is usually Thursday? Both, of course. So, on to the sessions!

1. Security for e-Voting in Public Elections

With the controversy surrounding our last election there is an increased look at improvements to our aged punch cards and voting booths. Many people are raising the idea of using the Internet for voting, but what kind of risks would that entail? Avi Rubin, who has studied this area extensively, shared his insights and research.

Avi Rubin was invited by the Costa Rican government to investigate the possible usage of electronic voter registration systems in their 1997 election. In this country voting is mandatory and must occur in the same district as the first time a person voted. This creates unique logistical problems the government was hoping to solve with computer systems. Their goal was to register people at any polling site using computers borrowed from schools. Several significant challenges were discovered during the trial. First, the high proportion of computer illiterate persons required usage of light pens for input instead of mice. Trust was another problem, since the population would not necessarily trust a US developed system. This was compounded by the fact that US cryptography export laws prevented the use of most encryption systems. In the end Cost Rica's voting tribunal became worried about challenges to the new system and decided to cancel the trial.

There have been several other groups looking into this issue lately. The NSF hosted an e-voting workshop that brought together technologists, social scientists, election officials, and the Dept. of Justice. The workshop's recommended that we were unprepared for remote electronic voting systems, but that modernizing poll sites held promise.

One of the most important characteristics in e-voting systems is voter confidence. There must be confidence that all votes were counted, counted only once, and remain private. All of these are simple in paper systems and quite difficult in electronic systems. Additionally, electronic systems suffer from new problems, such as selective denial of service. What if a subtle denial of service attack was aimed at a carefully picked geographic area? In a close election this could be enough to change the outcome. Another significant threat is trojan horses and viruses. With the proliferation of this malicious software, how could we trust the integrity of our computers for something as important as a national election?

Cryptographic protocols are a key component of any online voting system. Rubin described a system called "Sensus" developed by Lorrie Craner. Sensus uses blind signatures and a public key infrastructure (PKI) to provide many of the properties of a good voting system. Unfortunately it is still vulnerable to non-cryptographic attacks, such as a lack of anonymity and denial-of-service. This illustrates some inherent problems with voting over the Internet.

A longer (2-3 week) voting period to combat the risk of DDOS attacks on voting systems would still not prevent selective denial attacks that subtly reduce service to targeted areas. Additionally, there is always the possibility of large scale network failures over any time period which could prevent the election from happening.

2. SNAC Meeting

As a member of the Program Committee for the oft-discussed but not-yet-implemented System and Network Administration Conference (SNAC), I was invited to the "short" meeting of those of us present at USENIX. I got to take minutes so here they are:

Overview

The conference was to be held July 2001 but we delayed it when the economy slowed down. The delayed date of March 2002 was chosen, but there's some concern that that would be too soon. There's also concern that we may be missing a window of opportunity if we delay it more. Finally, there's the question of whether or not this should be a training experience for us.

What we want to accomplish

We want to target system and network administrators who do not currently attend USENIX or LISA. The thought is "SANS for SysAdmins," where we're trying to get more junior folks, tool users and button pushers, not the architects or designers.

There's no real competition from SANS in this market, but we may be in competition with vendor training. We might want to look at getting multiple vendors to participate to reduce the competition charge.

We'd provide preferred (if not best) practices. We want to build a community and provide professional development. We want to balance with USENIX's traditional strengths.

Program notes

The program will be tutorials (Dan Klein) and invited talks (program committee), but not have refereed papers. We definitely want to provide more than one "thing" for better perceived value or one-stop shopping. One possibility was vendors' technical people at 2-hour BOF-like sessions, or nighttime sessions. Vendors can include marketing people in the back of the room to target questioners later; the speakers up front MUST be technical. Vendors need to include Microsoft (though they've previously shown little interest in participating in non-Seattle shows).

We might consider BOF or workshop leaders, such as Curtis Preston leading the backup BOFs. This seems to me to be something like the Guru-Is-In sessions on a larger scale. We should consider practicums and have lots of breadth, with short sessions but many tracks. The sessions need to include how to train people in doing research to figure out which vendor best meets their needs. In addition we need to remain vendor neutral, not necessarily recommending one over the other, though all are welcome to attend and present (on a technical basis).

Micro-topic sessions of 1-2 hours long that provide the rudiments of doing something reasonably. Example topics are "Installing new hardware," "Building a new office of 50 people," "...of 100 people," and so on.

(Vendors, Paradigms, Surveys — rob)

We want to include a connect-a-thon to let vendors prove their interoperability claims, sort of a "sandbox" or "playground" so people could test things. We want to promote diversity not sterility.

We want to provide information to the audience which they don't have. Step-by-step "how things work" paradigms, supervised sandbox to connect things together (such as a FOO switch with BAR boxes), possibly even including database administrators.

Some other possibilities include distance learning, piping in experts, building communities (both within the attendees and between them and the USENIX regular attendee base).

Costs

We need about 500 people to break even; having only 100 atendees will leave us about $300,000 in the hole. The venue — Dallas, Texas — can hold about 1,000 people.

If we do NOT follow through with the March 2002 date it will cost us $60,000 to break the contract with the hotel now, or $200,000 to cancel as late as November. We estimate a total $500,000 in losses if we completely tank.

The marketing budget is $150,000 right now.

March is generally a good month (most fiscal years end in December or June). We don't necessarily want to wait until an economic uptick since that puts us behind the curve in case anyone else is thinking of this market segment.

Marketing

November through January is traditionally a bad time for marketing, due to the (US) Thanksgiving, Christmas, and New Year's holidays. Since this would be a new conference we don't have any of the benefits of an established name or reputation (beyond the USENIX name). However, since our target audience is people who've never been to a USENIX-sponsored conference or workshop before that doesn't necessarily mean anything.

There's a marketing question as to whether this is a conference or a training opportunity. We want to market not just to the target audience but also to their managers (who have spending authority to send their junior folks to SNAC).

We definitely want to push cost savings — possibly by selling sponsorships, possibly by using the catalog vendors to sponsor events (get juniors hooked with brand loyalty now, include copies of the current catalogs in the registration package). We can also work the regional angle (local-to-Dallas technical people and companies).

Questions asked

Are we reacing only to self-identified systems administrators?

Challenge: How do we get people to come to us: money and time as well as desire (some folks don't like to travel)?

How do we get post-college "kids" trained in advance? Will vendors pony up cash to pay for student stipends? Are SAs from CS or non-CS majors eligible? This might not be the rightr venue fo this discussion.

How do we convince managers? Emphasize the cost savings and continuing education angles. Pitching the ROI is probably the best approach.

When do we have to reach a decision? Now. Marketing needs 9-15 months in order to successfully market the conference to its targeted audiences. In terms of attendance, the decision can be deferred until the early registration deadline, typically 4-6 weeks before the conference itself.

Does being "first to market" buy us anything here? Probably not.

Can we time-shift to be adjacent to LISA 2000 in Philadelphia? No, because of space issues there.

Summary

We took a straw poll; most people favored March 2003 and eating the $60,000 cost now. Only two people thought March 2002 would be a good idea still.

We have a lot of excellent ideas, both technical and marketing, and are confident that we can put together a great program.

3 and 4. Quiz Show Preparation, Part I

This afternoon was the Quiz Show preparation. In this case, Rob Kolstad, Dan Klein, and I met over lunch, wrote the qualifying questionnaires, had them published (and to the Member Services booth by the 3:30pm break), and then ran through the categories to make sure things were correct. They were, except that they'd been run through at LISA 2001 and SANS 2001. This resurfaces tomorrow.

5. Evening activities

The reception was this evening. From my standpoint they got it right: Many food stations and lots of places to sit and talk, without too-loud music preventing conversation. The food was generally good (though the beef was gristly and the turkey too dry), very diverse (for omnivores, carnivores, and herbivores alike), and the drinks plentiful.

After the reception I collected some water and then tended bar at the scotch BOF. We had some really good scotches this year, including a very nice cask-strength pale scotch that was both fiery and smooth.


Saturday, June 30

Finally, virtual Friday — the last day of the conference. The schedule shift (tutorials Monday through Wednesday and sessions Thursday through Saturday) was necessary because when we added the third tutorial day we had to add it at the far end (Saturday the 30th) instead of the near end (Sunday the 24th) sicne the space wasn't available otherwise.

1. Slept In

One advantage of the scotch BOF is it gives me an excuse to sleep in on the last day. So I did. Then went on to prepare for the Quiz Show again.

2 and 3. Quiz Show Preparation, Part II

You see, at the reception I mentioned to Trey Harris that if we ran the same questions as we had loaded he couldn't play in the Quiz Show. Then he reminded me that the Quiz Show was online on the Dr. Dobbs' TechCast web site from LISA. Oops.

So I contact Rob and Dan and we agree to rewrite the entire Quiz Show — that's 5 questions in 6 categories in 6 games (the 3 elimination rounds, the finals, a tie-breaker if we need it, and a tournament-of-champions so Aaron Mandel and Trey Harris can have a rematch with the champ-du-jour). But first we have to score the qualifying questionnaires (the top 6 scorers and 3 random participants get selected for the 9 contestant slots and everyone else becomes an alternate based on scoring), print up the name tags for them all, print the announcements, and hand them off to the student slaves to hang around the conference floor.

Once we've finished all that (by 3pm), we get to rewrite 36 categories or 180 questions total before the 5pm start time. And then have to make sure everything fits on the screens, tear-down, run downstairs, finish our printing before the last printer gets packed up since the terminal room's closed down at 2pm.

At 4 we take a quick break for lunch; at 5:15 we do a 5-minute setup, then launch into the game itself.

4. Quiz Show

The contestants and scores are:

Game 1Christopher Davis (3500) Steve McIntyre (2900)Perry Metzger (900)
Game 2Matt Crosby (700) Mark Langston (2000)Andy Tannenbaum (2100)
Game 3Michael Buselli (1400) Jim Larson (2900)Ethan Miller (3500)
FinalsChristopher Davis (1700) Andy Tannenbaum (1300)Ethan Miller (5700)
Tournament of ChampionsTrey Harris (1900) Aaron Mandel (2100)Ethan Miller (2100)

So in the tie-breaker Aaron scored 500 and Ethan — who turns out to be a professor at University of California at Santa Clara — scored 1000 to be the grand winner.

The USENIX Quiz Show has been produced by Rob Kolstad, Dan Klein, Dave Parter, and Josh Simon. Testers were Rik Farrow and Greg Rose. Our Prize Mistress, Vanna Off-White, was played by Pat Wilson. Special thanks to MSI for audio/video assistance. Prizes were provided by USENIX, Radware, O'Reilly, Prentice Hall, Addison-Wesley, ActiveState, Tandberg, SEI/CERT, and Integrated Computer Solutions. This has been a Klein/Kolstad Prodction. Copyright © 2001.

5. Evening activities


Sunday, July 1

After finishing up packing, coordinating checkout of the hotel (since I shared the room with someone who would pay 1.5 nights on his own and have USENIX pick up 2 whole nights for him, leaving 6.5 nights for me, plus my phone calls), and having a lovely brunch buffet with Trey and JD, I was off to the airport and a slightly-delayed flight home to Chicago. (Boston was closed down due to thunderstorms in New York, but we finally got clearance to take off, go north into Canada to get around them, then had to hold above O'Hare due to them being stacked up with east-coast traffic due to those same storms. Sigh.) Finally get home to check my mail, pay my bills, do some laundry, and get a few hours sleep before an early morning flight to Tampa Florida to bill on an audit.



Back to my conference reports page
Back to my professional organizations page
Back to my work page
Back to my home page

Last update Feb01/20 by Josh Simon (<jss@clock.org>).
The "Security Aspects of Napster and Gnutella" talk was summarized by Chris Hayner. The "Security for E-Voting in Public Elections" talk was summarized by Adam Hupp.