Workshop for Intrusion Detection and Response Data Sharing

Matt Bishop and Stephen Northcutt, co-hosts

Logistics

Schedule:
July 14, 7-9pmReception
July 15, 7:15-8amContinental breakfast
July 15, 8am-8pmWorking sessions with working meals

Where: The workshop will be held in Room 1003, Engineering Unit II, at the University of California at Davis. It is in the lobby to the right, almost kitty-corner from the Dean's Office.
Who: The sponsors of this workshop are the SANS Institute and the Department of Computer Science, UC Davis.

What It Is

The purpose of the IDRDS workshop is to identify, define, and prioritize problems that cause IDSes to be cumbersome and ineffective in day-to-day use. Additionally, the workshop will focus on channels, or mechanisms to enable users to provide data and continuing feedback to help researchers ameliorate their problems, as well as to enable researchers to provide users with experimental or prototype systems to test the efficacy of new algorithms and interfaces.

As the workshop has a limit of about 25 participants, it will not be possible to invite everyone who expresses interest. We are very interested in knowing about your experiences, preferences and the problems that you have run into with Intrusion Detection Systems. Your responses will be kept confidential. The replies to the questionnaire will be stored off-line. If the information is provided to any party other than the co-hosts of the workshop or Alan Paller from SANS, the identification information will not be sent with the rest of the file.

If you are interested, please fill out the questionnaire at the end of this page, and send it to any of: Matt Bishop, Stephen Northcutt, or Alan Paller.

What We Will Do

Each invitee will be asked to present a 15-20 minute talk. The talk should contain the following information.

For researchers:

For practitioners:

Here, "researcher" means a developer of IDSes and "practitioner" means a user of IDSes. If you fall into both classes, feel free to talk about both parts!

Following the talks, we will determine how to transfer data and prototypes so that everyone has access to as much as possible, with minimal overhead. We also will try to develop a list of desirable features of IDSes and common problems in practice that IDSes need to overcome. The focus here will be on the "end result" and not on the best technique to reach that "end result" (although I anticipate some discussion on the latter topic too!)

As time approaches, we will put out a more detailed agenda and a list of invitees who have accepted.


Intrusion Detection and Response Data Sharing Questionnaire

Identification information optional
Name:
Organization:
Position or Title:
Preferred E-mail (to clarify answers):

If any question is too intrusive please answer "pass." If you do not know the answer to a question please answer "don't know"

  1. Does your site utilize an intrusion detection system (IDS)?
  2. What system(s) do you use?
  3. How long has it been in place?
  4. Is your system "manned" twenty four hours a day, seven days a week?
  5. How many detects did your site collect:
  6. What do you estimate your IDSes false alarm/false positive rate to be?
  7. Are there attacks that you know your IDS can not detect (T/F)?
  8. Does your IDS have an automatic response capability (T/F)?
    1. If so, have you found it to be effective?
    2. What is the biggest failure of the response capability?
  9. Does your IDS have a database that allows it to watch out for "bad IPs" (hosts/networks that attacked you in the past) (T/F)?
    1. If so, has this capability been effective, that is has it contributed to additional detects?
  10. Does your IDS supplier, or CERT maintain lists of known "bad IPs"?
  11. Does your IDS have a historical database that allows it to do trend analysis (T/F)?
  12. If you could add three features, or capabilities to your IDS what would they be?
  13. What feature(s) of your IDS are its strongest points?
  14. Has your organization ever participated with your IDS supplier or researchers in the intrusion detection or incident response field in testing new designs if IDSes (T/F)?
    1. If so, did you find this partnership to be beneficial to your organization?
  15. What do you see as the most pressing problem in using intrusion detection systems?
  16. Does your organization's incident response team, find the output of your IDS to be helpful in incident response (T/F)?
    1. What additional features, or capabilities do they request from your IDS?
  17. Does your IDS collect raw traffic, or content data which can be examined off-line?
  18. Does your organization archive the raw traffic or content data collected by your IDS?
  19. What will you be able to contribute to the workshop? Please be as specific and detailed as possible.
  20. What do you expect to get out of the workshop? Again, please be as specific and detailed as possible.
  21. Would your organization be able and willing to make (possibly sanitized) real data available to research groups?
  22. Would your organization be able and willing to make your IDS prototypes available to practitioners for early testing?