The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

CERIAS Blog

Page Content

Panel #1: Traitor Tracing and Data Provenance (Panel Summary)

Share:

Tuesday, April 5, 2011

Panel Members:

  • David W. Baker, MITRE
  • Chris Clifton, Purdue
  • Stephen Dill, Lockheed Martin
  • Julia Taylor, Purdue

Panel Summary by Nikhita Dulluri

In the first session of the CERIAS symposium, the theme of ‘Traitor Tracing and Data Provenance’ was discussed. The panelists spoke extensively about the various aspects relating to tracing the source of a given piece of data and the management of provenance data. The following offers a summary of the discussion in this panel.

With increasing amounts of data being shared among various organizations such as health care centers, academic institutions, financial organizations and government organizations, there is need to ensure the integrity of data so that the decisions based on this data are effective. Providing security to the data at hand does not suffice, it is also necessary to evaluate the source of the data for its trust-worthiness. Issues such as which protection method was used, how the data was protected, and whether it was vulnerable to any type of attack during transit might influence how the user uses the data. It is also necessary to keep track of different types of data, which may be spread across various domains. Identification of the context of the data usage i.e., why a user might want to access a particular piece of data or the intent of data access is also an important piece of information to be kept track of.

Finding the provenance of data is important to evaluate its trustworthiness; but this may in-turn cause a risk to privacy. In case of some systems, it may be important to hide the source of information in order to protect its privacy. Also, data or information transfer does not necessarily have to be on a file to file exchange basis- there is also a possibility that the data might have been paraphrased. Data which has a particular meaning in a given domain may mean something totally different in another domain. Data might also be given away by people unintentionally. The question now would be how to trace back to the original source of information. A possible solution suggested to this was to pay attention to the actual communication, move beyond the regions where we are comfortable and to put a human perspective on them, for that is how we communicate.

Scale is one of the major issues in designing systems for data provenance. This problem can be solved effectively for a single system, but the more one tries to scale it to a higher level, the less effective the system becomes. Also, deciding how much provenance is required is not an easy question to answer, as one cannot assume that one would know how much data the user would require. If the same amount of information as the previous transaction was provided, then one might end up providing excess (or insufficient) data than what is required.

In order to answer the question about how to set and regulate policies regarding the access of data, it is important to monitor rather than control the access to data. Policies when imposed at a higher level are good, if there is a reasonable expectation that people will act accordingly to the policy. It is important not to be completely open about what information will be tracked or monitored, as, if there is a determined attacker, this information would be useful for him to find a way around it.

The issue of data provenance and building systems to manage data provenance has importance in several different fields. In domains where conclusions are drawn based on a set of data and any alterations to the data would change the decisions made, data provenance is of critical importance. Domains such as the DoD, Health care institutions, finance, control systems and military are some examples.

To conclude, the problem of data provenance and building systems to manage data provenance is not specific to a domain or a type of data. If this problem can be solved effectively in one domain, then it can be extended and modified to provide the solution to other domains as well.

Opening Keynote: Neal Ziring (Symposium Summary)

Share:

Tuesday, April 5, 2011

Keynote Summary by Mark Lohrum

Neal Ziring, the current technical director for the Information Assurance Directorate at the NSA, was given the honor of delivering the opening keynote for the 2011 CERIAS Symposium on April 5th at Purdue University. He discussed the trends in cyber threats from the 1980s to today and shifts of defenses in response to those threats. He noted that, as a society, we have built a great information network, but unless we can trust it and be defended against possible threats, we will not see the full potential of a vast network. Ziring’s focus, as an NSA representative, was primarily from a perspective of preserving national interests regarding information security.

Ziring discussed trends in threats to information security. In the 1980s, the scope of cyber threats was rather simple. Opposing nations wished to obtain information from servers belonging to the U.S., so the NSA wished to stop them. This was fairly straightforward. Since the 1980s, threats have become far more complex. The opponents may not be simply opposing countries; they may be organized criminals, rouge hackers, hacktivists, or more. Also in years past, much expertise was required to complete attacks. Now, not so much expertise is required, which results in more threat actors. In the past, attacks were not very focused. Someone would write a virus and see how many computers in a network in can effect, almost as if it were a competition. Now, attacks are far more focused on achieving a specific goal aimed at a specific target. Ziring cited a statistic that around 75% of viruses are targeted at less than 50 individual computers. Experts in information security must understand the specific goals of a threat actor so attacks can be predicted.

Ziring also discussed shifts in information security. The philosophy used to be to simply protect assets, but now the philosophy includes defending against known malicious code and hunting for not yet known threats. Another shift is that the NSA has become increasingly dependent upon commercial products. In the past, defenses were entirely built internally, but that just does not work against the ever-changing threats of today. Commercial software advances at a rate far faster than internal products can be developed. The NSA utilizes a multi-tiered security approach because all commercial products contain certain shortcomings. Where one commercial product fails to protect against a threat, another product should be able to counter that threat; this concept is used to layer security software to fully protect assets.

A current concern in information security is the demand for mobility. Cell phones have become part of everyday life, as we as a society carry them everywhere. As these are mobile networking computers, the potential shortcomings of security on these devices is a concern. If they are integrated with critical assets, a security hole is exposed. Similarly, cloud computing creates a concern. Integrity of information on servers which the NSA does not own must be ensured.

Ziring brought up a couple of general points to consider. First, information security requires situational awareness. Knowing the current status of critical information is necessary to defending it properly, and knowing the status of the security system consistently is required. Currently, many security systems are audited every several years, but it may be better to continuously check the status of the security system. And secondly, operations must be able to operate on a compromised network. The old philosophy was to recover from a network compromise, then resume activity. The new philosophy, because networks are so massive, is to be able to run operations while the network is in a compromised state.

Ziring concluded by discussing the need to create academic partnerships. Academic partnerships can help the NSA have access to the best researchers, newer standards, and newer technologies. Many of the current top secure systems would not have been possible without academic partnerships. It is impossible for the NSA to employ more people than the adversaries, but it is possible to outthink and out-innovate them.

A Recent Interview, and other info

Share:

I have not been blogging here for a while because of some health and workload issues. I hope to resume regular posts before too much longer.

Recently, I was interviewed about the current state of security . I think the interview came across fairly well, and captured a good cross-section of my current thinking on this topic. So, I'm posting a link to that interview here with some encouragement for you to go read it as a substitute for me writing a blog post:

Complexity Is Killing Us: A Security State of the Union With Eugene Spafford of CERIAS

Also, let me note that our annual CERIAS Symposium will be held April 5th & 6th here at Purdue. You can register and find more information via our web site.

But that isn't all!

Plus, all of the above are available via RSS feeds.  We also have a Twitter feed: @cerias. Not all of our information goes out on the net, because some of it is restricted to our partner organizations, but eventually the majority of it makes it out to one of the above outlets.

So, although I haven't been blogging recently, there has still been a steady stream of activity from the 150+ people who make up the CERIAS "family."   

Can Your IPv4 Firewall Be Bypassed by IPv6 Traffic?

Share:
Do you have a firewall? Maybe it's not as useful as you think it is. I was surprised to discover that IPv6 was enabled on several hosts with default firewall policies of ACCEPT and no rules. This allowed IPv6 traffic to completely bypass the numerous IPv4 rules!

IPv6 makes me queasy security-wise due to features such as making all IPv6 hosts into routers that obey source routing, as well as the excessively eager and accepting autoconfiguration. More recent doesn't imply more secure, especially if it's unmanaged because you don't realize it's ON. The issue is IPv6 being enabled by default in a fully open mode. Not everyone realizes this is happening, as we're very much still thinking in terms of IPv4. Even auditing tools such as Lynis (for Linux/UNIX systems) don't report this; it only checks if the IPv4 ruleset is empty. There are going to be a lot of security problems because of this. I know it's been so for some time, but awareness lags. I'm not the only one who thinks it's going to be a bumpy ride, as pointed out elsewhere.

You can mitigate this issue in several ways, besides learning how to secure IPv6 (which you'll have to do sometime) and using your plentiful spare time to do so enterprise-wide. Changing all the default IPv6 policies to DROP without adding any ACCEPT rules breaks things. For example, Java applications try IPv6 first by default and take several minutes to finally switch over to IPv4; this can be perceived as broken. If you have Ubuntu on your desktop, you can use ufw, the Uncomplicated FireWall, to configure your firewall with a click of the mouse. When "turned on", it changes the default policy to DROP but also adds rules accepting local traffic on the INPUT and OUTPUT chains (well done and thanks, Canonical and Gufw developers). This allows Java applications to contact local services, for example. You can also disable IPv6 in sysctl.conf (and have Java still work) if you have a recent kernel (e.g., Ubuntu 10):

net.ipv6.conf.all.disable_ipv6 = 1
net.ipv6.conf.default.disable_ipv6 = 1
net.ipv6.conf.lo.disable_ipv6 = 1

followed by a reboot. You can also do this immediately, which will be good only until you reboot (note: sudo alone doesn't work, you need to do "sudo su -"):

echo 1 > /proc/sys/net/ipv6/conf/all/disable_ipv6
echo 1 > /proc/sys/net/ipv6/conf/default/disable_ipv6
echo 1 > /proc/sys/net/ipv6/conf/lo/disable_ipv6


This removes the IPv6 addresses assigned to your network interfaces, and then Java ignores IPv6. If you have an "old" kernel (e.g., the most recent Debian) and need to support Java applications, the above kernel configurations are not available at this time. However, there are other ways to disable IPv6 for Debian, well documented elsewhere. You can also manually add firewall rules like those done by ufw, as described above.

Centers of ... Adequacy, Revisited

Share:

Almost two years ago I wrote in this blog about how CERIAS (and Purdue) was not going to resubmit for the NSA/DHS Centers of Academic Excellence program.

Some of you may notice that Purdue is listed among this year's (2010) group of educational institutions receiving designation as one of the CAEs in that program. Specifically, we have received designation as a CAE-R (Center of Academic Excellence in Research).

"What changed?" you may ask, and "Why did you submit?"

The simple answers are "Not that much," and "Because it was the least-effort solution to a problem." A little more elaborate answers follow. (It would help if you read the previous post on this topic to put what follows in context.)

Basically, the first three reasons I listed in the previous post still hold:

  1. The CAE program is still not a good indicator of real excellence. The program now has 125 designated institutions, ranging from top research universities in IA (e.g., Purdue, CMU, Georgia Tech) to 2-year community colleges. To call all of those programs "excellent" and to suggest they are equivalent in a meaningful way is unfair to students who wish to enter the field, and unfair to the people who work at all of those institutions. I have no objection to labeling the evaluation as a high-level evaluation of competence, but "excellence" is still not appropriate.   
  2. The CNSS standards are still used for the CAE and are not really appropriate for the field as it currently stands. Furthermore, the IACE program used to certify CNSS compliance explicitly notes "The certification process does not address the quality of the presentation of the material within the courseware; it simply ensures that all the elements of a specific standard are included.." How the heck can a program be certified as "excellent" when the quality is not addressed? By that measure, a glass of water is insufficient, but drowning someone under 30ft of water is "excellent."
  3. There still are no dedicated resources for CAE schools. There are several grant programs and scholarships via NSF, DHS, and DOD for which CAE programs are eligible, but most of those don't actually require CAE status, nor does CAE status provide special consideration.

What has changed is the level of effort to apply or renew at least the CAE-R stamp. The designation is now good for 5 academic years, and that is progress. Also, the requirements for the CAE-R designation were easily satisfied by a few people in a matter of several hours mining existing literature and research reports. Both of those were huge pluses for us in submitting the application and reducing the overhead to a more acceptable level given the return on investment.

The real value in this, and the reason we entered into the process is that a few funding opportunities have indicated that applicants' institutions must be certified as a CAE member or else the applicant must document a long list of items to show "equivalence." As our faculty and staff compete for some of these grants, the cost-benefit tradeoff suggested that a small group to go through the process once, for the CAE-R. Of course, this raises the question of why the funding agencies suggest that XX Community College is automatically qualified to submit a grant, while a major university that is not CAE certified (MIT is an example) has to prove that it is qualified!

So, for us, it came down to a matter of deciding whether to stay out of the program as a matter of principle or submit an application to make life a little simpler for all of our faculty and staff when submitting proposals. In the end, several of our faculty & the staff decided to do it over an afternoon because they wanted to make their own proposals simpler to produce. And, our attempt to galvanize some movement away from the CAE program produced huge waves of ...apathy... by other schools; they appear to have no qualms about standing in line for government cheese. Thus, with somewhat mixed feelings by some of us, we got our own block of curd, with an expiration date of 2015.

Let me make very clear -- we are very supportive of any faculty willing to put in the time to develop a program and working to educate students to enter this field. We are also very glad that there are people in government who are committed to supporting that academic effort. We are in no way trying to denigrate any institution or individual involved in the CAE program. But the concept of giving a gold star to make everyone feel good about doing what should be the minimum isn't how we should be teaching, or about how we should be promoting good cybersecurity education.

(And I should also add that not every faculty member here holds the opinions expressed above.)