GFIRST 2012 Conference, Atlanta Georgia

The annual “Government Forum of Incident Response and Security Teams” (GFRIST) 2012 conference was held in Atlanta Georgia. The week-long conference started on August 19th and ended on August 24th.  The GFIRST is a group of technical people focused on securing government information systems. The purpose of this conference is to gather such people and share their security and incident response knowledge and experiences with each other thus providing a level of transparency and improving coordination among agencies dealing with security challenges on daily basis.

Each day of the conference, except the very last one, it started out with the plenary sessions with various influential speakers from the information security industry who spoke and gave their insight on the security challenges as they stand today and they presented unique ideas to deal with them. Some of the most notable speakers in the plenary sessions were Allan Paller,  Director of Research at the SANS, Art Coviello, Executive Chairman , Institute, and Tony Sagerformer head of NSA’s IA program, and now with SANS.

In his speech, Allan Paller presented the approach of fixing the “known bad first” in security instead of wasting time in finding the unknown bad in security while the known one are yet to be fixed. He suggested using this approach by focusing and prioritizing the top 20 critical security controls.  He also emphasis that in order to fix the security issues, it is important to provide the information where and to whom it matters. There is one thing that Allan mentioned and I could not agree more with him was to focus on properly implementing the tools and technologies that are in hand rather than buying new ones. He mentioned that there is significant increase in attacks on DHS, it receives new attacks every 90 seconds out which 50% are malware attacks.

Art Coviello brought up some great points about how the security budget is being used today; he mentioned 80% goes on preventative measures, 15% on detection and monitoring and 5% on response. He suggested that we need to do more risk and intelligence based information and cyber security. One of the most prevailing and common theme that I kept hearing over and over again from various speakers and I find myself agreeing on it  is the issue of lack of good trained security professionals in the security industry. We need good people in the cyber security, people who understand information technology in general and have greater understanding of security issues. Art Coviello also expressed his concerns on congress for not taking action on passing the Rogers-Ruppersberger Cyber security Bill. The bill is focused on information sharing by U.S. government with the private sector businesses so that these businesses can also protect themselves from the cyber threats from which the U.S. government is protecting itself from.

Defense in depth is always consider as technology issue and not adding another person, said Tony Sager, former head of NSA’s IA program. His said that adding another person who bring ideas and implement them is much more beneficial than adding technology to achieve defense in depth. He also mentioned the Pareto Principle in context to cyber security. This principle is also known as 80/20 rule which states that, for many events, roughly 80% of the effects come from 20% of the causes.  This also ties back to what Allan Paller said in his speech, “Fix the known bad first”. Fixing the few known bad will prevent the adversaries that might happen as result of not fixing the few known bad.

Another very interesting topic that I thoroughly enjoyed and agreed with the speaker was “Hiring the un-hirable” Winn Schwartau gave a very entertaining and thought provoking presentation that there is an immense need to shift the mentality of the human resources in hiring only the professionals that looks and present themselves as a cyber-security professionals, although they might not carry the right credentials to do the job. Contrary to that, there are people who do not fit the profiles of such professionals by their looks, but looks can be deceiving. These people actually know how to write code and understand the technology to its core and such people can be an asset for organizations.

After the each plenary session during the week, the rest of the day was dedicated to sessions on various topics on information security. Large portion of these session were dedicated to continuous monitoring and big data. For continuous monitoring, a few presenters talk about NIST continuous monitoring framework also known as The Continuous Asset Evaluation, Situational Awareness, and Risk Scoring (CAESARS). In short, the CAESARS framework consists of sensor subsystems, database and repository subsystems, analysis and risk scoring systems and presentation and reporting subsystems.

The presentations were mainly from vendors who talked about the case studies as how their solutions have helped various federal agencies in solving security issues. In addition to that, there were some panel discussions on continuous monitoring and incident and security response activities.

Here is the quick overview of the sessions that I had a chance to attend:

Welcome to McSecurity, would you like fries with your scan?

The presenter showed practical example of differences between network penetration testing vs. the vulnerability assessment and how it is very important not to just rely on automated off the shelf tools to perform these tasks but to take the process critically and analytically to understand the true security issues.

 Demystifying Continuous Monitoring – Real World Solutions

It was a panel discussion in which the representatives of various federal agencies and private sector discussed about the implementation of various solutions to solve the continuous monitoring problems. They discuss about the framework for implementing the continuous monitoring and remediation solutions.

Grey Hats and Responsible Disclosure

This was another very interesting panel discussion about the ethics of fuzzing software looking of vulnerabilities and then publishing them on the Internet. It was interesting to find a balance through discussion on responsible disclosure of the discovered vulnerabilities of the software and walking the fine line of ethics on whether to publish these vulnerabilities to the public before the vendor know about them or take alternate actions and to inform the vendor and other parties in order to fix them first before disclosing them in the wild.

Managing Advanced Security Problems Using Big Data Analytics

“Big Data” is the buzz word of today. With ever increasing security and threat data that is being collected by the multitude of security tools and the reliance of the current data model on conventional databases is showing its limitations in storing, processing and analyzing the huge amount of collected data. The speaker presented the case on how it is time to switch to different data storage and manipulation architecture that can handle large amount of data which allows optimization, normalization and the analysis of  the data using technologies which can handle big data.

“Big Data”: beyond the buzzword

Another interesting talk on big data and what it means to security. How big data technologies like Apache Hadoop can be used to perform data mining and other analytical tasks and solve issues related to security monitoring.

Cyberspace Is Not Flat Either: Moving to 3D

This talk was focused on various cyber analysis tools and to understand differences between available 2D and 3D outputs that these tools generate. I did not see much benefit of using 3D tools since human are more accustomed to see in 2D rather than 3D. But it generated an interesting discussion between the presenter and the audience.

I think this conference was a great learning experience and to see how other people are tackling the security issues on daily basis in their environment and how similar or different they are in resolutions of the cyber security problems. Also it was refreshing to see that the paper exercise of performing certification and accreditation is finally fading away and continuous monitoring is taking over which I think is a much better way to securing the critical information technology  infrastructure of the country.