ISSA Journal – January 2011
From the President
Kevin L. Richards, International President
Go confidently in the direction of your dreams.
Live the life you have imagined. – Henry David Thoreau
On behalf of the ISSA International Board, I wish you a happy and prosperous new year! May
the new year provide all those things for which you dream.
It is an exciting time for the ISSA. As we kick off 2011, there are a number of items to
ISSA International Board Elections
Make a difference and help guide the future of the ISSA. Nominations are now open for the
following International Board roles: Vice President, CFO/Treasurer, COO/Secretary, Director
Board nominations will be accepted through March 15, 2011. Click HERE to learn more about
the roles and how to nominate candidates.
ISSA International Annual Awards
We are also accepting nominations for the ISSA International Annual Awards. This is admittedly
one of my favorite items of the year – recognizing those of you that contribute so much to the
ISSA and the information security community. The award categories are as follows:
• Hall of Fame
• Honor Roll
• Security Professional of the Year
• Chapter of the Year Awards are made in three categories: Fewer than 100 Members, 100-
200 Members, More than 200 Members
• Chapter Communications Program
• President’s Award for Public Service
• Organization of the Year
Award nominations are accepted through February 15, 2011. Click HERE for more information.
Discounts on World Class Training
Are you planning out your 2011 personal training goals? From RSA and BlackHat to the ISSA
International Conference later this year, ISSA members have access to discounts for many of the
leading training events our industry has to offer – saving literally hundreds of dollars on these
training opportunities. The members-only section on the issa.org website is maintained with the
current discount offers.
The Power of the ISSA Community
With 151 chapters worldwide and members in 38 different countries, ISSA members continue to
drive the information security profession. Recently, SecurityMagazine.com named its top 500
security programs list, which boasts a number of ISSA members including Eric Cowperthwaite
(Puget Sound Chapter/Providence Health & Services), Stephen Scharf (Orange County
Chapter/Experian), and Dave Tyson (San Francisco Chapter/Pacific Gas & Electric) – just to
name a few.
The full list can be found HERE. Congratulations to all that had their security programs
In addition to the public recognition, our chapters continue to provide leading training and
excellent networking opportunities. If you haven’t attended a local chapter meeting recently, I
invite you to get re-acquainted with the local membership. I’m sure you’ll gain tremendous value
from the collaboration.
In closing, I am planning to be at RSA in San Francisco this February. I would greatly enjoy
meeting with you!
Thank you for making the ISSA the pre-eminent, trusted, global information security
Welcome to the January Journal
Thom Barrie – Editor, the ISSA Journal
I watched with interest the recent FCC ruling on, among other things, net neutrality. Now, I’m a
big fan of “open and free.” Let me download and upload to my heart’s content and bandwidth
constraints. But I do feel the pain of the broadband providers whose networks are clogged with
bit-torrents and streaming movies at the cost of their own content offerings. Reminds me of the
home network (now down to just three constant users). I pay for the infrastructure, but when it
comes time for me to surf the bits and bytes, the incessant Hulu watchers, WOW players, and
YouTube aficionados are clogging my bandwidth, even with QoS dialed all the way up (for me).
Which brings me around to the issue at hand: The impact of legislation on the security
professional, our theme this month. Three of our resident attorneys – Randy Sabett, David
Navetta, and Steven Teppler – have stepped up to give their take on the pervasive
legal/regulatory environment, and one thing is clear: the information security practitioner is
poised to play a greater role than ever before – if he or she is willing to step up to the plate. As
Mike Ahmadi, who is not a lawyer, explains: “Knowledge is power. Legal knowledge is legal
Happy and Prosperous New Year,
Bullet Dodged or Opportunity Missed?
By Randy V. Sabett – ISSA member, Northern Virginia, USA Chapter
My full article on page 12 in this issue contains a brief mention of something that has been
tangentially debated for years – whether security professionals should be required to be
credentialed and, if so, which credentials would pass muster. Sweeping cybersecurity legislation
originally proposed by Sen. Rockefeller, Sen. Snowe, and Sen. Nelson would have had a
significant impact on our profession, since it contained just such a professional certification
requirement. In a revision to S.773 (the Cybersecurity Act of 2010),1 that requirement has been
removed. Depending on how you are situated, you may view this development with a sigh of
relief or a dejected “hurumpf.”
While the substance of this provision of S.773 cannot be ignored (I’ll come back to that in a
minute), the process by which it entered and exited the public dialog bears further examination
because it exemplifies the impact of the legislative process on security professionals. When
S.773 was first introduced in the Senate, it represented a good first draft of provisions that would
presumably help improve the state of cybersecurity, particularly in the U.S. government. I know
some of the staffers that worked on the bill and can honestly say that they truly care about
making things better. For various reasons, however, many of the provisions were not well
received. The so-called “kill switch” and the professionalization requirement were two examples
of things that did not make it into the next version of the bill, which was officially introduced on
December 10, 2010. Here is the main part of the text of the professionalization requirement:
Beginning three years after the date of enactment of this Act, it shall be unlawful for any
individual to engage in business in the United States, or to be employed in the United States,
as a provider of cybersecurity services to any federal agency or an information system or
network designated by the President, or the President’s designee, as a critical infrastructure
information system or network, who is not licensed and certified under the program.
Does the absence of this provision in the updated draft mean professionalization is not a good
idea? Not necessarily. Does this mean professionalization should not start out as a government-
only requirement? Definitely not. It simply means that many vocal respondents convinced the
committee responsible for this bill that the current drafting was not adequate. Some critics stated
that it would simply be a “windfall for those involved in cybersecurity certification” and a
“dangerous and intrusive expansion of government power.” Instead of trying to redraft it, the
authors decided to leave out this provision from the new draft.
While on their face such criticism may seem well-placed, let’s take a look at the rest of the
Within one year after the date of enactment of this Act, the Secretary of Commerce shall
develop or coordinate and integrate a national licensing, certification, and periodic
recertification program for cybersecurity professionals.
So, we have a one-year window within which a certification program would be established or
integrated, followed by a requirement after three years that anyone hired to work as a
cybersecurity professional would need to have been certified. Notice I didn’t mention the
government anywhere in the previous sentence. I am still trying to understand how a certification
program by the government intended only for government workers (and those working on
critical infrastructure systems) can be characterized as dangerous and intrusive. If a private
company were to implement such a requirement, would there have been such an outcry? I doubt
In any event, S.773 no longer contains such a requirement. While professionalization may be a
laudable goal, many did not see the Rockefeller-Snowe approach as being a workable solution.
Legislation may not be the answer. If not, however, then what? Sighing with relief or
“harumpfing” will do little to advance the ball.
The debate rages on…
About the Author
Randy V. Sabett, J.D., CISSP, is the co-chair of the Internet & Data Protection (IDP) practice
group at SNR Denton US LLP, an adjunct professor at George Washington University, and a
member of the Commission on Cybersecurity for the 44th Presidency. He may be reached at
Laws! Itʼs CHAOS
By Branden R. Williams – ISSA member, North Texas, USA Chapter
We’ve come a long way since the Computer Fraud and Abuse Act (CFAA) of 1986, which was
the amended version of the Counterfeit Access Device Act of 1984.2 When this piece of
legislation was enacted, we were still on the leading edge of the electronic part of our
information revolution. It’s not that we didn’t have lots of information in the ’80s; we just
realized that we needed better ways to analyze, search through, store, and access it. The
commercialization and evolution of the computer chips designed in the ’60s and ’70s, and
massive disk arrays such as the IBM 3380 – released in 1980, the first commercial storage device
with one gigabyte capacity, yours for the low, low price of $40,0003 – presented a solution to
companies looking to make things easier. As the prices came down, corporate America’s appetite
for storage and electronic information boomed, and our habit of using paper began to slowly
change to an electronic equivalent.
Fast forward to today. Fewer processes rely on paper-based work flows, and thoses that are left
are quickly becoming digitized for fast, reliable access. But with convenient access comes easy
and frequent accidental disclosure. Look out world, here comes LEGISLATION!
Legislation on information security is terrifying to me. I’ve watched enough congressional
hearings to know that the vast majority of legislators in the U.S. House and Senate do not have
What a horrible name!
Just over $100K in today’s dollars.
the requisite base of knowledge to do anything more than recite prepared statements from those
that are in the know. I understand the challenge that legislators have, but I also realize that
criminals change their attack strategies faster than the government can enact legislation to
combat the hack of the day. When legislation comes down, it tends to be either sweeping with
financial incentives (like HITECH) or watered down with limited concrete detail (like Gramm-
Leach-Bliley). Combine those constraints with fifty individual states that may enact their own
legislation to protect their residents5 and you end up with an extremely complex legal landscape,
against which security professionals are regularly measured.
In some respects, we should be glad that someone realized the carrot method isn’t working and
it’s time to bring out the stick. Information security spending as a component of overall IT spend
has grown tremendously over the last decade. Without some legislation and the very public
dissection of the legal ramifications of breaches, I am not sure companies would spend much
beyond the “Availability” and “Integrity” legs6 of the CIA tripod. I recently compared the
failings of information security to the same issues we have in physical security,7 and even our
physical world mirrors the same “it won’t happen to me” mentality that information security has
harbored since the first computers were operated. If a business gets robbed, a new security
system is installed. If a company is compromised, all the sudden executives are serious about all
those proposals you pitched over the last decade.
As you readers know, one of my mantras is that we are to blame for the failings of information
security in the board room, not them. Our failure to communicate the value and importance in a
language that the board can understand has forced legislation upon the private sector to cattle
prod those guys into action. Even that does not work uniformly as not every company complies
with all of the relevant information security regulations on all of the books from every sovereign
nation with which they interact. It’s painful, and it’s only going to get worse.
What can you do? First off, get friendly with a good lawyer that you trust. The information
security team should be a good partner with the legal team to better understand exactly what you
need to do to stay ahead of, and in compliance with, the legislation on the books. For the most
part, it means building a mature information security program based on something like ISO
27002, and adding more controls around regulated data. Do your best to define your information
security program independent of legislation, but allow for legislation to shape only the areas with
which it governs.
About the Author
Branden R. Williams, CISSP, CISM, is the Director of the Global Security Consulting practice at
RSA, the Security Division of EMC, and regularly assists top global retailers, financial
institutions, and multinationals with their information security initiatives. Read his blog, buy his
book, or reach him directly at http://www.brandenwilliams.com.
Arguably this had more to do with privacy than security, but I saw many financial organizations make changes in their
information security activities based on this legislation.
See all the disclosure notification legislation, or the Nevada, Minnesota, and Massachusetts laws.
With a much heavier emphasis on availability
The Secret History of Public-Key Cryptography
By Luther Martin – ISSA member, Silicon Valley, USA Chapter
Back in 2007, the “Thirty Years of Public-Key Cryptography” event at the Computer History
Museum in San Jose, California, celebrated the 30th anniversary of the invention public-key
The timing of this event was based on the publication of the 1977 paper “New Directions in
Cryptography”8 by Whitfield Diffie and Martin Hellman. This paper described what’s now
known as the Diffie-Hellman key exchange, and the publication of this paper is often described
as the birth of public-key cryptography. The truth is more complicated, however, but it wasn’t
actually known until 1997 when government documents that described the true origin of the
technology were declassified and made public.
Turns out that John Ellis, Clifford Cocks, and Malcolm Williamson, cryptographers at the UK’s
Government Communications Headquarters (GCHQ), actually invented public-key encryption
well before the academic world did.
As early as 1970, John Ellis proposed the idea of public-key cryptography in a paper entitled
“The Possibility of Secure Non-Secret Digital Encryption.”9 Three years later, in 1973, Clifford
Cocks showed that Ellis’ idea was practical when he invented what’s now known as the RSA
cryptosystem,10 the security of which is based on the idea that it’s easy to multiply big numbers
together but hard to factor a big number. Only four months later, in early 1974, Malcolm
Williamson invented what’s now known as the Diffie-Hellman key exchange,11 the security of
which is based on the idea that it’s easy to calculate exponentials, like finding ga from g and a,
but it’s hard to calculate discrete logarithms, like finding a from g and ga.
While the GCHQ cryptographers were busy inventing public-key cryptography in the early ‘70s,
the outside world apparently hadn’t made the same connections that Cocks and Williamson had,
and it wasn’t until 1977 that they caught up. That’s when Diffie and Hellman published their
famous paper. Only three months later, in early 1978, Ron Rivest, Adi Shamir, and Len Adelman
described what’s now known as the RSA cryptosystem.12
W. Diffie and M. Hellman, “New Directions in Cryptography,” IEEE Transactions on Information Theory, Vol. IT-22, pp. 644–
654, Nov. 1977 – http://groups.csail.mit.edu/cis/crypto/classes/6.857/papers/diffie-hellman.pdf.
J. Ellis, “The Possibility of Secure Non-Secret Digital Encryption,” Jan. 1970 –
C. Cocks, “A Note on ‘Non-Secret Encryption’,” Nov. 1973 – http://www.cesg.gov.uk/publications/media/notense.pdf.
M. Williamson, “Non-Secret Encryption Using a Finite Field,” Jan. 1974 –
R. Rivest, A. Shamir and L. Adelman, “A Method for Obtaining Digital Signatures and Public-Key Cryptosystems,”
Besides the fact that there were a few years between the classified inventions and the ones in the
academic world, there was another interesting difference between the two sets of inventions. In
particular, in the classified world, RSA was invented before Diffie-Hellman, while in the
academic world Diffie-Hellman was invented first.
But while cryptographers at GCHQ may have invented public-key technology first, the private
sector seemed to adopt it much more quickly than governments did. In 1982, Rivest, Shamir, and
Adelman founded RSA Data Security to commercialize the RSA technology. This can be used to
both encrypt data and to create digital signatures, and RSA technology was used in many of the
early commercial applications of public-key encryption, including most of the technologies that
powered the dot-com boom and the subsequent development of e-commerce.
In 1984, Jim Omura founded the Cylink Corporation to commercialize the Diffie-Hellman
technology. This doesn’t actually let you encrypt or sign anything. Instead, it lets you create a
shared secret that can then be used for other purposes, like a key that’s used to encrypt data that’s
being transmitted over a network link. And while RSA technology supported the beginnings of
dot-com-era e-commerce, Cylink’s link encryptors protected the financial transactions.
Eventually both RSA Data Security and the Cylink Corporation were acquired by larger
companies, and their technologies are now part of the infrastructure for the global economy, but
much of what’s still used today can be traced back to the first ideas of a few cryptographers at
GCHQ over 30 years ago.
In the past 30 years there has been lots of innovation in the field of cryptography. Maybe we’ll
one day learn that some of the more recent innovations were invented by government
cryptographers long before, but that’s probably getting less likely. The time when governments
made significant breakthroughs in cryptography may now be part of history that we’re unlikely
to see again.
About the Author
Luther Martin is the Chief Security Architect for Voltage Security. You can find his daily
thoughts on in formation security at http://superconductor.voltage.com and can reach him at
Are You Up for the Cyber Security School Challenge?
Communications of the ACM, Vol. 21, pp. 120-126, Feb. 1978 – http://people.csail.mit.edu/rivest/Rsapaper.pdf.
By Joyce Brocaglia
Within our extensive network of information security, IT risk, and GRC professionals, I have
found a common thread unrelated to work roles and responsibilities. Despite the vast collective
knowledge that we all share about technology and protecting organizations and enterprises, there
are many opinions, various solutions, and much interest over how to educate our children to be
safe, secure, and considerate online.
Interestingly enough at numerous gatherings of information security executives, the topic that
frequently comes up is what these professionals who are also parents are allowing their children
to access online and how they are protecting them. Some feel it’s dangerous territory for children
under a certain age; some feel that not allowing them online is somehow limiting their social
development with their peers; others just install software that spies on their kids without their
knowledge, which opens up a whole other can of worms.
What is universally true, however, is that we are all passionate about the subject and keeping our
kids both informed and safe. What is also true is that most of us have been approached by friends
– non-security professionals – and asked how they should be addressing the issues with their
In response to this need and desire to positively influence children, the Executive Women’s
Forum (EWF) initiated a community education outreach program at their recent national
conference. Members went to Arcadia Neighborhood Learning Center, and through individual
classroom presentations and school assemblies volunteers spoke to all 550 students and many of
their parents about Internet safety, ethics, and security. The reaction from the school was
overwhelmingly positive and the volunteers felt it was an incredibly rewarding experience to
interact with the students and later that evening with a group of parents as well.
The EWF is now continuing to develop and increase participation by launching the Cyber
Security School Challenge at the upcoming RSA Conference 2011. Everyone is invited to attend
the EWF Meet & Greet where the challenge will kick off.13
The Cyber Security School Challenge is an outreach program that challenges IT security, risk,
and GRC executives to get out into their communities to help educate students on online
security, safety, and ethics. Lesson plans, games, and instructions from leading academic and
industry authorities have been developed to enable volunteers to teach our children best practices
of “cyber awareness,” encouraging safe, secure, and responsible use of computers.
If you are interested in joining the challenge or would like more information, visit the EWF
website,14 which has a section aggregating information on lesson plans, games, handouts, and
additional valuable teaching information. The following are among the contributors to the Cyber
5:30pm-7:30pm, Tuesday, February 15, 2011 at the Moscone Center in North Room 132.
MySecureCyberspace: an initiative of Carnegie Mellon University’s Information Networking
Institute. Carnegie Cadets,15 the MySecureCyberspace game, teaches elementary school students
how to spot spam, protect personal information online, avoid inappropriate web content, and deal
with cyberbullying. The game is offered as a free download.
Safe and Secure Online Program:16 (ISC)²-certified members volunteer to help children ages 11-
14 learn how to protect themselves online.
Cyber Security Awareness Volunteer Education Program:17 an outreach of the National Cyber
Over the years working with professional organizations – whether EFW or ISSA – I have
recognized how powerful it is to harness our members’ individual efforts and bring people with
similar passions together to collaborate and create unified actions to achieve change and
influence the future. Imagine the influence that an entire information security industry could
have. What if we get these kids to understand the ramification of their actions at a much earlier
age? What if we made them pause for a moment and consider their options and make more
rational or educated decisions? What if we saved a life? Please join us in this Challenge!
About the Author
Joyce Brocaglia is the CEO of Alta Associates, the industry’s trusted advisors specializing in
information security recruiting, and the founder of the Executive Women’s Forum (www.ewf-
usa.com). Joyce may be reached at www.altaassociates.com and Joyce@altaassociates.com.
Choosing Your Leaders:
The 2011 International Election
Nominations are now open for the International Board of Directors. The Nominating and
Election Committee, chaired by William Tompkins, will be accepting nominations through
March 15 for the June election.
You will be electing the International Vice President, Chief Operations Officer/Secretary,
CFO/Treasurer, and four directors to lead the association for the next two years. Candidates are
nominated by their chapters or a member of the International Board. At-large members may
submit a petition signed by 25 members in good standing. Nomination documents and election
guidelines can be found on ISSA Connect. All nominations should be sent to the Elections Chair.
Voter credentials will be sent to active General, CISO Executive, Lifetime, and assigned
Corporate and Government Organizational members in good standing who have a current email
address in their membership profiles. Visit www.issa.org to review and update your profile by
May 1. If your membership is scheduled to lapse, you should renew by May 31 to ensure you
receive credentials. If you have questions regarding your membership status, please contact Dana
Paulino, Manager of Member and Chapter Relations.
Anyone interested in being nominated as a Director can learn about the International Board of
Directors responsibilities in Article VI of the ISSA By-Laws, which are posted on ISSA
Connect. There are three ways a candidate can be nominated:
1. ISSA members who belong to a chapter should be nominated by their chapter. See
sample chapter nomination statement.
2. At-Large members are nominated by petition, which must be signed by a minimum of 25
members in good standing.
3. Sitting ISSA Board members may submit a nomination. Board members may not
The following documents must be provided by each candidate. Documents can be emailed to the
Election Chair or faxed to ISSA International, +1 206 299 3366.
• A Statement of Commitment form
• A biography
• Goals the candidate wishes to achieve as a member of the ISSA Board
• A digital photo 1½ x 2” in jpeg format (optional)
If elected, all candidates must complete the Disclosure of Relationships form prior to taking
office. The form is to be completed annually. It is also required that should there be a change in
the relationships, a new form must be filed within thirty days of the change of relationship.
Feedback in the form of suggested activities to obviate conflicts of interest will be delivered to
each submitter within four weeks.
Candidate biographies and goals will be published in the ISSA Journal. Candidates will be
allotted one column (375-400 words including photo). Content is subject to editing to conform to
the publication’s style guide. There is no word limit for candidate bios and goals posted on the
ISSA Education Foundation Announces Scholarship A
The ISSA Education Foundation has awarded two academic scholarships this year. Steve
Haydostian, President of the Board of Directors of the Foundation, notes that while this year
ISSA will help two individuals to attain their educational goals, he anticipates increasing the
number of scholarships awarded in coming years.
A significant number of applications were received from around the world. The Foundation
Scholarship Committee, consisting of Dr. Daniel Manson (Chair), Rich Owen, Mark Spencer,
George Proeller, and Don Evans, evaluated each of the applications against the published
criteria. Upon the recommendation of the committee, the Foundation Board awarded
scholarships to the following applicants:
Undergraduate Study Scholarship: Sarah Kennedy, Murray State University, Murray,
Graduate Study Scholarship: Mindy Bollinger, Capella University, Minneapolis, MN.
Haydostian said, “It is an honor to award these scholarships to two well-qualified candidates
with the vision they will help contribute and advance the information security profession in the
About the ISSA Education Foundation
The ISSA Education Foundation offers scholarships of up to $1,200 per person to support
undergraduate study and up to $10,000 per person to support graduate level studies of
individuals pursuing approved studies in the field of information security. The Foundation also
provides grants to professionals wishing to expand their knowledge and skills in the information
The Foundation was founded in 2003 by the Information Systems Security Association (ISSA)
as a 501 (c) (3) public charity, qualified to receive tax-deductible bequests, devises, transfers, or
gifts under Section 2055, 2106, or 2522 of the U.S. Tax Code. The Foundation is made up
exclusively of volunteers from the ranks of information security professionals.
For more information, please visit www.issaef.org. Contact Steve Haydostian, President ISSA
Education Foundation, 818-363-3118, firstname.lastname@example.org.
Make Your Nominations for International Awards
Is there someone whose accomplishments would you like to see recognized? An industry leader?
A chapter leader or member who has had a significant impact? An organization that has
contributed broad-based support through ISSA for the profession and the industry?
The Awards Committee is accepting nominations until February 15 at midnight US Pacific time
for the award categories listed below. Criteria and the appropriate nomination form for each
award are available on ISSA Connect as documents in the Association Business community or
by searching for “awards nomination.” Past recipients are listed on the International website.
Click on the links for nomination forms.
• Hall of Fame
• Honor Rol
• Security Professional of the Year
• Chapter(s) of the Year
• Outstanding Chapter Communications Program
• President’s Award for Public Service
• Outstanding Organization of the Year
Nominations may be made by your Chapter President, a Chapter Presidents Advisory Council
(CPAC) representative, or a member of the International Board of Directors. You can help by
suggesting candidates and providing information for the nomination form.
Questions should be sent to Mark Williams, Awards Chair, email@example.com.
The Evolving Legal Duty to Securely Maintain Data
By Randy V. Sabett – ISSA member, Northern Virginia, USA Chapter
Baking in security and focusing on risk management are messages we have all heard
before, but when coupled with the notion of an accelerating shift in liability, those messages
bear greater attention.
Many articles have been written about the intersection of law, technology, and information
security (including several by yours truly). Like other articles, this piece to ring in 2011 will talk
about new laws, pending laws, and recent court cases. A broader message exists, however, and
you should keep the following in mind: we continue to see shifts in liability allocation taking
place amongst the numerous stakeholders involved in electronic commerce. These liability shifts
will eventually touch everyone in the ecosystem. Some view this to mean that we are trending
toward a default or implied duty of security, which could get codified into law. This evolution
creates a business environment where you may want to (a) actually bake security in from the
beginning, instead of bolting it on after the fact, and (b) increase your focus on risk management,
instead of just pure infosec. You can play an important role within your organization by tracking
these trends and having the ability to be the information security risk translator.
Baking in security and focusing on risk management are messages we have all heard before, but
when coupled with the notion of an accelerating shift in liability, those messages bear greater
attention (particularly following some of the events we experienced in 2010). Before we try to
sort through the state of flux in which the legal environment finds itself, let’s first take a look at
some of the primary theories of liability and policy trends applicable to information security. In
particular, contracts, “wrongful acts,” or legislation could each lead to liability with respect to
information that should be kept secure.
Contract liability trends
A contract establishes what amounts to private law between the contracting parties. The contract
generally spells out the obligations of each of the parties (such as performance metrics or
security requirements) and, in many cases, the repercussions if something goes wrong (such as
indemnification obligations or liquidated damages).
Whereas a decade or so ago very few technology-focused contracts had information security-
specific provisions, a significant majority do today.18 Some examples of these types of
provisions include information security addenda or schedules, audit provisions that focus
specifically on information security resources, and requirements to comply with specific
information security regulatory requirements. In addition to the general trend toward greater
contractual attention to information security, there has also been more robust negotiation over
these provisions, often coupled with a discussion of applicable amounts of insurance.
In working with certain clients on contractual matters, PCI has taken on a significant role
resulting in one other trend worth noting. In certain situations, various PCI provisions have been
incorporated into a contractual arrangement between parties even though the parties were not
dealing with credit card information. In effect, PCI is becoming a de facto information security
Tort liability trends
A tort is a wrongful or illegal act by one party against another. Tort liability can exist in the
absence of any contractual obligations between parties. For this reason, plaintiff’s attorneys often
assert claims related to the tort of negligence in data breach cases on behalf of affected data
subjects. To prevail in such a case, the plaintiff must show that (a) a duty existed from the
defendant to the plaintiff, (b) the duty was breached, (c) a causal connection existed between that
breach and some harm to the plaintiff, and (d) the plaintiff suffered actual damages. In a series of
cases, plaintiffs have consistently lost on the premise that the exposure of their personal
information did not amount to compensable damages.19
In the business-to-business arena, a variety of cases have been brought by various stakeholders
against the entity that suffered the data breach. In certain situations, these cases settle with the
presumptive reason being that the parties would rather not spend the money on, or expose any
kinds of sensitive information in the course of, litigation.20 For the ones that do go forward, two
trends seem to be emerging. Plaintiffs are (a) bringing cases that are based on something more
than just speculative damages, and (b) doing more careful investigations to determine potential
liability and assert appropriate claims.
It should be noted, however, that most technology-related agreements do contain general confidentiality provisions that would,
in some cases, cover data protection issues.
See, e.g., Willey v. J.P. Morgan Chase, N.A., No. 09 Civ. 1397, 2009 WL 1938987 (S.D.N.Y. July 7, 2009); Randolph v. ING
Life Ins. & Annuity Co., No. 07-CV-791 (D.C. Jun. 18, 2009); Belle Chasse Auto. Care, Inc. v. Advanced Auto Parts, Inc., No. 08-
1568, 2009 WL 799760 (E.D. La. Mar. 24, 2009); Pisciotta v. Old Nat’l Bancorp, 499 F.3d 629 (7th Cir. 2007); Stollenwerk et al.
v. Tri-West Health Care, 254 Fed. Appx. 664 (9th Cir. 2007); Ponder v. Pfizer Inc., 522 F. Supp. 2d 793 (M.D. La. 2007); Forbes
v. Wells Fargo Bank, N.A., 420 F. Supp. 2d 1018 (D. Minn. 2006); Bell v. Acxiom Corp., 4:06CV00485-WRW, 2006 U.S. Dist.
LEXIS 72477 (E.D. Ark. Oct. 3, 2006); Key v. DSW, Inc., 454 F.Supp.2d 684 (S.D. Ohio, Sept. 27, 2006).; Giordano v.
Wachovia Sec., LLC, Civ. No. 06-476, 2006 U.S. Dist. LEXIS 52266 (D.N.J. July 31, 2006); Guin v. Brazos Higher Ed. Service
Corp., 2006 WL 288483 (D. Minnesota, February 7, 2006); Stollenwerk v. TriWest Healthcare Alliance, No. Civ. 03-0185-PHX-
SRB, 2005 U.S. Dist. LEXIS 41054, at *10 (D. Ariz. Sept. 8, 2005).
Perhaps the most prominent example of this trend involved TJX, which, over the span of several months following its
infamous data breach that was discovered in mid-2007, settled with Visa in November 2007, a group of banks in December
2007, and Mastercard and the FTC in April 2008.
In one case, a judge in Illinois District Court has permitted a negligence claim to survive a
summary judgment motion against Citizens Financial Bank.21 The plaintiffs claim that the bank
was negligent in securing their account, thereby allowing a hacker to obtain a $26,500 loan using
the customer’s user name and password. In response to a summary judgment motion, the judge
ruled that the bank could be found negligent. Stating that banks generally have a duty to protect
customer information, the judge went on to say that “[i]f this duty not to disclose customer
information is to have any weight in the age of online banking, then banks must certainly employ
sufficient security measures to protect their customers’ online accounts.”22 In looking at all
available evidence, including a lack of two-factor authentication, the judge denied the motion to
dismiss the negligence claims. Let me repeat – a U.S. District Court judge analyzed evidence
related to authentication technology for on-line banking that led to a negligence case being allowed to
continue against a bank. This definitely represents a trend that deserves watching.
In a second case involving the Hannaford breach, all claims brought by plaintiffs in a class action
were dismissed except for one. The sole claim that was allowed to continue involved a woman
whose bank refused to reverse allegedly fraudulent charges made on her account. The court, in
allowing her claims to continue, stated that “[i]f Hannaford’s negligence has caused fraudulent
postings to [the plaintiff’s] account that have not been corrected, her ability, if any, to sue her
bank under her credit or debit card contract does not eliminate Hannaford’s potential liability to
In one more case, an online retailer (Paintball Punks) recently filed a class action lawsuit against
U.S. Bank that was removed to Minneapolis District Court on December 6, 2010.24 In this case,
the plaintiff retailer alleged that over $10,000 in orders were received that were fraudulently
billed to credit cards issued by the defendant bank. The defendant bank instituted chargebacks
against the plaintiff retailer to recover the fraudulently charged funds. In its complaint, the
retailer alleged that the bank failed to protect it and other merchants by not correcting a known
vulnerability in the bank’s computer systems and that the data breaches resulted from a
compromise of those systems. Despite knowledge of those breaches, the bank allegedly allowed
compromised card accounts to remain active, which led to fraudulent credit card transactions
with the plaintiff retailers. The plaintiff contends that the defendant bank could have corrected
the data breach prior to the occurrence of any losses.
Regulatory liability trends
Other than in a few limited industry verticals (such as financial services with the Gramm-Leach-
Bliley Act (GLBA) and health care with the Health Insurance Portability and Accountability Act
(HIPAA) and HITECH Act), most activity involving information security laws has occurred at
the state level. A definite trend has emerged there – reactive laws dealing with cybercrime and
hacking have given way to proactive laws requiring companies to take affirmative steps to secure
Shames-Yeakel v. Citizens Financial Bank, Case No. 07-c-5387, U.S.D.C., Northern District of Illinois.
Ibid., Memorandum Opinion and Order, at 20 (N.D. Illinois August 21, 2009).
In re Hannaford Bros. Co. Customer Data Security Breach Litigation, MDL Docket No. 2:08-MD-1954, Decision and Order on
Defendant Hannaford Bros. Co.’s Motion to Dismiss (D. Maine, May 12, 2009).
E-Shops Corp. d/b/a Paintball Punks v. U.S. Bank, NA, Case 0:10-cv-04822-DSD-JJK (State of Minnesota District Court
December 6, 2010).
Reactive cybercrime laws
Early state laws related to information security operate by criminalizing various activities that
today would collectively be referred to as “hacking.”25 While important and still necessary, these
reactive laws focus on the hacker – an elusive entity that, even if found, would not be able to
make a victim whole again. These laws also come into effect only after a bad event has occurred.
Other than the rather amorphous deterrent effect that they might have, the laws do not do
anything to help prevent hacking or cybercrime from occurring in the first place.
Reactive data breach notification laws
As the next step in the evolution of state information security laws, state legislatures realized the
need to focus on other parties in the chain of liability. By passing SB 138626 in 2003, California
became the first state with a data breach notification law. Now, not only would the actual
wrongdoer be criminally liable, but also companies that allow a breach to occur would be liable.
Numerous other states soon followed. In some cases, bright line tests exist for determining
whether a breach has occurred. In other states, a subjective risk-based standard exists. Some
states have safe harbors for GLBA- or HIPAA-compliant entities. Others do not. Arguably, all
data breach notification laws are still reactive since they do not kick in until a breach has already
occurred, but they have created a negative incentive (i.e., fear of public exposure) that has
definitely heightened the awareness of many in the business community of the need for
Proactive reasonable security laws
Continuing with its lead role in information security legislation, California passed AB 195027 in
2004. Unlike the reactive data breach notification laws, AB 1950 focuses on whether an entity
has in place “reasonable security procedures and practices.”28 This was one of the first of its kind
– a broad-reaching proactive data security statute that places obligations on parties before
anything bad has occurred with personal information.29 Many states have now followed suit with
similar proactive laws that require reasonable security measures.
If we view cybercrime laws as the first wave, data breach notification laws as the second wave,
and reasonable security measures laws as the third wave, a new, fourth wave of state information
security laws is emerging. The laws in this fourth wave represent an attempt by state legislatures
to pass much more granular provisions. Oregon, Massachusetts, and Nevada have the most
Granular information security laws
Statutes vary by state, but all states have some type of computer crime laws.
See California Civil Code § 1798.82, available at http://www.leginfo.ca.gov/cgi-bin/displaycode?section=civ&group=01001-
See California Civil Code § 1798.81.5, available at http://www.leginfo.ca.gov/cgi-bin/displaycode?section=civ&group=01001-
Although both HIPAA and GLBA have a similar structure, they are limited to specific industry verticals and are not broadly
applicable to all businesses that collect or maintain sensitive personal information.
In Oregon, SB 58330 requires companies to implement an information security program that
includes administrative safeguards, physical safeguards, and technical safeguards (just like other
similar laws). It then goes further by specifying specific measures for each class of safeguards
deemed to be in compliance with the law. For example, acceptable administrative safeguards
include designating an employee to coordinate the security program; assessing “reasonably
foreseeable” risks and the sufficiency of implemented safeguards; employee training on
information security policies and procedures; and requiring service providers, by contract, to
maintain appropriate security safeguards. Likewise, acceptable technical safeguards includes risk
assessment of network and software design; mechanisms to detect, prevent, and respond to
attacks; and regular testing and monitoring of key controls, systems, and procedures.
Detailed data security regulations31 in Massachusetts took effect in March 2010 and require
companies to implement a comprehensive information security program. The regulations also
require the implementation of certain administrative, technical, and physical controls to protect
sensitive personal information. Highlights include overseeing service providers by selecting and
retaining third-party service providers that are capable of maintaining appropriate security
measures and contractually requiring such third-party service providers to implement and
maintain appropriate security measures for personal information. There are also a number of
specific computer security requirements that are to be implemented “to the extent technically
feasible,” including secure user authentication protocols, secure access control measures, and
encryption of all sensitive information in transit.32
The Nevada law generally requires entities to implement “reasonable controls” (and to
contractually require vendors to implement reasonable controls) to protect personal information.
It also requires businesses to encrypt sensitive personal information when transmitted across
public networks and private wireless networks and when stored on “data storage devices” taken
outside the company’s security perimeter, but only when involving “electronic, non-voice
transmissions other than a facsimile.”
Perhaps the most interesting trend besides more significant granularity is the incorporation of an
industry standard into state law. Specifically, two states (Nevada and Minnesota) have codified
or partially codified the Payment Card Industry Data Security Standard (“PCI DSS”). In Nevada,
businesses accepting payment cards must comply with the PCI DSS. This would almost seem to
create a type of “safe harbor” - if the entity is PCI-compliant and the breach is not caused by “the
gross negligence or intentional misconduct” of the entity, the entity will not be liable under the
law for damages for a security breach.
The Minnesota law reflects only one part of the PCI DSS and, in many respects, codifies
obligations already contained in card association contracts. The law forbids entities that handle
credit card information from retaining the card security code, PIN, or contents of any track of
magnetic stripe data after the transaction is authorized. Companies not in compliance with the
statute are liable for any fraudulent transactions that result from such noncompliance, as well as
the costs of replacing compromised cards.
See Oregon Rev. Stat. § 646A.600 et seq., available at http://www.leg.state.or.us/ors/646a.html.
See “Standards for the Protection of Personal Information of Residents of the Commonwealth,” available at
Ibid. at Section 17.04.
Renewed federal activity
Perhaps due to a confluence of different events, including several massive data breaches, the
varied activity by the states in this area, and heightened attention to security over personal
information and greater awareness of privacy, Congress has become very active in the areas of
privacy and data security. According to the bill tracking that we do, there are over 40 bills
currently pending a the federal level. Some of these are quite discrete, focused on a specific topic
(such as the creation of a national uniform data breach notification law). Others contain a broad
variety of provisions, effectively acting as an omnibus bill that affects all stakeholders. One of
those, the Rockefeller-Snowe bill, has recently been updated.
Exemplary pending federal bill: Rockefeller-Snowe
Senators Rockefeller, Snowe, and Nelson first introduced S. 77333 in April of 2009. The original
draft of the bill contained both operational and strategic provisions that address perceived
shortcomings in the current cybersecurity landscape. Perhaps of greatest concern to the
commercial sector was Section 18, which would have provided the President with the power to
order “the disconnection of any Federal government or United States critical infrastructure
information systems or networks in the interest of national security.” Known as the “kill switch,”
this provision generated considerable controversy, since most of the critical infrastructure
systems are in the hands of the private sector. Another provision that generated some controversy
was Section 14, which established within the Department of Commerce a “clearinghouse of
cybersecurity threats and vulnerability information” including information from both federal and
private sector owned networks and that would have provided the Department of Commerce with
authority to compel production of “all relevant data concerning such networks without regard to
any provision of law, regulation, rule, or policy restricting such access.”
On December 17, 2010, Senator Rockefeller reported an updated version of S.773 to the Senate,
and it was placed on the Senate’s legislative calendar. The new version, which went through
intense negotiations with sponsors of other related bills, no longer contains the “kill switch” and
goes so far as to state that “[t]his section does not authorize, and shall not be construed to
authorize, an expansion of existing presidential authorities.”34 The new version also eliminated
the licensure requirement for information security professionals. Other notable provisions
include Section 204, which directs the National Institute of Standards and Technology to
recognize and promote auditable, private sector developed cybersecurity risk measurement
techniques, risk management measures, and best practices for all federal government and U.S.
critical infrastructure networks. Section 210 and 211 call for reports on authentication, civil
liberties, and authentication functionality. These will include a review and analysis of risk issues
in the context of previous OMB guidance.35 Section 205 of the bill requires a comprehensive
review of the statutory and legal framework applicable to cyberspace.
What about privacy?
One further regulatory trend relates to the continuing expansion of the FTC’s reach under
Section 5 of the FTC Act, which provides authority to the FTC over unfair and deceptive trade
See The Cybersecurity Act of 2010, S.773, available at http://www.gpo.gov/fdsys/pkg/BILLS-111s773rs/pdf/BILLS-
Ibid. at Section 201(c).
See OMB e-Authenitcation Guidance Memorandum 04-04 (OMB 04-04) available at
practices. Early actions by the FTC focused on companies that had deceived consumers by
violating the express terms of their privacy policies. When companies responded by writing
privacy policies with greater latitude, the FTC began taking action against companies under the
unfairness prong of the FTC Act. Now, in a Preliminary Staff Report released on December 1,
2010 entitled “Protecting Consumer Privacy in an Era of Rapid Change”36 (the “Report”), the
FTC has proposed a sweeping new draft “framework for businesses and policy makers.” The
Report lays out a broad agenda centered on three principles:
1. Privacy by design – companies should build in privacy protections from the beginning,
including security for consumer data, procedures to protect the accuracy of information,
and collecting only what is needed
2. Simplified choice – companies should present choices to consumers at the point of data
collection, and may forego choice altogether for commonly accepted uses of consumer
3. Greater transparency – consumers should be provided with clearer, shorter, and more
standardized privacy notices, and more information about the practices of non-consumer
The FTC suggests in the Report that companies who collect consumer data “that can be
reasonably linked to a specific consumer, computer, or other device”37 incorporate mechanisms
to protect privacy into their everyday business practices. Amongst other things, the FTC cited
data security, reasonable collection limits, appropriate retention practices, and accuracy of the
stored data. It reiterated the need for companies to utilize “reasonable safeguards”38 to protect
consumer data. In addition, the FTC specifically called out location-based data as being
something that should not be retained by companies longer than reasonably necessary (to
complete a transaction) because of the ability to build consumer profiles from it.39
The Report also broaches the subject of Do Not Track and observes that many consumers are
unaware of the choice mechanisms that do exist in current browsers and of their ability to control
those mechanisms.40 The view of the FTC clearly points to Do Not Track as a way of simplifying
such choice. One significant concern about the Do Not Track proposal involves its
implementation. The FTC states that “[s]uch a universal mechanism could be accomplished by
legislation or potentially through robust, enforceable self-regulation.”41 Presenting legislation as
the primary mechanism from directing the conduct of companies, in combination with earlier
comments that self-regulation has not been effective, raises concerns that new legislation may be
on the way.
Federal Trade Commission, “Protecting Consumer Privacy in an Era of Rapid Change,” available at
http://www.ftc.gov/os/2010/12/101201privacyreport.pdf (December 1, 2010) (hereinafter the “Report”).
Ibid. at 41.
Ibid. at 44.
Ibid. at 47.
Ibid. at 65.
Ibid. at 66
Though Do Not Track has proven to be the most talked about feature of the Report, the FTC
seems to have fallen somewhat short. In its discussion, the Report describes only a binary
mechanism, stating that “[t]he most practical method of providing uniform choice for online
behavioral advertising would likely involve placing a setting similar to a persistent cookie on a
consumer’s browser and conveying that setting to sites that the browser visits, to signal whether
or not the consumer wants to be tracked or receive targeted advertisements.”42 Although the
Report hinted about something other than a binary “Track/Do Not Track” decision mechanism
and asked for input on this issue,43 it did not provide much detail or guidance on how such a
system could be developed or deployed. Many had hoped that the FTC’s research would have
resulted in more robust discussion of this concept.
In light of the proposals by the FTC in the Report, there will likely be a number of comments and
other proposals by both the consumer and business communities. Comments are due to the FTC
by January 31, 2011.44
What about you?
So where does all of this uncertainty and lack of legal uniformity leave you, the security
professional? In many respects, you should view this as an opportunity. By learning about the
legal landscape and being able to translate for the various stakeholders within your organization,
you become very valuable. The ability to understand and discuss security technology is
undoubtedly important. Having the ability to synthesize that with some of the diverse legal
underpinnings described above will mean that your organization can turn to you for answers that
very few (if any) people will have.
Some specific actions to consider:
• Help achieve management buy-in. Work directly or even indirectly with the CxO suite to
make risk reduction related to information security a priority.
• Focus not just on increasing security but more globally on reducing risk.
• Stay on top of trends in the industry, including trends in the legal arena that have an effect on
information security policy. Track new bills as they get introduced (both federal and state).
Become involved in organizations that analyze legislation.
• Establish an information security “tiger team” consisting of all interested stakeholders (which
could include a CxO, the HR director, the IT manager, the legal counsel’s office, and anyone
else critical to handling information security issues).
• Use the tiger team to establish a “united front” that will be useful when dealing with outside
entities, such as the FTC, external auditors, or business partners that have expectations
related to information security. A united front means that all parts of an organization know,
Ibd. at 66.
Ibid. at 68, stating that the “Commission staff seeks comment on whether a universal choice mechanism should include an
option that enables consumers to control the types of advertising they want to receive and the types of data they are willing to
have collected about them, in addition to providing the option to opt out completely.”
Comments can be submitted at https://ftcpublic.commentworks.com/ftc/consumerprivacyreport.
understand, and follow a written security plan. This will help further reduce risk.
Perhaps most importantly, having a good understanding of the risks associated with information
security will allow you to become a critical resource in the risk assessment and decision making
process. When the legal team or the audit team or the risk team within your organization needs to
decide on a course of action with respect to cyberrisk, you will be one of the first people to
whom they turn.
— The views expressed herein are those of the author and do not necessarily reflect the positions of any current or former clients of SNR
Denton or Mr. Sabett. In addition, nothing above should be interpreted as legal advice. For any specific questions, please talk with a
About the Author
Randy V. Sabett, J.D., CISSP, is the co-chair of the Internet & Data Protection (IDP) practice
group at SNR Denton US LLP, an adjunct professor at George Washington University, and a
member of the Commission on Cybersecurity for the 44th Presidency. He may be reached at
Cloud Computing Customersʼ “Bill of Rights”
By David Navetta – ISSA member, Denver, USA Chapter
The author sets forth a Bill of Rights to be used as a tool for cloud customers and cloud
service providers to engage in spirited debate about the issues of information security in the
Regardless of whether one thinks it a mere marketing slogan or a paradigm-shifting technology
revolution, 2010 was a big year for cloud computing. Mainly driven by the promise of significant
cost savings and instant scalability, organizations of all shapes and sizes have entered into the
Cloud. In exchange for this savings these organizations ceded increasing amounts of control over
their information technology to third parties.
The tension between costs savings and loss of control is best demonstrated when considering
information security in the Cloud. When an organization goes into the Cloud, it often finds itself
relying heavily on its cloud providers to protect the confidentiality, integrity, and availability of
its sensitive information. In fact, for some organizations concerns about information security and
compliance obligations around security are enough to scuttle a cloud deal. Security-sensitive
companies that do want to enter in to the Cloud often will conduct security assessments and other
due diligence activities before considering a cloud provider.
In addition to security assessments, many organizations are also looking carefully at contract
terms that cloud providers are offering them around security. However, in many cases cloud
providers offer one-sided contract terms that do not adequately address these security-related
concerns. In fact, some cloud providers will not even allow the negotiation of such terms.
What are the security-related contract terms that cause the most consternation? What follows is a
“Bill of Rights” for customers entering into the Cloud that are worried about information security
and the issues surrounding it. This Bill of Rights is intended to serve as the foundation of a cloud
relationship, allow for more transparency, and enable a better understanding of potential legal
and security risks associated with the Cloud. While the strong term “rights” is used, cloud
arrangements vary and every transaction has its own issues and circumstances that impact the
nature and scope of a negotiation. Moreover, as with the real Bill of Rights, none of these rights
discussed below are absolute and may appropriately be subject to reasonable limitations by
service providers in certain contexts. As such, this document should be viewed less as a universal
mandate, and more as a tool for cloud customers and providers to engage in spirited debate about
the issues addressed in this Bill of Rights.
Annotated Cloud Customer’s Bill of Rights
The following provisions make up the Cloud Customer’s Bill of Rights:
Article I – Data Location Transparency
Cloud service providers shall reveal the physical location of the servers that will be processing their
cloud customers’ data, and shall provide reasonable advance notice if those physical locations change;
cloud service providers shall coordinate with their customers to assure compliance with local laws and
any applicable restrictions on the transfer of certain categories of data from one jurisdiction to
Comments: The bottom line for this right is that in this day and age, for better or worse, the
nature of the data and the physical location where it is processed dictate the privacy and data
security legal obligations of cloud customers. Transborder data flow issues are not new, but they
are magnified in the cloud context where the free flow of data across borders is the norm.
The classic example is the EU Data Protection Directive. A company that moves data made up of
personal information of EU residents outside of the EU to certain countries (like the U.S.) risks a
violation of EU law. The recent privacy law passed by the Canadian province of Alberta
prohibits the transfer of Canadian personal information outside of Canada without providing
certain notices to the data subject. Another example is the desire for some entities to avoid
having their data processed on U.S. soil because of the USA Patriot Act. The processing of data
in an unexpected country might also generally implicate jurisdictional issues over a particular
cloud customer (e.g., it would allow a customer to get “hauled into court” or investigated by a
regulator). Finally, in another twist in the litigation context, having to disclose certain data that is
subject to a discovery request could run afoul of privacy laws in certain jurisdictions.
Cloud service providers that fail or refuse to reveal where their customers’ data is being
processed risk exposing their customers to significant regulatory and data security and privacy
legal risk. Unfortunately some providers simply refuse to provide this information (either
because they do not want to, or perhaps because they do not know or cannot keep track of it).
Other cloud providers are more sensitive to this issue and will actually promise that their
customers’ data will be processed only in certain countries or locations. Nonetheless, for cloud
customers to truly understand the privacy and security legal risks of the Cloud, they need this
information, and they need to have the ability to contractually constrain the processing location
of their data.
Article II – Security Transparency
Cloud service providers shall provide full information and access to documentation concerning their
security policies and measures, including the ability for cloud customers to conduct periodic security
assessments and obtain relevant security-related information and documents from the service
provider; this information and documentation should address data integrity and availability as well
as the confidentiality of customer data.
Comments: Cloud customers may be ultimately liable for security breaches suffered by their
cloud service providers. Moreover, cloud customers may have legal obligations to maintain
certain security measures. This obligation does not disappear just because a customer’s data is
being processed by a cloud service provider. Yet, in many cloud transactions, getting good
information about security can be very difficult. While many cloud service providers are willing
to provide SAS70 reports, if not tied to established data security standards such as ISO 27002,
these reports may provide only a limited picture of security (and often the picture limited to that
the provider desires to reveal). Unless the cloud customer is a large entity (and even then), most
cloud providers will not allow for an independent security assessment by the customer.
Moreover, in long-term relationships, a cloud provider’s security stance may change. Even if in-
depth information is provided at the outset of a relationship, if security is not allowed to be
revisited, cloud customers may be at risk. Similar to the data location issue, this can result in
very unpleasant surprises in the form of security breaches, lawsuits, and regulatory actions. As
such, from the cloud customer point of view, transparency around a cloud provider’s security is
of paramount importance.
Article III – Subcontractor Transparency
Cloud service providers shall provide cloud customers with notice as to which third parties will have
the ability to access the customer’s data and for what purposes, including subcontractors,
subcontractors of subcontractors, and so on.
Comments: It is not an uncommon for cloud customers to discover that the cloud service
provider with whom they are entering into an agreement is not the sole entity that will be
processing their data. The classic example is a SaaS running on a third-party cloud. These
relationships may be more attenuated than meets the eye as there may be third and fourth levels
of cloud providers processing customer data, and the cloud customer may have no idea who is
actually handling its data. Even if a cloud provider has revealed its subcontractors at the outset, it
is not unusual for a cloud provider to switch subcontractors. From the cloud customer’s point of
view, it is important to know exactly who will have access to its data, and whether those entities
pose additional risk. Unfortunately, these relationships may not be revealed up front by cloud
providers, and are even less likely to be revealed in the middle of a cloud relationship. Rather,
many cloud contracts contain clauses that provide the service provider with the right to use third
parties, or are silent on the issue. Some cloud providers may want to impose certain contract
conditions to govern the use of subcontractors.
Article IV – Subcontractor Due Diligence and Contractual Obligations
Cloud service providers shall conduct reasonable due diligence and security assessments of
subcontractors or other third parties that will have access to customer data or systems, and shall enter
into contracts with such third parties that hold those third parties to substantially similar obligations
as in their cloud agreements with their customers; cloud service providers shall manage and similarly
limit the ability of their subcontractors to utilize other subcontractors.
Comments: As a corollary to Article III above, to the extent that cloud providers do utilize third
parties to process customer information, a proper vetting of such third parties is appropriate, as
well as contractual obligations. The providers’ due diligence should include data security and
privacy assessments of their subcontractors, but also more generally ensure that their
subcontractors are capable of carrying out the promises made by the cloud provider to its
customers. This due diligence should be buttressed by contractual obligations imposed on
subcontractors that match those made by the cloud provider to its customers. Finally, both for
their own protection and the protection of their customers, cloud providers need to worry about
and limit their subcontractors’ ability to use subcontractors.
Article V – Customer Data Ownership and Use Limited to Services
Cloud customers shall have the right to solely “own” the data they put into a cloud service provider’s
cloud, and cloud service providers shall use their customers’ information solely for the purposes of
providing services to the customer, unless otherwise explicitly agreed.
Comments: Certain types of data flowing through cloud provider systems is extremely valuable
(e.g., personal information of users) and there may be some temptation to use or exploit this data.
Customers will expect that their cloud providers acknowledge that the customers are the sole
owners of that data relative to the providers, and that the data should only be used to provide
services to the cloud customer. In fact, this was one of the key requirements of the City of Los
Angeles when it agreed to use Google cloud services. If service providers are going to use data
beyond the purpose of providing services, notice to their customers should be provided. Service
providers that engage in such activities risk hurting their customers’ relationships with their own
clients and customers, and risk rendering their customers in violation of their own privacy
policies or data privacy laws.
Article VI – Response to Legal Process
Cloud service providers shall provide notice (within hours, not days) of the service of any subpoena or
other legal process seeking customer data, and shall assist and cooperate with their customers in
responding to such legal process.
Comments: The ability of a cloud customer to understand when the government is seeking its
data is crucial for managing legal risk. If a cloud service provider sits on a subpoena or other
legal process, it could harm the target customer and hamper its ability to adequately respond to
such a request and develop legal positions. Cloud service providers should consider developing a
process for promptly dealing with these requests and providing notice to their customers. In the
cloud context, with data potentially distributed across multiple geographically distant data
centers, developing an efficient process and information flow may be challenging.
Article VII – Data Retention and Access
Cloud service providers shall reveal their data search, retention, and destruction practices to their
cloud customers; and shall develop and enable data search, retention, and destruction capabilities in
order to allow their customers to implement their own data retention programs, efficiently effectuate
litigation holds, and locate, collect, and preserve relevant data, including metadata; cloud service
providers shall build in processes and controls that allow for the efficient authentication of data (e.g.,
accurate time-stamping, metadata, chain-of-custody indicators, etc.).
Comments: Most sophisticated organizations have data retention policies and procedures in
place for executing a litigation hold and preserving data. Implementing these policies and
procedures internally can be a challenge, and that challenge is magnified significantly in a cloud
environment where the customer must rely on a third party, and the flow of data is very fluid
(and where data may be intertwined with data of multiple cloud customers). In an environment
where proper eDiscovery and electronic evidence practices can make or break a lawsuit, the
search, retention, and preservation capabilities of a cloud provider are very important. Cloud
customers will be seeking to ensure that their own internal policies can be followed in their cloud
provider’s environment. This requires transparency on the front end and technologies that enable
the efficient identification, collection, and preservation of data. On the back-end, service
providers will be expected to cooperate with and assist their customers with obtaining electronic
evidence and responding to electronic discovery requests. As discussed with respect to Article
VIII, this may be tricky in the cloud context.
Article VIII – Incident Response
In the event a cloud provider suffers a security breach, cloud providers shall provide prompt notice of
the security breach to their affected cloud customers; shall coordinate, cooperate, and assist their
customers with the investigation, containment, and mitigation of the breach; and shall allow their
cloud customers to conduct their own forensic assessment and investigation of the security breach.
Comments: Similar to issues around litigation holds and data preservation, cooperation and
coordination is crucial when a cloud service provider suffers a security breach. Again, it is the
service provider’s customers whose business will suffer due to a breach, especially if procedures
are not in place for the containment and mitigation of a breach. This again requires service
providers to provide transparency as to their internal incident response processes so that cloud
customers can ensure that their own internal incident response policies match up. Also of
significance is the ability of cloud customers to access their service provider’s facilities and
systems in order to conduct their own forensic security assessment of a security breach. This is
important not only for data preservation, but also for substantive defense issues. Cloud customers
need to be able to conduct such assessments to determine what went wrong, whether any laws
may have been violated, the defenses that may be available to the company, and who was
responsible for the breach. On the latter question, in some cases it may be the service provider
who was at fault, which makes getting access an interesting proposition. Moreover, the multi-
tenancy nature of cloud computing poses challenges: some cloud providers claim that
independent forensic assessment is not possible because it could expose the data of the
provider’s other customers and potentially result in a violation of a non-disclosure agreement.
Needless to say this is a very tricky issue, and finding the right balance is a challenge.
Article IX – Indemnification and Limits of Liability
Cloud service providers shall engage their customers in meaningful discussions and negotiations
around indemnification and limitations of liability arising from security breaches, including
consideration of exceptions to limits of liability for security breaches suffered by the cloud service
Comments: From the customer perspective, it is ceding control of some of its most precious
assets: its ability to provide its goods or services, and its data. When a customer suffers a breach
internally, its incentives are to mitigate the breach and potential adverse consequences to the
organization. In the cloud context the service provider’s interests may not be aligned with those
goals (in fact, to the extent the service provider was at fault, its interests may run counter to its
customers’). Service providers may choose to put their own considerations very high up. Also to
the extent a breach involves multiple cloud customers, cloud service providers may also favor
the interest of particular customers over others. This lack of control and reliance on the providers
justifies serious consideration of indemnification clauses, consequential damages disclaimers,
and limitations of liabilities. In some cases, service providers may provide higher limits of
liability (or even no limits of liability) for confidentiality breaches or security breaches.
When it comes to information security, companies considering the Cloud should proceed
carefully. Unfortunately, cloud service providers do not always make it easy to fully understand
and manage information security risk in the Cloud. Cloud customers should take steps to
understand the security of potential cloud providers, including security assessments and proper
In addition, the contract terms governing security are also of significant importance. These terms
ultimately dictate the security-related and compliance risks that the cloud customer takes on.
Increasingly security professionals will need to have a better understanding of these contract
terms and the level of risk associated with the Cloud. Security professionals will be looked on to
educate their organization’s legal counsel in order to enable them to negotiate these terms in a
manner that sets the risk at an appropriate level.
This Bill of Rights should be seen as a starting point to gain that understanding and provide some
preliminary tools to begin addressing these contract issues.
About the Author
David Navetta, Esq., CIPP, is one of the founding partners of the Information Law Group
(www.infolawgroup.com). David has practiced law for over twelve years, including technology,
privacy, information security, and intellectual property law and currently serves as a co-chair of
the American Bar Association’s Information Security Committee. He has spoken and written
frequently concerning technology, privacy, and data security legal issues and can be reached at
The Security Professional and the Legal Environment
By Mike Ahmadi – ISSA member, Silicon Valley, USA Chapter
The security practitioner must develop an understanding of the legal environment as most
(if not all) proactive security decisions an organization makes are driven by the law – more
precisely, a dire need to prevent the law from bleeding funds from the organization’s
I am not a lawyer, nor do I play one on TV. I am a security professional. As a security
professional, most of my interest with respect to the law has risen within the last decade. This
was not initiated because I felt that understanding the law would help me be a better security
professional (although I do believe it does help me today). The reason I became more interested
in understanding the law is because those who pay me to do my job are keenly interested in the
law. In fact, I would surmise that most (if not all) proactive security decisions an organization
makes are driven by the law - more precisely, a dire need to prevent the law from bleeding funds
from the organization’s coffers.
Security is part of a risk mitigation strategy45 in all organizations today. At the very least,
physical security (e.g., a lock on a door) is part of their security strategy. Organizations that
have felt the pain of security-related incidents generally react by implementing more robust
security. This is usually not because the law requires more robust security (although it can), but
because the organization simply wants to put an end to something that is depleting the bottom
line. I say “usually” because some organizations are indeed more proactive…but only to a point.
Proactive security measures are notoriously difficult to justify from a return on investment (ROI)
perspective at stockholder meetings. If security measures are implemented and in return for the
measures nothing bad happens, then perhaps it is a job well done. Stockholders do not always see
it this way. Sometimes they simply see it as a waste of money. There was nothing wrong with
respect to security when the project started, and there is nothing wrong after it is put in place.
The only net change is that the cost of doing business went up.
Do you see the picture?
This is where the legal side of things gets interesting. The legal industry is well versed in the
concepts of “due diligence” and “due care.”46 Some people mistakenly believe they are the same,
but (despite being related) they are not. Due diligence refers to actions/activities of discovery.
With respect to security, it means that the guy in charge (someone at the C-level) has engaged in
activities to determine what potential vulnerabilities exist, what threats may possibly exploit the
vulnerabilities, what is the likelihood of occurrence, and how much will it cost the company if it
happens. This activity is generally referred to as a Business Impact Analysis, which is a critical
component of business continuity planning.47 Anyone who has studied for the CISSP is indeed
familiar with this. Going through this exercise is certainly indicative of some level of due
…but what comes next?
What does the prudent C-level manger do with this information?
This is where the concept of due care comes into play. After reviewing the potential risks, it is
time to make a decision about what the organization is going to do about it. Do you accept the
risk, or do you implement safeguards to mitigate the risk? From a legal perspective, one thing
that you do not want to do is ignore the risk (especially not after you are made aware of it),
because any decent lawyer can use that to prove negligence. A key decision maker at an
organization has to show that he or she is doing something to address the security issues. One
popular method is plugging the information into a risk formula. A typical one would be:
Asset Value (a $ amount)
x Exposure Factor (a percentage)
= Single Loss Expectancy
This is often abbreviated AV x EF = SLE.
In other words, this is how much we can expect to lose when one bad thing happens to one asset.
This seems straightforward enough, since asset value can be estimated fairly accurately (in many
cases). Exposure factor may be a bit tricky, but for the sake of this discussion let’s assume it is
100% (in other words, one bad thing destroys the entire asset).
It is this next part where things get a little tricky. In order to determine how much ($) a company
should invest in protecting an asset from security issues, the SLE is plugged into the following
SLE x ARO (Annualized Rate of Occurrence)
= ALE (Annualized Loss Expectancy)
This may also seem straightforward enough, but let’s talk about the ARO for a moment.
In the world of finance, the ARO is a fairly easy number to come up with. Banks get ripped off
all the time. They will not tell you they do, but they do…to the tune of millions of dollars every
month. They have plenty of data and can easily come up with an ARO that can lead to an ALE
that gives them a pretty good idea of how much they need to spend – and this is important – to
avoid appearing negligent. Losing a few million to hackers is bad. Losing many more millions to
people who sue you for negligence on top of that is worse…and can be a lot worse if the
plaintiffs have good lawyers. If the ALE is $10 million, and a bank spends $10 million annually
in security countermeasures, then that is considered prudent. In fact, $8 million will probably
suffice, since the remaining residual risk is sometimes considered acceptable (and prudent from a
…but what about situations where the security issues are only theoretical? What if the bad
things have not yet happened? What is the ARO then?
Did you guess 0? Well, then you are correct.
When you plug the 0 into the equation, the ALE conveniently ends up being $0 as well. So that
means that a company needs to spend how much to mitigate the risk, and not appear negligent?
From a due diligence perspective the job is done. From a due care perspective, all that really has
to happen is spending a little to keep an eye on things. From a legal perspective, in the event of a
security mishap, the plaintiff would have to prove negligence. In the absence of regulations, the
risk formula argument certainly looks good to the defense, especially since it is commonly used
in industries that face risks on a regular basis.
The concerned world we live in has not necessarily accepted the woeful inadequacy of this. As
technology has continued to explode, we have become painfully aware of the inevitable security
issues that seem to tag along with every new technological breakthrough and have taken some
steps (albeit very small steps) to pro-actively deal with the issues. We have tried voluntary
compliance with some arbitrary (yet often well thought out) guidelines in some industries. That
does not ever work out as well as we would like it to…if at all. Typically the principals that
promote self-regulated compliance start out very gung-ho about the idea, and then it fizzles out
after a short time period. When finances get tight, as they always do at some point, security
compliance is treated as a discretionary activity. This eventually leads to some level of
mandatory compliance through a regulatory body with some legal teeth, and that is when the
legal profession gets involved again. Lawyers love regulations because regulations need to be
interpreted and those who are regulated generally do not want to spend any more than they must
to be compliant. Spending money on a legal professional who, ostensibly, will help guide you
towards compliance is generally considered wiser than simply throwing money at a problem with
hopes that you are covered. Lawyers write regulations, and lawyers are who you want
interpreting them, so get the checkbook out.
So where does the security professional fit into all of this? In the case of organizations that have
determined the ALE based upon the risk formula, it is fairly obvious. Security professionals are
brought in to help determine the threats and vulnerabilities, and then to recommend
countermeasures. This does not necessarily involve much interaction with lawyers (at least not in
my experience), since the inputs are not generally legally focused. Under the regulated scenario,
however, it is all about the lawyers, and lawyers do not like to be caught by surprise, or ask
questions they do not know the answer to, or go into any situation knowing less than anyone else
in the room. Knowledge is power, and a lack of knowledge is a weakness.
Approximately a year ago I became involved with the American Bar Association Information
Security Committee (ABA ISC).48 This group is comprised of lawyers and security professionals
who work together to gain a better understanding of security and the law. Lawyers do not like
going into a courtroom with a perceived understanding of a technological security issue and then
have to face an expert witness with a very comprehensive understanding of security, who will
generally have no problem making a lawyer look like a fool…if the lawyer is not prepared. This
is why we now have lawyers who specialize in Information Security Law. When I first heard of
this group, I was under the impression that they were not going to be very well informed about
security, but this is certainly not the case. The lawyers I have met are all quite knowledgeable,
and are constantly working to gain more knowledge. As a security professional, I have gleaned
quite a bit of knowledge about information security and the law by listening to what they have to
say as well.
As someone who has taken the time to understand both environments (security and legal) over
the last year, I have had to rethink a lot of what I once considered firm maxims about some
concepts. One of these is the concept of ownership, and specifically the ownership of data. If I
own $100,000, and it is stored in a bank account, who owns that data? If I withdraw the money
in cash and take it home with me, I certainly assume I own that money. If I buy something with
it, I clearly own what I buy. If I destroy what I buy? As the owner of purchased item I am
certainly within my rights to do so. So what if I choose to destroy the information a bank holds
saying I have the $100,000 to begin with? Is that my right as the owner of the $100,000, or does
the bank own that information? If they own it, can they destroy it? Do we mutually own it? If so,
does a mutual agreement to destroy the information make it okay to destroy it? What if the IRS
wants the information?
Why is this important to security professionals?
Well, because as security professionals we deal with data on a regular basis. We create it,
manage it, protect it, archive it, and destroy it on a regular basis. We also are tasked with
determining who else has engaged in these activities. During the course of any given day in any
business environment this goes on in a continuous manner. The choices a security professional
makes with respect to these activities can (and does) have a profound effect on the legal issues an
organization may have to deal with.
…and this is only one example
Besides the questions of information ownership and custodianship, we also have the issues of
privacy, conflict of interest, ethical vs. non-ethical penetration testing, information disclosure,
awareness, activity monitoring…the list goes on and on. Like the issue of information
ownership, the answers are not “black and white” in nature. A security professional who does not
bother to achieve, at the very least, a basic understanding of the legal implications of his or her
actions is certainly engaging in extremely insecure activities from a personal perspective, and
may potentially create a deluge of legal liabilities for his or her client/employer. This holds true
for both an environment where security is regulated and one where it is not regulated, because in
either case the environment is subject to regulations with respect to information and the systems
and people who interact with it.
In the last year I have spent quite a bit of time working with groups that are focused on defining
regulatory standards and requirements for information security. This is not because I particularly
like the idea of more regulations, but because I understand that they are necessary, and most
importantly, because I realize that not understanding how they came about and how they affect
me as a security professional means that I am potentially putting myself and my clients at risk.
As a security professional and an ordinary guy, I am also more concerned with assuring I am
secure before anyone else.
Knowledge is power.
Legal knowledge is legal power.
About the Author
Mike Ahmadi is Vice President of Operations and co-owner of GraniteKey LLC, which is a
consulting firm specializing in embedded systems security, secure mobile application
development, security regulatory consulting, and security business development. Mike is
currently serving on the Privacy and Security Advisory Board Security Steering Committee for
the California Office of Health Information Integrity (CalOHII PSAB). In addition, Mike is also
serving on the NIST AMI and NIST Testing and Certification sub-groups under the NIST Cyber
Security Working Groups (CSWG), as well as the Department of Homeland Security Industrial
Control Systems Joint Working Group (DHS ICSJWG), and the UCAIUG OpenSG working
groups. He may be reached at firstname.lastname@example.org.
Intersection of Law and Information Security
By Steven W. Teppler – ISSA member, Tampa Bay, USA Chapter
Data security laws and regulations have inaugurated a new era of increased potential
liability affecting operations of domestic and multinational organizations operating in the
United States, underscoring the critical role of data security professionals as stakeholders
in enterprise success.
Recently enacted data security laws and regulations or enhancements to existing data protection
and security laws have inaugurated a new era of increased potential liability affecting operations
of both domestic enterprises and multinational organizations operating in the United States.
Existing laws such as Sarbanes-Oxley and HIPAA, when taken together with recently enacted
Dodd-Frank and HITECH Act laws and regulations, expose the enterprise to significant
financial, reputational, and even criminal liability. Litigation focusing on cloud computing and
social networking privacy and data breach issues is on the rise. Throw into the mix the onslaught
of electronic discovery obligations and underscore the critical role of data security professionals
as stakeholders in enterprise success.
The Sarbanes Oxley Act (“SOX”) was enacted into law on July 30, 2002, largely as the result of
financial scandals at corporations such as Enron and the now defunct accounting firm Arthur
Andersen and ushered in the age of intensified scrutiny on corporate governance.49 In pertinent
part, SOX imposes disclosure and certification requirements on senior management of covered
publicly traded companies, and, when its certain requirements are met, SOX’s reach extends to
international as well as domestic enterprises. These certifications are designed to ensure the
accuracy of a covered entity’s public filings, and create heightened responsibilities for attorneys
and accountants who work with these entities. SOX violations include both financial as well as
Two of SOX’s sections directly implicate data security, and should be of continuing concern to
data security professionals. Section 302 and the regulations promulgated thereunder play a
central role in how a company’s senior management certify as to the accuracy and fairness of
their periodic public filings. That section also imposes on senior management the requirement
that they acknowledge their responsibility for establishing and maintaining their enterprise’s
“internal controls.” Section 404 provides generally that the annual report filed by a covered
public company contains an “internal controls” report. Together, these two sections implicate
data security practices and procedures. Taken together, Sections 302 and 404 form the basis for
compelling data security stakeholders to be ever more vigilant in architecting, implementing, and
maintaining defensibly robust data security policies and processes.
This increasing convergence of governance and information security issues is underscored in
regulations promulgated under SOX, which implicates key data security concerns of access and
A process designed by, or under the supervision of, the registrant’s principal executive and
principal financial officers, or persons performing similar functions, and effected by the
registrant’s board of directors, management and other personnel, to provide reasonable
assurance regarding the reliability of financial reporting and the preparation of financial
statements for external purposes in accordance with generally accepted accounting principles
and includes those policies and procedures that:
• Pertain to the maintenance of records that in reasonable detail accurately and fairly
reflect the transactions and dispositions of the assets of the registrant
• Provide reasonable assurance that transactions are recorded as necessary to permit
preparation of financial statements in accordance with generally accepted accounting
Sarbanes-Oxley Act of 2002, Pub. L. No. 107-204, 116 Stat. 745 (codified as amended in scattered sections of 15, 18 U.S.C.).
principles, and that receipts and expenditures of the registrant are being made only in
accordance with authorizations of management and directors of the registrant
• Provide reasonable assurance regarding prevention or timely detection of unauthorized
acquisition, use or disposition of the registrant’s assets that could have a material
effect on the financial statements.
The ubiquity of digital accounting records in public companies covered by SOX continues to
present significant compliance challenges. It is long established that governments and courts
have recognized that internal controls requirements subsume issues relating to information
security.51 Internal controls problems can be caused by “material weaknesses.” An example of
such material weakness may be evidenced by insider access to enterprise environmental controls,
which can include unnecessarily heightened levels of administrator access to critical enterprise
data infrastructure. Put very simply, these unnecessarily heightened levels of administrator
access can include control over the time source for data generation in the enterprise. Control over
time in a covered entity’s data generating environment resulted in the nearly hundreds of SOX-
related charges for backdating of stock option grants.52 Although to date criminal convictions
for SOX violations are rare,53 the financial toll for non-compliance can run into the tens of
millions of dollars, and may cumulatively total in the hundreds of millions of dollars.54
But wait, there’s more.
The Dodd-Frank Wall Street Reform and Consumer Protection Act (H.R. 4173) was signed into
law July 21, 2010 and introduced new and extensive financial industry reforms. These reforms
include significantly enhanced whistle-blower provisions already provided by SOX. The Dodd-
Frank Act creates a private right of action (think class action, or securities class action) for
violations of the whistle-blower provisions of SOX. The statute of limitations for these action
runs six years from the date the violation occurred, six years from the date the whistle-blowing
employee knew or should have known of the violations asserted, with a maximum of ten years
after the date on which the violation occurred. While the objective of Dodd-Frank is laudable
(financial industry reform and transparency), the collateral financial and reputational damage
even to a blameless covered organization can be enormous, and compels additional involvement
by enterprise data security stakeholders who must now enhance, modify, or improve information
security policies and processes to address two areas of concern: practices by insiders who are
17 C.F.R. §§ 240.13a-14, 15d-15 (2005) (for section 302); Final Rule: Management’s Report on Internal Control Over
Financial Reporting and Certification of Disclosure in Exchange Act Periodic Reports, 68 Fed. Reg. 36,636, 36,642 (June 18,
2003) (for Section 404).
See, U.S. TREASURY, TREASURY DIRECTIVE 87-05, ELECTRONIC COMMERCE INITIATIVES (Apr. 21, 2001),
For example, former Brocade CEO Gregory Reyes’ conviction in connection with backdating was reversed by the Ninth
Circuit in 2009. See http://www.law.com/jsp/article.jsp?id=1202433151691.
Research In Motion (RIM) was fined 62.5 million dollars for backdating stock option grants; see
http://www.nytimes.com/2009/02/06/technology/companies/06rim.html. Former Comverse CEO Jacob Alexander agreed to
settle backdating charges with the SEC for 53 million dollars; See http://www.sec.gov/news/press/2010/2010-232.htm.
engaging in activities that violate either SOX or Dodd-Frank, and, perhaps more important,
where no issues of SOX or Dodd-Frank violations are present, the deployment of heightened
data generating environment monitors to prevent unauthorized information access and retrieval
that can result in millions of dollars of potentially frivolous litigation.55
Data breaches and privacy
Data breaches are increasingly costly. Datalossdb,56 reports that 27% of data breaches it tracks
involve stolen computers. Stolen laptops alone account for the single largest source of data
breaches – 20%. An additional 10% of breaches involve lost or stolen drives or other storage
media. A Ponemon Institute study57 released in 2010 concludes that costs associated with data
breaches went up in 2009, with an average cost per incident of $6.7 million. The Ponemon study
further reports that “direct costs soared by 20 percent due to increased legal defense costs.” The
study attributes these costs to “increasing fears of successful class actions resulting from
customer, consumer, or employee data loss.” Although privacy-related claims have had limited
success to date,58 both federal and state legislatures are starting to put teeth into data protection
laws and regulations that bring with them more and more specific requirements for data security,
and make real, and immediate, the legal risks of inadequate data security.
Moreover, and in distinction to the more unified approach in Europe, privacy regulation in the
U.S. is a patchwork of state and domain-specific federal laws. The Graham-Leach-Bliley Act of
1999 (GLBA), for example, regulates personal information held by federally regulated financial
institutions. The Health Insurance Portability and Accountability Act of 1996 (HIPAA), and now
the Health Information Technology for Economic and Clinical Health (HITECH) Act, regulate
health information. Forty-six U.S. states, plus the District of Columbia, Puerto Rico, and the U.S.
Virgin Islands have data breach notification (DBN) laws, in addition to other data- or domain-
specific privacy statutes. This patchwork of state and federal laws and regulations add a generous
serving of compliance complexity to data security professionals at both domestic and
These ever more specific laws and regulations governing data security provide clearer standards
for data protection practices, which will in turn provide guidance to organizations’ risk
management practices. However, this specificity also provides well-defined targets for lawsuits
or other enforcement actions that include specific allegations of practices and technology that
have fallen short. The thoughtful legal counsel, risk manager, or information security
professional will understand that in terms of legal risk, the compliance or effectiveness of a data
protection program is a snapshot-in-time measurement taken after the fact of a security incident,
at which time some part of the program is ipso facto proven inadequate. Information security
professionals are now stakeholders in the legal compliance universe, and will need to understand
It should be noted that significant costs associated with data breaches (i.e., badly architected, implemented or maintained
information security eco-systems are not relegated to HIPAA/HITECH violations. The Hannaford/TJX complex of data breach
issues resulted in combined damages nearing one billion dollars. See In re TJX Companies Retail Sec. Breach Litigation, 524
F.Supp.2d 83 (D.Mass. 2007); In re Hannaford Bros. Co. Customer Data Security Breach Litigation, 613 F.Supp.2d 108 (D.Me.
2009) and In re: Heartland Payment Systems, Inc., 626 F.Supp.2d 1336 (U.S.Jud.Pan.Mult.Lit. 2009).
Press Release: http://www.ponemon.org/news-2/23; Study available from http://www.encryptionreports.com/index.html.
See, e.g., Hammond v. The Bank of New York Mellon Corp., 2010 WL 2643307 (S.D.N.Y. 2010).
how this joinder of law and technology will affect the development, deployment, auditing and
assessment of enterprise information security infrastructure.
One area that remains murky to both technologists and lawyers is the requirements by the
various statutes and regulations that certain information be encrypted. Although the various data
security laws and regulations speak to encrypted vs. unencrypted information, not one addresses
either provability or defensibility of encryption deployments. In other words, just having
encryption capability will not be enough to avoid potential legal liability. Any such assertion
must be testable, and on a periodically reliable basis, or it will be laid open to legal challenge and
successful attack. With potentially enormous financial and reputational stakes at risk, it is more
than likely that the issue of provability will thus rear its head in litigation. The enterprise’s
challenge before a court or government tribunal will be to make a defensible showing to a judge
or jury (or government investigative agency) that data was provably encrypted at the time of the
breach. Meeting that challenge necessitates being able to demonstrate the existence of adequate
auditable controls (tests) over the encryption. Such auditable controls include knowing who can
remove encryption and knowing if and when encryption has been enable or removed.
Once again, information security professionals will need to understand how this joinder of law
and technology will affect the development, deployment, auditing, and assessment of enterprise
information security infrastructure.
HIPAA and HITECH
Title II of the Health Insurance Portability and Accountability Act (HIPAA) of 1996 (P.L.104-
191) includes “administrative simplification” provisions and directs the U.S. Department of
Health and Human Services (HHS) with adopting rules for the privacy and security of personal
health information (PHI). The Health Information Technology for Economic and Clinical Health
(HITECH) Act, passed in 2009 as part of a major economic stimulus bill, includes “improved”
privacy and security provisions for health information.
In general, HIPAA requires HIPAA-covered entities to maintain “reasonable and appropriate
administrative, technical, and physical safeguards”59 to protect PHI. Pursuant to HIPAA, HHS
has promulgated rules60 which require encryption both in storage and transmission of PHI.
Encryption is defined in the rules as “an algorithmic process to transform data into a form in
which there is a low probability of assigning meaning without use of a confidential process or
The HITECH Act adds considerably stronger incentives to adopt encryption. It includes, among
other things, a notification requirement following the breach of “unsecured” PHI. PHI is
“unsecured” when it has not been rendered “unusable, unreadable, or indecipherable to
unauthorized individuals.”61 In other words, “unsecured” means “unencrypted,” and where a data
breach includes the loss or disclosure of covered unencrypted information, liability for
HIPAA/HITECH may ensue. For the enterprise, ultimate responsibility for liability avoidance
(i.e., architecting, deploying, maintaining, and upgrading HIPAA/HITECH-compliant data
42 U.S.C. § 1320d-2(d)(2).
45 C.FR. 164.
74 FR 42740, 42741.
generation, storage, and transmission environments) will fall upon the shoulders of enterprise
information security stakeholders.
The HITECH enhancements to HIPAA significantly expand the legal (read criminal and
financial) liability for violations of HIPAA/HITECH Privacy and Security Rules. In addition to
expanding the universe of entities covered by HIPAA to extend to “business associates,”
HITECH also creates a private right of action (with a right to seek monetary penalties), which
allows a state to bring civil actions on behalf of its residents who are faced with or experienced
adverse effects as a result of a HIPAA/HITECH violation of a covered entity. The State of
Connecticut has commenced the first lawsuit under the new HITECH act in early 2010.62
Electronic discovery and information security
The introduction of the recently amended Federal Rules of Civil Procedure (addressing in
pertinent part the discovery – or acquisition by a party to a lawsuit – of relevant electronically
stored information referred to as “ESI,”63 adds additional complexity to information governance
and legal compliance efforts.64 Layered into information security stakeholder compliance and
governance concerns about accurate information generation are increasingly strict obligations to
defensibly preserve, search, collect, and produce ESI for litigation in general.
These new rules impose certain obligations on parties to a lawsuit that directly implicate
information security professionals. For example, Fed.R.Civ.P. Rule 26 requires that the lawyers
for the parties to a lawsuit “meet-and-confer” to discuss discovery (evidence requests and
acquisition) issues that include ESI. These “meet-and-confer” sessions include technology
representatives from the litigating parties. These representatives are expected to have thorough
knowledge of the party’s computing environment, which includes location of data silos,
identification of data custodians, business continuity, data recovery, access, and integrity issues.
Where disputes about ESI acquisition or integrity arise, the testifying witness may well be the
enterprise information security officer, or an individual who works with that officer. More
recently, the U.S. Court of Appeals for the Seventh Circuit adopted an Electronic Discovery Pilot
Program for cases arising in federal district courts in the states of Illinois, Wisconsin, and
Indiana.65 After such “meet-and-confers” the parties will serve discovery requests, which
typically include requests for ESI in native format, and will also inquire as to the type of data
custodianship and integrity protection mechanisms utilized by that party. Parties to litigation are
also required to preserve certain ESI relevant to the litigation, and this duty to preserve may be
triggered well in advance of litigation. Generally, and in the absence of a good faith operation of
a document retention policy, permitting ESI to be destroyed (i.e., failing to preserve ESI with
integrity intact when a duty to preserve has been triggered) may result in sanctions that can be
costly.66 These sanctions for failure to preserve ESI (also known as “spoliation” of evidence) can
For more information about the Federal Rules of Civil Procedure, and a link to the rules, see
Many states, including New York and California, have enacted electronic discovery amendments to rules of civil procedure
that are either identical or are substantially similar to the Federal Rules. See,
See., e.g., Zubulake v. UBS Warburg, 220 F.R.D. 212 (S.D.N.Y. 2003), and Pension Committee of University of Montreal Pension
range from monetary penalties to dismissal or default (also appropriately termed ‘terminating’ or
‘outcome dispositive’ sanctions).67
The frequency with which law and data security meet (or collide) at the crossroads is on the rise.
As a result of new (or enhanced) laws and regulations and with increasing frequency, legal issues
involving information security are moving to the forefront of enterprise risk management.
Information security and information security professionals are (and really always have been)
key players in enterprise liability mitigation efforts, and this involvement is expected to increase.
About the Author
Steven W. Teppler is a partner at Edelson McGuire, LLC, leading the firm’s electronic discovery
and digital evidence litigation and risk management activities, Co-Chair of the eDiscovery and
Digital Evidence Committee of the A.B.A., chair of the Florida Bar Professional Ethics
Committee (2010-2011), and a member of the Sedona Conference WG1. He may be reached at
Information Loss in Enterprise Networks: Mini-
By Areej Al-Bataineh and Gregory White – ISSA member, Alamo, USA Chapter
This article sheds light on the substantial threat of information loss to enterprise networks
including government, academia, and industry from mini-botnets and provides
recommendations for possible countermeasures.
Cybercrime has been facilitated by the use of botnets because they offer anonymous, distributed,
and automated means for cyberattacks. Mini-botnets are the next generation of botnets with a
smaller size and targeted nature. They are mainly used to steal sensitive information that has high
financial or political value, and their effectiveness depends primarily on being quiet, stealthy,
and better hidden. The malware used for these types of botnets is generated automatically by
specialized software, which enables customization of the malware according to the
cybercriminal’s needs and desires. This software is often created using do-it-yourself (DIY)
software kits such as the infamous Zeus. This article sheds light on this substantial threat to
enterprise networks including government, academia, and industry from mini-botnets and
provides recommendations for possible countermeasures.
Plan v. Banc of Pension Committee of University of Montreal Pension Plan v. Banc of America Securities, 685 F.Supp.2d 456
Cybercrime has increased exponentially in recent years due to the availability of botnets, which
are networks of compromised machines. Botnets are generated by distributing malware via email
or social networking sites. Once the victim machine is infected, the bot communicates with the
master server to provide an update on its status and to receive commands. General purpose
botnets have been very effective in launching attacks such as spamming, distributed denial of
service, and most recently identity theft.
The latest generation of botnets is called the mini-botnet.68 These botnets have fewer members
compared to the general purpose ones and are often built to target a specific organization or
individual. They mostly specialize in stealing personal information, banking credentials, user
names and passwords, and corporate data. According to Damballa, a company specializing in
protecting companies from botnet attacks, mini-botnets accounted for 57% of 600 different
botnets found within global enterprises during a period of three months in 2009.69
The technology of mini-botnets is advancing while defense mechanisms are falling behind.
Cybercriminals use do-it-yourself (DIY) software kits from which a mini-botnet can be created
and maintained. One of the most popular botnet creation kits among cybercriminals is called
Zeus. Some estimates suggest that Zbots (Zeus malware) have infected 3.6 million hosts in the
Other and newer DIY kits emerged following the success of Zeus, such as SpyEye and Butterfly.
SpyEye is considered the successor or competitor of Zeus. It has similar functionalities but is
able to look for Zeus samples and disable them. Butterfly was used to generate the Mariposa
botnet in which 50 of the world’s Fortune 100 companies were targeted along with hundreds of
government agencies, financial institutions, universities, and corporations worldwide.71
The term botnet refers to a network of compromised machines (bots) that are under the command
and control (C&C) of a single owner (master). The malware used for a particular botnet is often
used as the name of the botnet, such as the infamous Storm and Conficker. This is the case of
general large botnets whose size can be numbered in the millions of bots distributed worldwide.
Mini-botnets, on the other hand, are a collection of autonomous, customized botnets under the
C&C of different owners, although they use the same family of malware. In addition, these
botnets are often targeted toward a particular government or enterprise network.
Owner Same group or individual Different groups or individuals
Purpose General (Spam, DDoS, etc) Specific (Information stealing)
Target General (any vulnerable host Specific (vulnerable hosts
K. J. Higgins, “Attack Of The Mini-Botnets,” 2009 –
E. Wu and G. Ollmann, “My Bots Are Not Yours! A case study of 600+ real-world living botnets,” in Virus Bulletin, Geneva,
Damballa, “Sizing a botnet – You’re doing it wrong!,” 2009 – http://blog.damballa.com/?p=326.
Defence Intellegence, “Half of Fortune 100 Companies Compromised by New Information Stealing Trojan,” 2009 –
on the world wide net) within an enterprise network)
Examples Storm, Conficker Zeus, SpyEye
In order for the malware to be used by different masters to create different independent botnets, it has to be
customized such that the address of the C&C server refers to the one owned by a particular master. This has been
achieved by crimeware DIY kits. With these kits a criminal with minimum technical expertise can customize his
malware according to his needs. Often these kits come with two programs, one for creating the malware itself and
another for running the C&C server. The criminal in this case has to generate the malware, distribute it, and setup a
server with a domain name – this is relatively easy to accomplish.
The distribution of malware can be done in several ways. The most frequently used medium is
email through attachments including media files, PDF files, and Microsoft documents. The
second is through link spam on social networking sites and forums. The criminal has to entice
users into clicking on these links to download the malware. A newer method of infection is
through compromising legitimate websites, so when visited, the users’ computers become
infected. In the case of targeted attacks, the criminal typically has a list of email addresses and
websites of individuals and corporate agencies. This list can be obtained by social engineering or
through insiders that cooperate with the criminals to provide the needed information.
Once a machine becomes infected inside the network, the criminal has the power to navigate
through the network and look for other targets. Depending on the capabilities of the malware
used, the criminal may instruct the malware to steal information and exfiltrate it out of the
network to a collection server, often called the drop zone. The value of the information gathered
from victim machines depends primarily on the user or the owner of the machine; those of top
executives and managers provide more valuable information. Figure 1 illustrates the process of
creating a mini-botnet within a network.
The types of information being sought vary according to the criminal’s intent. Online banking
credentials are top value since the criminal can use them to withdraw money from the victims’
accounts. Monetary gain from these types of attacks is very high, but the operational effort is not
trivial. To avoid prosecution, the criminal has to set up a network of money “mules” to export
money abroad. For instance, the criminal group arrested in the multi-national investigation called
“Trident Breach” attempted to steal US$220 million but was detected and was successful at
stealing only US$70 million from victims around the world.72 Over 150 charges or arrests were
made across the U.S., UK, and Eastern Europe in relation to this crime. Some of those arrested
were responsible for spreading the malware and others served as mules for stolen Zeus
Proprietary information is also a target, as seen in the attacks targeting Google and other
companies in what is called “Operation Aurora.” In the case of Google, the computer of a Gmail
employee who had administrative privileges was compromised. The criminals were able to steal
Gmail account credentials and other proprietary information. Some researchers speculate that the
malware used in this attack was not generated by a crimeware kit but rather the botnet operators
built many mini-botnets using different families of malware developed by different authors.73
Zeus DIY malware kit
FBI Press Release, “International Cooperation Disrupts Multi-Country Cyber Theft Ring,” 2010 –
Damballa, “The Command Structure of the Aurora Botnet,” 2010 – http://www.damballa.com/research/aurora.
Zeus, the name given to a DIY kit for the creation and management of information stealing mini-
botnets, appeared in 2007 and since then has grown to be one of the top banking Trojans in the
wild.74 Since its appearance it has gone through numerous changes and improvements and
remains one of the most effective pieces of crimeware criminals are using. It was developed by
Eastern European organized criminals and entered the underground market as a commodity. The
price of the latest version of this kit is relatively high (US$4000 to US$10,000); nonetheless, it
has proliferated because older versions can be obtained free.
Zeus consists of two applications: a binary generation module (builder) and a webserver
application (control panel). The builder generates binary samples that are detected as Zbot by
most AV vendors. The control panel runs on the C&C server and provides easy administration
for the botnet such as viewing the status of bots. In addition, it allows the master to issue
commands to individuals bots such as ”view screenshots,” or even ”kill OS.” The other part of
the control panel maintains a database of the data stolen from victim machines.
Mini-botnets and cybercrimes linked to Zeus
Zeus allows criminals to create and manage their mini-botnets easily. They can target a specific
organization or network, or a collection of networks. In the second case, criminals create the
malware and try to spread it using spam email, forums, or social networking sites. The most
prevalent way of spreading Zeus is “drive-by-downloads,” in which merely visiting a website,
malicious or legitimate, causes the victim machine to be infected by the malware. This results in
a mini-botnet that is scattered around the globe and not limited to a specific network. While these
botnets are often called Zeus, some were labeled using different names and drew media attention.
The Kneber botnet is an example of a Zeus mini-botnet that infected 75,000 systems in 2,500
organizations around the world.75 Kneber targeted 374 U.S.-based firms, including government
agencies, and stole login credentials for online social networks, email accounts, and online
financial services of the infected victims. In terms of government agencies, US-CERT did not
comment on this particular botnet but reported that Zeus is the top financial Trojan infecting
government networks (41%).76
A recent report by the antivirus vendor AVG investigated another Zeus botnet called “Mumba”77
and found that it was responsible for 60GB of stolen data such as accounting details for social
networking sites, banking accounts, credit card numbers, and intercepted emails. It infected
around 55,000 computers, half of which are in the U.K. and Germany. Such massive infections
easily attract media attention, but the actual danger lies in the ones which are not detected and do
not cause much attention.
Symantec, “Zeus: King of the Bots,” 2010 –
NetWitness, “Kneber Botnet Targets Corporate Networks and Credentials,” Feb 2010 –
B. Bain, “Massive botnet may have snared some agency systems,” in Federal Computer Week, Feb 2010.
AVG AntiVirus, “The “Mumba” Botnet Disclosed,” retrieved Feb 2010 – http://avg.typepad.com/files/revised-mumba-botnet-
Data exfilteration and credential stealing can lead to massive monetary losses. According to
Brian Kerbs,78 who followed actual money-loss cases related to Zeus, some firms were bankrupt
as a result. For example a dental practice in Missouri lost $205,000; an IT firm in New
Hampshire lost $100,000; an insurance firm in Michigan lost $150,000; and a New York firm
filed for bankruptcy after losing $164,000. More recently, as mentioned earlier, the FBI and
international law enforcement agencies discovered a ring of criminals that used Zeus to steal
around $70 million from victim’s bank accounts.
Information stealing capabilities
Though tempting, comparing Zbot to a Trojan or a spyware is naïve because the information
stealing capabilities far exceeds any key logger or a spyware. Zbot79 can steal account credentials
of FTP and POP and even those stored in the Windows Protected Storage. It can also steal client-
side X.509 PKI certificates and HTTP and Flash cookies. It can intercept HTTP forms to steal
data from them and modify or redirect webpages for that purpose. It can take screen shots of the
victim machine as well as search and upload files from it. It can download and execute arbitrary
programs including other types of malware and modify the hosts file enabling the redirection of
even DNS queries.
What makes Zbot especially dangerous is its “man-in-the-browser” ability. When a user opens a
browser to log into his bank website, for example, Zbot will grab all values filled in the form. It
will inject additional form fields into the page which will lure the user into entering more
information such as social security number or ATM pin numbers. Some Zeus variants also
contain a feature called “Jabber Zeus,” which will relay the victim’s credentials to the criminals
in real-time via Instant Messenger (IM). This real-time interception allows them to login into the
account while the user is using it and wire money to third parties.
The security vendor Kaspersky published a report80 about the number of Zeus variants
discovered each month since 2007. As shown in Figure 2, before October 2007, it had no
significance. But a spike occurred January 2008 with 546 different samples detected that month.
This was the initial use of the Zeus toolkit by some criminals. It continued to grow in popularity
in the underground market until May 2009 when it reached the highest number of variants thus
far. In February 2010 the Kneber botnet was discovered, which represented a massive Zeus
infection. Another security vendor, FireEye, published a similar report in mid 2010 on the top
malware families infecting their customers’ networks and ranked Zeus in third place.81
On the defensive side, a group of security researchers founded a website called “Zeus Tracker”
to track Zeus C&C servers around the world and to provide domain- and IP-blocklists. In
addition to the lists, the properties of each C&C server are provided such as online/offline, AS
number, country, and level. This website is an essential tool in the fight against Zeus, and it is
actively updated with new information about the spread of this mini-bot.
SecureWorks, “ZeuS Banking Trojan Reports,” March 2010 – http://www.secureworks.com/research/threats/zeus.
D. Tarakanov, “ZeuS on the Hunt,” 2010 – http://www.securelist.com/en/analysis/204792107.
FireEye, “World’s Top Malware,” 2010 – http://blog.fireeye.com/research/2010/07/worlds_top_modern_malware.html.
Other DIY Malware Kits
The success of Zeus opened new venues for other similar toolkits like SpyEye and Butterfly.
Version 1.0 of SpyEye appeared in late December 2009 and has been continuously updated since
then.82 It is similar to Zeus, having a binary builder module that will create the Trojan and a
control panel application for the C&C server. The information stealing capabilities of SpyEye are
comparable to those of Zeus, in addition to being able to spy on Zeus itself if it has also infected
the system. It reports on Zeus activity including the C&C communication and the stolen
information stored in Zeus logs. More interestingly, it has a feature called “Kill Zeus” that
deletes Zeus from the infected system or disables it, rendering it idle.83
The Butterfly kit (some refer to it by “Mariposa” the Spanish equivalent) surfaced in 2009 and
gained attention when the Mariposa botnet was discovered to have infected 12 million systems
worldwide.84 This botnet was extremely dangerous not only because of the massive infection
numbers but because, in addition to information stealing capabilities, it had a DDoS attack
capability. Fortunately it was partially taken down in December 2009 and three months later the
author of the kit was arrested in the Spanish province of Vizcaya.85 Mariposa is only one of the
many botnets that were created using this kit. Researchers from Panda Security revealed that this
kit was used to generate more that 10,000 distinct variants of the malware (some AV identify it
as Palevo) and created over 700 different mini-botnets.86
Conclusions and recommendations
Mini-botnets pose a great threat to the privacy of individuals and the security of networks. When
a bot infects a machine, it becomes the owner and is capable of doing anything the legitimate
user can do. One or more infections within an enterprise network will punch a hole in the
security layers leaking personal information as well as sensitive corporate data. Creating and
managing a mini-botnet is no longer limited to experienced attackers; it is now available to
novice cybercriminals using software DIY kits. This is causing a widespread of mini-botnets,
resulting in a huge financial loss to enterprises and individuals.
One way to minimize the impact of this threat is to apply the following recommendations and
guidelines on different levels.
As a user, it is vital to keep anti-virus, operating system, and software patches up to date. For
additional protection, it may be useful to conduct online banking and financial transaction on
isolated machines that are not used for general Internet use. As general safety measures, the user
should not open suspicious emails or download software from untrusted sources, and should
avoid clicking on links advertised on social networking websites.
Symantec, “SpyEye Bot versus Zeus Bot “ 2010 – http://www.symantec.com/connect/blogs/spyeye-bot-versus-zeus-bot.
Malware Intellegence, “SpyEye Bot: Analysis of a new alternative scenario crimeware,” Feb 2010 –
Defence Intellegence, “Half of Fortune 100 Companies Compromised by New Information Stealing Trojan,” Sep 2009 –
J. Leyden, “Monster botnet held 800,000 people’s details,” in The Register, March 2010 –
Panda Security, “The Mariposa saga goes on,” July 2010 – http://www.pandainsight.com/en/the-mariposa-saga-goes-on.
As a system administrator, traditional layers of defense are crucial, including anti-malware on
workstations, web and email content filtering, least privilege access to all users, intrusion
detection or prevention systems (IPS/IDS), and firewalls where appropriate. Blacklists of
discovered botnet C&C servers, such as the ones provided by Zeus Tracker, can be utilized as
well to prevent currently infected machines from communicating with their servers.
These conventional solutions cannot guarantee maximum protection. For example, the traffic
generated by these botnets is almost always encrypted, preventing traffic content inspection from
being effective. In addition, it is difficult for anti-malware solutions to keep their products up-to-
date with the new serial variants generated using the DIY kits. Therefore, alternative solutions
must be sought; for example, security companies must be active in gaining access to the latest
versions of various kits and build generic and behavior-based detection signatures. Protecting
web services is another area of innovation where web users expect to be able to perform online
transactions securely. This includes a web server’s protection against anomalies and client
protection using trusted environments running on the user’s machine.
And finally, organized collaboration between the security and law enforcement communities,
ISPs, and the owners of networks on an international level is critical and vital in combating this
fast growing threat.
About the Authors
Areej Al-Bataineh is a Ph.D. candidate at the Department of Computer Science, The University
of Texas at San Antonio. Her research includes in-depth analysis of current malicious software
(or malware) and botnets. Areej is currently working on finding effective defense mechanisms
against data stealing botnets by utilizing different anomaly detection methods. She can be
reached at email@example.com.
Dr. Gregory White has been involved in computer and network security since 1986. He is
currently the Director for the Center for Infrastructure Assurance and Security, The University
of Texas at San Antonio where he also serves as an Associate Professor of Computer Science.
His current research interests include efforts in high-speed intrusion detection, infrastructure
protection, botnet detection, and methods to detect and respond to computer security incidents in
states and communities. He can be reached at Greg.White@utsa.edu.
Common ISO 27001 Gaps
By Bil Bragg – ISSA member, UK Chapter
Based on a review of 20 gap audit reports for a variety of organizations, this article should
help your organization if you are considering ISO 27001, or wish to ensure you comply
with best practice
Companies considering getting certified to the international information security standard ISO
27001 often commission a gap audit to find out what they are missing at a high-level. Many of
these gap audits have common areas that are not yet in place, such as reviewing user access
rights and security in supplier agreements. This article should help your organization if you are
considering ISO 27001, or wish to ensure you comply with best practice.
This article is based on a review of 20 gap audit reports for a variety of organizations, including
public sector organizations, global enterprises, financial, manufacturing, and technology
companies. Most organizations have many of the controls in place already, such as security in
Human Resources, password management systems, and physical security controls. However,
these audits show that many of the organizations shared gaps in their information security
controls. This is certainly not an exhaustive list of gaps, but it may help give you an
understanding of the broader requirements of implementing an Information Security
Management System (ISMS).
The references in parentheses refer to “Annex A: Controls and Objectives” in the ISO
27001:2005 standard and the section reference in the code of practice ISO 27002:2005.
Common System Gaps
4.2 – Establishing and managing the ISMS
ISO 27001 has basic structural requirements for an ISMS. These include what you want to have
in your ISMS (the scope) and a risk assessment.
Few organizations had a formal statement of scope (4.2.1(a)) but often had a good idea of what
would be in scope of the ISMS. For larger organizations this is usually a department, service, or
location such as the IT Department or a Data Center; whereas smaller organizations usually
include the whole organization. Where the scope is a part of the organization, it is important to
define it in order to understand where the boundaries are and what is included and excluded from
The risk assessment is a key part of an effective ISMS (4.2.1(c)-(h)). Many organizations had a
form of risk assessment. However, in most cases it did not meet the specific requirements of the
standard. Generally, existing risk assessments either did not consider assets first, did not consider
all important assets in scope, or did not consider impacts to confidentiality, integrity, and
availability. Often, the risk assessment methodology was not documented along with criteria for
It is not surprising that at the gap-audit stage, many ISO 27001-specific requirements are not in
place, but they are worth mentioning. Following on from the risk assessment, management
should approve the proposed residual risks; the organization should implement a risk treatment
plan (4.2.2) and produce a corresponding statement of applicability (4.2.1(j)).
6.0 – Internal ISMS audits
Only one organization had an internal ISMS audit program, and none of the organizations had
undertaken a management review of the ISMS. Many organizations had an internal audit
function that covered IT and some compliance requirements such as Sarbanes Oxley, so less
work would have been needed for those to meet the requirements. Organizations with a Quality
Management System would be able to extend their existing internal audits and management
reviews to cover the requirements of ISO 27001.
A.6 – Organization of information security
A.6.1 Internal organization
Objective: To manage information security within the organization.
An information security committee or forum that would meet regularly was not yet in place
(A.6.1.1), more so for smaller organizations. This is best practice rather than a specific
requirement; however, implementing and running an ISMS is difficult without this. Additionally,
a staff member had not been formally assigned an ISMS manager-type role (A.6.1.2). These
would be key to getting an ISMS up and running. Often organizations have existing regular
management meetings that can be extended to include the ISMS.
A.6.2 External parties
Objective: To maintain the security of the organization’s information and information processing
facilities that are accessed, processed, communicated to, or managed by external parties.
Identifications of risks relating to suppliers and customers were generally sporadic or not in place
(A.6.2.1). Many organizations use suppliers with logical or physical access, such as IT support
companies, security guards, and cleaners or provided systems access to customers. Following on
from this, agreements with these suppliers and clients (A.6.2.3) that have access to important
information assets did not include key provisions such as an information security policy, asset
protection, staff screening and training, access control policy, reporting security incidents,
monitoring and auditing, service continuity arrangements, and use of subcontractors.
Many of the smaller organizations outsourced IT functions, which gave these IT support
companies full access to their information. Although there was a high level of trust for these
companies and individuals supporting them, there had been no formal risk assessment and no
agreement with the expected information purity provisions.
A.7 – Asset management
A.7.1 Responsibility for assets
Objective: To achieve and maintain appropriate protection of organizational assets.
The standard requires an inventory of all important assets (A.7.1.1). Many organizations had an
inventory of hardware assets maintained by the Finance department. Some IT departments also
had software inventories through using discovery tools. The standard requires an inventory of
important assets that typically includes non-physical and information assets such as systems,
databases, documentation, services, people, and intangibles such as reputation. These assets
would also be used in the risk assessment.
A.9 – Physical and environmental security
A.9.1 Secure areas
Objective: To prevent unauthorized physical access, damage and interference to the organization’s
premises and information.
Many of the gaps in physical security were specific to the organizations. However, most if not all
should be identified as part of the ISMS risk assessment. Common examples of these were
CCTV cameras that were obscured or not working, and fire doors used as normal doors, which
meant that locks were broken or ineffective, or the fire doors were often wedged open.
A.10 – Communications and operations management
A.10.7 Media handling
Objective: To prevent unauthorized disclosure, modification, removal, or destruction of assets, and
interruption to business activities.
There was a widespread lack of any formal procedures for media handling and media disposal
(A.10.7). This would include use of USB flash memory sticks, external hard drives, DVDs, and
printed media. Often, where organizations issued USB flash memory, there was no requirement
for encryption or restriction on the use of personal USB flash memory.
As expected, smaller organizations tended to have no media handling policy whereas larger
organizations did, but with no procedures that would meet the requirements. For example, one
large organization used a company to destroy hard disks, but this was not formally recognized.
A.10.8 Exchange of information
Objective: To maintain the security of information and software exchanged within an organization
and with any external entity.
Many organizations did not have an information exchange policy (A.10.8.1) for how to send
confidential information over email, for example, whether to send confidential information at all,
or use a specific level of encryption. Related to this, many organizations did not have agreements
with customers or suppliers on how to exchange confidential information (A.10.8.2).
One small company received regular, confidentially classified information from a large financial
institution via email. Despite how hard the company tried, the financial institution was not
willing to agree to send the information encrypted! On the whole, most organizations did tend to
encrypt information that individuals determined as confidential, using ad hoc means of
encryption, rather than based on a company-wide policy.
Objective: To detect unauthorized information processing activities.
Clocks on Microsoft Windows servers and desktops on the internal network were generally
synchronized with a public NTP server (A.10.10.6); however, servers in DMZs, CCTV systems,
and some network devices were often not synchronized. Most organizations did not know if
clocks on servers and network devices not on the internal network were synchronized. Date and
time stamps for audit logs are important when troubleshooting and may hinder the credibility of
using audit logs as evidence if inaccurate.
One smaller organization had desktops that synchronized with a domain controller, but the
domain controller did not synchronize with an external time source. Two organizations had
CCTV system clocks that were out by over 10 minutes.
A.11 – Access control
A.11.1 Business requirement for access control
Objective: To control access to information.
Most organizations had an access control policy that was inferred for each system through the
way Active Directory was configured, or the way roles within an application were setup
(A.11.1.1). The standard requires a documented access control policy that identifies common
roles for each business application. The access control policy should specify rules ensuring the
concept of least privilege.
All organizations tended to have well-defined Active Directory groups and applications with
well-defined roles with owners (sometimes informal) responsible for access authorization.
Smaller organizations on the whole did not have an access control policy, whereas larger
organizations mostly had a very high level access control policy without specifying systems or
A.11.2 User access management
Objective: To ensure authorized user access and to prevent unauthorized access to information
Most organizations did not have an effective, regular review of user access rights (A.11.2.4).
Reviews of access rights were usually ad-hoc, and only covered a few systems such as Active
Directory and the core business applications, rather than a formal review across all systems.
Larger organizations were more likely to have a regular review of user accounts for Active
Directory and the main applications. For those that did not have a formal regular review of
access rights, a sampling of different operating systems, databases, and applications showed old
active test accounts, accounts for people who had left, and generic accounts for which the
purpose was unclear.
A.11.3 User responsibilities
Objective: To prevent unauthorized user access and compromise or theft of information and
information processing facilities.
Many of the less obvious systems had accounts with very weak or default passwords. These
included network devices, databases such as Microsoft SQL Server and Oracle, physical access
control systems, and local accounts on older servers (A.11.2.3 and A.11.3.1). For example, one
large organization had a physical access control system with a default administrator password,
and another large organization had an SQL Server database with a blank ‘sa’ password.
Although most organizations had clear desk and clear screen policies (A.11.3.3), multiple
breaches of these policies were often observed, most often by screens left unlocked with staff
away from their desks.
A.11.7 Mobile computing and teleworking
Objective: To ensure information security when using mobile computing and teleworking facilities.
Some organizations had effective technical controls for mobile computing and teleworking
(A.11.7.1 and A.11.7.2), such as encrypted hard disks for laptops, encrypted smart phones, two-
factor authentication for VPNs, and endpoint protection. The gaps found in the majority of
organizations were that a formal policy for mobile computing should be in place and that a
policy and procedures for teleworking is needed. These should include physical protection, rules,
and advice for connections used in public areas, and possible access by friends and family.
As a typical example, one organization had a procedure for assigning laptops and blackberries,
which included an agreement by the staff members that they would look after them. The
organization also enforced some security controls for remote access and hard disk encryption.
However, there was no guidance on how staff should protect information on the assets.
A.12 – Information systems acquisition, development and maintenance
A.12.3 Cryptographic controls
Objective: To protect the confidentiality, authenticity, or integrity of information by cryptographic
Many organizations did use cryptography to protect emails, information on removable media,
and laptop hard disks (A.12.3.1). However, there was generally no central policy that ensures a
consistent management approach that ensures appropriate levels of encryption through risk
assessment, and that ensures that keys and passwords are protected and recoverable.
Examples of cryptographic controls in use even without a policy were two-factor authentication
for VPNs, a variety of full-disk encryption software products for laptops, PGP encryption for
emails, and e-wallets for password storage.
Some of the organizations used external companies for outsourced software development
(A.12.5.5). In many cases contracts did not stipulate who had the intellectual property rights of
the code, escrow arrangements in case of dispute or business failure, requirements for quality and
security functionality of the code, or a right to audit the company. In one example, some of the
code was copyrighted to the organization and the rest of the code was copyrighted to be external
company, even though the code was only used by the organization.
Many organizations had effective technical vulnerability management (A.12.6) for Microsoft,
Linux, and database software but did not manage vulnerabilities for some other software in use,
especially on desktops, such as Adobe Reader and Adobe Flash. A typical example was that
existing publicized vulnerabilities in Adobe Reader had not been considered as a possible
vulnerability. A check on the desktop estate showed that there were many older versions of
Adobe Reader with no central configuration management.
A.13 – Information security incident management
A.13.1 Reporting information security events and weaknesses
Objective: To ensure information security events and weaknesses associated with information systems
are communicated in a manner allowing timely corrective action to be taken.
Many organizations did not have a formal procedure for reporting security events (A.13.1.1), nor
a mechanism to ensure that types, volumes, and costs of information security incidents could be
quantified and monitored. For example, a sample of staff members was not clear on what a
security event was and how it would need to be reported.
A.14 – Business continuity management
A.14.1 Information security aspects of business continuity management
Objective: To counteract interruptions to business activities and to protect critical business processes
from the effects of major failures of information systems or disasters and to ensure their timely
Some organizations did not have a business continuity plan (A.14.1). For those that did, it was
generally a bit dusty. Current business processes should be assessed to determine acceptable
maximum downtime and business continuity plans created to ensure business processes can be
back in place within that time frame, given a variety of scenarios.
Most organizations that had business continuity plans did not test them regularly or with a wide
enough coverage (A.14.1.5). For example, in one case the only test was that backup tapes were
restored to a remote location. A variety of techniques and scenarios should be used to give
assurance that plans will operate in real life.
A.15 – Compliance
A.15.1 Compliance with legal requirements
Objective: To avoid breaches of any law, statutory, regulatory, or contractual obligations, and of any
All organizations had not identified all applicable legislation within the scope of their ISMS
(A15.1.1), such as data protection legislation and computer misuse laws. Organizations had also
not established a mechanism to ensure they were kept up-to-date on relevant legislation and
There are many gaps that organizations have in common in their information security
management systems. The most important gap in common is that key staff who would be
involved in implementing ISO 27001 had not yet been given training on what ISO 27001 was
and how to implement it.
Although many smaller organizations did not have the policies and procedures that the larger
organizations had, they still had informal practices that met many of the requirements of the
standard that could be formalized. Smaller organizations generally did not have much in the way
of incident management or business continuity management. Due to other compliance
requirements financial institutions usually had less gaps than others.
Organizations may have these common gaps as it is not obvious that there are significant
information security risks until they have been addressed. For example, considering risks to
assets such as applications, staff and suppliers, not just hardware assets: all staff being aware of
the information exchange policy so that it is less likely that a CD or email is sent containing
personal records unencrypted; a regular review of access rights that clears up defunct domain
administrator accounts with weak passwords that also allow remote access; and an effective test
of business continuity plans that shows how much they need to be updated.
About the Author
Bil Bragg, ISSA member UK Chapter, is a penetration tester with Dionach Ltd and an ISO
27001 lead auditor with Certification Europe. He may be reached at firstname.lastname@example.org.
Reading Outside the Lines…
The Good, the Bad and the Ugly
In this 1966 classic western, three soldiers of fortune alternatively plot to kill or collaborate with
each other in pursuit of $250,000 of buried gold. Ultimately, Clint Eastwood rides off with the
money, leaving one dead and the other stranded in the desert with no horse. Professor Tim Wu
tells a similar tale in his new book about government regulation and control of content and the
telecom industries. Virtually all security professionals deal with the requirements of government
regulations on a daily basis. The Master Switch describes how 20th century information empires
shaped – and were shaped by – regulation with emphasis on the back room dealings of the
outsized personalities that created the empires and the regulations. This book is an exciting read
and will be of interest to anyone who wants to be an active part of the evolving 21st century
Today, we all face the likelihood of increased regulation of the Internet. Recently, the FCC
passed new “net neutrality” rules covering cable and telecom providers. In the coming year we
may see new legislation regarding supply chain security and consumer privacy. Will any of these
regulations have their intended effect?
Professor Wu, who coined the term “net neutrality,” advocates a “Separations Principle” as a
guide for the information economy, advocating that all parties, including government, keep their
distance from the others, in order to prevent concentration of power. I am reminded of the
standoff scene in the movie. What will lead to acceptance of this principle? One important factor
is individual involvement to help establish community norms and accepted practices. I am not
sure we have that yet, but with the latest FCC ruling, I feel a little more confident that I will be
able to continue to stream classic movies without interference from my ISP.
About the Reviewer
Fred Scholl, PhD, CISSP, CISM, CHP, is a security consultant based in Nashville, Tennessee. A
member of ISSA Middle Tennessee Chapter, he may be reached at email@example.com.
ISSA Honor Roll
Roger Younglove holds a BBA and an MBA in Information Systems. Prior to retirement he held
the following certifications: Certified Information Systems Security Professional (CISSP),
Certified Information Security Manager (CISM), and an Information Assurance Methodology
(IAM) . He is internationally published on the subjects of Virtual Private Networking (VPN) and
Public Key Infrastructure (PKI). Roger’s professional security experience includes being the
senior consultant for General Motors in the Automotive Network eXchange (ANX) program
office and was responsible for the technical implementation of the ANX. He designed the PKI
system for Ford Motor’s Sync implementation and for many other companies. Roger has
provided security consulting for INS, Lucent, AT&T, TEK Systems, Fishnet, and many other
organizations. For the past ten years he has been active speaking at national and regional security
In 1999 as a new member of the ISSA, Roger founded the ISSA Motor City Chapter in Detroit,
Michigan. Since that time he has sponsored and mentored new chapters in Lansing, Grand
Rapids, and North Oakland County, Michigan and Toledo, Ohio. He has held offices at both
local and international levels of the ISSA and has continued to be a readily available mentor for
all succeeding chapter officers.
What do you consider to be your most significant accomplishment as an information
security professional/As the senior consultant for General Motors on the Automotive Network
eXchange program, I was responsible for the technical implementation of the ANX. This
position allowed me to be on the concept team for the creation of IPSec, the basis of Virtual
Private Networking. It also allowed me to work with an early implementation of Public Key
Infrastructure. These two technologies have been the basis for my writing and speaking career.
What is the most important issue facing the industry and how would you like to see it
The implementation of secure cloud computing. Attacks from the outside are handled no
differently than any other network. However, one of the problems that I see is the authorization
and authentication of individuals utilizing their portion of the Cloud. Once inside the Cloud they
have pierced the external security. The only thing protecting the other cloud members is
authorization and authentication of some sort. PKI is the solution: Cloud providers can operate a
Root Certificate Authority and provide issuing Certificate Authorities to each individual cloud
What is the biggest challenge currently consuming your time and energy?
Well, it’s been a long and fruitful road and I am now retired. I keep busy with family, fishing,
skiing, yard work, and volunteering with my church.
What would you like to say to your peers?
When security professionals stop learning about technology, they stop being effective. When that
happens, they should leave the field because they give a false sense of security.
– Roger Younglove
ISSA Phoenix Chapter
Chapter of the Year: 100-200 Members
Chapter President, ISSA International Director
Debbie Christofferson has worked for 20 years in information security management in the global
Fortune 500 environment. With nine years on the ISSA Phoenix Chapter leadership team and
current chapter president, Debbie is serving her second 2-year term as an ISSA International
director. She is a columnist, published author, speaker, and workshop leader for CISM
What do you consider to be your chapter’s most significant accomplishment in serving
your members and your local information security community?
We create excellent value by offering an superior program. Paid chapter members attend regular
events at no cost, and guests, who are always welcome, attend at a very low cost. This draws
high attendance, which in turn draws strong financial support from our sponsors. In fact, we have
a waiting queue of speakers on current topics. We demonstrate consistent value, and we have a
strong board who support each other seamlessly.
We have an excellent facility sponsor and standing location at ITT Technical Institute. This has
been a key to success, our recognizing them as a sponsor and building a positive relationship for
venues. Our members and board greatly appreciate them. They provide door prizes, all catering,
and a welcoming environment. They make it very easy for our board, so we can focus our efforts
elsewhere for increased value.
What is the most important issue facing the industry? Does your chapter have plans to
I see three vital issues facing information security professionals:
1. Staying on top of ever-changing threats and technology
2. Selling security to management at the executive level and understanding their real
business needs at our own level
3. Being seen and hired as more than just technical security administrators
Education helps, but we have a long way to go. Our chapter is hosting a conference again in
2011 to address business challenges; we are looking to form a CISO Round Table formally in
2011 that links to ISSA locally; and we network tirelessly at the local level.
What does it mean to you for your chapter to be selected?
It means we are serving our members and ISSA well. In a time when many associations have
dropped in membership, we have grown. Reviewing award criteria sets a bar that forces you to
look at yourself, the chapter, and the leadership and determine exactly who and what you are,
where you excel, and where there is room to improve. The Phoenix Chapter is good…we can be
What would you like to say to your peers about your experience leading an ISSA chapter
and why ISSA’s mission is meaningful to you?
Getting involved in ISSA has connected me with professionals throughout our industry and
helped me grow in ways I could not have imagined. Involvement at the board level lets you
know these peers better than your colleagues, and even some of your family members! They
cross boundaries of industry and expertise – all at your fingertips. You learn to appreciate
volunteer leadership and how to work most effectively for the good of the team and the mission.
You meet the speakers, vendors, recruiters, sister organizations, and members – up close and
personal. These industry professionals all connect in different ways – it really is a small world
and you increase your pool of experience and connections 50-fold. If you want to get involved
on a deeper level, get involved. You’ll discover new open doors and new opportunities.
The privilege of serving has been mine. I have loved every minute of it. – Debbie Christofferson.
By Russ McRee – ISSA member, Puget Sound (Seattle), USA Chapter
Virtualization platform or dedicated physical host for BT4R2
First, Happy New Year.
I hope 2011 will be a good one for you all.
Likely you’ve all heard of Metasploit if not used it as part of ethical hacking training exercises or
penetration testing engagements. Depending on your background or the availability of
commercial tools in your environment (Core, Canvas, etc.), your comfort with Metasploit likely
varies with the depth of your experience. Armitage87 is designed to help close some of the
experience or comfort gaps, described by the developer as useful for “non-hackers”.
Raphael Mudge, Armitage’s developer, has “met too many people involved with the defense of
large networks who do not understand hacking and what’s possible today. Some smart people
think if they don’t know how to do something, then it must be difficult, so they’re willing to
assess the risk of it as lower. This is very dangerous in network defense. Armitage exists to make
it easier for non-hackers to understand what today’s tools are capable of.”
Raphael proposed of few use cases for Armitage, first as a learning tool for people who are new
to Metasploit and find themselves struggling with three questions:
1. What can I do?
2. Which exploit do I use?
3. Ok, I compromised that host, now what can I do?
Per question 1: Armitage is logically organized around the vulnerability discovery and
exploitation process. The documentation88 will help orient the Metasploit workflow process and
orient the user accordingly.
Per question 2: Armitage uses Metasploit’s capabilities to help out where possible. Armitage
recommends exploits to help narrow the number of exploits a user must search through. For
services that have many exploits associated with them, Armitage can run each exploit’s check
command to help the user find the right exploit to use.
Per question 3: Raphael put a lot of effort into making it easy to manage post-exploitation
through Armitage. The user can escalate your privileges, capture hashes, or take a screenshot
with one click. Armitage also allows users to browse files and interact with a Windows
command shell simultaneously.
A second Armitage use case is as a demonstration tool. Have you ever watched a demonstration
of Metasploit? It can be painful for a non-techie; imagine a lot of gray text scrolling on a black
background. Once the demo is over, it usually requires several slides to explain what happened.
Armitage captures the action in a way anyone comfortable with computers can follow.
According to Raphael, professional penetration testers seeking to replace commercial products
may need to wait a while. Armitage lacks the reporting and auditing features that these tools
provide. Raphael would like to hear your needs before addressing these features. For now, he’s
focused on empowering non-professional penetration testers and making it possible for system
administrators to test their own networks.
During red team exercises, Raphael noticed a few problems that today’s tools aren’t solving well
as it can be difficult for a red team to coordinate efforts, share sessions and information. His
original goal for Armitage was to make a UI for Metasploit that made team cooperation possible
for these CTF/exercise environments. He’s hit milestone 0 by providing an effective local client
for Metasploit. The next milestone is to make it possible to manage Metasploit remotely as well
as Armitage does locally. Imagine the possibilities for coordinating multiple Metasploit instances
collocated in the cloud. Beyond that, he hopes to implement the team cooperation features.
As for additional Armitage functionality improvements, Raphael would like to see it handle
attacks against web applications as well as it handles attacks against the OS and client-side
applications. Metasploit includes WMAP89 but it requires additional development before
Armitage can leverage it.90
Setting up Armitage
I used the opportunity to test Armitage to also test BackTrack 4 R2, downloaded the VMWare
image,91 and installed Armitage with ease. Metasploit 3.5.1-dev is native to BackTrack 4 R2 (run
msfupdate to update it to the current version, 3.6.0-dev as I write this), as is MySQL, which
makes Armitage setup very simple.
Unpack the Armitage archive, then cd /pentest/exploits/framework3, followed by
./msfrpcd -f -U msf -P test -t Basic. This will fire up the Metasploit RPC
daemon with the user msf, password test, and an SSL listener on the default port 55553. You can
modify this as you see fit. Be sure to start MySQL: /etc/init.d/mysql start. Change
directory back to your Armitage installation and run ./armitage.sh; be sure you check the
Use SSL box when connecting as seen in Figure 1.
Once Armitage is connected and running, define a workspace via Workspaces, then Create. You
scan targets via the Hosts menu; enter the IP addresses of the host(s) you seek to “explore.” You
can opt to run an Nmap scan from Armitage, but it is recommended that you import Nmap
results from a direct client scan rather than a scan called from the Armitage UI. Armitage does
not report results back to the console in real-time, thus leaving you in the dark on scan progress.
That said, the console from which you launched msfrpcd will report Nmap activity to you. You
can also choose to run MSF scans which will launch 19 discovery modules.
Armitage includes extensive import functionality, consuming scan results and host lists from
THC-Amap, Nessus, NeXpose, and Qualys amongst others.
My test network (192.168.122.0/24) included a couple of vulnerable Windows Server 2003
virtual machines to exemplify host pivoting where one compromised host can then be used as a
jump-off platform for exploration and further exploitation.
Once your scans are finished, you will be advised to use Attacks then Find Attacks by port or
vulnerability. You’ll note icons for host IPs populate in the Target UI. I used Find Arracks by
port; once analysis is complete you will be advised of a right-clickable Attack menu attached to
Interview feedback provided by Raphael Mudge.
the host(s). One note: if you use Find Attacks by vulnerability, if no vulnerabilities have been
identified, no attack menu will be populated.
As seen in Figure 2, I opted to exploit the server service vulnerability typically exploited by the
Conficker worm, specifically MS08-067.92 The related Metasploit module “exploits a parsing
flaw in the path canonicalization code of NetAPI32.dll through the Server Service.” After
launching the exploit, Armitage will report back a compromised host as a lightning stricken red
icon inclusive of a Meterpreter session with access, interact, explore, pivoting, and MSF scan
options as seen in Figure 3.
Note that Access options include privilege escalation and hash dumping for later use as part of
pass-the-hash attacks. First set up a pivot by right-clicking your initial compromised host. Select
Meterpreter n, then Pivoting, then Setup to define the subnet to pivot through. In the hierarchical
target view you’ll note a dim green connector arrow after setting up the pivot; the same arrow
will become bright green once achieving a successful compromise of a secondary host.
One can take advantage of the hash dump as seen in Figure 3 to attack the secondary host. Right
click the secondary host, select Attack, then smb, then pass the hash. Select one of the hashes
grabbed from the initial host and click the Launch button; you can review them prior to passing
them via View then Credentials. Figure 4 shows the Pass the Hash UI and a fully completed pivot
session for posterity.
Once you’ve established a Meterpreter session, there are endless possibilities, many of which are
well suited to the above mentioned demonstration functionality. There’s nothing like
compromising a system as part of an authorized engagement, and using Armitage to grab a
screenshot of the desktop for the active system user and returning it to the Armitage tabs menu as
seen in Figure 5.
Other host exploration options include running processes, browsing the filesystem, or spawning a
command shell via the Interact menu. You can even run VNC if wish, not that you couldn’t
enable RDP with escalated privilege.
The Armitage manual in comprehensive and includes information far in excess of what we’ve
discussed here. Read through it before getting started to avoid time sinks that are explained
clearly (why would I read the manual first?).
Armitage delivers exactly as promised. I’m looking forward to continued feature enhancements
and heartily suggest you give it a close look as a learning or demonstration tool. I’m already
seeing the upside of being able to show clients or senior managers as one of their systems fall
due to a woeful patch state or vulnerable application.
Use Armitage and similar tools carefully; have your “get out of jail free” card hand at all times.
Cheers…until next month.
—Raphael Mudge, Armitage developer and project lead.
About the Author
Russ McRee, GCIH, GCFA, GPEN, CISSP, is team leader and senior security analyst for
Microsoft’s Online Services Security Incident Management team. As an advocate of a holistic
approach to information security, Russ’ website is holisticinfosec.org. Contact him at