Information Security Awareness, Training and Motivation — Native Intelligence, Inc.

Recommend this article:   Add to your del.icio.us    Digg This   Slashdot   GotNews   StumbledUpon   Reddit

Do you plan for the Mission Impossible attack scenario, while ignoring simple threats?

Johnny Long's book, "No Tech Hacking," brings new attention to overlooked aspects of information security. In his book, Long reveals how simple threats can cause serious problems, even in organizations prepared for a Mission Impossible-style attack scenario.

Long recounts how he and his team of ethical hackers consistently access sensitive information with no special equipment or technical skills. In fact, Long reveals how the ordinary (coat hangers, hand towels, drinking straws, baby powder, and aluminum cans) can result in extraordinary breaches of organizational security.

Protect your information.  You never know who's watching.  Picture of mouse being observed by a large eye from the outside of the mouse hole.

You Can Observe a Lot Just by Watching

While Long doesn't use the famous words of Yogi Berra, the message in "No Tech Hacking" is clear: "You can observe a lot just by watching." Long shares real world stories and cell-phone photographs from his adventures in people watching, shoulder surfing, dumpster diving, and vehicle observation.

Long and his colleagues go to great, conspicuous lengths in order to collect non-public information. While their targets should notice almost all of their activities, most do not. The closest thing to a consequence or confrontation they encounter is a glare from an airline passenger.

Why isn't Long confronted when others observe him surreptitiously taking pictures? Some people don't like to confront an unfamiliar person or don't know whom to report their concerns to. Others are complacent and don't expect negative events to occur. Action invites risk: risk of an awkward or unwarranted accusation, that one won't be taken seriously, and possible personal embarrassment. Sometimes, people feel that the safest action is no action at all. Unfortunately, that feeling of security is deceptive.

Johnny, How Should We Stop You?

Thankfully, Long offers useful advice. He recommends that companies should:

  1. Provide incentives for reporting suspicious activities, and
  2. Make the desired response well-known and easy-to-do.

To follow these recommendations, organizations need to ensure that everyone knows what information to disclose and what information requires protection. Foremost, all organizations should create policies for verifying the identity of anyone who requests non-public information and adequately train all employees to recognize these situations and take appropriate actions.

What Else?

Long's success reminds me of other stories told by ethical social engineers. Todd Snapp of Rocket Ready keeps audiences on the edge of their seats when he recounts the time he made a social engineering call to an individual claiming to be her coworker, Mike. Snapp's team learned Mike's title and full name from an earlier step in the social engineering process. When Snapp attempted to pass as Mike, his mark was justifiably dubious. After a long pause, Mike's coworker answered, "You don't sound like Mike." As the call progressed, it became clear that this individual knew Mike well. Snapp remained on the line and gracefully ended the call. Snapp and his team expected this call to blow their cover and end their penetration test in a matter of minutes, however; the suspicious employee never reported her concerns. Due to this inaction, Rocket Ready continued their penetration test for two more weeks and obtained all the information they were after.

The skeptical employee's behavior underscores people's tendency to avoid confrontation and not report their suspicions. Sometimes inaction is not willful but the result of a lack of attention. Shoulder surfers gain information by peering into places that they shouldn't; this can be done up close or at distance with a camera, binoculars, or similar devices. In one chapter of his book, Long describes a banker who failed to notice Long standing outside the banker's office window photographing his office and work papers. The banker had simply "tuned out" his environment. Long also describes security guards who monitor video screens for hours on end. These guards learn to expect nothing to happen and in turn may fail to notice when something unusual does occur.

(In elearning, we call this "attenuation." Brains crave novelty and find monotony boring, but that's a story for another day.)

What If?

Long offers great advice on decreasing risk through awareness — what to look for and what to do about it — and on positive reinforcement by rewarding incident reporting. Unannounced penetration tests by firms like Long's are sobering affairs. Employees who fail to deter or report penetration attempts may feel foolish and suffer embarrassment and guilt. Without adequate training, these employees may not know enough about the risks to successfully respond to these advances. Penetration test reports may be a much needed eye-opener for senior managers who don't grasp the "people problem" in security. At the same time, the negative connotations around these tests fail to build employee morale or encourage better habits.

A more positive approach includes teaching employees about social engineers and their tactics. Once employees are empowered with this knowledge, engage an ethical hacking team to test the employees. Inform employees of the social engineering test beforehand, revealing only agreed upon details such as the duration of the test (e.g., three months). Motivate employees with a point system that rewards individuals or offices that successfully recognize and report social engineering attempts and explain what aspect of the attempt caught their attention.

Gold cup trophy for catching social engineers

Reward points to an employee who stops a tailgater. Reward points to an employee who confirms the identity of a support technician and that the technician is authorized to perform the work underway. The first employee to report or respond to a test could receive the most points while less points can be awarded to several other vigilant individuals: go for the gold, but take pride in the silver and bronze, too! Give bonus points to any employee who successfully thwarts an actual social engineering attack that was not part of the penetration test. Real world tests should lead to real world results.

Similarly, dock points from employees or offices that fail to protect sensitive information or succumb to a social engineering challenge. Emphasize that the perimeter of information security extends beyond the office. Identification badges seen exposed outside of the office or non-public information discussed over lunch off-site curtail security efforts. Demerits should be instructive, not punitive. Along with any loss of points, explain how information was not adequately protected and offer supplemental training resources to the employee(s) in question.

Accrued points can be turned into actual awards either at the end of the test or incrementally when employees cross specific thresholds. Positive reinforcement results in positive actions. Engaging employees in an action-oriented exercise helps counter our reluctance to take action in uncomfortable situations.

Implementation details can vary, but positive, experiential security training leads to:

  • Employees who learn to recognize social engineering attacks
  • Employees who learn how best to respond to those attacks
  • Increased awareness of and alertness to employees' surroundings
  • Secure behaviors that become habits
  • A greater probability of catching real social engineers and industrial spies

No Tech Hacking book cover

Want to make the world a better place? Buy the No Tech Hacking book. Each purchase of the No Tech Hacking book creates a donation to hackers for charity that will feed one African child for an entire month. [http://www.hackersforcharity.org].

 

arrow Is no tech hacking a concern at your organization?

Let us know!


Article by K Rudolph, CISSP © Native Intelligence, Inc.  All rights reserved.