Osterman Research Blog


What Threats Should You Be Concerned About? (Part 2)
April 1, 2015, 5:22 pm
Filed under: Uncategorized

Last week, I enumerated a list of things that decision makers should be concerned about with regard to potential security holes, focused both on malicious content that might make its way into your organization, as well as for valuable data assets that might make their way out. We continue that list of issues here:

Employee errors: Employees will sometimes inadvertently install malware or compromised code on their computers. This can occur when they download a codec, install ActiveX controls, install various applications that are intended to address some perceived need (such as a capability that IT does not support or that a user feels they must have), or when they respond to scareware/fake anti-virus (rogue AV or fake AV) software. Scareware is a particularly dangerous form of malware because it preys on users who are attempting to do the right thing – to protect their platforms from viruses and other malware. Even users who are quite experienced can be fooled by a well-crafted scareware message.

Malvertising: Malicious Internet advertising is intended to distribute malware through advertising impressions on Web sites. An Online Trust Alliance brief discussed how a single malvertising campaign can generate 100,000 impressions, with approximately 10 billion malvertising impressions occurring in 2013 via more than 200,000 malvertising incidents. Underscoring just how serious the malvertising problem has become, a study by RiskIQ for the period January to September 2013 found that 42% of malvertising is carried out by drive-by exploits that did not require interaction by end users (58% of malvertising involves users clicking on malicious advertisements).

Mobile malware: The growing use of smartphones and tablets, particularly personally owned devices, is increasingly being exploited by cyber criminals. For example, Alcatel-Lucent found that 16 million mobile devices were infected with malware during 2014, an increase of 25% from 2013. This represents an infection rate of 0.68%, meaning that in an organization of 1,000 employees, each of whom has an average of 1.5 mobile devices, there will be a total of 102 infected mobile platforms at any given time. The vast majority of infections impact Android devices – the Alcatel-Lucent research suggests that under 1% of iPhone and BlackBerry devices are infected with malware.

Mobile copycat applications: Many developers distribute their mobile apps through vendor and third party stores that offer varying levels of security, much of it inadequate. Some app stores are highly secure operations and require that developers satisfy rigorous standards before their apps can be offered. Others’ standards, however, are less stringent and create the opportunity for serious security risks. The result is that many third-party app stores are susceptible to a number of security and related problems like the distribution of copycat apps and malware distribution.

Compromised search engine queries: Valid search engine queries can be hijacked by cybercriminals to distribute malware. This form of attack relies on poisoning search queries, resulting in the display of malware-laden sites during Web searches. Search engine poisoning is particularly effective for highly popular search terms, such as information on celebrities, airline crashes, natural disasters and other “newsy” items.

Botnets: Botnets are the cause of a large number of successful hacking and phishing attacks against many high-profile targets. For example, Sony, Citigroup, the US Senate, Lockheed Martin, the International Monetary Fund, Northrup Grumman, and RSA have all been victimized by botnet attacks. The result has been that millions of records have been exposed that will result not only in the disclosure of personal and sensitive information, but also lawsuits and other expensive remediation efforts.

Hacking: This is a form of specialized cyberattack in which cybercriminals use a number of techniques in an attempt to breach corporate defenses. An example of a successful hacking attack is the recent incursion against Sony Pictures that may have been carried out by an operation of the North Korean government.

Gullible users: Users can represent a major security threat because of a combination of their specific personality types and inadequate training. For example, 100 students from an undergraduate psychology at the Polytechnic Institute of New York were sampled. These students a) completed a survey focused on their beliefs and habits with regard to online behavior; b) asked about how likely they thought they would be the victim of online crime, such as password theft; and c) completed a personality assessment survey. After completing these activities, these students were then sent obvious phishing emails.

One out of six of those tested – most of whom were engineering or science majors – fell for the scam emails. Ignoring the gender differences of those who were most likely to fall for the phishing emails in this study, the researchers found that those with the most “open” personalities – i.e., those who are most extroverted – were more likely to fall for phishing scams. The findings strongly suggest that people who overshare on Facebook or Twitter, for example, are more likely to become victims of phishing scams and other online fraud than those who are more introverted, share less or who don’t have social media accounts. Another study found that younger students (aged 18-25) were more likely to fall for phishing scams than their older counterparts.

Ransomware: One of the more common recent examples of ransomware is the CryptoLocker malware that encrypts victims’ files and then demands ransom to decrypt them. Victims who choose not to pay the ransom within a short period of time will have their files remain encrypted permanently. Cryptolocker typically extorts a few hundred dollars per incident and is normally delivered through email with a PDF or .zip file disguised as a shipping invoice or some other business document.

We have just published a white paper focused on addressing these issues – you can download it here.



The Importance of Good Authentication and Data Asset Management
December 3, 2014, 8:27 pm
Filed under: Uncategorized | Tags: ,

Stories about the use of easy-to-guess passwords based on common words, consecutive numerical strings, or simply the use of “password” are fairly common. Millions of users, in an effort to make their passwords easy to remember, fall prey to this problem, or they will write their passwords down on sticky notes, not change them periodically, or use the same password for multiple applications.

I wanted to see how just the strength of a password would affect its ability to be guessed by brute force using a PC, so I went to howsecureismypassword.net. I am not affiliated with the host of this site or its sponsor, and so cannot vouch for the security of any content they manage. So, as a precaution, don’t use any site like this to test your actual passwords.

For the test, I chose five passwords: rabbit, rabbit9, rabbit99, rabbit99K and rabbit99K). I ran each password through their checker and found the following lengths of time that would be required to guess each one:

  • Rabbit: a desktop PC could guess this password more or less instantly
  • rabbit9: 19 seconds
  • rabbit99: 11 minutes
  • rabbit99K: 39 days
  • rabbit99K): 58 years

Obviously, the longer and more complex the password, the longer it will take to guess it through brute force. Yhn-P9q9Km4-9UtQw)7*, for example, would require 425 quintillion years according to howsecureismypassword.net.

But strong passwords are just part of the security story. Organizations should undertake other steps, as well:

  • Use multi-factor authentication that will require, for example, the entry of a password and a code that a user receives on his or her smartphone.
  • Impose password expiration requirements at regular intervals that will require users to create a new password every so often. The more sensitive or critical the data asset or application that is being accessed, the more frequently that IT might want passwords to change.
  • Lockout inactive users after a certain number of days.
  • Implement strict strikeout limits for sensitive data assets or applications that will allow only a small number of authentication errors.
  • Don’t allow passwords to be reused.
  • Implement self-service password functionality, but only if two-factor authentication or similar controls are in place.
  • Employ risk-based authentication that imposes stricter requirements based on the sensitivity of the data assets being accessed, the location of those accessing them, the time of day they are being accessed, etc.
  • Finally, establish policies for the data assets that really need to be accessible online and what can/should be disconnected from the Internet.

These are all fairly simple steps that would go a long way toward improving corporate security.



How Have Email and Instant Messaging Use Changed Since 2001?
November 13, 2014, 7:44 pm
Filed under: Uncategorized | Tags: , , , , , ,

Osterman Research has been in business nearly 14 years and has tracked the growth and changes in the use of email, instant messaging and other communication tools since our inception. We have focused continually on how individuals and organizations communicate and collaborate using these tools, and how they plan to do so in the future.

Not surprisingly, current Osterman Research surveys have found that email is the dominant communications and collaboration tool in most organizations, and that it serves as the primary method for transporting files – in fact, 98% of the bits that flow through the typical email system are the files that are attached to emails. Our research reveals that the typical information worker currently spends 167 minutes per business day doing work in their email client or Webmail system, such as sending or receiving email, looking for attachments, managing tasks, searching for contacts, and the like. The typical information worker receives 100 emails on a normal workday and sends 30. Moreover, users currently spend about 30 minutes per day working in an instant messaging system, whether a standalone instant messaging solution or one that is integrated with a collaboration platform.

Bucking the conventional wisdom, numerous Osterman Research surveys that email is actually becoming more important to users over time – both in the sheer volume of content sent and received through email, and also in the use of email as the starting point for many of the tasks that information workers undertake during the normal course of their workday.

So, how have things changed for email and instant messaging use since 2001? In some ways things have changed dramatically. For example:

  • An Osterman Research survey conducted in December 2001 found that the average employee sent and received a total of just under 16 Internet-based emails on a typical workday. Compared to today’s traffic volume that averages 130 emails sent and received per day, the result has been a dramatic increase in message volume of roughly 730% over the past 13 years. Given that email systems today include much more functionality and integration with other capabilities than they did in 2001, the amount of time spent in email has risen at even faster pace.
  • While instant messaging is almost universally accepted as a business communications medium today, that was not the case in 2001. For example, a July 2001 Osterman Research survey found that only 21% of organizations were using any form of instant messaging, only 23% of email users employed it, and only 22% of IT organizations supported its use in a workplace context.

In some ways, however, things have not changed all that much:

  • Microsoft Exchange was the market leader in business-grade email in an October 2001 Osterman Research survey, followed by Lotus Notes/Domino – much the same as the market shakes out today – albeit with a smattering 13 years ago of tools like Lotus cc:Mail and Microsoft Mail still accounting for some market share.
  • While Lotus Sametime was the dominant enterprise-grade instant messaging system in mid-2001, the dominant instant messaging systems in use were consumer-grade tools (AOL Instant Messenger, Microsoft MSN Messenger and Yahoo! Messenger were the “big three” in 2001).

What we have seen in email and instant messaging use since 2001 has been a steady progression in the volume of emails that users send, an increased reliance on the use of email for both communications and file transport, and growing use of instant messaging. Despite the view by some that email and instant messaging use in the workplace will diminish as social media solutions replace them, as well as the notion that younger workers consider email and instant messaging passé, Osterman Research forecasts that email and instant messaging will remain critical tools in a business context for many years to come.



You Don’t Need to Know Everything
October 15, 2014, 12:03 am
Filed under: Uncategorized

Many years ago when I was early in my career, I worked for one of the leading market research companies in the San Francisco Bay Area. Shortly after I joined the company it was acquired by DRI, a small subsidiary of McGraw-Hill. After the acquisition, a DRI employee was transferred into our offices to serve as a liaison between corporate and their new acquisition. He was a very bright guy with an exceptionally quirky sense of humor. Among the various things I learned from him, the most notable was a comment he once made: “You don’t have to know everything, but you do have to know how to find everything”.

The first iteration of what our liaison was describing, in a sense, became Google and other search engines. In theory, at least, there is a collection of all relevant information somewhere in the cloud and all you have to do is type in a few words to gain access to it. In short, you don’t have to know everything, but you at least have the opportunity to find everything. There continue to be shortcomings in modern search engines driven by incomplete information, intentional biases from prioritizing some information sources over others, the desire of search engine companies to generate revenue from search, and user limitations in not being to adequately describe exactly what they’re looking seeking. Moreover, as useful as search engines are, they don’t give us the answer we’re looking for – instead, they give us an enormous number of answers that might – or might not – be right. This results in search engines being extraordinarily useful tools, but not really a panacea to finding what we need to know.

What would be useful, then, is a way to do two things: a) have access to every bit of knowledge and information that is possible to have on a subject, and b) get the answer that is most likely to be right in as short a time as possible. That, in essence, is what IBM Watson does. Using natural language processing, “reasoning” capabilities, and voluminous amounts of data, Watson sifts through enormous amounts of data in a manner somewhat akin to a search engine, but it does so using natural language inputs. More importantly, Watson is focused on delivering the answer that is most likely to be correct. It’s not always right, of course, but it has demonstrated the ability to be mostly right – for example, on Jeopardy and in a contest with various members of the US House of Representatives.

Why is Watson important? Simply because it can receive inputs using natural language and process vast quantities of information to come up with an answer in a way that humans might if they had the capacity to sift through hundreds or thousands of terabytes of data in a very short amount of time. There are numerous potential applications in a wide variety of fields like medicine and law, among others. In the communications and collaboration realm, Watson could be used for things like analyzing who in a company is most likely to commit fraud by asking who is being abused verbally by their managers and correlating this with employee sentiment expressed in social media, email, text messages and the like.

In short, Watson could be enormously useful in providing direction for a wide variety of business activities like investigations, early case assessments, eDiscovery, fraud detection and mediation, and host of related types of efforts. While we will never know everything, Watson will help us get closer to being able to find everything.



Security 101: Securing the Malware Ingress Points in SMBs
September 18, 2014, 1:01 am
Filed under: Uncategorized

Our research, as well as that of many other firms, has revealed that malware infiltration has impacted most organizations and that the problem is getting worse over time, particularly for small and mid-sized businesses (SMBs). While it is essential that every potential ingress point for malware be monitored, many organizations have holes in their defenses that could allow malware to enter the corporate network. Here are a few areas to address, although the list is by no means exhaustive:

  • Personal Webmail
    Many users employ personal Webmail when they need to send files that exceed the mailbox-size quotas that IT has established for the corporate email system, or when the corporate system goes down. While both are valid reasons for using a personal alternative to continue sending emails, doing so bypasses corporate scanning defenses and can allow malware to sneak onto employees’ computers, such as in a phishing email.
  • Non-business-grade file sync and share
    Tools like Dropbox are widely used by employees so that all of their relevant content can be available from every device they use. These tools are incredibly useful for traveling employees, those who work from home, and those who want their files handy from a mobile device when they’re away from a desktop computer. However, they can also provide an entry point for malware. For example, if an employee’s home computer is used to work on a Word or Excel file, gets infected and then is synced via Dropbox to the employee’s work computer, malware can enter the corporate network without ever having been scanned for malicious content.
  • Mobile devices
    Any mobile device – whether supplied by an employer or one owned by an employee – is a potential source of malware infiltration. One of the ways this can occur is when employees download applications that have not been developed with security as a critical design consideration. Another way for data leakage, but also malware infiltration, to occur is if employees download copycat apps thinking they are downloading bona fide apps.
  • Web surfing
    The Web has become an essential tool for individuals to do their job – and the primary way that malware infiltrates a corporate network. There are numerous ways that malware can infiltrate an organization through the Web, including browsing to valid but infected sites as in a watering hole attack, through drive-by attacks or via compromised search engine queries.
  • Social media
    Tools like Twitter and Facebook can be used to distribute malware through short URLs or Facebook chat, among other ways. Social media can also be an invaluable tool for cybercriminals to gather intelligence about their potential victims who are intent on spearphishing high profile victims like corporate CFOs.

So what do you do about it? Here are four things:

  • First and foremost, understand what your users are doing, the tools they’re employing and why they are using these tools. Personal Webmail may be used only because of inadequacies in your corporate email system; Dropbox may be used because employees want to be more productive when they’re working after hours.
  • Next, develop policies about the use of personally owned devices, cloud applications and mobile applications. While a policy will not guarantee that a particular cloud service or app will not be downloaded or used, it will reduce the number of these potential malware ingress points available on your network.
  • Train users about what to do and what not to do with regard to things like phishing attempts, mobile apps and cloud applications. Follow this up with regular refresher course and reminders, and test users to see if they’re really learning anything.
  • Provide useful alternatives to the applications that users need to do their job. This means doing things like replacing consumer-focused file sync and share tools with enterprise-grade alternatives that will enable more secure management of corporate data.

Finally, deploy very good anti-malware defenses from a leading vendor that can support its tools with excellent threat intelligence.



Cost Justifying an Archiving Solution
August 13, 2014, 12:21 am
Filed under: Uncategorized

There are a number of ways to justify the cost of an archiving solution. We have just published a new white paper in which we present three “before and after” scenarios that will cover a variety of scenarios for archiving and how they can help to reduce corporate costs. Here is one of the examples we included in the white paper.

End users sometimes delete content that they will need at a future date, such as word processing documents they have taken a considerable amount of time to write, an email with an important communication from a customer, or a presentation. Let’s again assume a 500-person organization and each employee needs to recover just one document each month. This results in a total of 6,000 documents that need to be recovered each year (500 employees x one document per month x 12 months). We will also conservatively assume that IT requires an average of only 15 minutes to recover each document from a backup tape.Assuming that IT might even have the bandwidth to recover all of these documents, IT staff members will spend a total of 1,500 hours annually (6,000 documents x 15 minutes per document) recovering this content. The total IT cost of document recovery, therefore, will be $75,000, the equivalent of three-quarters of a full-time IT staff member.

We will now assume that an archiving solution has been configured to allow individual users to access their own content. Assuming that five minutes will be needed to recover a document and that the average employee salary is identical to that of IT staff members ($50 per hour), then the total cost of employees recovering their own documents will be $25,000 annually (6,000 documents x five minutes of recovery per document x $50 per hour). The total annual savings compared to IT recovering the documents will be $50,000. Factor in the cost of the archiving system (average of $20,000 per year) and the cost savings from end-user access to the archive is still a significant $30,000 annually.

While many consider an archiving solution to be a primarily defensive tool – allowing organizations to support eDiscovery or regulatory compliance efforts, for example – it can also be a tool to enhance employee productivity and provide other benefits that can actually pay for the archiving solution in a relatively short period of time.

You can download the white paper here.



Preventing Attacks Through the Web
July 31, 2014, 7:55 am
Filed under: Uncategorized | Tags: , , , , ,

The Web is a dangerous place. A recent Osterman Research survey found that 73% of mid-sized and large organizations have had malware infiltrate their corporate networks through the Web during the previous 12 months. By contrast, malware has successfully infiltrated through email in 59% of organizations and through social media in 17%. Our data is corroborated by Palo Alto Networks’ research that finds 90% of malware attacks come through Web browsers.

What should you do to protect your corporate network from the bad stuff that can be (and probably will be) delivered through your Web browser? The traditional approach is to adopt a defense-in-depth approach of intrusion detection, intrusion prevention, URL filtering, anti-virus, sandboxing and other technologies that will create something of a gauntlet through which bad stuff must pass before reaching users. This works to a great extent, but is by no means a guarantee that all malware will be stopped.

Another approach is offered by Spikes Security, a new company that isolates Web traffic in a centralized server. Instead of trying to detect malware or pass through only “safe” content to Web users, the solution makes the assumption that all content is bad and so passes through nothing. Instead, the AirGap solution converts Web traffic to compressed and optimized pixels that are then delivered to users who view them through a lightweight client that the company claims installs easily, requires no special configuration, and offers good video and audio performance. In essence, Web users are simply viewing a video feed of Web content instead of the actual Web content itself. AirGap provides end-to-end encryption for Web traffic and claims that its proprietary client/server protocol cannot be compromised by malware. Each user session is isolated via a hardware-assisted virtual machine.

Pricing for AirGap ranges from $5.13 to $9.00 per user per month depending on the number of users (sessions) and the length of the software license.

The concept of AirGap is a simple one and should be completely effective at preventing attacks that come through Web browsers. The only downside – and it might be a significant one for some organizations – is that at this point only the AirGap client can be used to view Web traffic, not individual browsers via a plug-in. While this won’t be a showstopper for most organizations, it could be for some that depend on plug-ins for some Web functionality.

All in all, AirGap is a fairly elegant approach to the increasingly perilous issue of Web-borne malware.




Follow

Get every new post delivered to your Inbox.

Join 2,462 other followers