Filed under: Uncategorized
There are a number of ways to justify the cost of an archiving solution. We have just published a new white paper in which we present three “before and after” scenarios that will cover a variety of scenarios for archiving and how they can help to reduce corporate costs. Here is one of the examples we included in the white paper.
End users sometimes delete content that they will need at a future date, such as word processing documents they have taken a considerable amount of time to write, an email with an important communication from a customer, or a presentation. Let’s again assume a 500-person organization and each employee needs to recover just one document each month. This results in a total of 6,000 documents that need to be recovered each year (500 employees x one document per month x 12 months). We will also conservatively assume that IT requires an average of only 15 minutes to recover each document from a backup tape.Assuming that IT might even have the bandwidth to recover all of these documents, IT staff members will spend a total of 1,500 hours annually (6,000 documents x 15 minutes per document) recovering this content. The total IT cost of document recovery, therefore, will be $75,000, the equivalent of three-quarters of a full-time IT staff member.
We will now assume that an archiving solution has been configured to allow individual users to access their own content. Assuming that five minutes will be needed to recover a document and that the average employee salary is identical to that of IT staff members ($50 per hour), then the total cost of employees recovering their own documents will be $25,000 annually (6,000 documents x five minutes of recovery per document x $50 per hour). The total annual savings compared to IT recovering the documents will be $50,000. Factor in the cost of the archiving system (average of $20,000 per year) and the cost savings from end-user access to the archive is still a significant $30,000 annually.
While many consider an archiving solution to be a primarily defensive tool – allowing organizations to support eDiscovery or regulatory compliance efforts, for example – it can also be a tool to enhance employee productivity and provide other benefits that can actually pay for the archiving solution in a relatively short period of time.
You can download the white paper here.
Filed under: Uncategorized | Tags: airgap, defense-in-depth, ids, ips, malware, web security
The Web is a dangerous place. A recent Osterman Research survey found that 73% of mid-sized and large organizations have had malware infiltrate their corporate networks through the Web during the previous 12 months. By contrast, malware has successfully infiltrated through email in 59% of organizations and through social media in 17%. Our data is corroborated by Palo Alto Networks’ research that finds 90% of malware attacks come through Web browsers.
What should you do to protect your corporate network from the bad stuff that can be (and probably will be) delivered through your Web browser? The traditional approach is to adopt a defense-in-depth approach of intrusion detection, intrusion prevention, URL filtering, anti-virus, sandboxing and other technologies that will create something of a gauntlet through which bad stuff must pass before reaching users. This works to a great extent, but is by no means a guarantee that all malware will be stopped.
Another approach is offered by Spikes Security, a new company that isolates Web traffic in a centralized server. Instead of trying to detect malware or pass through only “safe” content to Web users, the solution makes the assumption that all content is bad and so passes through nothing. Instead, the AirGap solution converts Web traffic to compressed and optimized pixels that are then delivered to users who view them through a lightweight client that the company claims installs easily, requires no special configuration, and offers good video and audio performance. In essence, Web users are simply viewing a video feed of Web content instead of the actual Web content itself. AirGap provides end-to-end encryption for Web traffic and claims that its proprietary client/server protocol cannot be compromised by malware. Each user session is isolated via a hardware-assisted virtual machine.
Pricing for AirGap ranges from $5.13 to $9.00 per user per month depending on the number of users (sessions) and the length of the software license.
The concept of AirGap is a simple one and should be completely effective at preventing attacks that come through Web browsers. The only downside – and it might be a significant one for some organizations – is that at this point only the AirGap client can be used to view Web traffic, not individual browsers via a plug-in. While this won’t be a showstopper for most organizations, it could be for some that depend on plug-ins for some Web functionality.
All in all, AirGap is a fairly elegant approach to the increasingly perilous issue of Web-borne malware.
Filed under: Uncategorized
Obviously, information security and risk management are critical issues for any organization, regardless of its size or the industry in which it participates. But maintaining the security of your information and others’ information that you possess, as well as mitigating the risk associated with data breaches, is difficult and getting tougher all the time. This is particularly true in an era in which employees and contractors increasingly use their personal devices and applications to create and store corporate content.
There are some important questions about your organization’s information security status and practices that you should be asking – and that you should be able to answer quickly:
- Do you know how many users in your organization have installed and are using Dropbox, Microsoft OneDrive, Google Drive or a similar solution to store work-related documents? If so, do you know what data they are storing there? If so, does your corporate IT department have ready access to this content if, for example, an employee leaves the company?
- Are some of your employees sexually harassing other employees or sharing ethnic jokes through the corporate email system, instant messaging or social media? If so, can you readily identify these people in real time or near real time and take appropriate steps to ensure that it stops immediately?
- Are any of your employees sending sensitive or confidential information to your competitors?
- When the corporate email system goes down, do your employees use their personal Webmail accounts to continue sending work-related emails? If so, are these emails and their content easily recoverable by your IT department so that they can be scanned and archived in compliance with corporate policies?
- When employees leave the company, is there a formal and reliable process for decommissioning their access to corporate resources, including their access to personally managed repositories that store corporate content?
- Do ex-employees still have access to your corporate systems and/or data assets?
- Do users employ very strong passwords to access corporate resources? Do they change them periodically? Are corporate passwords managed by IT?
- When users need to send files that are larger than can be sent by your corporate email system, do they use a corporate-managed solution to do this?
- Do users encrypt emails when necessary, such as when sending customers’ personal financial information or employees’ protected health information?
- Have employees received formal training about protecting themselves and the organization from phishing or spearphishing attacks? If so are they tested periodically to determine if the training has been effective?
- Is your organization archiving business records to satisfy eDiscovery, regulatory or other obligations? If so, are you archiving them in email only, or in every venue they might be found, such as instant messaging, social media, Dropbox, Salesforce Chatter, etc.?
- Is the content from employee’s smartphones and tablets – whether company or personally owned – archived on a continuous basis?
These questions are the just the tip of the iceberg with respect to the types of questions you need to be asking – and that you should be able to answer quickly and accurately.
Filed under: Uncategorized
In late May 2014, Osterman Research conducted an in-depth survey of 164 organizations and their archiving system migration plans. We surveyed primarily mid-sized and large organizations across a wide range of industries. Key findings from the research include the following:
- The typical archiving solution has been in place four years and eight months (median is 36 months).
- There is not a high level of satisfaction with current archiving solutions. For example, only 60% of organizations are “pleased” or “extremely pleased” with the current archiving solutions’ ability to place legal holds on content, only 52% are this pleased with the speed of the solution’s search performance, and only 44% are this pleased with the ability to delete content when necessary.
- Moreover, we found significant differences in the level of satisfaction with archiving solutions based on their age. For example, organizations with archiving systems that are more than three years old are nearly twice as likely “not to be pleased at all” with their ability to place legal holds on content (14.5% for older systems vs. 7.6% for more recent systems), the ability to establish different retention policies (16.7% vs. 11.0%), and the scalability of the system (15.2% vs. 11.2%).
- We also discovered a significant difference in the penetration of cloud-based archiving based on the age of the system: organizations with an archiving solution no more than three years old have placed 33.4% of their archived content in the cloud compared to only 13.2% for older solutions.
- Finally, we found that 7.6% of the organizations will “definitely” replace their archiving solution over the next 18 months while another 27.2% will “probably” do so, as shown in Figure 1. Not surprisingly, organizations with older archiving solutions in place are much more likely to definitely or probably replace their archiving solutions during the next 18 months (39.8% vs. 30.1%).
We published a white paper that goes in-depth on archiving migration that you can download here.
Filed under: Uncategorized
There has been substantial press coverage about how recruiters examine job candidates’ social media profiles to gain a bit more insight about prospective employees. While the merits and ethics of doing so are subject to substantial debate, there is evidence to suggest that social media can provide some interesting clues about how vulnerable some people are to phishing scams.
For example, 100 students from an undergraduate psychology at the Polytechnic Institute of New York were sampled. These students a) completed a survey focused on their beliefs and habits with regard to online behavior; b) asked about how likely they thought they would be the victim of online crime, such as password theft; and c) completed a personality assessment survey. After completing these activities, these students were then sent obvious phishing emails.
One out of six of those tested – most of whom were engineering or science majors – fell for the scam emails. Ignoring the gender differences of those who were most likely to fall for the phishing emails in this study (nope, you’re not getting me into that Vietnam War), the researchers found that hose with the most “open” personalities – i.e., those who are most extroverted – were more likely to fall for phishing scams. The findings strongly suggest that people who overshare on Facebook or Twitter, for example, are more likely to become victims of phishing scams and other online fraud than those who are more introverted, share less or who don’t even have social media accounts. Another study found that younger students (aged 18-25) were more likely to fall for phishing scams than their younger counterparts.
So why the differences:
- Extroverts tend to be more optimistic overall and so may be less inclined to assume that suspicious emails are being sent to them for nefarious reasons. Introverts, on the other hand, are generally less optimistic and so may be more skeptical of the world around them, including of emails that don’t seem quite right.
- Extroverts may have a perception of upside benefit vs. downside risk that is at odds with the needs of the corporate security model. For example, the ability to gain some perceived benefit by responding to an offer in a phishing email or friending a stranger in social media may overwhelm whatever training users might have received about the risks of these kinds of behaviors.
The issue for corporate security managers is obviously good user training and robust security technology. However, the missing element may end up being the critical need to evaluate those personality types that are most vulnerable to being fooled by phishing scams, malicious social media contacts and the like.
Years ago I worked for Dr. John Ryan, a very bright man who is now a senior manager at Google. He would periodically mention in talks that fighter jets are getting lighter and faster over time, so much so that if you extrapolated their weight and speed far enough into the future, they would eventually weigh nothing and fly infinitely fast. He would then ask what that described…the answer was software.
I attended EMC World this week and came away reminded of that story. One of the key themes at this annual conference was “the third platform” – the growing movement toward lightweight applications and rapid application development focused on the needs of an increasingly mobile workforce and society. Although the first platform (mainframes) and second platform (client/server and Web) are still quite relevant, the third platform, characterized by increasingly rapid development and lighter applications as we migrate toward the Internet of Things, represents the direction that computing is moving, and rather quickly at that.
What are the implications?
- It means very rapid application development that integrates data, analytics and applications in a continuously evolving loop to generate applications and updates to them very quickly, sometimes in just a matter of hours instead of the months or years that traditional software development requires.
- It means a zero-tolerance for downtime, since applications are updated on-the-fly instead of the traditional model of bringing down a server, installing the update and then bringing it back up – or worse, having the server or the application break (this point was driven home in one session that showed the healthcare.gov Web site and its downtime message seen and enjoyed by millions). That doesn’t mean that servers won’t ever go down in the third platform, only that the third platform is designed to operate with no scheduled downtime.
- It means that every company becomes a software company (sort of) in the model of Google or Facebook, designing applications for customers to use as an interface to services instead of the traditional customer service model.
- It means that data volumes increase exponentially as large volumes of rich data replace the text-based systems of the first and second platforms.
- It means a continued shift toward massive amounts of CPU power and very cheap storage, all of which is allocated dynamically based on the workloads that need to be addressed at the moment.
I was impressed by EMC’s approach at the conference in a couple of ways. First, the company today derives at least 95% of its business from the second platform. Some companies might wait until they were bleeding profusely before entertaining a shift to a new business model, but EMC seems to be reasonably proactive about shifting their business away from their bread and butter. There’s something to be said for management that can not only read the handwriting on the wall, but to heed its advice before it’s too late. Second, EMC were quite frank about where they have not done a good job. That may have been because they were talking to an analyst community that would have seen through fluffy platitudes anyway, but I got the impression that there is a new level of frankness on the part of the company’s management – quite refreshing for such a large company.
Also impressed by EMC’s acquisition of DSSD, a seemingly well-funded, very stealthy, four-year-old startup focused on developing very high-speed flash memory arrays. Don’t know much about them, and EMC was not overly forthcoming on the specs for their technology, but this certainly bears watching. GigaOM had a good article on DSSD last year that you can view here.
EMC, like all hardware companies, is making a somewhat painful set of transitions: most notably to the third platform and to a cloud-delivery model that often just means customers want to pay less for what they already have. On balance, EMC seems to be making the transition fairly well.