Filed under: Uncategorized | Tags: content, discovery, email, Microsoft, PST, storage
Email contains a wealth of critical business information. The importance of email to the typical information worker and his or her resulting use of email to create and manage a large proportion of business content means that using and migrating email must be risk-free, and non-intrusive to users.
Osterman Research surveys of end users have repeatedly found that the typical corporate email user spends approximately 150 minutes per day working within their email client – sending or receiving email messages, searching for content, managing contacts, managing tasks, using email as the default information filing system, etc. Moreover, email remains the primary file transport system in most organizations, used to convey important business documents like purchase orders, contracts, proposals and the like – as such, it often becomes a key repository of this content, as well.
As a result, email is the most important single source of business content in most organizations.
In Exchange environments, .PST files are commonly employed by end users for a variety of reasons: to store email locally so that mailbox-size quotas are not exceeded, to allow messages to be easily transportable between mail systems, for purposes of email backup, or because users want to maintain a personal archive of corporate information. Microsoft effectively encouraged the use of .PST files by increasing the maximum size of these files tenfold to 20 gigabytes beginning with Outlook 2003.
Because .PST files are used extensively in Exchange environments, they are a significant repository of corporate content and house much of the critical business information to which organizations must have access.
A recent survey we conducted found that 36% of users in the organizations surveyed store email locally in .PST files. Further, we found that the median size of a .PST file in these organizations is 1.3 gigabytes, the equivalent of more than 100,000 email messages. However, some users maintain much larger .PST files – one large professional services firm, for example, maintains more than 4.5Gb of .PST content per user.
Although good .PST management is essential, many organizations are not following best practices in two key areas.
- First, our research found that users store .PST files in a number of disparate locations, including their desktop machines, laptops, local file servers and cloud-based storage systems, among other locations.
- Second, our research found that only 29% of organizations back up local .PST files to a central location, despite 65% or more storing .PST files on laptops or desktops.
The immediate consequence of this highly distributed storage of .PST files is that the business content contained in these files is not accessible to those that need it, such as legal counsel, senior managers, compliance officers or information auditors – or, in many cases, to the individuals who created this information.
For more information on our .PST research, please feel free to download a just published white paper on the topic here.
Filed under: Uncategorized | Tags: analytics, big data, discovery, email, intelligence, investigations, messaging
Because email is used so extensively for purposes of communication, collaboration and content management by information workers and the organizations that employ them, it represents the primary source about how information flows within a company, and between companies and their business partners. For example, email data stores contain:
- Data on what information workers are doing during working hours. This information includes data on emails sent and received, to whom and from whom they are sent and received, files sent and received, how employees responded or did not respond to various communications, the tasks they assign to themselves or to others, the appointments they set, where they will be at specific times, requests that they make of others, etc. Moreover, because social media, real time communications, voicemail and other content types are often integrated with email, email archives often contain a wealth of information on other modes of communications used by employees.
- Information about how they collaborate with fellow employees, customers, business partners and others.
- Information about how employees support internal workflows and key business processes across the organization.
- Information on when employees work.
- Information on how employees work, such as sharing content with others or sending content to their personal accounts.
- Information about whether or not employees are complying with corporate policies, such as appropriate use or data leakage policies.
Clearly, email contains the primary source of information about content flows within an organization. Because decision makers rarely have the tools available to extract meaningful data from this rich content source, they lack much of the insight into their organizations that would help them to ask better questions, make better decisions about how to manage their companies, respond more effectively to customers, or satisfy their compliance obligations – just a few examples of how this information might be used. In short, decision makers need three fundamental capabilities:
- Insight about what is being said and transmitted via email; who is generating, receiving and responding to this information and content; and where this information is being sent and from whom it is being received.
- The ability to prioritize investigations based on these content flows.
- The ability to perform triage on email content at the beginning of an investigation in order to minimize the effort and intrusiveness required to completely it fully.
It is important to note that by “investigations”, we are not referring to the invasion of individuals’ privacy, nor are we talking about monitoring user behavior for the purpose of unreasonable or excessive control. While some may be sensitive to a misapplied notion of monitoring or investigating corporate email, particularly in light of the early June 2013 revelations about US government activities focused on widespread information gathering from email and other sources, what we are discussing here is much more about understanding how information flows through an organization’s email system and how decision makers can use this insight and intelligence more effectively to meet their legal, regulatory and best practice obligations. The goal of improving insight through the appropriate application of Messaging Intelligence is to enable better decision-making and to understand the context about the activity of the organization without invading privacy.
We have written a white paper that provides more detail on this topic – you can download it here.
Filed under: Uncategorized
Any information-based economy needs sufficient Internet bandwidth in order for its businesses to remain competitive and grow. That doesn’t mean that low Internet speeds equate to zero economic growth, but there is a relationship between the two. For example, one study found that doubling Internet speeds in a large economy increased GDP by 0.3%[i]. Another report found that since 2002, access to the Internet has added $34 billion to the US GDP each year[ii].
Unfortunately, satisfaction with Internet performance and reliability in the US is quite poor. Consider:
- J.D. Power, in a major 2012 survey, found that satisfaction with Internet service providers (ISPs) was between 650 and 725 on a 1,000-point scale, with DSL providers scoring at the low end of the scale and cable Internet providers at 672 (fiber-to-the-home providers scored at the high end, but account for only a small proportion of US Internet customers)[iii]. Scoring between 65.0% and 72.5% is usually a D to C- if you’re in school.
- The American Customer Satisfaction Index found that US ISPs have the lowest customer satisfaction rating of any US industry[iv].
- Internet service prices in the US are generally much higher than in many other countries. For example, a Verizon 500 Mbps Internet connection (with 100 Mbps upload) will cost $299.99 per month, while the same download speed in Amsterdam (but with 500 Mbps upload) offered by KPN will cost 71% less[v].
- Nearly two-thirds of US consumers are subject to a cap on the total amount of Internet bandwidth available to them[vi].
- The US is 35th out of 148 countries in terms of high-speed Internet service according to the World Economic Forum[vii].
So why is Internet service in the US so poor compared to other countries? There are several reasons:
- Local governments and public utilities often impose a variety of barriers that ISPs must satisfy before they can deploy infrastructure. These take the form of fees for attaching to utility poles or crossing various rights-of-way[viii]. That means that only the largest and most well funded ISPs can even consider deploying better services for consumers and businesses. When regulation and fees are reduced or eliminated, ISPs are motivated to deploy newer, better and faster infrastructure, as has been the case in Kansas City and Austin, for example.
- Infrastructure providers are “natural” monopolies in that an established provider can simply lower prices in order to fend off competition. By reducing their prices, existing providers can make it economically unfeasible for new vendors to enter the market.
- The US federal government has given substantially more lip service to enabling high-speed Internet service than it has to actually doing anything about it.
What are the implications of this?
- Internet bandwidth caps will stifle competition in the delivery of digital entertainment. For example, a 300-gigabyte monthly bandwidth cap will limit DVD-quality movie streaming to 69.7 hours per month or about 2.3 hours per day – Blu-Ray quality streaming will be limited to just 34 minutes per day. And for those who think that bandwidth caps are going away, the head of the National Cable and Telecommunications Association – and a former FCC chairman – wants ISPs to implement these caps more aggressively[ix].
- Companies, home-based businesses and others simply cannot operate as effectively with low Internet speeds. For example, a one-gigabyte file (five minutes of Blu-Ray quality video) that is uploaded at 0.875 Mbps (common for many DSL customers) will take two hours 36 minutes to upload. That means that many companies must instead opt for more expensive and slower physical delivery when sending large files.
- The “Internet of Things” won’t be practical if Internet bandwidth and performance are insufficient.
- The economic gulf between areas with high- and low-speed Internet will widen as companies in the engineering, graphics design, advertising, print and other industries that require fat Internet pipes won’t have the freedom to locate their facilities or hire employees, contractors and others in many areas of the country. Moreover, this is not just an urban vs. rural issue: some rural areas offer substantially better and cheaper Internet services than urban areas. For example, iFiber Communications offers Internet download speeds of nearly 110 Mbps[x] at a price of $49.95 per month in rural Grant County, Washington. Comcast, on the other hand, offers “up to 50 Mbps” download speeds for the same price in the much more population dense Seattle suburbs[xi].
Perhaps the solution is for municipalities to purchase the existing Internet infrastructure from telcos and cable TV companies, do the necessary bandwidth upgrades, and then lease the pipes to any and all ISPs that want to provide service, thereby eliminating the ability for existing providers to deter competitors. Government gets lots of income from franchise fees, right-of-way fees and other revenue generators with the current system – perhaps they could simply replace this income with the leasing fees they would realize.
[ii] Source: Four Years of Broadband Growth, Office of Science and Technology Policy and The National Economic Council
[v] Source: The Cost of Connectivity 2013, published by New America Foundation
Filed under: Uncategorized | Tags: Domino, GroupWise, ibm, Notes, novell, tesla
That’s a valid question. Because there are so few Teslas on the road, mechanics are not easy to find – it’s far easier to find a mechanic for a Chevy or Ford, for example. Teslas are sold by a company that may or may not be financially viable in five years – Chevrolet and Ford are much more likely to be in business at the end of 2018. Your decision to buy a Tesla will not be corroborated by a large number of like-minded owners – there are only a few thousand on the road today, whereas there are millions of Chevys and Fords. Tesla has a limited product line of only one model in three different configurations – Chevrolet and Ford offer a wide range of vehicles, running the gamut from small economy cars to muscle cars to large commercial trucks.
So why would anyone ever buy a Tesla? There are several reasons: they’re much less expensive to operate than many comparable gasoline-powered cars. Performance is phenomenal (0-60 in 4.2 seconds for the P85 configuration). They need very little maintenance. And they include features that Chevrolet and Ford don’t offer, such as an 11×17-inch flat panel display that runs HTML5 and offers a complete Web browsing experience, and the ability to seat five adults and two kids comfortably in a mid-sized sedan.
So why would anyone migrate to or stick with an email platform like GroupWise or Notes/Domino? GroupWise and Notes/Domino admins are not nearly as easy to find as Exchange admins. Arguably, Novell is less financially viable than Microsoft. There are many times more users of Exchange than GroupWise and at least 50% more Exchange users than Notes/Domino users. Novell offers a much more limited product line than Microsoft, whose offerings include on-premises and cloud-based email, desktop productivity applications and even a phenomenal gaming system. IBM has a broad and impressive product line in the messaging and collaboration space, but has nowhere near Microsoft’s share of the desktop productivity application market, for example.
So why stick with GroupWise or Notes/Domino? Our research shows that GroupWise is much less expensive to operate than on-premises Exchange and somewhat less expensive than Office 365. It has very low downtime and one admin can manage 15,000 or more users. And it has a very nice feature set that includes some things that Outlook and Exchange don’t. IBM is the clear leader in social technologies and offers impressive integration of email with social in a way that Microsoft does not. And, IBM offers an excellent cloud-based set of email, social and real-time communications tools that are arguably better than Office 365 (if not marketed as well).
This piece was not requested by Novell or IBM, nor is it a criticism against anyone that has decided that they need to leave behind their legacy communications platform and migrate to Exchange or Office 365 – both are solid offerings.
But, if you’re a senior, non-IT manager who thinks that Exchange and Microsoft must be your future messaging platform, this is a call to ask why you think that: Will your company achieve lower downtime, lower TCO or require fewer IT staff to manage the new system? Will your users realize better performance? Will your users be more productive? Will you gain a competitive advantage if you migrate? Is Microsoft’s roadmap really that much better than Novell’s or IBM’s?
Moreover, do you realize that Outlook is not really an email system? That’s not meant to be a slight, but some non-technical decision makers want to migrate their companies to “Outlook”, not realizing that Exchange is the actual email platform behind it.
Migrate away from GroupWise or Notes/Domino or some other platform if you’re compelled to do so. But you should have good reasons for the decision.