Who really cares about BYOD?

The bring-your-own-device movement is well on its way to fundamentally reshaping enterprise communications. So why do so few organizations seem to care about device management? A fairly wide gap formed almost immediately between BYOD user excitement and enterprise policy engagement, and it's only going to expand.

Entrenched employee attitudes absolving them of responsibility create problems for IT, and many organizations let worker preferences overwhelm clear-cut business priorities. The central problem with BYOD is a company's capacity to show that its cares – not only about the ways that BYOD can be hazardous, but about creating strategies that cater to worker preferences while keeping security at the forefront.

BYOD is clearly important to employees. One recent LANDESK survey found that the average European worker spends more on BYOD every year than he or she does on tea of coffee. They care about having the devices, but not protecting the data stored on them.

"It's not my problem" was a common refrain in a recent survey by Absolute Software about data security in the mobile enterprise. More than a quarter of those surveyed said they felt there should be no penalties for leaked or lost corporate data. Additionally, more than one third of respondents who lost their mobile devices said that they didn't make any changes to their security habits, while 59 percent estimated that their corporate data was barely worth more than the cost of replacing a phone.

Who is to blame for BYOD problems?
It's up to companies to exhibit the same passion for data security that employees have for using their own smartphones. Of those who acknowledged a penalty for data loss might be in order, most received nothing more than a slap on the wrist from employers, and often much less – 21 percent had a talking-to, 30 percent had to replace their lost device themselves and 34 percent reported that "nothing" happened when they lost company information. This reflects poorly on companies, observed Absolute mobile enterprise data expert Tim Williams, and will continue unless companies get proactive about BYOD management.

"If firms don't set clear policies that reflect the priority of corporate data security, they can't expect employees to make it a priority on their own," Williams said.

Establishing and enforcing BYOD practices is a good first step. Regulations have to acknowledge the ways personnel use BYOD and avoid limiting productivity as much as possible. There are several technological tools that can help a company secure mobile devices behind the scenes. Investing in managed infrastructure and IT support services provides a scalable, adaptable and continuous resource for effective network monitoring and data management

You've got mail, and it's a virus: Why organizations need cloud storage services for email

Security researchers recently discovered a cache of personal records for sale on the Internet’s black market, including 1.25 billion email addresses, according to the Independent. Finding one email address for every seven people in the world in the care of hackers is alarming. Email continues to be the central repository for the digital transmission and storage of confidential information and remains one of cybercriminals’ prime targets. Cloud storage services are a must for organizations struggling to take control of email security and management.

Keeping on top of email storage and archival is challenging for organizations of any size. Smaller organizations lack the IT resources of their larger peers, making it difficult to process email and ensure that all files are stored safely. Bigger companies have dedicated IT departments, but they also have massive email systems generated by bigger user bases and more diverse device profiles. The expertise and resources required to maintain in-house email storage are usually too costly. Either way, upholding the integrity of protection and system management at all times is beyond the purview of virtually every organization.

Adhering to traditional models of email storage simply won’t suffice in the face of today’s threat landscape. Moving email to cloud storage services, on the other hand, allows organizations to outsource the hardware and storage support to a trusted third party provider, wrote Nashville Business Journal contributor Richard Pinson.

“Hosting your own email requires constant upgrading, patching, backing up and monitoring,” Pinson wrote. “Once email transitions to the cloud, the service provider is responsible for all storage maintenance tasks and provides the most-recent version of their product.”

Cloud storage services are scalable, meaning that an organization won’t pay for what they don’t use. Over the long term, this is a much more cost-effective option than having to update legacy in-house environments every few years to respond to new security and productivity challenges. It only takes one malicious email ending up in a user’s inbox to let hackers in. In this landscape, organizations need the help of a dedicated cloud provider to keep their confidential information safe.

Investing in the IoT? Consider data storage issues first

The Internet of Things is a game-changing force, not only in the technology sphere but with implications for numerous other industries. IDC research analysts projected that the IoT will consist of 212 billion connected devices by 2020, generating $8.9 trillion in global revenues. Cisco's forecasts are even rosier, with the tech giant and IoT cheerleader predicting that the market will be worth $19 trillion within the next few years. Any way it's sliced, the IoT is poised to make a massive and far-reaching impact in the enterprise and personal lifestyles.

While many organizations look to ramp up their investment in connected electronics over the next few years, fewer have mapped a course for the data storage issues that will arise from the influx of linked, information-producing devices, as well as applications and analytics tools used to evaluate them. Companies already dealing with limitations in infrastructure support, network connectivity and IT management could be in for a rude awakening during the IoT investment process. Understanding the implications the IoT has for data storage can help organizations ensure that they're prepared.

Redefining data storage
Organizations may have to revamp their data center configurations to deal with machine-generated data, wrote InformationWeek contributor George Crump. Typically, an enterprise data center would process one of two data types: The first is large-file data, such as videos and images, which is accessed sequentially. The second kind is small-file data, which might come from a sensor log, but its massive volume compels random access. Machine-generated data comes in both types. In order for an organization to benefit fully from its network of sensors, it would need to outfit two separate storage systems to deal with the dual data types.

A company planning to approach its IoT investment with piecemeal, ad hoc storage investments would be better served outsourcing their storage needs to a provider that supports quickly scaling infrastructure builds. Otherwise, a business risks limiting the value of its machine-generated data. As Crump noted, the point of the IoT is to use data to make better decisions. Investing in a managed data storage service enables a company to direct its attention away from the complexities of infrastructure management and toward improving their business models.

"The storage systems for these initiatives almost always start out ad hoc and then become a focal point," Crump wrote. "If you have sensors, or things, that are creating data, keep an eye on that data now. Protect it and be prepared for it to become more important to the organization."

Why the higher education sector needs ITaaS

Data management continues to be an issue in the education sector. The recent flurry of information breaches highlights the lack of adequate information security practices at U.S. colleges and universities. Besides the sheer number of records potentially compromised, the leaks brought to light the dearth of IT infrastructure and governance policies capable of coping with the realities of today's cyberthreat landscape. As long as these institutions adhere to outdated IT security policies and questionable data management practices, they will be increasingly attractive targets to cyber espionage agents. IT-as-a-service can offer universities and colleges advanced IT support.

The recent university data breaches include:

  • A University of Maryland leak that exposed Social Security numbers, among other personal information, of more than 300,000 records. Some of these had been kept in a poorly maintained system since 1998, The New York Times reported.
  • Another recent leak compromised the information for 146,000 students and recent graduates at Indiana University, according to the Chicago Tribune. In following up on the breach, it was discovered that the data had been stored in an insecure server for 11 months.
  • Employee tax return problems at the University of Northern Iowa may be related to a compromised database, according to the Omaha World Herald.

Several unique issues contribute to poor data management at higher education institutions, including budgetary restrictions, work-study students with little experience serving as ad hoc IT support and sprawling networks with high user turnover. Migrating data storage, information security and other strategic IT planning demands to an ITaaS solution makes sense for universities and colleges that need to upgrade their IT support on a massive scale. ITaaS providers offer real-time data security, establish more stringent access and user protocols, and customize IT strategies to respond directly to the institution's most pressing needs. 

"Universities are a focus in today's global assaults on I.T. systems," said Wallace Loh, University of Maryland president in a statement following the breach. "Obviously, we need to do more and better, and we will."

Differentiating effective IT business continuity from disaster recovery

With constant threats posed by extreme weather and external attackers, companies have increasingly recognized the importance of protecting their IT assets in the wake of a disaster. But the nature of that protection plan is often up for debate. Recovering from disaster means leveraging tools like online backup services at the very least. However, true resilience in the face of a disaster requires a more all-encompassing business continuity approach.

The plan goes beyond data protection and recovery
While backing up data so it can be restored in the wake of an outage is the bedrock of any business continuity plan, it's only half the battle. Depending on a business's approach, its backup solution may do it little good in the event of an actual disaster. For instance, some businesses relying on off-site tape storage have found themselves unable to restore their files at a secondary location after a storm because they couldn't physically travel to the tape storage facility due to flooding, industry expert Jarrett Potts explained in a column for Data Center Knowledge. Having a plan that encompasses the full recovery process is essential.

"IT disaster recovery plans are very important when one considers how intertwined organizations are with technology, but it is important to note that IT disaster recovery plans are not, by themselves, a complete business continuity strategy," Continuity Central contributor Michael Bratton explained in a recent article.

The solution is oriented toward application uptime
A key differentiator between disaster recovery and business continuity is that the latter's focus is keeping core business operations running. As Bratton noted, this approach goes beyond simply IT. However, from a tech perspective, it primarily means keeping critical applications running with as little interruption as possible. Through technologies like virtualization and a distributed network of colocation facilities, businesses can establish a flexible application hosting model that can easily weather unexpected events. The exact nature of the plan is likely to vary from company to company, so working with a third-party solution provider to develop a custom response can also be beneficial.

Desktop virtualization: Why companies need to stop dragging their feet

Desktop virtualization is a necessary investment that reflects the changing technological paradigm. With employees increasingly mobile and companies more globalized, personnel need to be able to access their desktop operating system and applications from anywhere. Many organizations are eagerly sending data storage to the cloud and investing in as-a-service solutions to better manage and protect growing application environments. However, this accelerated investment wanes when it comes to desktop virtualization. Why? Shouldn't location-independent services extend to the level of the end user?

Cost continues to be an impediment to desktop virtualization in the eyes of many companies. While organizations acknowledge that the Internet offers a much more cost-effective and centralized medium through which to provide enterprise application and information access, they are worried about the expenses involved in reconfiguring enterprise infrastructure to make it compatible, according to a recent TechNavio report. While it's true that this can represent a sizeable capital investment, the long-term operational savings are enormous.

Bearing this in mind, ZDNet contributor Ken Hess wrote that it's surprising that companies are "still having this conversation" about the merits of desktop virtualization. Many companies who are worried about the costs of deploying virtual desktops and other infrastructure are the same ones clinging to hardware that is approaching or past its fifth year in use.Old equipment breaks down more frequently and often costs more to repair, and the more outdated hardware is, the more difficult it is to transition to a new IT program. Newer hardware likely has virtualization capacity. It makes sense to upgrade now, knowing that doing so when it is inevitable or reached a critical point will be extra complex.

Curing data management issues in the healthcare sector

Data management in the healthcare industry is reaching a tipping point. According to CDW Healthcare, the medical sector is gearing up to massive data growth – the 500 petabytes of data in 2013 are set to rise to 25,000 PBs by 2020. By 2015, the average hospital could be producing around 665 terabytes of data.

It's not just the amount of data that's the issue, but the types of information organizations collect. About 80 percent of data is unstructured, with imaging, scans and video requiring huge swaths of server space. Also, many healthcare providers are storing redundant information – the average hospital has 800,000 total records, but as many as 96,000 are duplicates. They are costly to store, making filing systems and data management efforts more complex without delivering additional security.

While big data offers potential benefits in patient care, research and treatment, the healthcare sector is flailing. In part, it's due to a relatively unique set of circumstances. The healthcare sector is traditionally fairly tech-averse – that acres of file cabinets containing patient records in manila folders still persist is a testament to how difficult it is to go digital. Initiatives such as electronic health records and healthcare information exchanges that increase the value of data have to contend with a slew of compliance, privacy and confidentiality issues.

Data management services can help healthcare organizations wield their vast information reserves in a cost-effective and secure way. Modern information technology infrastructure and business intelligence tools are critical to the effective utilization and protection of game-changing data-driven strategies, wrote Forbes contributor John Foley. Not only are massive file systems difficult to back up in a comprehensive way, many medical providers don't have any idea how long it would take to make files available following an unplanned incident. A data management services provider can help the organization establish a customized storage and backup system that prioritizes continuity and compliance. With people's lives potentially hanging in the balance, it's vital that healthcare providers alleviate big data headaches.

Colocation provides balance in a precarious world

 

Colocation is an increasingly popular choice for companies that want to cut down on data center spending without relinquishing control over their equipment. The market for wholesale and retail colocation is expected to surpass $43 billion by 2018, according to MarketsandMarkets. This represents a compound annual growth rate of 11 percent from 2013 to 2018. Retail colocation, in which businesses lease space in a large data center that services multiple clients, is rising in demand, with retail colocation deals often topping 1 megawatt of critical power to satisfy scaling client needs.

Many organizations that have little experience with massive infrastructure needs are now faced with increasing convergence between business and IT. This dive into the deep end can quickly subvert budgeting, resourcing, tech support and data strategies that companies have carefully planned. Colocation provides an alternative to an endless cycle of purchasing new equipment, building additions to onsite data centers and retraining staff. As Computer Weekly contributor Clive Longbottom pointed out, it makes little sense to build a facility given so much uncertainty, when it’s nearly impossible to predict demand even a few years down the road.

Unlike managed services, in which a company outsources the oversight of its infrastructure to a provider, colocation enables it to use its own servers and retain control of installation, maintenance and management. This can be a good first step for an organization that may have less experience with IT outsourcing but knows that it can’t subsist much longer on the status quo.

3 ways cloud storage solves IT complexity issues

Cloud storage enables businesses to exert more control over increasingly complex IT environments. Many IT departments are struggling with the management-related issues and costs stemming from infrastructure expansion. It's a physical problem, in terms of the storage equipment and support needed for big data and application provisioning. It's also an issue of management, as rising device and networking demands put more pressure on IT resourcing and policymaking capacities. At the same time, pressure to keep costs down can leave IT systems fractured or bloated. 

Cloud storage is critical to reducing the costs and complications of IT for a better bottom line. Here are three ways it makes a difference:

  1. Simplifies backup and recovery: Many organizations struggle to get employees to back up files in anything approaching real time. This reality is compounded by growing IT environments, wrote ZDNet senior editor Jason Perlow. Cloud storage offers organizations scalable storage space that expands as a business's needs do, plus automated syncing and backup to ensure real-time recovery availability.
  2. Reduces CAPEX and OPEX: The cloud can reduce storage-related capital and operating expenses in one fell swoop, observed CSO Online contributor Gordon Makryllos. Cloud storage offers upfront advantages to organizations by drastically reducing the amount of equipment they need to buy. Its scalability also offers OPEX cost benefits through streamlined security management, greater flexibility and more centralized IT support that provides continuity as organizations' priorities change.
  3. Improves collaborative potential: Communication and collaboration are more critical than ever to establishing a vibrant, successful organization. By centralizing file storage in a cloud server instead of on individual devices, employees can view, edit and share documents and files easily and in real time. IT departments can also leverage cloud environments to provide enhanced encryption and other security measures, automating access and preserving data integrity in the face of cyberthreats.

As complexity and costs rise, cloud storage can help relieve IT departments of many of the daily tasks that take up an increasing amount of their time. It enables them to spend more time on business-critical projects, with this alignment serving as another way to boost margins and take control of changing technological imperatives.

Real-world business continuity: The soaring costs of downtime

Many organizations approach business continuity as an afterthought. When a company is building up its hardware footprint and application investments in support of its growing business model, contingency plans are often relegated to the backseat and linger there. These organizations find out too late about the costs of prolonged downtime and the difficulty involved in righting the ship only in the aftermath of an unplanned event. One recent report offers some fairly chilling statistics about widespread shortcomings and expensive consequences of ignoring business continuity planning.

The Ponemon Institute report on the cost of data center outages in 2013 found that organizations lose $7,900 per minute of downtime. The mean cost of a single data center outage is $627,418 and the maximum amount lost to a single incident was more than $1.7 million. The total and per-minute costs correlated to the size of the facility and the duration of the outage, while IT equipment failure represented the most expensive root cause of unplanned data center downtime. Financial hits were worse for companies in data center-dependent industries such as e-commerce, financial services and telecommunications.

Costs can quickly escalate as a business recovers from an unplanned incident. From detection and containment to lost revenues and dwindled productivity, the expenditures can be immense. An organization will suffer more for each area of its business continuity planning that is lackluster or poorly thought out. 

These findings convey the importance of having an effective business continuity approach in place. The approach is twofold – prevention and recovery. Eliminating root causes of downtime is vital, especially in the case of expensive ones like IT equipment that can be more effectively managed. Visibility and redundancy can help streamline efforts to get the system back on track following a surprise incident.

Virtualization can be a great asset to both aspects of business continuity planning, as a recent CIO.com webinar pointed out. It provides a more manageable, agile environment for continuity efforts, mitigates hardware vulnerabilities by slashing equipment needs and helps a company access its safely stored systems and applications immediately following an unplanned occurrence.