Businesses look toward converged infrastructures to boost data center performance

As collecting and storing data becomes an increasingly critical part of the enterprise, businesses are starting to pay more attention to the infrastructure needed to handle such key workloads. In order to ensure reliable operations with such an influx of information, data center operators are turning to a variety of innovative methods to improve data handling while lowering costs. One of the most popular of these methods is convergence.

Practically every major cloud platform provider now offers some type of converged infrastructure, and some are even going so far as to realign their business models to work more effectively with the concept. HP is one such company that is making major strides toward accepting convergence architecture. The tech giant is looking to combine blade servers and its CI division to increase the speed of development and provide channel partners with more integrated solutions that help deployment and integration processes happen more quickly.

Convergence is the way of the future for <a  data-cke-saved-href=Convergence is the way of the future for data center operations.

Changes to networking essential for improved data management
When talking about a converged infrastructure, the key element is networking. Server and storage components function basically the same in a converged solution as they would traditionally, but they work in closer proximity to one another. However, as convergence gains more popularity among service providers, networking will evolve to become more of a fabric architecture, according to Information Age contributor Ben Rossi. This change will bring about a variety of challenges.

"As convergence gains more popularity, networking will have to evolve."

Providers will have to take a different approach to virtual networking. Provisioning and setup may be possible with only a simple overlay, but such a solution may inhibit performance as scale increases. A high degree of application awareness will also be necessary to optimize performance in key workloads, meaning simple automation won't be enough to deliver an optimal user experience. In order to address this issue, converged platforms will have to be provisioned to address specific workloads and support an overarching, integrated architecture that allows for simplified migration and data connectivity.

One of the biggest mistakes enterprises make when changing their internal IT infrastructure is trying to do all of the work themselves despite a lack of training and expertise. In order to avoid this common problem, enterprise decision-makers should work alongside a trusted service provider to ensure a successful implementation. By working together with a reliable industry partner, companies can create a customized infrastructure that works for them.

How the cloud is like PCs: An IT history lesson

Technology has always played a role in creating freedom within an organization, either by breaking down boundaries or by providing an avenue through which to reach new horizons. For a long time, the most innovative force in enterprise technology was the computer. When the first PC was introduced, it changed everything by making real computing power affordable and available to businesses and individual employees. The analysis made possible by PCs resulted in increased operating efficiency, faster innovation and dramatically improved client experiences. While computers are still the main focus of every organization, they no longer driving the freedom of innovation they once did. Cloud computing, however, has taken up the mantle, and has changed the face of enterprise IT in much the same way PCs did when they were first introduced. Businesses can learn from their own IT history and put the cloud to work for them the way they did with PCs in the following three ways:

“Cloud computing has changed the face of enterprise IT.”

1) Embrace the freedom to build
One of the reasons PCs became so popular so quickly was because they offered employees the ability to build applications, which freed them from the practical constraints of the IT department. Each user was able to pursue his or her own ideas independently and follow the ones that would make the biggest difference to the company. Any technology that expands a user’s possibilities is unstoppable, and cloud enables the same freedom as PCs before it.

Before the cloud arrived, innovative employees who wanted to create a new application to improve operating procedures had to go through an endless series of steps to get approval before anything could move forward. Now the cloud puts a massive number of resources right at users’ fingertips, allowing them to create, test and distribute programs that may never have gotten made otherwise.

The cloud is poised to change the enterprise the same way PCs did in the '80s.The cloud is poised to change the enterprise the same way PCs did in the ’80s.

2) Focus on the value of data
One of the biggest benefits PCs offered businesses in the ’80s and ’90s was the ability to gather and use data at a level previously unheard of. Now, the cloud offers businesses a similar opportunity. Not only can massive amounts of data be created through countless apps and services, but an even greater amount can be collected and analyzed through those same features to offer insights into business processes and operations.

As a recent Forbes article noted, “This changes the way IT practitioners and leaders need to think about IT. Now it’s not just about building and running data centers. It’s about marshaling tools and applications that acquire, transform, apply and protect the data that runs the organization.”

3) Recognize the power to disrupt
After PCs crashed onto the tech scene in the early ’80s, network storage systems followed closely behind. After that, PC technology moved into the data center and created even more innovations. PCs quickly became a dominate force in the data center, fundamentally changing the economics of how they were built and operated. Now cloud is here to usher in the next wave of data center disruption.

Cloud is poised to create a deep and lasting impact on the future of IT. Hybrid cloud especially is becoming a defining trend. The majority of enterprises around the world are already using multiple cloud environments for at least part of their IT workloads, changing the way people think about data.

The bottom line is that the cloud won’t be going anywhere anytime soon, and organizations would do well to look at the examples set by earlier disruptive technologies and apply them now to make the most out of their technological investments.

Increased use of technology causing changes in the enterprise

A recent study by management consulting and technology services firm Accenture revealed that a growing number of enterprise decision-makers are relying more heavily on technology to make changes in their business.

“90% of senior decision-makers expect technology to transform their companies.”

According to the report – which surveyed nearly 2,000 senior decision-makers in 15 countries – 90 percent of respondents expect digital technologies to transform their companies. At the same time, 87 percent of participants said their organization had made significant inroads to adopting digital technologies within the last 12 months.

While enterprise technology is useful for numerous reasons, one of the key drivers for adoption was the ability to increase mobility. When participants were asked which technologies their companies had already successfully adopted, nearly two-thirds responded with mobility.

A variety of benefits can be realized with the adoption of digital technologies, but one of the most widely reported was the creation of new revenue opportunities, with 48 percent of respondents experiencing that advantage. Another 46 percent reported faster time to market for products and services, and 45 percent said they were now able to provide more rapid responses to client demands.

As technology becomes a more essential part of the enterprise, organizations are shifting their focus.As technology becomes a more essential part of the enterprise, organizations are shifting their focus.

Use of technology causing companies to restructure 
Businesses are beginning to completely restructure their departments to make the best use of technology possible, as well as to ensure the most beneficial decisions are being made. The report found that 83 percent of organizations have implemented a holistic strategy and a central team to oversee the implementation and management of new technologies. Another 80 percent of companies have appointed a chief digital officer to help with large-scale adoption and ensure the technology being employed is being used in the best way.

“The benefits of digital technology are not just being talked about anymore, but are being put into action as organizations are reshaping themselves to take advantage,” said Jim Bailey, global managing director for Accenture Mobility. “A vast majority of respondents said their business had made significant inroads in using digital technologies over the past year – to grow their client base and or to enhance their overall enterprise efficiency – but acknowledged there is still some way to go.”

One of the most reliable ways for businesses to ensure a smooth and successful adoption of new technological infrastructure is to partner with a trusted service provider. An organization like ISG Technology is able to offer decades of industry experience to create a customized program that will work for each individual business. ISG enables companies to access the support and network capacity necessary for a successful deployment.

CIOs look to find a balance between tech innovations, enterprise security

With technology playing a much more integral part in the enterprise, the role of the CIO has become more complicated in recent years. A variety of factors that previously didn’t affect the position are now shaping everyday processes, and there is an increasing degree of change continuously facing IT staff. According to the 14th Annual State of the CIO survey conducted by CIO Magazine, 91 percent of CIOs say the role has gotten more challenging recently, and 74 percent say it is becoming increasingly difficult to find a balance between business innovation and operational excellence.

The rising frequency of data breaches have put a premium on strict security practices to protect critical infrastructure. But, at the same time, CIOs must be able to focus on just a few key priorities that will help to propel their organizations forward. In order to achieve this balance, there are a few main technology drivers that CIOs look to for guidance on IT priorities: cloud computing, big data analytics, enterprise mobility and data centers.

In many cases, the advantages of multiple areas are being combined to create solutions that benefit companies even more. Business continuity/disaster recovery and security will always be – or should always be, at least – a top priority for businesses, but innovations in cloud computing and data center design are helping to improve these processes by increasing overall security and enhancing recovery efforts so network intrusions cause as little disruption as possible.

Big data analytics and enterprise mobility are also teaming up to provide operational insights that were previously unavailable to most organizations. In the modern enterprise, data serves as a new form of currency, and the more information businesses can get out of their data, the richer they will become. Practically every company has some form of mobility or bring-your-own-device program by now, and many organizations also offer a mobile application for employees and clients to access information on the go. The data created through those programs is proving invaluable to enterprises hoping to learn more about their client base and streamline operating procedures.

Enterprises are experiencing numerous benefits with new technologies. Enterprises are experiencing numerous benefits with new technologies.

Tech innovations offer benefits to companies, but expertise is lacking
While these areas of IT are becoming the most important for many businesses, they are also some of the categories in which many CIOs are seeing skills shortages. According to the State of the CIO survey, big data, security and mobile technologies are three of the top five areas in which businesses are finding it difficult to find qualified candidates. The study also found that 56 percent of CIOs believe they will experience an IT skills shortage over the next year.

“ISG Technology offers expertise to help companies implement solutions right for them.”

In order to ensure they are able to experience the benefits of these technologies despite a lack of IT talent, many businesses are turning to third party service providers to receive the help they need. Organizations like ISG Technology offer expertise in data center management, security, enterprise mobility and cloud computing and can help companies implement solutions that are right for them quickly and conveniently.

New study finds Internet of Things continuing to expand

A new study recently released by Gartner has found that use of the Internet of Things is growing, and an increasing number of devices now have IoT capabilities.

According to the report, 4.9 billion connected things are expected to be in use next year, an increase of 30 percent from 2014. The number of IoT devices is believed to be on track to reach 25 billion by 2020. Gartner researchers estimated that total spending on services supported by the IoT will reach $70 billion in 2015 before rising dramatically to $263 billion in 2020.

Part of the reason connected devices have seen such a dramatic growth recently is due to the powerful force the IoT has shown itself to be in terms of business transformation. The report discovered that while the increased number of connected things is being driven by consumer applications, enterprises will account for most of the revenue in the market.

"The number of connected intelligent devices will continue to grow exponentially, giving 'smart things' the ability to sense, interpret, communicate and negotiate, and effectively have a digital 'voice,'" said Steve Prentice, Gartner fellow and vice president. "CIOs must look for opportunities to create new services, usage scenarios and business models based on this growth."

Researchers also noted that traditional, mainstream products will start to be reinvented to include computing capabilities and provide them with a digital voice. The enhancement of objects once viewed as passive products will completely change their value propositions and create new services and business models. The study found that by 2020, the three industries with the highest level of IoT use will be utilities, manufacturing and government.

Security a major part of IoT expansion 
​A major point touched on by the report is the security repercussions of the IoT, as dozens of new platform options are brought into enterprise digital security architecture. Increased use of the IoT will also bring new security standards to each industry individually and provide a new view of applications. These changes will cause IT leaders to create a more comprehensive technological approach to IoT risk and security going forward. According to the study, 20 percent of companies will have digital security services devoted to protecting business initiatives using IoT devices and services in the next two years.

"The IoT highlights the tight linkages between information security, information technology security, operational technology security and physical security like never before," a statement from Gartner noted. "Executives now face a decision regarding the future of security in their enterprise and who governs, manages and operates it."

New tests discover 'no-wait data center' technology

Researchers from the Massachusetts Institute of Technology recently announced that they have created what they are calling a 'no-wait data center'. According to ZDNet, the researchers were able to conduct experiments in which network transmission queue length was reduced by more than 99 percent. The technology, dubbed FastPass, will be fully explained in a paper being presented in August at a conference for the Association for Computing Machinery special interest group on data communication.

The MIT researchers were able to use one of Facebook's data centers to conduct testing, which showed reductions in latency that effectively eliminated normal request queues. The report states that even in heavy traffic, the latency of an average request dropped from 3.65 microseconds to just 0.23 microseconds.

While the system's increased speed is a benefit, the aim is not to use it for increased processing speeds, but to simplify applications and switches to shrink the amount of bandwidth needed to run a data center. Because of the miniscule queue length, researchers believe FastPass could be used in the construction of highly scalable, centralized systems to deliver faster, more efficient networking models at decreased costs.

Centralizing traffic flow to make quicker decisions
In current network models, packets spend a lot of their time waiting for switches to decide when each packet can move on to its destination, and have to do so with limited information. Instead of this traditional decentralized model, FastPass works on a centralized system and utilizes an arbiter to make all routing decisions. This allows network traffic to be analyzed holistically and routing decisions made based off of the information derived from the analysis. In testing, researchers found that a single eight-core arbiter was able to handle 2.2. terabytes of data per second. 

The arbiter is able to file requests quicker because it divides up the necessary processing power to calculate transmission timing among its cores. FastPass arranges workloads by time slot and assigns requests to the first available server, passing the rest of the work on to the next core which follows the same process.

"You want to allocate for many time slots into the future, in parallel, " explained Hari Balakrishnan, an MIT professor in electrical engineering and computer science. " According to Balakrishnan, each core searches the entire list of transmission requests, picks on to assign and then modifies the list. All of the cores work on the same list simultaneously, efficiently eliminating traffic.

Arbiter provides benefits for all levels
Network architects will be able to use FastPass to make packets arrive on time and eliminate the need to overprovision data center links for traffic that can arrive in unpredictable bursts. Similarly, distributed applications developers can benefit from the technology by using it to split up problems and send them for answers to different servers around the network.

"Developers struggle a lot with the variable latencies that current networks offer," said the report's co-author Jonathan Perry. "It's much easier to develop complex, distributed programs like the one Facebook implements."

While the technology's inventors admit that processing requests in such a manner seems counterintuitive, they were able to show that using the arbiter dramatically improved overall network performance even after the lag necessary for the cores to make scheduling decisions.

The FastPass software is planned to be released as open source code, but the MIT researchers warn that it is not production-ready as of yet. They believe that the technology will begin to be seen in data centers sometime in the next two years.

BYOD policies support majority of Americans who can't go 24 hours without their phone

A recent survey from Bank of America found that 96 percent of Americans between the ages of 18 and 24 consider mobile phones to be very important. While that may not be so surprising, the fact that only 90 percent of the respondents in the same group reported deodorant as also being very important. The report involved interviews with 1,000 adults who owned smartphones and found that they were more important than most anything, including toothbrushes, television and coffee.

The survey also discovered that 35 percent of Americans check their smartphones constantly throughout the day. Forty-seven percent of respondents said they wouldn't be able to last an entire day without their mobile phone, and 13 percent went so far as to say they couldn't even last an hour.

As the Bank of America report proves, people are more attached to their devices than ever. Millennials are especially dependent on their phones and tablets, and they are also the group making up the biggest portion of new workers. Companies are increasingly able to benefit from implementing BYOD policies, as employees who have grown accustomed to their particular phone expect to be able to continue using that phone at work. Allowing workers to keep their own device increases productivity, as they aren't constantly checking an alternate phone, as well as boosting employee satisfaction.

Determining bandwidth requirements in the data center

How much bandwidth does a data center really need? It depends on how many workloads and virtual machines are in regular operation, as well as what the facility is designed to support. For example, a data center providing resources to a public cloud requires much more bandwidth than one that is simply powering internal systems and operations shielded by the company firewall. The increasing uptake of remote data centers and colocation arrangements, in tandem with server virtualization, has added to organizations' bandwidth considerations.

How virtualization complicates bandwidth requirements
Server and desktop virtualization have made companies less reliant on physical infrastructure and the specific sites that house it. Here's how they work:

  • With desktop virtualization, or VDI, an operating system can be hosted by a single machine (even an aging one), simplifying management of both software and hardware while reducing costs
  • Server virtualization involves a single physical server being turned into multiple virtual devices. Each instance is isolated and the end user cannot usually see the technical details of the underlying infrastructure.

By getting more out of IT assets via virtualization, companies have reshaped IT operations. More specifically, they have spread out their infrastructure across multiple sites and put themselves in position to move toward cloud computing.

With increased reliance on virtualization, organizations have looked to ensure that remote facilities receive the bandwidth needed to provide software, instances and data to users. However, liabilities still go overlooked, jeopardizing reliability – especially when data centers are too far apart from each other.

Ensuring low latency is just one piece of the data center optimization puzzle, though. Sufficient bandwidth must also be supplied to support the organization's particular workloads. In the past, Microsoft has advised Exchange users to think beyond round trip latency.

"[R]ound trip latency requirements may not be the most stringent network bandwidth and latency requirement for a multi-data center configuration," advised Microsoft's Exchange Server 2013 documentation. "You must evaluate the total network load, which includes client access, Active Directory, transport, continuous replication and other application traffic, to determine the necessary network requirements for your environment."

Knowing how much bandwidth is needed
Figuring out bandwidth requirements is a unique exercise for each enterprise. In a blog post, data center networking expert Ivan Pepelnjak looked at the nitty-gritty of assessing bandwidth-related needs, honing in on some of the problems that reveal a need to rethink how bandwidth is allocated and utilized.
These issues include:

  • Over-reliance on slow legacy equipment
  • Oversubscription to services
  • Miscalculation of how much traffic each virtual machine generates 

In addition, data center operators sometimes overlook bottlenecks such as how virtual machines can sometimes interact slowly with storage. If they have to frequently access data stored on an HDD, for example, quality of service may degrade. Networks may require extra bandwidth in order to avoid data transfer hiccups. 

HealthKit, healthcare and managing BYOD

As smartphones become faster and increasingly capable of running sophisticated applications and services, health care organizations are faced with a dilemma. Do they allow doctors, nurses and staff to participate in bring-your-own-device policies and potentially unlock productivity gains that enable higher-quality care? Or do they hold back out of legitimate concerns about data security and compliance with regulations?

The growing interest of technology firms in health care tracking only complicates the situation. Individuals may now use devices such as wristbands, in addition to smartphones, to record and share health information, making it critical for providers to keep tabs on BYOD activity to ensure compliance.

HealthKit and the larger issue of sharing health information
At this year's Worldwide Developers Conference, Apple announced HealthKit, a platform built into iOS that underscores how healthcare on mobile devices is rapidly evolving and sparking questions about how sensitive data is handled. HealthKit isn't a discrete solution but a system of APIs that would allow, say, an application that tracks steps to share its information with medical software that could provide actionable advice.

Major health care organizations are already on board. The Mayo Clinic created an application that monitors vital signs and then relays anomalous readings to a physician. Given the already considerable presence of mobile applications in health care, HealthKit could give hospital and clinic staff additional tools for providing efficient care.

At the same time, HealthKit turns any iOS device into a potential compliance painpoint. Data that is stored on an iPhone, for example, would not fall under the purview of the Health Insurance Portability and Accountability Act, but if shared with a provider or one of their business associates, HIPAA would likely apply. Stakeholders will need time to adjust to the nuances of how healthcare applications interact with each other in the HealthKit ecosystem.

"The question would be whether the app is being used by a doctor or other health care provider. For example, is it on their tablet or smartphone?," asked Adam Greene of Davis Wright Tremaine LLP, according Network World. "Where the app is used by a patient, even to share information with a doctor, it generally will not fall under HIPAA. Where the app is used on behalf of a healthcare provider or health plan, it generally would fall under HIPAA."

Tracking and securing privileged health information
HealthKit is just one platform on a single OS, but it is part of a broader shift in data control, away from centralized IT departments and organizations and toward end users. For healthcare, this change is particularly challenging since providers have to ensure that the same compliance measures are enforced, even as BYOD and cloud storage services become fixtures of everyday operation.

A recent Ponemon Institute survey of more than 1,500 IT security practitioners found that almost 60 percent of respondents were most concerned about where sensitive data was located. BYOD complicates compliance, and healthcare organizations will have to ensure that they have well defined policies in place for governing security responsibilities.

"People trained in security also view IT as accountable for the security domain," Larry Ponemon, chair of the Ponemon Institute, stated in a Q&A session on Informatica's website. "But in today's world of cloud and BYOD, it's really a shared responsibility with IT serving as an advisor, but not necessarily having sole accountability and responsibility for many of these information assets."

It's no longer enough to rely on IT alone to enforce measures. Security teams and IT must work together and implement BYOD security as well as network monitoring to ensure that only authorized devices can connect to the system, and that data is safely shared.

Virtualization, open source switches changing the face of data centers

Data center technology moves quickly. With the emergence of wide-scale cloud computing over the past decade, enterprises have constructed new facilities and adopted cutting-edge equipment to keep up with demand and/or worked with managed services providers to receive capacity through colocation sites.

Virtualization drives strong growth of data center networking market
Last year, MarketsandMarkets estimated that the data center networking market alone could top $21 billion by 2018 as virtualization and specific technologies such as 40 Gigabit Ethernet continue to gain traction. Rather than rely on legacy physical switches that are challenging to upgrade and scale, enterprises are turning to virtual alternatives.

Virtualizing the network makes equipment and services much easier to modify. Since the fundamental advantage of cloud computing is the ability to get resources on demand, such extensibility is critical for helping companies keep pace with changing requirements.

"Virtualization being a disruptive technology is one of the major driving factors in [the] data center networking market," MarketsandMarkets analyst Neha Sinha told Network Computing. "The adoption of high-performance virtual switches is critical to support increasing number of virtual machines used in multi-tenant data centers. The virtual switches include programmatically managed and extensible capabilities to connect the virtual machines to both physical and virtual networks."

Down the road, such interest in mixing and matching legacy, physical and virtual assets may lead organizations to take up software-defined networking. This practice entails managing network services in a more intelligent, CPU-centric way.

However, SDN is still over the horizon for many companies right now. Both the use case and the underlying technology are not widely understood. Plus, enterprises are still trying to accrue enough personnel expertise in areas such as server virtualization to give them a solid foundation for future modifications of their networks and data centers.

Facebook announces open source data center switch
The demand for higher data center efficiency is unabating, and tech giants such as Facebook are looking to get in on the action. PCWorld reported that the social network has confirmed an open source switch, released through the Open Compute Project, that could challenge longstanding incumbents such as Cisco.

Facebook's switch is a top of the rack appliance that connects servers to other data center infrastructure. It has 16 individual 40 Gigabit Ethernet ports. The endpoint is designed for maximum flexibility for developers and data center operators, and it may contribute to broader efforts to make infrastructure more flexible.