Why cloud computing is safe

Cloud computing has been gaining popularity in the business space over the last couple years. Organizations are abandoning server-based data centers in favor of a third-party-provided solutions. Yet as more data is stored digitally, the danger of hacking grows. Companies are losing significant income to data breaches, and cybercriminals are developing new, sophisticated ways to steal data.

So why are companies taking their information to the cloud? Many executives want to push their businesses to the cloud but don’t fully understand how it works. As such, they may be wary over the idea of removing confidential information from complete corporate oversight. However, the cloud is not as penetrable as its name might imply.

Three factors driving cloud safety
According to Forbes, there are three principal factors helping to keep data secure when it is in a cloud platform. The first is redundancy. Losing data can be almost as harmful as having it stolen. When a server fails or a hacker gains access to a corporate network and deletes or attempts to ransom vital information, companies can lose months of productivity. Most cloud networks, however, typically keep data in at least three locations.

This means that lost data at one location, such as data loss caused by a server failure, will not have the disastrous impact that it could in an organization relying on an on-premise data center. By keep copies of each file, cloud solutions are making sure mission-critical data is accessible until the user no longer wants it.

The second factor is the safe sharing policy. Anyone who has ever used the popular Google Docs knows how file sharing works. Rather than making a copy, the user must enter the email address of anyone they want to see the file. These extra users can’t share the file on their own (unless given express permission), they simply have access to the information. This is how safe sharing works. It prevents any unauthorized copies from being created or distributed. Users have access to their own data and can control exactly who sees it.

The last factor driving cloud safety is encryption. Provided a user keeps track of their password, it is very difficult for a hacker to gain access to the files. They are being stored either entirely in the cloud or at a secure, remote facility in an unknown location. Since the user’s connection to this information is encrypted, following it to gain access would be difficult, if not impossible for a human hacker.

“Cybersecurity today is more about controlling access than managing data storage.”

It’s all about access
As TechTarget pointed out, cybersecurity today is more about controlling access than managing data storage. When hackers breach data, they typically do so because they have access to sensitive information. This can be a password or even a corporate email address. Cybercriminals infiltrate and steal information based on the access they’ve gained, typically from an unknowing authorized user.

Cloud solutions help monitor this access, keeping secure data under control. The providers offering these platforms have the expertise and the resources to keep cybersecurity evolving alongside the threats. In most cases, they have more resources than the client companies using their solutions.

The cybersecurity arms race
One popular cloud vendor is Microsoft. Each year the company invests over $1 billion into cybersecurity initiatives for its Azure platform. The money, explained Azure Government CISO Matthew Rathbun in an interview with TechRepublic, isn’t just about maintenance, it is about innovation:

“Ninety percent of my threat landscape starts with a human, either maliciously or inadvertently, making a mistake that somehow compromises security,” said Rathbun. “In an ideal state, we’re going eventually end up in a world where there’ll be zero human touch to an Azure production environment.”

Overseen by talented specialists with ample resources, cloud solutions are a safe form of data protection in today’s digital business space.

Is physical data destruction completely secure?

Cybersecurity is a paramount issue facing businesses in the digital world. The average costs of a successful cybercrime in 2017 were roughly $1.3 million for large enterprises and $117,000 for small- to medium-sized businesses, according to Kaspersky Lab. These figures include the cost of data theft but do not encompass the additional potential price of a damaged reputation and ensuing legal action. Data also indicates that cyberattacks will become only more expensive and damaging in the coming years.

Defending an organization against cybercrime requires a multi-channel approach. Companies should be open to software solutions, employee training and hardware upgrades whenever necessary. However, another avenue for cybercrime is occasionally overlooked. Physical theft of connected mobile devices, laptops and even desktop computers can lead to an open pathway for cyberattacks. In addition, some businesses simply sell their used electronics without first doing a proper data cleanse.

But can information to completely and permanently removed from a hard drive?

Hard drives are traditional data collection units that can be altered in a number of ways. However, the question is "can data be permanently removed."Hard drives are traditional data collection units that can be altered in a number of ways. However, the question is "can data be permanently removed?"

The levels of data destruction
Deleting data is not as secure as some might assume. In actuality, when information on a computer is "deleted," the files themselves are not immediately removed. Instead, the pathing to that information is expunged. The data is also designated as open space, so the computer will eventually overwrite it. However, until this rewrite occurs, it is relatively easy for the information to be restored and accessed by any tech-savvy user.

Fortunately for organizations trying to permanently dissolve their data, deletion is only the first step of the process. Lifewire recommended three additional methods to ensure that information remains lost.

First comes software – using a data destruction program on the hard drive. This method has been met with approval from the National Institute of Standards and Technology as a secure way to permanently remove information from a hard drive, according to DestructData. However, drawbacks include resource consumption, as this can be a time-intensive process. In addition, some overwriting tools can miss hidden data that is locked on the hard drive.

The most secure method to completely remove data is degaussing. Hard disk drives operate through magnetic fields, and degaussers alter those waves. The result is a drive that can never be read again. In fact, the computer will not even register it as a hard drive from that moment on. However, the downside in this process is twofold: One, the drive is useless after degaussing. Two, this method can on only hard disk drives. Solid state drives and flash media do not use magnetism in the same way, so a degausser will be ineffective.

The final option is to physically destroy the data drive. While many people think that this task can be done with patience and a hammer, it is unfortunately not that simple. Hard drives can be rebuilt with the right tools and expertise. According to the Computer World, NASA scientists were able to recover data from the charred wreckage of the Columbia shuttle after its disastrous explosion and crash in 2003.

Computers that are simply thrown out can still possess classified data, which can return to haunt the company. Computers that are simply thrown out can still possess classified data, which can return to haunt the company.

The resiliency of hard drives
In short, it can be difficult to permanently expunge data from a hard drive. This reality is in part why businesses are opting for less internal data centers and more dependency on cloud solutions. According to TechTarget, cloud solutions represent a more secure method of data organization than traditional IT infrastructure.

While data can be safely deleted, the reality is, unless a degausser is used, there is always some chance of information recovery. Cybercriminals are becoming more sophisticated, and given the expensive nature of dealing with data breaches, it is understandable why the cloud is becoming the preferred solution.

Google joins the empowered edge with Cloud IoT Edge

The internet of things has been a rapidly growing segment of technology over the past decade. Ever since Apple took made the smartphone a consumer success with its first iPhone, users have grown comfortable carrying technology in their hands and pockets. This IoT-filled world has created new opportunities and challenges.

According to IDC, connected devices will generate over 40 trillion gigabytes of data by 2025. This is too much of a good thing, especially if IoT devices remain only collectors and not processors. To help speed up data collection, Google has announced its Cloud IoT Edge platform, as well as a new hardware chip called the Edge tensor processing unit.

What are Google's new announcements?
Google described its decision to move forward on the Cloud IoT Edge platform as "bringing machine learning to the edge." Essentially, current edge devices, such as drones and sensors currently transmit most of their data collection back for internal processing. This procedure uses a lot of bandwidth and reduces the speed at which decisions can be drawn from the data. It also places a lot of stress on constant network connectivity, as any downtime can result in lost information.

Google's new software solution would allow this data processing to happen right at the data source. It will also enable advanced technology, such as machine learning and artificial intelligence, to operate on these edge devices. Enter the Edge TPU: This chip is designed to maximize performance per watt. According to Google, the Edge TPU can run TensorFlow Lite machine learning models at the edge, accelerating the "learning" process and making software more efficient faster.

Google is seen as one of the big three when it comes to cloud infrastructure solutions. Google is seen as one of the big three when it comes to cloud infrastructure solutions.

How does this compare with the greater market?
In this announcement, Google is following in the path of Microsoft. Released globally in July, Azure IoT Edge accomplished many of the same tasks that the Cloud IoT Edge solution intends to. The two aim to empower edge devices with greater machine learning performance and reduce the amount of data that must be transmitted to be understood.

However, as Microsoft has been in the hardware space much longer than Google, no TPU chip needed to accompany the Azure IoT Edge release. It is possible that Google may gain an advantage by releasing hardware designed to optimize its new platform performance.

Amazon's AWS Greengrass also brings machine learning capabilities to IoT devices. However, unlike the other two, this platform has existed for a while and seen modular updates and improvements (rather than a dedicated new release).

The presence of all three cloud platform giants in edge space signifies a shift to at-location data processing. Cloud networks have already been enjoying success for their heightened security features and intuitive resource sharing. As these networks become more common, it has yet to be fully seen how Microsoft, Amazon and Google deal with the increased vulnerabilities of many edge devices. However, with all three organizations making a sizeable effort to enter this market space, businesses should prepare to unlock the full potential of their edge devices and examine how this technology will affect workflows and productivity.

Why companies must change their infrastructure for optimal data visualization

The digital revolution has brought a host of new opportunity and challenges for enterprises in every sector of business. While some organizations have embraced the wave of change and made efforts to be at its forefront, others have held back. These institutions may not have sufficient staff to implement the changes or may be waiting for a proven added-value proposition.

In other words: No technology upgrades until the innovation is proven useful. There is some wisdom in this caution. Several reports have noticed that productivity and efficiency are not rising at expected levels alongside this technology boom. However, as Project Syndicate noted, this lag may be a case of outgrowing the traditional productivity model, meaning that not every employee action is measured in the old system. 

However, there is another reason to explain why more technology does not automatically equal greater efficiency and higher profits. If a company buys some new software, it will see a minor boost. However, it will reap the full rewards only if staff properly learn to use said platforms.

Part of this problem stems from the misunderstanding that technology can only improve current work processes. This is not true. When looking at a basic enterprise operation like data visualization, technology has created an entirely new workflow.

Examining the traditional business model
In the traditional business model, all data visualization was manual. Employees would gather data from various endpoints and then input it into a visual model. Common examples of this process included pie charts and bar graphs. The employee would then present the data to the executive, who would use it to make information-based decisions.

While acceptable, this process is far from optimized. Most data had to be generated in spreadsheets before it was collected, using formulas made by staff. Collecting and framing the information is a time-consuming process that will absorb at least one individual. As employees are involved at every step of this workflow, there is a potential for human error.

The time involved prevented companies from acting on real time information. In the interim, intuition and "gut feeling" were used as substitutes for data-backed decisions. The people involved raised the level of risk that the data in question may be inaccurate or misleading.

Charts work because the human mind can understand pictures so much faster than words.Charts work because the human mind can understand pictures so much faster than words.

Unlocking data analytics 
Of course, with the arrival of the internet of things, companies have a lot more data collection at their disposal. Connected devices have provided a whole new network of information. This gold mine, also known as big data, has one downside: There is too much of it. A human cannot hope to categorize and analyze the information in any acceptable timeframe.

Enter data analytics. Using advanced technology like machine learning, companies can create and implement this software to study their data, creating automatic visualizations based on trends and prevalent patterns. According to Fingent, these software solutions employ mining algorithms to filter out irrelevant information, focusing instead on only what is important.

However, companies cannot simply go from a traditional system to a fully fledged data analytics solution for one reason: data segmentation. Many enterprises divide their information based on different departments and specializations. Each group works internally, communicating primarily with itself. While this method is helpful for organization, it greatly impedes data analytics potential.

If companies have siloed their data, the program will have to reach into every source, work with every relevant software and bypass every network design. In short, it will have to work harder to communicate. While modern data analytics solutions are "smart," they cannot navigate barriers like this easily. They are designed to optimally read only the information that is readily available.

For organizations to fully capitalize on the potential of internal data analytics, infrastructure overhaul is needed. Departments – or at least their data – must be able to freely communicate with one another. This process entails implementing a common software solution that is in use across the entire company.

The good news is that many modern solutions fit this need. Solutions like cloud platforms store relevant data in accessible locations and train employees to not segment their work. By creating an infrastructure that is open to the data analytics program, organizations can start to act on information, rather than relying solely on their gut. 

Data analytics can give companies real time answers to their challenges. Data analytics can give companies real time answers to their challenges.

Should companies embrace Microsoft’s Azure IoT Edge?

As of late June 2018, one of Microsoft's newest software platforms, Azure IoT Edge, is generally available. This means that commercial enterprises and independent consumers now have access to it and, thanks to Microsoft's decision to take the platform open source, can begin modifying the technology to fit specific needs.

Every innovation brings new opportunity and unforeseen challenges, and there is no reason to suspect that Azure IoT Edge will be any different. Even programs created by technology industry leaders like Microsoft have their potential disadvantages. 

What exactly is Azure IoT Edge?
Simply put, Azure IoT Edge represents Microsoft's plan to move data analytics from processing centers to internet of things enabled devices. This sophisticated edge computing technology can equip IoT hardware with cognitive computing technologies such as machine learning and computer vision. It will also free up enormous bandwidth by moving the data processing location to the device and allow IoT devices to perform more sophisticated tasks without constant human monitoring.

According to Microsoft, there are three primary components at play:

  1. A cloud-based interface will allow the user to remotely manage and oversee any and all Azure IoT Edge devices.
  2. IoT Edge runtime operates on every IoT Edge device and controls the modules deployed to each piece of IoT hardware.
  3. Every IoT Edge module is a container that operates on Azure services, third-party software or a user's personalized code. The modules are dispersed to IoT Edge machines and locally operate on said hardware.

Overall, Azure IoT Edge represents a significant step forward in cloud computing and IoT operations, empowering devices with functionality that wasn't before possible.

Devices like drones will be able to carry out more sophisticated tasks using Azure IoT Edge. Devices like drones will be able to carry out more sophisticated tasks using Azure IoT Edge.

The cybersecurity concerns of Azure IoT Edge
It is worth remembering that IoT hardware has a long and complicated history with cybersecurity standards. Considering the bulk of IoT technology adoption has been driven by consumer, rather than enterprise, products – issues like security and privacy were placed second to interface design and price point.

Research firm Gartner found that 20 percent of organizations had already reported at least one IoT-centered data breach within the three years leading up to 2018. This risk has led to IoT security spending that is expected to cost $1.5 billion globally in 2018. Some companies scrambling to make their IoT hardware more secure may want to leave this problem as a priority over incorporating Microsoft's newest software platform.

Another potential issue is Microsoft's decision to make the platform open source. The original code is public knowledge and now available to all to modify for personal use. While this flexibility will greatly help the product's user base expand, open source programs have not historically been the most secure from cybercriminals.

Many ecommerce websites ran on the Magento platform, an open source solution that became the target of a brute force password attack in 2018, which ultimately proved successful. The resulting data breach led to thousands of compromised accounts and stolen credit information.

A Black Duck Software report tracked open source programs as they have become more widespread. While the overall quality of open source code is improving, the study found that many organizations do not properly monitor and protect the code once it has been put in place, leaving it vulnerable to exploitation from outside sources.

"Microsoft annually invests $1 billion in cybersecurity research."

The Microsoft advantage
However, Microsoft is arguably in position to address the major security concerns with its Azure IoT Edge platform. The company invests over $1 billion in cybersecurity research each year. According to Azure Government CISO Matthew Rathbun, a lot of this money is spent  with Azure in mind:

"Ninety percent of my threat landscape starts with a human, either maliciously or inadvertently, making a mistake that somehow compromises security," Rathbun told TechRepublic. "In an ideal state, we're going eventually end up in a world where there'll be zero human touch to an Azure production environment."

Azure IoT Edge represents a bold step forward in empowering IoT technology and improving automated productivity. While there are risks associated with every innovation, Microsoft remains committed to staying at the forefront and protecting its platforms. Companies should be willing to invest in Azure IoT Edge while remaining vigilant about the possible risks. 

Is blockchain the antidote to all cybersecurity woes?

Blockchain has been turning heads since it was first unveiled in 2008 to become the backbone of then relatively unknown cryptocurrency, bitcoin. Since then, blockchain and Bitcoin have skyrocketed in public awareness, with the latter becoming the most successful cryptocurrency in history. A large portion of bitcoin's success is due to its blockchain infrastructure, which prevents the duplication of funds (preventing double-spending) and automatically time-stamps every transaction.

The developer (or developers) behind blockchain created the software to be resistant to alteration or hacking, making it one of the more inherently secure systems that companies can use to manage secure infrastructures. Some have heralded blockchain as the ultimate tool to promote cybersecurity and reduce the risk of data breaches.

Then bitcoin, in addition to several other cryptocurrencies, were hacked. According to CNN, the attack erased the equivalent of billions of dollars and sent the value of the affected cryptocurrencies plunging. The incident has many questioning just how secure blockchain is and whether the software was simply a temporary fix, like so many others, against the ever-present threat of cyberattacks.

"Blockchain can give each registered device a specific SSL certificate for authentication."

The case for blockchain
While buzzwords are common in the tech industry, there are several legitimate reasons why blockchain has been celebrated as a secure platform. According to Info Security Magazine, one of blockchain's primary appeals is its decentralized data storage. While users can access blockchain data on a computer or mobile device, the program itself is typically stored throughout the network.

If one access point – or block – is targeted by hackers, then the other blocks will react to it. The attempted cyberattack will likely alter the data on the block in a way that is immediately noticeable by the rest of the chain. This block will then simply be disconnected, isolating the malicious data before it can impact the system.

Another helpful advantage of blockchain is its effectiveness against dedicated denial of service attacks. These cyberattacks target the domain name system, flooding it with so much data traffic that it essentially shuts down. Using blockchain software would allow the DNS to spread its contents to more nodes, reducing the effectiveness of the DDoS attack before it reaches a crippling stage.

Networks using a blockchain infrastructure can also bypass the need for passwords in certain situations. Instead of using the human-oriented password system, blockchain can give each registered device a specific SSL certificate. This mode of authentication is a lot more difficult for outside sources to access, reducing the likelihood of a hack.

Removing dependence on passwords may sound less secure but it is actually seen as an improvement. Employees can be careless with their login information or choose passwords that can be easily deduced by third parties. Eliminating the human factor from authentication actually goes a long way by removing one of the most common exploit points.

However, no system is 100 percent secure.

The McAfee Report
While many companies preach the value of blockchain, global computer security software company McAfee recently released a critical report on the software, stating that industries have every reason to expect cyberattacks. McAfee looked at early blockchain adapters, namely cryptocurrencies, and studied the types of cyberattacks still occurring within these companies.

The report identified four primary attack types: implementation exploits, malware, phishing and general technology vulnerabilities. Certain cryptocurrencies themselves have been used to help the spread of advanced malware, including ransomware. Coin miner malware alone grew by 629 percent in the first quarter of 2018, according to McAfee data.

Cybercriminals have also been using cryptocurrencies to mask their identities, taking advantage of blockchain's secure features to help them evade the law.

Blockchain builds its infrastructure securely, but not in a manner that is invulnerable. Blockchain builds its infrastructure securely, but not in a manner that is invulnerable.

What companies can learn from the cryptocurrency attack
Lastly, however, the attack of the cryptocurrencies themselves should highlight the limitations of blockchain. While the program may be innately secure, it is not an excuse to abandon other forms of caution. Technology is spreading at a rapid pace with information security specialists struggling to catch up.

In short, blockchain should be seen as just another tool and not a cure-all for cyberattacks. Its architecture can be helpful but must be implemented in a thorough, professional manner. Even then, it should also be paired with other programs and employee training to best reduce the risk of cybercrime.

How cloud infrastructure can help the retail sector

Cloud computing has caught on in a big way. A recent report from Right Scale found that 81 percent of the enterprise sector has adopted a multi-cloud system in at least some way. Public cloud adoption rates have continued to climb, as well, with the report noting that 92 percent of users now employ cloud technology (up from 89 percent in 2017). Across the board, cloud networks are gaining usership due to its improved interfacing, less dependence on in-house technical teams and flexible program structure.

However, some industry verticals continue to lag behind. The latest international Bitglass survey found that the retail sector has been slow to adopt cloud infrastructure. Only 47.8 percent of responding retail organizations had deployed the often-used Microsoft Office 365 suite, and Amazon Web Services – the most popular cloud system – was only used by 9 percent.

In short, retail is being left behind, and that lag is a serious problem for the industry – in part because retail is a sector that can profit immensely from successful cloud integration. However, cybersecurity concerns and technical knowledge limitations may be slowing down the adoption rate.

Taking advantage of mobile hardware
Almost everyone has a smartphone, that’s not an exaggeration. According to Pew research data, 77 percent of Americans have this hardware, and that number has been climbing steadily. Since smartphones are becoming cheaper and more user friendly, it is unlikely to think this device will be replaced in the near future.

Because smartphones are so ubiquitous and convenient, consumers are using them for a wide variety of tasks, including shopping. OuterBox found that, as of early 2018, precisely 62 percent of shoppers had made a purchase through their phones within the last six months. Another 80 percent had used their smartphones to compare products and deals while inside a store.

With a cloud infrastructure, retailers can better take advantage of this mobile world. Successful retail locations should consider maintaining at least two online networks – one for customers and another for employees. This setup will prevent bandwidth lag and help keep the consumer away from sensitive information. In addition, creating a mobile experience that is user friendly and seamlessly interwoven with the physical shopping experience is paramount.

Rather than building such a system from the ground up, retailers can take advantage of the numerous infrastructure-as-a-service cloud options available, leveraging a reliable third party rather than an in-house IT team.

Shoppers are already augmenting their experience with external online information. Shoppers are already augmenting their experiences with external online information.

Getting ahead of the latest trends
Data drives business intelligence, this is true in every enterprise sector. In retail, housing the right products can mean the difference between turning a profit and going out of business. However, retailers still using traditional sales reporting will be slow to react to shopping trends, as these reports can take months to compile.

Data analytics is the actionable side of big data. In retail, customers convey valuable information about shopping habits before they even enter the store, but if this data is not being captured, it is essentially useless. Bringing in an encompassing data analytics solution, which can read information such as store purchases, response to sales and even social media reaction, can provide retailers with extra information to make actionable decisions.

“This analysis removes the guesswork about what will sell and which styles will flop on the shelves,” Roman Kirsch, CEO of fashion outlet Lesara, stated in an interview with Inc. “We don’t just know which new styles are popular, we can also identify retro trends that are making comebacks, which styles are on the way out, and that helps us to precisely manage our production.”

Improving inventory management
In addition, data analytics can be paired with a responsive inventory management program. Retail-as-a-service solutions exist and can be used to track stock availability, shipping orders and in-store details. With this software, retail companies can get a real-time image of how well products and even entire locations are performing.

These solutions can prevent item shortages before they occur and give retail chains a greater understanding of performance at every location.

Using inventory management solutions can help retailers maximize their shipping profits. They can ship either directly to the customer or to the retail location most in need. Using inventory management solutions can help retailers maximize their shipping profits. They can ship directly to the customer or to the retail location most in need.

Concerning cybersecurity
Perhaps one of the factors slowing the adoption of cloud technology in the retail sector is cybersecurity. Retail organizations process multitudes of consumer credit information by the day, and the fallout from a data breach can be fatal in this sector. When faced with using cloud technology or in-house data center solutions, retail executives may believe that the safest hands are still their own.

However, this may not be the case. Research firm Gartner predicted that through 2022, 95 percent of cloud security failures will be the customer’s fault, meaning that issues will not come from a software defect but through poor implementation. The firm also concluded that cloud structures will see as much as 60 percent fewer cyberattacks than those businesses with in-house servers.

Cloud infrastructure is secure but must be installed and operated properly. The only thing that retail agencies have to fear when it comes to this new solution is technological ignorance, but many cloud providers and third-party services stand ready to aid in the installation process.

Should companies embrace wearables?

Technology has gotten far more mobile within the last decade. The laptop was already allowing employees to maintain productivity on the go, but this device got augmented by the arrival of the commercial smartphone, tablet and, now, wearables. Each new hardware unveiling has increased the amount of work that can be done while mobile. This shift is leading some in the enterprise space to rethink office structure and workflow.

However, should businesses be embracing innovation at this pace? Rapid adoption of any new technology has downsides and, with cybersecurity concerns on the rise, utilizing innovative hardware can have serious repercussions. Since wearables represent the newest hardware and software infrastructure hitting industries, the question becomes: Should companies embrace this technology or exercise caution until it has become more mainstream?

"Mobile workplaces lead to improved employee retention."

The advantages of workplace mobility
A mobile workplace strategy provides several advantages. Many of these benefits, such as the greater likelihood for increased collaboration among employees, are straightforward. The more data that workers can store on their person, the less they'll have to retreat to their desks to retrieve information.

Another benefit that may not be so apparent is how mobile workplaces lead to improved employee retention. Workers who sit at their desks all day are likely busy but may not be engaged in the workplace or its culture. This sentiment makes the task just another job, and, eventually, the employee may leave to find another that pays better or offers superior benefits. According to Deloitte data, however, engaged employees are 87 percent more likely to remain at their companies.

Mobile workflow allows workers to get up, be more flexible and do more, all of which can lead to higher levels of productivity and revenue for a business. In some ways, wearables represent the pinnacle of mobile workplace technology. With a device like augmented reality glasses, workers don't even have to glance down at a screen to see data. This flexibility means employees can update one another in real time with the most relevant data.

How to embrace BYOD for wearables
It feels strange to say now, but the smartphone did not begin with the iPhone. Blackberries and other enterprise devices existed for years prior to Apple's launch. However, within less than a decade, Apple and Samsung overthrew the Blackberry and are enjoying immense adoption rates. What's the reason? People liked using the tech.

Likewise, workers brought this hardware to the office before many organizations had concrete "bring your own device" policies in place. Some businesses still resist given the information security concerns associated with BYOD. However, rejecting BYOD can be just as perilous because many employees will still use personal devices anyway.

The better option is to embrace the mobile nature of this new hardware and work to develop a comprehensive BYOD policy that reflects and monitors every device. According to Tenable, many companies make BYOD available to all (40 percent) or some (32 percent) of employees, so the goal is design a strategy that reflects each employee's device usage.

Pew Research found that, unsurprisingly, 77 percent of Americans own a smartphone. Another 53 percent own a tablet. Wearables are newer, so their device distribution is much lower. Even relatively common devices like Fitbit have not reached the level of tablets. Wearable glasses have yet to have their "iPhone moment," where one consumer device connects and enjoys wide commercial appeal.

That said, a lower number of these devices does not mean companies can ignore them. Valuable data can be stored on a smartwatch as easily as it can on a laptop. Companies using BYOD should plan for wearables now before the devices become mainstream, allowing IT teams to create and deploy a strategy that will be safe.

Most wearables are linked to a smartphone, meaning they share the same data library. Most wearables are linked to a smartphone, meaning they share the same data library.

The problematic nature of cybersecurity
Cybersecurity has been struggling to keep pace with the internet of things in general and, unfortunately, wearables are no exception. A product examination conducted by HP Fortify found no hardware with two-factor authentication but noticed that all tested smartwatches stored confidential information that could be used for identity theft. These devices also received limited security updates.

Wearables will likely be driven by the same commercial appeal that spurs other recent technology, meaning that the two factors that will be stressed above all else will be price and usability. While this focus will make employees happy, it can create fits for an IT team or chief information security officer.

To help improve the cybersecurity of these devices, businesses can treat them similar to smartphones by placing them on a different network with less compromising information. Organizations can also look to implement custom multi-step authorization software whenever possible.

Augmented reality glasses often have live feeds meaning that, if hacked, outside sources can see operating data. Augmented reality glasses often have live feeds meaning that, if hacked, outside sources can see worker operations.

Know which wearables can make an impact
Lastly, businesses should not presume that all wearable technology will be viable in an enterprise setting. For instance, AR glasses will need a battery life of at least eight hours to last a full day of work, and smartwatches will have to be durable enough to withstand occasional bumps, even in an office environment.

Before investing in any official company-sanctioned hardware, thoroughly research and test devices to be sure they perform well in a typical environment. Wearables are cutting-edge technology, and many products now are designed for only niche markets rather than the mainstream.

So while companies can adopt wearables now, it makes sense to first have a policy in place. This isn't the iPhone. Businesses have a chance to get ahead of mass wearable adoption and create policies that make sense rather than reacting to the latest tech trend.

How cave fish may help prevent IoT jamming

Jamming is a potential crippling blow to internet of things-enabled hardware. It can bring down drones from the sky, disrupt network connections and lead to economic downtime. In the cybersecurity arena, jamming is more commonly known as dedicated denial of service attacks. According to a CORERO DDoS trend report, this method of cyberattack increased by an incredible 91 percent in 2017.

IoT devices are behind this surge in DDoS attacks, as many lack comprehensive cybersecurity protocols and can be easily jammed. While this deterrent is not enough to slow the pace of IoT adoption, enterprises hoping to make use of mass IoT market penetration must be aware of the risks, as well as what is being done to prevent IoT jamming.

Luckily, a recent study published in Optics Express gives some hope against rampant DDoS cybercrime. As with many technological innovations, the potential salvation is inspired by a system that already works inside the animal kingdom.

Studying the Eigenmannia
The Eigenmannia are a species of cave fish that exist in total darkness. Without light, these creatures need another way to hunt, communicate and otherwise "see" within the perpetual darkness. The researchers studying these fish discovered that they emitted an electric field to sense the environment and communicate with other fish.

Because two or more of these animals could emit the field near one another, the species had to have a way to stop the signal from getting disrupted, otherwise the fish couldn't thrive. The scientists learned the Eigenmannia have the ability to alter their signals. This capability is due to a unique neural algorithm in their brain activity. The purpose and function of the field remains in tact, but its frequency is changed just enough to avoid confusion.

This same trait can be harnessed to help create a light-based jamming avoidance response device.

Drones if jammed run the risk of damaging hardware and products. If jammed, drones run the risk of damaging hardware and products.

Creating a jamming avoidance response device
When two IoT devices operating on the same frequency come close to each other, the fields become crossed, and jamming occurs. The closer the two pieces of hardware drift, the more the disruption intensifies.

However, with a JAR device, similar to the natural solution used by Eigenmannia, these IoT components could adjust their frequency, preserving the function of the signal while avoiding jamming. Using a light-based system would enable IoT devices to shift through a wide range of frequencies.

The resulting machine, created by the research team, shows promise.

"This could allow a smarter and more dynamic way to use our wireless communication systems without the need for the complicated coordination processes that currently prevent jamming, by reserving whole sections of bandwidth for specific phone carriers or users such as the military," said team lead Mable P. Fok.

While it won't single-handedly eliminate the threat of DDoS attacks, JAR device usage on a large scale has some advantages. Essentially, it is a low-cost solution for any agency that utilizes a plethora of IoT content. In addition to the aforementioned military use case, health care facilities like hospitals, air traffic control towers and even educational institutions could find immediate value in this technology.

Since a JAR device would likely lower the bandwidth needed for IoT hardware interaction, DDoS attacks could become less expensive. As these attacks continue to become more prevalent, the value of this research will likely increase. Designing IoT devices on software that can shift frequency will reduce costs and, hopefully, a more secure IoT landscape.

The potential of Project Kinect for Azure

When Microsoft first debuted its Kinect hardware in 2010, the product had nothing to do with edge computing, AI or machine learning. The Kinect served as a controller interface for Microsoft's Xbox 360 video game console. (Later versions were released for Windows PC and Xbox One.) Using cameras and sensors, it registered a player's body movements and inputted these gestures as controls. While it was innovative, Kinect struggled to gain a footing.

Despite going through various upgrades, it was fully discontinued as a consumer project in 2017. However, Microsoft did not fully abandon its Kinect hardware. At this year's Build developer's conference the company revealed a new use for its one-time video game accessory: edge computing.

Specifically, the new Kinect project factors into the greater themes of Build 2018, namely combining cognitive computing, AI and edge computing. 

"Microsoft has ambitious plans to bring its Cognitive Services software to Azure IoT Edge."

Microsoft at Build 2018
Edge computing is at the forefront of technological innovation. Capitalizing on the internet of things, this method of data processing de-emphasizes a central hub. Remote sensors receive computer processing power to analyze the data near its source before sending it back, greatly reducing bandwidth needs. This system is also more dependable because the sensors store the data, at least for a limited time span. Network outages or dropped connections won't result in lost or fragmented information.

However, these sensors are, at the moment, fairly basic equipment. Microsoft aims to change that. At Build 2018, the company announced ambitious plans to bring its Cognitive Services software to its edge computing solution, Azure IoT Edge. According to TechCrunch, the first of these programs will be the Custom Vision service.

Implementation of this software with Azure IoT Edge can allow unmanned aerial vehicles, such as drones, to perform more complex tasks without direct control from a central data source. It will give these devices the ability to "see" and understand the environment around them, analyzing new visual data streams. This technology can also be used to improve advanced robotics, autonomous vehicles and industrial machines.

This advanced method of machine learning can increase productivity because all of these devices will be able to continue to perform complicated, vision-based tasks even with network connection disruptions.

Microsoft has also partnered with Qualcomm to bring cognitive vision developer's tools to devices like home assistants, security cameras and other smart devices.

However, this technology, Qualcomm and its Custom Vision service, while useful only work with devices equipped with sensors and cameras that can process visual data. To increase the variety of edge sensors that can benefit from these new tools and software services, Microsoft resurrected the Kinect. 

Allowing advanced robotics to "see" will enable them to perform far more complex actions, even without a constant relay of instructions. Allowing advanced robotics to "see" will enable them to perform far more complex actions, even without a constant relay of instructions.

The power of the Kinect 
In an introduction on LinkedIn, Microsoft Technical Fellow Alex Kipman discussed Project Kinect for Azure. In his piece, Kipman outlined the company's reasoning for opting to return to the commercial failure. First, Kinect has a number of impressive features that make it ideal as a sensor.

These benefits include its 1024×1024 megapixel resolution, which is the highest among any sensor camera. Kinect also comes with a global shutter that will help the device record accurately when in sunlight. Its cameras capture images with automatic per pixel gain selection. This functionality allows the Kinect to capture objects at various ranges cleanly and without distortion. It features multiphase depth calculation to further improve its image accuracy, even when dealing with power supply variation and the presence of lasers. Lastly, the Kinect is a low-power piece of hardware thanks to its high modulation frequency and contrast.

Utilizing the Kinect sensors for cognitive computing makes sense. When looking at the product history, Microsoft had already developed more than half the specifications needed to create an effective sensor. The Kinect was designed to track and process human movement, differentiate users from animals or spectators in the room and operate in numerous real-world settings. It was also made to endure drops and other household accidents. Essentially, the Kinect was a hardy specialized sensor interface a market where it had to compete with precise button pressing.

In an industrial space, Kinect can fair far better. Augmenting existing data collection sensors with this visual aid will increase the amount of actionable data that is recorded. The Kinect brings with it a set of "eyes" for any machine. This advantage will let developers and engineers get creative as they seek to create the advanced edge computing networks of the future.