Freddy’s is now ready for tremendous future growth with the flexibility and security of ISG’s cloud solution with our straightforward approach to implementation and willingness to execute outside of business hours, ensuring both client satisfaction and business success.
VoIP (Voice over Internet Protocol) is a technology that allows businesses to make phone calls with a broadband internet connection, instead of using a telephone landline. As companies grow increasingly dependent on the internet to do their business, VoIP has become more and more appealing for organizations of all sizes and industries.
According to market research and advisory firm Zion Research, the global VoIP industry is expected to surge from $83 billion in 2015 to $140 billion by 2021.
Despite the move towards hosted VoIP, many companies have chosen to remain with an on-premises PBX (private branch exchange) solution.
In this article, we’ll discuss the definitions of hosted VoIP and on-premises PBX and then go over the pros and cons so that you can make the right choice for your business.
What is hosted voice?
The term “hosted” means that the VoIP provider is responsible for hosting the services in the cloud. In other words, the telephones at your business headquarters use the internet to connect to the equipment hosted by the VoIP provider at an off-site location.
Most hosted VoIP providers use a recurring monthly or annual pricing model, which includes a predetermined number of minutes as well as a given set of features. However, some providers offer a per-minute pricing model for additional flexibility.
It’s worth noting that many hosted voice offerings are part of a “unified communications” solution that combines phone, email, fax, chat, and video capabilities. Indeed, some companies treat the terms “hosted voice” and “unified communications” almost synonymously.
What is on-premises PBX?
A private branch exchange (PBX) is a private telephone network that manages the internal and external phone calls of an enterprise.
As the name suggests, “on-premises” PBX means that your business is responsible for maintaining the necessary hardware on-site. You have ultimate ownership of, and responsibility for, the network.
Hosted voice vs. on-premises PBX: advantages and disadvantages
Let’s look at the pros and cons of each option.
Using an on-premises PBX solution is typically much more expensive when first starting out. You need to purchase your own hardware, set it up, and perform your own maintenance.
Meanwhile, hosted VoIP uses an OPEX cost model, so your monthly expenses are much more predictable (and often lower).
If you’re just starting out and are unsure which option is best for you, then you should likely choose a hosted VoIP solution.
Your VoIP provider shoulders all the load in terms of future work and expansion, including concerns such as maintenance and software updates. This gives you a great deal more flexibility.
The biggest question mark in terms of hosted voice is reliability.
Because VoIP relies on an internet connection in order to function, VoIP customers will be left without phone service when their internet goes down or when they experience a power failure.
Before option for a hosted VoIP solution, be sure you have a reliable internet connection.
If your company has the IT expertise required to perform the installation and you’re willing to handle all of the responsibilities, then an on-premises PBX may not be a bad choice.
For most businesses, however, the lower costs, lessened maintenance obligations, and increased flexibility of a hosted VoIP solution are enough for them to make the switch.
Need some expert advice about whether hosted voice or on-premises PBX is right for your organization? The right managed services provider can help make the decision a lot easier. Contact your MSP for some good advice on the solution that best fits your situation.
https://www.isgtech.com/wp-content/uploads/2019/04/woman-airport-laptop.jpg266702ISG Tech/wp-content/uploads/2018/02/isg-logo.pngISG Tech2018-10-04 19:24:232018-10-04 19:24:23The pros and cons of hosted voice vs. on-premises PBX
The school year is underway, and Backup School is back! Together, ISG and Veeam focus on educating our clients and their organizations about how they can keep their business up and running and eliminate downtime – even when the unexpected happens.
Office 365 is a powerful suite of products – but it lacks a comprehensive backup of some of your most critical data. Learn how to protect yourself in this webinar.
https://www.isgtech.com/wp-content/uploads/2019/04/woman-laptop-office.jpg266702ISG Tech/wp-content/uploads/2018/02/isg-logo.pngISG Tech2018-09-24 23:04:492019-04-15 20:11:12Webinar: Everything You Need to Know About Backup for Office 365
Cloud computing has been gaining popularity in the business space over the last couple years. Organizations are abandoning server-based data centers in favor of a third-party-provided solutions. Yet as more data is stored digitally, the danger of hacking grows. Companies are losing significant income to data breaches, and cybercriminals are developing new, sophisticated ways to steal data.
So why are companies taking their information to the cloud? Many executives want to push their businesses to the cloud but don’t fully understand how it works. As such, they may be wary over the idea of removing confidential information from complete corporate oversight. However, the cloud is not as penetrable as its name might imply.
Three factors driving cloud safety According to Forbes, there are three principal factors helping to keep data secure when it is in a cloud platform. The first is redundancy. Losing data can be almost as harmful as having it stolen. When a server fails or a hacker gains access to a corporate network and deletes or attempts to ransom vital information, companies can lose months of productivity. Most cloud networks, however, typically keep data in at least three locations.
This means that lost data at one location, such as data loss caused by a server failure, will not have the disastrous impact that it could in an organization relying on an on-premise data center. By keep copies of each file, cloud solutions are making sure mission-critical data is accessible until the user no longer wants it.
The second factor is the safe sharing policy. Anyone who has ever used the popular Google Docs knows how file sharing works. Rather than making a copy, the user must enter the email address of anyone they want to see the file. These extra users can’t share the file on their own (unless given express permission), they simply have access to the information. This is how safe sharing works. It prevents any unauthorized copies from being created or distributed. Users have access to their own data and can control exactly who sees it.
The last factor driving cloud safety is encryption. Provided a user keeps track of their password, it is very difficult for a hacker to gain access to the files. They are being stored either entirely in the cloud or at a secure, remote facility in an unknown location. Since the user’s connection to this information is encrypted, following it to gain access would be difficult, if not impossible for a human hacker.
“Cybersecurity today is more about controlling access than managing data storage.”
It’s all about access As TechTarget pointed out, cybersecurity today is more about controlling access than managing data storage. When hackers breach data, they typically do so because they have access to sensitive information. This can be a password or even a corporate email address. Cybercriminals infiltrate and steal information based on the access they’ve gained, typically from an unknowing authorized user.
Cloud solutions help monitor this access, keeping secure data under control. The providers offering these platforms have the expertise and the resources to keep cybersecurity evolving alongside the threats. In most cases, they have more resources than the client companies using their solutions.
The cybersecurity arms race One popular cloud vendor is Microsoft. Each year the company invests over $1 billion into cybersecurity initiatives for its Azure platform. The money, explained Azure Government CISO Matthew Rathbun in an interview with TechRepublic, isn’t just about maintenance, it is about innovation:
“Ninety percent of my threat landscape starts with a human, either maliciously or inadvertently, making a mistake that somehow compromises security,” said Rathbun. “In an ideal state, we’re going eventually end up in a world where there’ll be zero human touch to an Azure production environment.”
Overseen by talented specialists with ample resources, cloud solutions are a safe form of data protection in today’s digital business space.
https://www.isgtech.com/wp-content/uploads/2019/04/technician-server-room-laptop.jpg266702RJackson/wp-content/uploads/2018/02/isg-logo.pngRJackson2018-08-08 15:41:152019-04-05 20:06:34Why cloud computing is safe
Cybersecurity is a paramount issue facing businesses in the digital world. The average costs of a successful cybercrime in 2017 were roughly $1.3 million for large enterprises and $117,000 for small- to medium-sized businesses, according to Kaspersky Lab. These figures include the cost of data theft but do not encompass the additional potential price of a damaged reputation and ensuing legal action. Data also indicates that cyberattacks will become only more expensive and damaging in the coming years.
Defending an organization against cybercrime requires a multi-channel approach. Companies should be open to software solutions, employee training and hardware upgrades whenever necessary. However, another avenue for cybercrime is occasionally overlooked. Physical theft of connected mobile devices, laptops and even desktop computers can lead to an open pathway for cyberattacks. In addition, some businesses simply sell their used electronics without first doing a proper data cleanse.
But can information to completely and permanently removed from a hard drive?
The levels of data destruction Deleting data is not as secure as some might assume. In actuality, when information on a computer is "deleted," the files themselves are not immediately removed. Instead, the pathing to that information is expunged. The data is also designated as open space, so the computer will eventually overwrite it. However, until this rewrite occurs, it is relatively easy for the information to be restored and accessed by any tech-savvy user.
Fortunately for organizations trying to permanently dissolve their data, deletion is only the first step of the process. Lifewire recommended three additional methods to ensure that information remains lost.
First comes software – using a data destruction program on the hard drive. This method has been met with approval from the National Institute of Standards and Technology as a secure way to permanently remove information from a hard drive, according to DestructData. However, drawbacks include resource consumption, as this can be a time-intensive process. In addition, some overwriting tools can miss hidden data that is locked on the hard drive.
The most secure method to completely remove data is degaussing. Hard disk drives operate through magnetic fields, and degaussers alter those waves. The result is a drive that can never be read again. In fact, the computer will not even register it as a hard drive from that moment on. However, the downside in this process is twofold: One, the drive is useless after degaussing. Two, this method can on only hard disk drives. Solid state drives and flash media do not use magnetism in the same way, so a degausser will be ineffective.
The final option is to physically destroy the data drive. While many people think that this task can be done with patience and a hammer, it is unfortunately not that simple. Hard drives can be rebuilt with the right tools and expertise. According to the Computer World, NASA scientists were able to recover data from the charred wreckage of the Columbia shuttle after its disastrous explosion and crash in 2003.
The resiliency of hard drives In short, it can be difficult to permanently expunge data from a hard drive. This reality is in part why businesses are opting for less internal data centers and more dependency on cloud solutions. According to TechTarget, cloud solutions represent a more secure method of data organization than traditional IT infrastructure.
While data can be safely deleted, the reality is, unless a degausser is used, there is always some chance of information recovery. Cybercriminals are becoming more sophisticated, and given the expensive nature of dealing with data breaches, it is understandable why the cloud is becoming the preferred solution.
https://www.isgtech.com/wp-content/uploads/2019/04/random-digital-numbers.jpg266702RJackson/wp-content/uploads/2018/02/isg-logo.pngRJackson2018-07-31 12:29:062018-07-31 12:29:06Is physical data destruction completely secure?
The internet of things has been a rapidly growing segment of technology over the past decade. Ever since Apple took made the smartphone a consumer success with its first iPhone, users have grown comfortable carrying technology in their hands and pockets. This IoT-filled world has created new opportunities and challenges.
According to IDC, connected devices will generate over 40 trillion gigabytes of data by 2025. This is too much of a good thing, especially if IoT devices remain only collectors and not processors. To help speed up data collection, Google has announced its Cloud IoT Edge platform, as well as a new hardware chip called the Edge tensor processing unit.
What are Google's new announcements? Google described its decision to move forward on the Cloud IoT Edge platform as "bringing machine learning to the edge." Essentially, current edge devices, such as drones and sensors currently transmit most of their data collection back for internal processing. This procedure uses a lot of bandwidth and reduces the speed at which decisions can be drawn from the data. It also places a lot of stress on constant network connectivity, as any downtime can result in lost information.
Google's new software solution would allow this data processing to happen right at the data source. It will also enable advanced technology, such as machine learning and artificial intelligence, to operate on these edge devices. Enter the Edge TPU: This chip is designed to maximize performance per watt. According to Google, the Edge TPU can run TensorFlow Lite machine learning models at the edge, accelerating the "learning" process and making software more efficient faster.
How does this compare with the greater market? In this announcement, Google is following in the path of Microsoft. Released globally in July, Azure IoT Edge accomplished many of the same tasks that the Cloud IoT Edge solution intends to. The two aim to empower edge devices with greater machine learning performance and reduce the amount of data that must be transmitted to be understood.
However, as Microsoft has been in the hardware space much longer than Google, no TPU chip needed to accompany the Azure IoT Edge release. It is possible that Google may gain an advantage by releasing hardware designed to optimize its new platform performance.
Amazon's AWS Greengrass also brings machine learning capabilities to IoT devices. However, unlike the other two, this platform has existed for a while and seen modular updates and improvements (rather than a dedicated new release).
The presence of all three cloud platform giants in edge space signifies a shift to at-location data processing. Cloud networks have already been enjoying success for their heightened security features and intuitive resource sharing. As these networks become more common, it has yet to be fully seen how Microsoft, Amazon and Google deal with the increased vulnerabilities of many edge devices. However, with all three organizations making a sizeable effort to enter this market space, businesses should prepare to unlock the full potential of their edge devices and examine how this technology will affect workflows and productivity.
https://www.isgtech.com/wp-content/uploads/2019/04/woman-airport-laptop.jpg266702RJackson/wp-content/uploads/2018/02/isg-logo.pngRJackson2018-07-31 12:28:552018-07-31 12:28:55Google joins the empowered edge with Cloud IoT Edge
As of late June 2018, one of Microsoft's newest software platforms, Azure IoT Edge, is generally available. This means that commercial enterprises and independent consumers now have access to it and, thanks to Microsoft's decision to take the platform open source, can begin modifying the technology to fit specific needs.
Every innovation brings new opportunity and unforeseen challenges, and there is no reason to suspect that Azure IoT Edge will be any different. Even programs created by technology industry leaders like Microsoft have their potential disadvantages.
What exactly is Azure IoT Edge? Simply put, Azure IoT Edge represents Microsoft's plan to move data analytics from processing centers to internet of things enabled devices. This sophisticated edge computing technology can equip IoT hardware with cognitive computing technologies such as machine learning and computer vision. It will also free up enormous bandwidth by moving the data processing location to the device and allow IoT devices to perform more sophisticated tasks without constant human monitoring.
A cloud-based interface will allow the user to remotely manage and oversee any and all Azure IoT Edge devices.
IoT Edge runtime operates on every IoT Edge device and controls the modules deployed to each piece of IoT hardware.
Every IoT Edge module is a container that operates on Azure services, third-party software or a user's personalized code. The modules are dispersed to IoT Edge machines and locally operate on said hardware.
Overall, Azure IoT Edge represents a significant step forward in cloud computing and IoT operations, empowering devices with functionality that wasn't before possible.
The cybersecurity concerns of Azure IoT Edge It is worth remembering that IoT hardware has a long and complicated history with cybersecurity standards. Considering the bulk of IoT technology adoption has been driven by consumer, rather than enterprise, products – issues like security and privacy were placed second to interface design and price point.
Research firm Gartner found that 20 percent of organizations had already reported at least one IoT-centered data breach within the three years leading up to 2018. This risk has led to IoT security spending that is expected to cost $1.5 billion globally in 2018. Some companies scrambling to make their IoT hardware more secure may want to leave this problem as a priority over incorporating Microsoft's newest software platform.
Another potential issue is Microsoft's decision to make the platform open source. The original code is public knowledge and now available to all to modify for personal use. While this flexibility will greatly help the product's user base expand, open source programs have not historically been the most secure from cybercriminals.
Many ecommerce websites ran on the Magento platform, an open source solution that became the target of a brute force password attack in 2018, which ultimately proved successful. The resulting data breach led to thousands of compromised accounts and stolen credit information.
A Black Duck Software report tracked open source programs as they have become more widespread. While the overall quality of open source code is improving, the study found that many organizations do not properly monitor and protect the code once it has been put in place, leaving it vulnerable to exploitation from outside sources.
"Microsoft annually invests $1 billion in cybersecurity research."
The Microsoft advantage However, Microsoft is arguably in position to address the major security concerns with its Azure IoT Edge platform. The company invests over $1 billion in cybersecurity research each year. According to Azure Government CISO Matthew Rathbun, a lot of this money is spent with Azure in mind:
"Ninety percent of my threat landscape starts with a human, either maliciously or inadvertently, making a mistake that somehow compromises security," Rathbun told TechRepublic. "In an ideal state, we're going eventually end up in a world where there'll be zero human touch to an Azure production environment."
Azure IoT Edge represents a bold step forward in empowering IoT technology and improving automated productivity. While there are risks associated with every innovation, Microsoft remains committed to staying at the forefront and protecting its platforms. Companies should be willing to invest in Azure IoT Edge while remaining vigilant about the possible risks.
Blockchain has been turning heads since it was first unveiled in 2008 to become the backbone of then relatively unknown cryptocurrency, bitcoin. Since then, blockchain and Bitcoin have skyrocketed in public awareness, with the latter becoming the most successful cryptocurrency in history. A large portion of bitcoin's success is due to its blockchain infrastructure, which prevents the duplication of funds (preventing double-spending) and automatically time-stamps every transaction.
The developer (or developers) behind blockchain created the software to be resistant to alteration or hacking, making it one of the more inherently secure systems that companies can use to manage secure infrastructures. Some have heralded blockchain as the ultimate tool to promote cybersecurity and reduce the risk of data breaches.
Then bitcoin, in addition to several other cryptocurrencies, were hacked. According to CNN, the attack erased the equivalent of billions of dollars and sent the value of the affected cryptocurrencies plunging. The incident has many questioning just how secure blockchain is and whether the software was simply a temporary fix, like so many others, against the ever-present threat of cyberattacks.
"Blockchain can give each registered device a specific SSL certificate for authentication."
The case for blockchain While buzzwords are common in the tech industry, there are several legitimate reasons why blockchain has been celebrated as a secure platform. According to Info Security Magazine, one of blockchain's primary appeals is its decentralized data storage. While users can access blockchain data on a computer or mobile device, the program itself is typically stored throughout the network.
If one access point – or block – is targeted by hackers, then the other blocks will react to it. The attempted cyberattack will likely alter the data on the block in a way that is immediately noticeable by the rest of the chain. This block will then simply be disconnected, isolating the malicious data before it can impact the system.
Another helpful advantage of blockchain is its effectiveness against dedicated denial of service attacks. These cyberattacks target the domain name system, flooding it with so much data traffic that it essentially shuts down. Using blockchain software would allow the DNS to spread its contents to more nodes, reducing the effectiveness of the DDoS attack before it reaches a crippling stage.
Networks using a blockchain infrastructure can also bypass the need for passwords in certain situations. Instead of using the human-oriented password system, blockchain can give each registered device a specific SSL certificate. This mode of authentication is a lot more difficult for outside sources to access, reducing the likelihood of a hack.
Removing dependence on passwords may sound less secure but it is actually seen as an improvement. Employees can be careless with their login information or choose passwords that can be easily deduced by third parties. Eliminating the human factor from authentication actually goes a long way by removing one of the most common exploit points.
However, no system is 100 percent secure.
The McAfee Report While many companies preach the value of blockchain, global computer security software company McAfee recently released a critical report on the software, stating that industries have every reason to expect cyberattacks. McAfee looked at early blockchain adapters, namely cryptocurrencies, and studied the types of cyberattacks still occurring within these companies.
The report identified four primary attack types: implementation exploits, malware, phishing and general technology vulnerabilities. Certain cryptocurrencies themselves have been used to help the spread of advanced malware, including ransomware. Coin miner malware alone grew by 629 percent in the first quarter of 2018, according to McAfee data.
Cybercriminals have also been using cryptocurrencies to mask their identities, taking advantage of blockchain's secure features to help them evade the law.
What companies can learn from the cryptocurrency attack Lastly, however, the attack of the cryptocurrencies themselves should highlight the limitations of blockchain. While the program may be innately secure, it is not an excuse to abandon other forms of caution. Technology is spreading at a rapid pace with information security specialists struggling to catch up.
In short, blockchain should be seen as just another tool and not a cure-all for cyberattacks. Its architecture can be helpful but must be implemented in a thorough, professional manner. Even then, it should also be paired with other programs and employee training to best reduce the risk of cybercrime.
https://www.isgtech.com/wp-content/uploads/2019/04/lighting-storm-weather-city.jpg266702RJackson/wp-content/uploads/2018/02/isg-logo.pngRJackson2018-06-14 10:59:022018-06-14 10:59:02Is blockchain the antidote to all cybersecurity woes?
Cloud computing has caught on in a big way. A recent report from Right Scale found that 81 percent of the enterprise sector has adopted a multi-cloud system in at least some way. Public cloud adoption rates have continued to climb, as well, with the report noting that 92 percent of users now employ cloud technology (up from 89 percent in 2017). Across the board, cloud networks are gaining usership due to its improved interfacing, less dependence on in-house technical teams and flexible program structure.
However, some industry verticals continue to lag behind. The latest international Bitglass survey found that the retail sector has been slow to adopt cloud infrastructure. Only 47.8 percent of responding retail organizations had deployed the often-used Microsoft Office 365 suite, and Amazon Web Services – the most popular cloud system – was only used by 9 percent.
In short, retail is being left behind, and that lag is a serious problem for the industry – in part because retail is a sector that can profit immensely from successful cloud integration. However, cybersecurity concerns and technical knowledge limitations may be slowing down the adoption rate.
Taking advantage of mobile hardware Almost everyone has a smartphone, that’s not an exaggeration. According to Pew research data, 77 percent of Americans have this hardware, and that number has been climbing steadily. Since smartphones are becoming cheaper and more user friendly, it is unlikely to think this device will be replaced in the near future.
Because smartphones are so ubiquitous and convenient, consumers are using them for a wide variety of tasks, including shopping. OuterBox found that, as of early 2018, precisely 62 percent of shoppers had made a purchase through their phones within the last six months. Another 80 percent had used their smartphones to compare products and deals while inside a store.
With a cloud infrastructure, retailers can better take advantage of this mobile world. Successful retail locations should consider maintaining at least two online networks – one for customers and another for employees. This setup will prevent bandwidth lag and help keep the consumer away from sensitive information. In addition, creating a mobile experience that is user friendly and seamlessly interwoven with the physical shopping experience is paramount.
Rather than building such a system from the ground up, retailers can take advantage of the numerous infrastructure-as-a-service cloud options available, leveraging a reliable third party rather than an in-house IT team.
Getting ahead of the latest trends Data drives business intelligence, this is true in every enterprise sector. In retail, housing the right products can mean the difference between turning a profit and going out of business. However, retailers still using traditional sales reporting will be slow to react to shopping trends, as these reports can take months to compile.
Data analytics is the actionable side of big data. In retail, customers convey valuable information about shopping habits before they even enter the store, but if this data is not being captured, it is essentially useless. Bringing in an encompassing data analytics solution, which can read information such as store purchases, response to sales and even social media reaction, can provide retailers with extra information to make actionable decisions.
“This analysis removes the guesswork about what will sell and which styles will flop on the shelves,” Roman Kirsch, CEO of fashion outlet Lesara, stated in an interview with Inc. “We don’t just know which new styles are popular, we can also identify retro trends that are making comebacks, which styles are on the way out, and that helps us to precisely manage our production.”
Improving inventory management In addition, data analytics can be paired with a responsive inventory management program. Retail-as-a-service solutions exist and can be used to track stock availability, shipping orders and in-store details. With this software, retail companies can get a real-time image of how well products and even entire locations are performing.
These solutions can prevent item shortages before they occur and give retail chains a greater understanding of performance at every location.
Concerning cybersecurity Perhaps one of the factors slowing the adoption of cloud technology in the retail sector is cybersecurity. Retail organizations process multitudes of consumer credit information by the day, and the fallout from a data breach can be fatal in this sector. When faced with using cloud technology or in-house data center solutions, retail executives may believe that the safest hands are still their own.
However, this may not be the case. Research firm Gartner predicted that through 2022, 95 percent of cloud security failures will be the customer’s fault, meaning that issues will not come from a software defect but through poor implementation. The firm also concluded that cloud structures will see as much as 60 percent fewer cyberattacks than those businesses with in-house servers.
Cloud infrastructure is secure but must be installed and operated properly. The only thing that retail agencies have to fear when it comes to this new solution is technological ignorance, but many cloud providers and third-party services stand ready to aid in the installation process.
https://www.isgtech.com/wp-content/uploads/2019/04/url-website.jpg266702RJackson/wp-content/uploads/2018/02/isg-logo.pngRJackson2018-06-11 10:37:572019-04-05 20:05:28How cloud infrastructure can help the retail sector
When Microsoft first debuted its Kinect hardware in 2010, the product had nothing to do with edge computing, AI or machine learning. The Kinect served as a controller interface for Microsoft's Xbox 360 video game console. (Later versions were released for Windows PC and Xbox One.) Using cameras and sensors, it registered a player's body movements and inputted these gestures as controls. While it was innovative, Kinect struggled to gain a footing.
Despite going through various upgrades, it was fully discontinued as a consumer project in 2017. However, Microsoft did not fully abandon its Kinect hardware. At this year's Build developer's conference the company revealed a new use for its one-time video game accessory: edge computing.
Specifically, the new Kinect project factors into the greater themes of Build 2018, namely combining cognitive computing, AI and edge computing.
"Microsoft has ambitious plans to bring its Cognitive Services software to Azure IoT Edge."
Microsoft at Build 2018 Edge computing is at the forefront of technological innovation. Capitalizing on the internet of things, this method of data processing de-emphasizes a central hub. Remote sensors receive computer processing power to analyze the data near its source before sending it back, greatly reducing bandwidth needs. This system is also more dependable because the sensors store the data, at least for a limited time span. Network outages or dropped connections won't result in lost or fragmented information.
However, these sensors are, at the moment, fairly basic equipment. Microsoft aims to change that. At Build 2018, the company announced ambitious plans to bring its Cognitive Services software to its edge computing solution, Azure IoT Edge. According to TechCrunch, the first of these programs will be the Custom Vision service.
Implementation of this software with Azure IoT Edge can allow unmanned aerial vehicles, such as drones, to perform more complex tasks without direct control from a central data source. It will give these devices the ability to "see" and understand the environment around them, analyzing new visual data streams. This technology can also be used to improve advanced robotics, autonomous vehicles and industrial machines.
This advanced method of machine learning can increase productivity because all of these devices will be able to continue to perform complicated, vision-based tasks even with network connection disruptions.
Microsoft has also partnered with Qualcomm to bring cognitive vision developer's tools to devices like home assistants, security cameras and other smart devices.
However, this technology, Qualcomm and its Custom Vision service, while useful only work with devices equipped with sensors and cameras that can process visual data. To increase the variety of edge sensors that can benefit from these new tools and software services, Microsoft resurrected the Kinect.
The power of the Kinect In an introduction on LinkedIn, Microsoft Technical Fellow Alex Kipman discussed Project Kinect for Azure. In his piece, Kipman outlined the company's reasoning for opting to return to the commercial failure. First, Kinect has a number of impressive features that make it ideal as a sensor.
These benefits include its 1024×1024 megapixel resolution, which is the highest among any sensor camera. Kinect also comes with a global shutter that will help the device record accurately when in sunlight. Its cameras capture images with automatic per pixel gain selection. This functionality allows the Kinect to capture objects at various ranges cleanly and without distortion. It features multiphase depth calculation to further improve its image accuracy, even when dealing with power supply variation and the presence of lasers. Lastly, the Kinect is a low-power piece of hardware thanks to its high modulation frequency and contrast.
Utilizing the Kinect sensors for cognitive computing makes sense. When looking at the product history, Microsoft had already developed more than half the specifications needed to create an effective sensor. The Kinect was designed to track and process human movement, differentiate users from animals or spectators in the room and operate in numerous real-world settings. It was also made to endure drops and other household accidents. Essentially, the Kinect was a hardy specialized sensor interface a market where it had to compete with precise button pressing.
In an industrial space, Kinect can fair far better. Augmenting existing data collection sensors with this visual aid will increase the amount of actionable data that is recorded. The Kinect brings with it a set of "eyes" for any machine. This advantage will let developers and engineers get creative as they seek to create the advanced edge computing networks of the future.
https://www.isgtech.com/wp-content/uploads/2019/04/cloud-curcuit.jpg266702RJackson/wp-content/uploads/2018/02/isg-logo.pngRJackson2018-05-09 14:09:322018-05-09 14:09:32The potential of Project Kinect for Azure