The cloud — as in, the offsite storage of data — is becoming as vast and ever-changing as clouds in the sky, but DeepField Networks announced an analytics tool that could allow the flow of cloud-based data to be mapped.
This would include tracking traffic patterns, application performance and cost structure of networks for cloud computing companies as well as content providers and carriers.
This mapping of the cloud has already been compared to that of the genome — and in some ways, the cloud has developed its own DNA.
“Comparing the cloud to genes is a bit of a stretch, and all analogies break down after a while,” said Craig Labovitz, cofounder and CEO of DeepField Networks. “However, websites are built up from various components, and there are a lot of complexities in how these are built.”
It is these complexities that have made the mapping increasingly challenging.
“The way the Internet and the servers work has really changed,” Labovitz told TechNewsWorld. “Severs used to be in the closet or in a room in the building.”
Instead of being in the closet or a company’s basement, storage has left the enterprise data center. As with much of the Internet, the result it is that it is everywhere. This is especially true for large businesses and many online sites.
This isn’t a problem when things work — in fact, it can ensure that a single outage doesn’t bring a whole system down. However, when things go wrong, it can be difficult to find out why. Thus, mapping the cloud becomes all the more essential.
“If you can accurately map and attribute traffic by type and volume, you can identify bottlenecks, often locate spreading viruses, locate entities that are abusing or stressing the network, and then optimize the result for performance and cost,” said Rob Enderle, principal analyst for the Enderle Group.
“Basically, a tool like this ensures the bottom line — profit — for companies providing complex networking services internally or externally,” he pointed out, “and profit is what keeps the company afloat and the executives’ bonuses intact.”
This also addresses the underlying issue that by its nature, the performance of the cloud depends on how well the infrastructure is managed and even maintained. An outage in the cloud can bring down sites in a way that outages of whole server farms wouldn’t have done just a few years ago.
“The fact is that many — if not most — cloud service providers do not offer deep insight into the cloud infrastructures they utilize,” Charles King, principal analyst of Pund-IT, told TechNewsWorld. “In worst case scenarios, like the outage of Amazon’s ECS in June, which impacted Netflix, Pinterest, Instagram and other Web services, companies may not have realized they were dependent on ECS until their businesses went offline.”
In essence, DeepField’s solutions provide its customers with ways to map the cloud infrastructures they and their SPs depend on, and allow them to monitor the health and performance of that infrastructure, said King.
“By using DeepField’s solutions, customers will gain deeper insight into cloud dependencies that will help them better plan the management and rollout of their cloud-based services,” he added.
Cloud Mining and Big Data
The demand for this technology could also increase. While it is mostly important to companies that are actively using public cloud services to support critical business processes, which is modest today, that is likely to increase over time.
“Many of the biggest complaints about cloud computing relate to transparency into performance and security issues,” said King.
“There’s no reason that cloud service providers couldn’t offer similar services of their own — but unless they do, DeepField could become a trusted ‘seal of approval’ provider for the cloud,” he continued. “Given the modest size of the company, the real question may be whether the company remains independent or becomes an acquisition target for cloud-happy large vendors.”
In the meantime however, DeepField Network’s analytic tools also arrive as a new buzzword has crept into the lexicon, namely “big data.” While not really new, it is a term that is being used to help users understand exactly how much data is being created, in part because of what the cloud has allowed.
“Big data has become synonymous with the cloud,” said Labovitz, “but it is also an enabling technology. It means everything and nothing at the same time.”
The very problem of data has been compounded by the uptick in computing power, which fortunately has also enabled the solution.
“There has really been an incredible evolution with the tool set and technology,” added Labovitz. “This could not have existed four years ago, and we couldn’t do what we are doing.”