The hot (albeit not necessarily sexy) segment of IT operations — the analysis and intelligence-gathering from logs and performance management data — is showing increasing signs of an on-demand future.
First, Paglo came out last month (in beta) with a free and open source (GPL) crawler service that scours the reams of log files and other electronic records users point it at inside of data centers and server farms. With a subsequent index, IT operators can view on and search for needed analysis and metrics of IT use and performance data as an online service via browsers.
Paglo provides IT administrators and operators the free crawler service to gain information or meta data on all sorts of assets on their networks, including across VPNs to remote offices. As an open source crawler, folks are free to write scripts to search into various modules and whatever else they want to gather data from on their networks. Other users can then benefit from these scripts via the community. Pretty quickly the Paglo community ought to be able to index just about anything of import on their networks. No cost incurred for users but their time involved.
Crunch the Numbers
The meta data then — they assure me, safely — is sent to an index instance in the cloud managed by Paglo. The managers of the crawler and hosted data can then securely search the logs using all sorts of queries, charts, views and dashboards to gather quantitative and qualitative business intelligence on their IT systems use and use patterns.
The analysis can initially help with such chores as determining how many Microsoft Office suites are actually in use, or how to do quick audits of this or that element on a network. This can help with audits, to identify straggler application installations and to track down when users have installed things they should not. But later, the service could spawn premium services for operations analytics and troubleshooting. Furthermore, by aggregating and (one hopes) anonymizing the data from many IT sites, Paglo could create definitive market research on just what constitutes IT use and context based on just the facts, ma’am.
Rather than rely on quasi-annual surveys by IT analyst firms (always on the vanguard of accurate and objective guesses), a broad Paglo audit of large swaths of IT use and habits — based on valid and scientific samplings (if not actual empirical censuses) — could take the guess work out of what IT is actually being used in certain types of companies, and regions. That would be some mighty fine data, and could hold the IT vendors’ feet to the fire on their real penetration and use patterns.
Value in the Aggregate
If Paglo gets sufficient volume adoption and the data is good and comprehensive, we could end up with a comScore for IT components and infrastructure bits. Perhaps Paglo will make its money from selling the use patterns and market share data, while giving away the means to the tactical analysis for each company. So far, Paglo is mum on where its remuneration will come from.
Suffice to say, such a service will generate a lot of page views that only an IT systems administrator could love. That in itself could spell advertising gold for those selling to IT shops.
And, hey, free insight into IT ops — as long as you feel OK about someone else’s crawler sniffing around your network and servers — could be an offer some cheapo outfits can’t refuse. If the (CIO) chief information officer won’t pay for analytics products, what else could an operations manager do to prevent those awful Monday mornings.
New LogLogic CEO
On another IT analysis front, LogLogic announced today that longtime IT infrastructure thought leader Pat Sueltz has joined as CEO. Pat has been marching upward in title (while perhaps sliding a bit in employer size) over the past seven years. You may recall Pat as the gal who managed the Java relationship for IBM, back when Sun Microsystems and IBM saw eye to eye, at least on a common foe: Microsoft.
Then Pat went to Sun — after making a lot of noise at IBM on why Java ought to be overseen by a standards body (if not open sourced). And this back in mid-1990s! After a stint at Sun in charge of software (not great timing it turns out) and then Sun services, she did a well-timed stint at Salesforce.com. And there lies the rub on the intersection of LogLogic and SaaS and on-demand models.
She won’t commit, of course, this being her first week on how on-demand and LogLogic come together. But I’ll wager a new chapter of growth potential for LogLogic lies in some of the interesting things Paglo has been trying, not to mention following the Salesforce ecology thing. There’s also an example by Splunk to follow with what it has done with an online open repository of analytics data, know as SplunkBase. [Disclosure: Splunk has been a sponsor of BriefingsDirect podcasts.]
Pat comes to LogLogic from SurfControl, where she was also CEO. I’ll be keeping an eye on Pat, with keen interest on how research, trends, data and online business models come into play with the perhaps no longer esoteric log file management arena. I’m also looking for real business intelligence as applied to IT, culled from this log data. Between those values and the compliance and virtualization imperatives, this is a high-growth area.
In other words, there’s gold in them thar logs.
Dana Gardner is president and principal analyst at Interarbor Solutions, which tracks trends, delivers forecasts and interprets the competitive landscape of enterprise applications and software infrastructure markets for clients. He also produces BriefingsDirect sponsored podcasts.