Developers

EXPERT ADVICE

What Are All Those Logs Trying to Tell You?

Gaining more insights early and often into what vast arrays of servers, routers and software stacks are actually doing has long been on the top of the IT wish list. Traditional IT management approaches force the trade-off between depth and comprehensive reach, meaning you can’t get the full, integrated picture across mixed systems with sufficient clarity.

Splunk’s approach to this problem has been to index and make searchable the flood of constantly generated log files being emitted from IT systems, and then aligning the time stamps to draw out business intelligence inferences about actual IT performance.

The San Francisco company took the IT information assembly and digestion process a step further two years ago by creating SplunkBase, an open reservoir of knowledge about IT searched systems for administrators to share and benefit from. (Disclosure: Splunk is a sponsor of BriefingsDirect podcasts, including this one on SplunkBase.)

A New Direction

Now, recognizing the power of mashed-up services and Enterprise 2.0 tools for associating applications, services and data, Splunk has gone “platform.” Instead of only providing the fruits of IT search to sysadmins and IT operators, Splunk has created the means to offer developers easy access to that data and the powerful inferences gleaned from comprehensive IT search. That means the data can go places no log file has gone before.

Through a common set of services and APIs (application programming interfaces), the Splunk Platform now allows developers and equipment makers to build and integrate applications that include IT-search generated data. Because Splunk collects and manages logs, configurations, messages, traps and alerts — compiling statistics from nearly every IT component — the makers of IT equipment can build better management and maintenance applications (not to mention billable services).

In trial use, the Splunk Platform has already been leveraged by OEMs (original equipment manufacturers) and systems integrators in the form of bundling and embedding Splunk with their own hardware, software and services. The opportunity there is for these OEMs and systems integrators to seek new business opportunities for offering ongoing maintenance and support values for their products and services.

The Snowball Effect

What’s more, the applications that the various OEMs, service providers, hosting organizations and service bureau outsourcers build on Splunk, the more the applications can be used in coordination together, and the findings then integrated for faster problem solving, greater threat response, heightened compliance reporting, and for gaining business intelligence insight into user activity and transactions.

I like this approach because gaining an insight into total data center behavior in near real-time has been so difficult, but its importance is growing with the advances in virtualization, mixed-hosting arrangements, collocation, and SOA (service-oriented architecture)-based systems and infrastructure. In effect, both the complexity and heterogeneity of systems has kept growing, while the ability to gain common-denominator meta data about systems behaviors hasn’t kept pace. We’ve long needed a way to make all systems “readable” in common ways.

With Splunk Platform and the applications it will spawn, IT information can now much better support and interact with distributed management applications. We certainly need more innovative applications that can leverage this common meta data about systems to produce better management and quick feedback from systems and users.

Priming the Pump

Taking this all a step further, many of these applications and services can and should support an ecosystem. By easily distributing their applications and gaining the ability to download other applications created by anyone in the Splunk ecosystem, IT managers and the makers of IT equipment will benefit. To kick-start the effort, the first Splunk-built application on the platform was announced this week. Splunk for PCI Compliance is available for download from SplunkBase.

The application provides 125 searches, reports and alerts to help satisfy PCI (payment card industry data security) requirements, including secure remote access, file integrity monitoring, secure log collection, daily log review, audit trail retention, and PCI control reporting, says Splunk. The goal is to make it simpler and faster for IT managers to comply, to answer auditor questions, and to control access to sensitive systems data. Splunk has taken pains to provide security and access control to the sensitive data, while opening up access to the non-sensitive information for better analysis.

Consequently, Splunk’s foray into the developer world and applications ecosystems coincides with the company’s release of Splunk 3.2, which now includes a Splunk for Windows version (on the same single code base that runs on Linux, Mac OSX, Solaris, FreeBSD and AIX). New features in Splunk 3.2 include transaction search and interactive field extraction to create easier ways for end users to generate their own applications. The update also extends the platform’s capabilities with file system change monitoring, flexible roles, data signing and audit trails. A new REST (representational state transfer) API and SDKs (software development kits) for .Net and Python further opens the platform for more developers.

Demand Is There

The Splunk Platform and associated ecosystem should quickly grow the means to bridge the need for transparency between runtime actualities and design-time requirements. When developers can easily know more about what applications and systems do in the real world in real time, they can make better decisions and choices in the design and test phases. This obviously has huge time- and money-saving implications.

The need for such transparency will quickly grow as virtualization and a services-based approach to applications gains stream and acceptance. We have seen some very powerful productivity improvements as general enterprise data has been mined for business intelligence. Now it’s time to better mine systems data for better IT intelligence.


Dana Gardner is president and principal analyst at Interarbor Solutions, which tracks trends, delivers forecasts and interprets the competitive landscape of enterprise applications and software infrastructure markets for clients. He also produces BriefingsDirect sponsored podcasts. Disclosure: Genuitec sponsored this podcast.


Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels