PODCAST

Corporate Darwinism: Q&A With IBM’s Irving Wladawsky-Berger

Darwin’s principle of “survival of the fittest” is in full effect in the corporate world.

Over the years, many companies, especially technology firms, have risen to great heights only to fall to great depths.

As Darwin’s theory implies, when change swept through their industries, these once great firms couldn’t keep pace and eventually disappeared, either to be gobbled up by more powerful foes, to sink into oblivion as a shell of their former selves, or to become defunct.

Keeping Busy


Podcast: Listen to the entire interview (24:08 minutes).


Irving Wladawsky-Berger makes this observation in his blog, and in an interview with the ECT News Network.

Wladawsky-Berger is the chairman emeritus of the IBM Academy of Technology. He is also a strategic technology advisor to Citigroup, and a professor at the Massachusetts Institute of Technology.

In his interview, Wladawsky-Berger cites former tech heavyweights such as Digital Equipment, Compaq and Silicon Graphics as great examples of corporate Darwinism. DEC and Compaq were absorbed by other firms, and today, Silicon Graphics is merely a shell of its “Jurassic Park” heyday.

At the heart of it all is the lack of innovation, according to Wladawsky-Berger. Sustained innovation is the key to maintaining a competitive edge and corporate relevance, he says.

Smart Planet Initiative

He also discusses the newly announced Smart Planet Initiative, spearheaded by IBM. This initiative will utilize technology to help solve real-world challenges related to energy, transportation and other areas, Wladawsky-Berger said.

One example he cites includes the use of mobile technology in San Francisco that alerts drivers to available parking spaces.

Finally, Wladawsky-Berger gives his thoughts on the fast emerging tech sector called “cloud computing.” Among others, he sees Google establishing a leadership position in the cloud computing category.

Here are some excerpts of the interview:

E-Commerce Times: I’d like to start off with a blog post you recently wrote about the “Darwin Effect” in corporations, and I think this is particularly apropos in terms of what’s happening in the automobile industry and those companies being in trouble. Can you just tell the audience what you mean by the “Darwin Effect” in corporations?

Irving Wladawsky-Berger:

Yeah, absolutely. By the “Darwin Effect,” I really mean that if a company doesn’t embrace innovation at all levels, both in the technologies and products it uses and in the way it manages itself in the best way possible, it won’t be able to survive, and the marketplace is so competitive and it’s such a jungle out there that if the company falls behind because it cannot adapt to the new environment, it won’t be able to make it. I think of that in terms of Darwinian evolution. And Blake, you see the evolution in action, including a mass extinction. It’s not pretty to say, but you’ve got whole industries being under siege because there is so much change and those companies that are not able to adapt to the change are not making it. What’s amazing is that many of them are once-powerful companies, and the marketplace doesn’t care how powerful you once were — if you cannot make it, you cannot make it.

ECT: What are some good examples of these once-powerful companies that have either gone extinct or are teetering on the brink of extinction right now?

Wladawsky-Berger:

It is all really apparent in the IT industry, the information technology industry — and that’s the industry I know best, having been there for a long time — Digital Equipment was one of the most successful companies, they invented a whole category of minicomputers. They were riding really high in the late 1980s, and Blake, they are gone. They don’t exist anymore.

Compaq was once the most successful PC company in the industry for many years, they were very successful and now they’re gone, merged into HP. I still remember when “Jurassic Park” came out in the early ’90s, they used Silicon Graphics workstations to do a lot of the special effects. Silicon Graphics is still there, but it’s a shadow of what it once was. And the list goes on and on.

A really timely example is the financial industry. Look at Lehman, look at Citigroup — it’s a company I’m consulting for — it’s been to the brink. Hopefully, Citigroup will be fine and make it and be able to become a leading bank and continue to be so. You see Bear Stearns disappearing. The number of banks that are no longer here is very large, and that number will keep growing. Then, look at the media industry and look at the big changes that are going on in the media industry. So I think that there is really no industry that can claim to be isolated from this big survival imperative that every business needs to embrace in order to survive and move into the future.

ECT: There’s something that comes to mind for me in terms of two reasons why companies no longer exist. One is because they are bought out by — or merge with — another company and their brand name ceases to exist, or they simply go out of business for whatever reason. Do you make a distinction in terms of the Darwin Effect between those two situations?

Wladawsky-Berger:

Not really, although I can see why maybe one should, because often the reason — for example, Compaq didn’t go out of business, but the fact that they merged with HP and essentially disappeared is because they weren’t doing well. Often, the reason a company gets absorbed is a way of at least their assets continuing to survive. I’m sure there are counter-examples when the company was thriving and then got absorbed, and I’m sure those exist. For the most part, once a company gets absorbed — for the most part — it’s because it is deemed that it can no longer survive under its own power and then it’s best for its assets to now be integrated into other companies.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by Blake Glenn
More in Exclusives

TechNewsWorld Channels

EXCLUSIVE INTERVIEW

Cryptocurrency Custody Concerns: Who Holds the Digital Storage Keys?

Got Crypto? Make sure you own and have access to it in a secure digital stronghold.

Having self-custody of your crypto keys and managing your digital assets can help stave off digital bankruptcy or loss through theft, warns cryptocurrency storage provider CompoSecure.

Cryptocurrency is an increasingly familiar term since Bitcoin emerged in 2009. Since then, numerous cryptocurrencies have joined the digital asset marketplace and, despite the recent decline in valuations, the cryptocurrency market value has skyrocketed.

Market watchers valued the global cryptocurrency market size at $1.49 billion in 2020. Some project it will reach $4.94 billion by 2030, rising at a compound annual growth rate (CAGR) of 12.8 percent from 2021 to 2030.

The cryptocurrency market represents the start of a new phase of technology-driven markets that can potentially challenge traditional market strategies, longstanding practices in business organizations, and determined regulatory perspectives, according to Vantage Market Research.

Control of Crypto

Cryptocurrencies have the innovative potential to allow people access to a global payment system in which participation is barred only by access to technology. It could replace traditional standards based on having a bank account or a credit history.

However, buying and selling crypto coins and using digital currency to pay for products in the physical world is not the same as opening a bank account and depositing a paycheck. An announcement by Coinbase may have dislodged the elephant in the crypto storage room.

Coinbase is an app that lets people buy and sell various cryptocurrencies — Bitcoin, Ethereum, Litecoin, and many others — and lets users convert one cryptocurrency to another. Users can also send and receive cryptocurrency to and from other people.

In its 10-Q filing last month Coinbase disclosed that it would have the right to hold crypto assets of its retail users as property of the bankruptcy estate, if the company were to file for bankruptcy.

So, what about crypto providers and digital storage centers that hold your crypto funds?

That disclosure is driving awareness and highlighting the importance of self-custody, according to Adam Lowe, chief innovation officer of CompoSecure and creator of Arculus.

“As cryptocurrency is becoming more mainstream, many people are jumping in feet first and not properly researching and educating themselves. It’s important users know how their cryptocurrency works, who owns it, and what control they have with their digital assets,” Lowe told the E-Commerce Times.

Crypto Cold Storage Solution

CompoSecure is a pioneer in the premium payment cards industry. The company also developed and provides an emergent cryptocurrency and digital asset storage and security solution it calls Arculus.

The new cold storage wallet solution approach for securing crypto uses the name of the ancient Roman god. Arculus was considered to be the guardian of safes and strongboxes the Romans relied upon to ensure the protection of their cherished possessions.

The company applies that same nomenclature today. Arculus is the contemporary incarnation of this vigilant deity, ensuring the safe, strong security of critical digital assets and identity.

Think of this storage solution as a token, much like the physical device some people rely on to keep their computers under lock and key. For crypto, ownership is directly linked to the owner’s private key.

For example, if you purchase crypto through an exchange and leave it there, you are trusting the exchange to give you your digital assets when you ask. But since they keep ownership of the private keys, the exchange has full control to comply or not comply, Lowe cautioned.

“This is why self-custody wallets are important. By storing your private keys in a self-custodied wallet, such as a hardware wallet, only you have full ownership and control of your cryptocurrency and other digital assets. As we say, your keys, your crypto,” he explained.

Fuss-Free Ownership and Access

Dealing with digital assets is not the same as walking into to your local bank. Crypto security works much differently. When a traditional bank is insured by the Federal Deposit Insurance Corporation (FDIC), if the bank is robbed, defaults, or goes bankrupt, deposits are protected up to at least $250,000 per depositor.

Not so with cryptocurrencies. Those digital assets belong in an unregulated asset class that does not have the safeguards of traditional fiat currency. Crypto is currently not subject to FDIC protection, noted Lowe.

“As of now, if your cryptocurrency is hacked, it is gone. This is the main reason why properly securing and protecting your digital assets offline is important,” he advised.

No holistic regulations governing cryptocurrency exist. That is why cryptocurrency is a highly volatile asset.

“The Biden administration is discussing U.S. regulations. While we expect to see movement in that direction, it could be a while until widely accepted regulations are in place,” he added.

Holding the Right Card

CompoSecure’s recently launched storage hardware wallet enables consumers to have self-custody and manage all their digital assets in one offline place. This approach gives ownership of the crypto keys only to the user.

Arculus Wallet NFT support
Arculus Wallet product capabilities now include NFT support. (Image Credit: Business Wire)

The company’s innovative solution is the Arculus Key Card which uses a CC EAL6+ secure element to encrypt and store your digital keys. It is not connected to anything. If you lose it or it gets stolen, no one else can use it.

When a crypto owner makes a transaction in the Arculus Wallet App, it requires the user to tap the key card to his or her mobile device. This is an important security step in the three-factor authentication that Arculus uses to keep crypto keys safe and secure.

The card communicates with the wallet app to authorize a tap-to-transact secure near-field communication (NFC). It involves no Bluetooth, no Wi-Fi, no USB, and no cords.

CompoSecure on Tuesday announced the same approach for non-fungible token (NFT) support.

Cashing In on Crypto

Dealing with cryptocurrency issues can become much like a rabbit hole. The more your dig, the further into a financial abyss you fall. To ease the transition into crypto banking, we asked Adam Lowe to shine a light on the subject.

E-Commerce Times: Do crypto platforms provide digital protections?

Adam Lowe: Some cryptocurrency platforms do provide types of cyber or crime insurance, but like most insurance policies there are limitations and loopholes.

So, must consumers understand about the basic guidelines for digital asset ownership and who owns the keys to the crypto?

Lowe: The most important thing to understand is who owns your keys owns your cryptocurrency. Consumers need to educate themselves on custodial versus non-custodial assets.

Additionally, utilizing exchanges or hot wallets that use a continuous internet connection keeps the door open to threats of hacking and theft.

It is also vital to utilize multifactor authentication (MFA). Three-factor authentication is extremely valuable because it ideally looks at something you are such as a biometric, which can be a fingerprint or facial recognition. It requires something you know, such as a personal identification number or PIN.

Lastly, it needs something you have, such as our Arculus Key Card. This added step of security is crucial to ensure only you have access to your assets.

How does self-custody work?

Lowe: That means you own your private keys. The keys are what grant access to full control of someone’s digital assets instead of trusting a third party to be the custodian and arbiter of your digital assets.

Utilizing a hardware wallet, such as Arculus, will provide self-custody as only you can access your private keys and manage your digital assets.

What makes this method different from other custody arrangements with crypto brokers?

Lowe: Crypto brokers and centralized exchanges are third-party custodians. They have control and access to your private keys to purchase, move, and invest your digital assets accordingly. Non-custodial agreements hand over the keys and limit the layers of protection to the end-user.

How can self-custody protect consumers from online hackers and retain their digital assets even if they go bankrupt?

Lowe: With self-custody, no one can access your digital assets without your consent. This provides the necessary level of protection from hacks.

When it comes to an individual user going bankrupt, cryptocurrency is not considered income but rather property. Bankruptcy law is complex and very fact-specific, so I cannot give you guidance on what could happen to cryptocurrency in a user’s bankruptcy.

Is crypto investing for everyone or just those who can afford to lose?

Lowe: Cryptocurrency is currently being adopted at a faster rate than the internet. It is becoming mainstream. For some, it is their first investment venture. But like any investment, there is a risk of loss. As long as people understand the lack of regulations and high volatility, they can invest according to their level of comfort.

Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open-source technologies. He is an esteemed reviewer of Linux distros and other open-source software. In addition, Jack extensively covers business technology and privacy issues, as well as developments in e-commerce and consumer electronics. Email Jack.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by Jack M. Germain
More in Emerging Tech
EXCLUSIVE INTERVIEW

Data Observability’s Big Challenge: Build Trust at Scale

The cost of cleaning data is often beyond the comfort zone of businesses swamped with potentially dirty data. That clogs the pathways to trustworthy and compliant corporate data flow.

Few companies have the resources needed to develop tools for challenges like data observability at scale, according to Kyle Kirwan, co-founder and CEO of data observability platform Bigeye. As a result, many companies are essentially flying blind, reacting when something goes wrong rather than proactively addressing data quality.

Data trust provides a legal framework for managing shared data. It promotes collaboration through common rules for data security, privacy, and confidentiality; and enables organizations to securely connect their data sources in a shared repository of data.

Bigeye brings data engineers, analysts, scientists, and stakeholders together to build trust in data. Its platform helps companies automate monitoring and anomaly detection and create SLAs to ensure data quality and reliable pipelines.

With complete API access, a user-friendly interface, and automated yet flexible customization, data teams can monitor quality, proactively detect and resolve issues, and ensure that every user can rely on the data.

Uber Data Experience

Two early members of the data team at Uber — Kirwan and Bigeye Co-founder and CTO Egor Gryaznov — set out to use what they learned building Uber’s scale to create easier-to-deploy SaaS tools for data engineers.

Kirwan was one of Uber’s first data scientists and the first metadata product manager. Gryaznov was a staff-level engineer who managed Uber’s Vertica data warehouse and developed several internal data engineering tools and frameworks.

They realized the tools their teams were building to manage Uber’s massive data lake and thousands of internal data users were far ahead of what was available to most data engineering teams.

Automatically monitoring and detecting reliability issues within thousands of tables in data warehouses is no easy task. Companies like Instacart, Udacity, Docker, and Clubhouse use Bigeye to keep their analytics and machine learning working continually.

A Growing Field

Founding Bigeye in 2019, they recognized the growing problem enterprises face in deploying data into high-ROI use cases like operations workflows, machine learning-powered products and services, and strategic analytics and business intelligence-driven decision making.

The data observability space saw a number of entrants in 2021. Bigeye separated itself from that pack by providing users the ability to automatically assess customer data quality with more than 70 unique data quality metrics.

These metrics are trained with thousands of separate anomaly detection models to ensure data quality problems — even the hardest to detect — never make it past the data engineers.

Last year, data observability burst onto the scene with no less than ten data observability startups announcing significant funding rounds.

This year, data observability will become a priority for data teams as they seek to balance the demand of managing complex platforms with the need to ensure data quality and pipeline reliability, Kirwan predicted.

Solution Rundown

Bigeye’s data platform is no longer in beta. Some enterprise-grade features are still on the roadmap, like complete role-based access control. But others, like SSO and in-VPC deployments are available today.

The app is closed source, and so are the proprietary models used for anomaly detection. Bigeye is a big fan of open-source options but decided to develop its own to achieve the performance goals internally set.

Machine learning is used in a few key places to bring a unique blend of metrics to each table in a customer’s connected data sources. The anomaly detection models are trained on each of those metrics to detect abnormal behavior.

Three features built-in at the end of 2021 automatically detect and alert on data quality issues and enable data quality SLAs.

The first, Deltas, makes it easy to compare and validate multiple versions of any dataset.

Issues, the second, bring multiple alerts together into a single timeline with valuable context about related issues. This makes it simpler to document past fixes and speed up resolutions.

The third, Dashboard, provides an overall view of the health of the data, helping to identify data quality hotspots, close gaps in monitoring coverage, and quantify a team’s improvements to reliability.

Eyeballing Data Warehouses

TechNewsWorld spoke with Kirwan to demystify some of the complexities his company’s data sniffing platform offers data scientists.

TechNewsWorld: What makes Bigeye’s approach innovative or cutting edge?

Kyle Kirwan
Bigeye Co-founder and CEO
Kyle Kirwan, co-founder and CEO of Bigeye

Kyle Kirwan: Data observability requires constant and complete knowledge of what is happening inside all the tables and pipelines in your data stack. It is similar to what SRE [site reliability engineering] and DevOps teams use to keep applications and infrastructure working around the clock. But it is reimagined for the world of data engineering and data science.

While data quality and data reliability have been an issue for decades, data applications are now critical to how many leading businesses run; because any loss of data, outage, or degradation can quickly result in lost revenue and customers.

Without data observability, data dealers must constantly react to data quality issues and have to wrangle the data as they go to use it. A better solution is identifying the issues proactively and fixing the root causes.

How does trust impact the data?

Kirwan: Often, problems are discovered by stakeholders like executives who do not trust their often-broken dashboard. Or users get confusing results from in-product machine learning models. The data engineers can better get ahead of the problems and prevent business impact if they are alerted early enough.

How is this concept different from similar-sounding technologies such as unified data management?

Kirwan: Data observability is one core function within data operations (think: data management). Many customers look for best-of-breed solutions for each of the functions within data operations. This is why technologies like Snowflake, Fivetran, Airflow, and dbt have been exploding in popularity. Each is considered an important part of “the modern data stack” rather than a one-size-fits-none solution.

Data observability, data SLAs, ETL [extract, transform, load] code version control, data pipeline testing, and other techniques should be used in tandem to keep modern data pipelines all working smoothly. Just like high-performance software engineers and DevOps teams use their sister techniques.

What role do data pipeline and DataOps play with data visibility?

Kirwan: Data observability is closely related to DataOps and the emerging practice of data reliability engineering. DataOps refers to the broader set of all operational challenges that data platform owners will face. Data reliability engineering is a part of data ops, but only a part, just as site reliability engineering is related to, but does not encompass all of DevOps.

Data observability could have benefits to data security, as it could be used to identify unexpected changes in query volume on different tables or changes in behavior to ETL pipelines. However, data observability would not likely be a complete data security solution on its own.

What challenges does this technology face?

Kirwan: These challenges cover problems like data discovery and governance, cost tracking and management, and access controls. It also covers how to manage an ever-growing number of queries, dashboards, and ML features and models.

Reliability and uptime are certainly challenges for which many DevOps teams are responsible. But they are often also charged with other aspects like developer velocity and security considerations. Within these two areas, data observability enables data teams to know whether their data and data pipelines are error-free.

What are the challenges of implementing and maintaining data observability technology?

Kirwan: Effective data observability systems should integrate into the workflows of the data team. This enables them to focus on growing their data platforms rather than constantly reacting to data issues and putting out data fires. A poorly tuned data observability system, however, can result in a deluge of false positives.

An effective data system should also take much of the maintenance out of testing for data quality issues by automatically adapting to changes in the business. A poorly optimized data observability system, however, may not correct for changes in the business or overcorrect for changes in the business, requiring manual tuning, which can be time-consuming.

Data observability can also be taxing on the data warehouse if not optimized properly. The Bigeye teams have experience optimizing data observability at scale to ensure that the platform does not impact data warehouse performance.

Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open-source technologies. He is an esteemed reviewer of Linux distros and other open-source software. In addition, Jack extensively covers business technology and privacy issues, as well as developments in e-commerce and consumer electronics. Email Jack.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by Jack M. Germain
More in Data Management