Ghosts of E-Business Past, Present & Future

In the immortal words of philosopher and cultural critic George Santayana, those who do not learn from history are condemned to repeat it. Ebenezer Scrooge, the protagonist of Charles Dickens’ famous play A Christmas Carol, likely would have agreed with this sentiment.

In a similar vein, as the holiday season progresses and the prospect of an economic recovery looms, e-businesses would do well to heed the lessons of their fallen predecessors. Like ghosts of e-business past, infamous flameouts like Webvan and Pets.com may hold valuable lessons for today’s survivors about where the industry has been, what issues e-commerce companies now face, and what challenges are likely to loom in the future.

Can e-tailers learn from past failures, navigate the challenges of the present and face the future with renewed prospects for success?

Chilling Past

Amid the economy’s hard downward turn a few years ago, dot-coms were in the thick of the chaos. Many companies that had been founded on shaky business models faded away to nothing, leaving behind only remnants like leftover T-shirts and Aeron chairs. However, other Internet companies had such intriguing rises and falls that it seemed they had been created just to serve as warnings to others.

For example, Webvan turned out to be an example of poor arithmetic, since the company was paying more for its products than it was charging customers. Go.com proved to be a portal that led nowhere, and Iam.com lost US$48 million in trying to convince models and actors to put their portfolios online.

Some dot-coms tried to cash in on markets that did not need online components, such as HeavenlyDoor.com, which tried to sell caskets and burial plots and ended up losing $26 million in the process. The list, from Flooz to iMotors.com, goes on and on.

As IDC analyst Jonathan Gaw told the E-Commerce Times: “It was just a huge implosion, and all of the dot-com stocks went down. But the ones that survived are the stronger for it.”

Root of the Problem

However, it is not enough just to look at the past and shake one’s head in disbelief at e-business blunders. Gaw noted that for e-businesses in today’s environment, it is vital to understand past mistakes to avoid repeating them.

“The problem with the dot-com era was that we had a mantra that the rules [had] changed, that everything was different,” he said. “In some ways, that was true, because the rules did change. But in other ways, it was false, because the principles remained the same. We confused rules [with] principles.”

Gaw added that the Internet’s rise was similar to the invention of the airplane, which did change the world and the rules of travel. However, it did not change the underlying principles of flight. “Gravity still works the same, whether there’s an airplane or not,” he noted.

Similarly, business principles are a constant, and it is up to e-commerce companies to learn — and successfully apply — the basics.

Present Tense

Indeed, if there is a bright side to the dot-com mass extinction, it is that companies now are open to learning from the past, rather than insisting that everything has changed and old rules do not apply. The loss of so much money has not been forgotten by survivors of the e-business shakeout — and neither have the pain of layoffs, unreturned phone calls from venture capitalists and derision from consumers.

Philip Kaplan, founder of dot-com deathwatch site F***edCompany.com, told the E-Commerce Times that when he visits a company now, management and employees are keen to show they are not squandering money as in days past.

“They almost brag about how crappy their offices are and how bad their parties are,” he said. “It’s like they’re proud of it. But I think they’re just trying to distance themselves from what happened.”

Some lessons remain to be learned, though. Although the bad old days seem to have passed, e-commerce companies still face significant challenges in the present. Some of these hurdles are standard business difficulties, including retaining customers, determining adequate pricing strategies, maintaining proper staffing levels and keeping up with the pace of technology changes.

Other issues are more e-commerce specific, such as perfecting online usability and earning customer trust in an age of identity theft.

Future Perfect?

If today’s e-commerce companies could be visited by a Ghost of E-Business Future, they likely would want to know what challenges they would face in coming months and years, so they could prepareto meet them.

One area that still presents a challenge and will continue to do so in years ahead is usability and navigation. Jared Spool, founding principal of usability firm User Interface Engineering, told the E-Commerce Times that some e-tailers still do not know why customers come to their sites. He said this lack of insight likely will not be overcome soon.

“We see it all the time, that one little piece of information turns out to be tremendously important,” he said. “Making a site usable is a process that takes understanding and time. There is still a great deal to be learned.”

Pendulum Ready To Swing?

Some contemplative e-business executives may be able to gain a deeper understanding of usability, basic business practices and customer management as time passes, but not all of them will succeed, Gaw warned.

“We’re kind of like dogs,” he said. “We have really poor memories. That both helps and hurts…. You need to have a short memory to take risks and be aggressive. But it hurts when those risks do not pan out.”

He noted that in the e-commerce industry, as in other sectors, the pendulum swing of business means that the aggressiveness of the dot-com era is currently being counteracted by conservatism. In the future, the pendulum may begin swinging back toward exuberance.

“You see this swinging in everything, from business to politics,” Gaw said, “and pretty soon you realize that this is just the way life is.”

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories

TechNewsWorld Channels

Attacks on Cloud Service Providers Down 25% During First 4 Months of 2022

New research from Atlas VPN shows that cloud-native exploits on major cloud service providers (CSPs) declined during the first four months of 2022.

Cloud-native exploits dropped by 25%, from 71 exploits in the first four months of 2021 to 53 exploits in the first four months of this year, Atlas researcher Ruta Cizinauskaite told the E-Commerce Times.

Although those numbers may seem small, they are significant, maintained Paolo Passeri, a cyber intelligence principal at Netskope, a Security Service Edge provider in Santa Clara, Calif., and author of the Hackmageddon blog, from where Atlas obtained the data for its report.

“This is only the so-called tip of the iceberg, that is, campaigns that have been unearthed and disclosed by security researchers,” he told the E-Commerce Times.

One of the most targeted CSPs during the period was Amazon Web Services (AWS), Cizinauskaite wrote in the report released June 8. “[AWS] suffered the most cloud-native exploits among cloud service providers as of April 2022,” she reported. “In total, it experienced 10 cloud-native exploits accounting for nearly a fifth (18.9%) of all such events in the first four months of this year.”

She explained that cloud-native threats refer to cyber events that exploit the cloud in one or more stages of the “kill chain,” a cybersecurity model that identifies the typical steps taken by hackers during a cyberattack.

Tool for Mischief

For hackers, Amazon — which, with a third of the CSP market, is top dog — is a robust battleground where an attacker can never run out of targets, Alon Gal, co-founder and CTO of Hudson Rock, a threat intelligence company in Tel Aviv, Israel, told the E-Commerce Times.

AWS is also a flexible tool that can be used for multiple purposes, Passeri added. For example, AWS can be used to host a malicious payload delivered during an attack, as a command-and-control center for malware or to provide the infrastructure to exfiltrate data, he explained.

“As trust in cloud service providers has increased, so has the attraction for cybercriminals that target selected external services with sophisticated yet expected techniques,” Gal observed.

“Once a playbook for a technique is developed,” he continued, “it usually results in a quick win for them across multiple companies.”

Tempting Targets

David Vincent, vice president of product strategies at Appsian Security, an ERP security application provider in Dallas, explained that more and more organizations are moving their critical business systems into the cloud for obvious advantages.

“As long as these business systems contain valuable targets such as data and personally identifiable information or enable financial transactions, like payments, that criminals want access to, these cloud solutions will continue to be targeted by malicious actors,” he told the E-Commerce Times.

With 60% of corporate data stored in the cloud, CSPs have become a target for hackers, Passeri added.

“Besides,” he continued, “a compromised cloud account can provide the attackers multiple tools to make their attacks more evasive.” For example, they can provide a platform to host malicious content, such as AWS, OneDrive or Google Drive. They can also provide an embedded email service, such as Exchange or Gmail, to deliver malicious content that evades web security gateways.

Fishers of Bytes

The report noted that trailing behind AWS in the targeted department were five services each with five exploits: Microsoft OneDrive, Discord, Dropbox, Google Drive, and GitHub.

Other services had a thinner slice of the exploit pie: Pastebin (5.7%); Microsoft 365 and Azure (3.8%); and Adobe Creative Cloud, Blogger, Google Docs, Google Firebase, Google Forms, MediaFire, and Microsoft Teams (1.9%).

 

A majority of the exploits (64.8%), the report found, were aimed at delivering a malware strain or a phishing page.

Other exploits used the CSPs to set up a command and control infrastructure for malignant activities elsewhere (18.5%) and for stealing data or launching other attacks (16.7%).

“Successful hackers are like fishermen, they have different lures in the tackle box to attack a victim’s weakness, and they often must change the lure or use multiple lures because the victims become informed and won’t bite,” Vincent explained.

Exploiting CSP Infrastructure

Passeri explained that malware delivered to CSPs is not designed to compromise their systems but to use their infrastructure since it is considered trusted by the victims and organizations that use it.

In addition, he continued, the CSPs offer a flexible platform that is resilient and simplifies hosting. For example, there is no need to allocate an IP space and register a domain.

Advantages to hackers using a CSP’s infrastructure cited by Passeri include:

  • It is considered trusted by the victim because they see a legitimate domain and in the case of a phishing page, a webpage hosted on a cloud service with a legitimate certificate.
  • In some cases it is considered trusted by organizations because too many of them consider the CSP infrastructure trusted, so they end up whitelisting the corresponding traffic, meaning that the security controls normally enforced on the traditional web traffic are not applied.
  • It is resilient because if the malicious content is taken down, the attackers can spin up a new instance instantaneously.
  • Traditional web security technologies are blind to the context, that is, they do not recognize if, for example, a connection to AWS is heading to a legitimate corporate instance, or to a rogue instance controlled by the attackers.

Info-Stealers

One form of malware distributed through CSPs is information-stealing software. “Info-stealers are a quick win for hackers, as they are able to capture all the sensitive data from a compromised computer in a matter of seconds while leaving almost no traces behind,” Gal said.

“They can then use data like corporate credentials and cookies that were captured by the stealer to cause significant data breaches and ransomware attacks,” he added.

While hackers are willing to use CSP infrastructure for nefarious ends, they’re less inclined to attack that infrastructure itself. “Most exploits from CSPs are a result of misconfigured public internet-facing resources, like AWS S3 buckets,” explained Carmit Yadin, CEO and founder of DeviceTotal, a risk management company in Tel Aviv, Israel.

“Malicious actors target these misconfigurations rather than looking for a vulnerability in the CSP’s infrastructure,” he told the E-Commerce Times. “CSPs often maintain a more secure infrastructure than their customers can manage alone.”

John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by John P. Mello Jr.
More in Cybersecurity
EXCLUSIVE INTERVIEW

The Business Case for Clean Data and Governance Planning

Do you know if your company’s data is clean and well managed? Why does that matter anyway?

Without a working governance plan, you might not have a company to worry about — data-wise.

Data governance is a collection of practices and processes establishing the rules, policies, and procedures that ensure data accuracy, quality, reliability, and security. It ensures the formal management of data assets within an organization.

Everyone in business understands the need to have and use clean data. But ensuring that it is clean and usable is a big challenge, according to David Kolinek, vice president of product management at Ataccama.

That challenge is even greater when business users must rely on scarce technical resources. Often, no one person oversees data governance, or that individual lacks a complete understanding of how the data will be used and how to clean it.

This is where Ataccama comes into play. The company’s mission provides a solution that even people without technical knowledge, such as SQL skills, can use to find the data they need, evaluate its quality, understand how to fix any issues, and determine whether that data will serve their purposes.

“With Ataccama, business users don’t need to involve IT to manage, access, and clean their data,” Kolinek told TechNewsWorld.

Keeping Users in Mind

Ataccama was founded in 2007 and basically bootstrapped.

It started as a part of Adastra, a consulting company, which is still in business today. However, Ataccama’s was focused on software rather than consulting. So management spun off that operation as a product company that addresses data quality issues.

Ataccama started with a basic approach — an engine that performed basic data cleansing and transformation. But this still required an expert user because of the user-provided configuration.

“So, we added a visual presentation for the steps that enable data transformation and things like cleansing. This made it a low-code platform since the users were able to do the majority of the work just by using the application user interface. But it was still a thick-client platform,” Kolinek explained.

The current version, however, is designed with a non-technical user in mind. The software includes a thin client, a focus on automation, and an easy-to-use interface.

“But what really stands out is the user experience, which is built off the seamless integration we were able to achieve with the 13th version of our engine. It delivers robust performance that’s tuned to perfection,” he offered.

Digging Deeper Into Data Management Issues

I asked Kolinek to discuss the data governance and quality issues further. Here is our conversation.

TechNewsWorld: How does Ataccama’s concept of centralizing or consolidating data management differ from other cloud systems such as Microsoft, Salesforce, AWS, and Google Cloud?

David Kolinek: We are platform agnostic and do not target one specific technology. Microsoft and AWS have their own native solutions that work well, but only within their own infrastructure. Our portfolio is wide open so it can serve all the use cases that must be covered across any infrastructure.

Further, we have data processing capabilities that not all cloud providers possess. Metadata is useful for automated processing, generating more metadata, which in turn can be used for additional analytics.

We developed both of these technologies in-house so we can provide native integration. As a result, we can deliver a superior user experience and a whole lot of automation.

How is this concept different from the notion of standardization of data?

David Kolinek
David Kolinek
VP of Product Management,
Ataccama

Kolinek: Standardization is just one of many things we do. Usually, standardization can be easily automated, the same way we can automate cleansing or data enrichment. We also provide manual data correction when solving some issues, like a missing social security number.

We cannot generate the SSN, but we could come up with a date of birth from other information. So, standardization is not different. It is a subset of things that improve quality. But for us, it is not only about data standardization. It is about having good quality data so information can be properly leveraged.

How does Ataccama’s data management platform benefit users?

Kolinek: The user experience is really our biggest benefit, and the platform is ideal for handling multiple personas. Companies need to enable both business users and IT people when it comes to data management. That requires a solution for business and IT to collaborate.

Another enormous benefit of our platform is the strong synergy between data processing and metadata management it provides.

The majority of other data management vendors cover only one of these areas. We also use machine learning and a rules-based approach and validation/standardization, which, again, are often not both supported by other vendors.

Also, because we are technology agnostic, users can connect to many different technologies from the same platform. With edge processing, for instance, you can configure something once in Ataccama ONE, and the platform will translate it for different platforms.

Does Ataccama’s platform lock-in users the way proprietary software often does?

Kolinek: We developed all the core components of the platform ourselves. They are tightly integrated together. There has been a huge wave of acquisitions lately in this space, with big vendors buying smaller ones to fill in gaps. In some cases, you are not really buying and managing one platform, but many.

With Ataccama, you can purchase just one module, like data quality/standardization, and later expand to others, such as master data management (MDM). It all works together seamlessly. Just activate our modules as you need them. This makes it easy for customers to start small and expand when the time is right.

Why is a unified data platform so important in this process?

Kolinek: The biggest benefit of a unified platform is that companies are not looking for a point solution to solve just a single problem, like data standardization. It is all interconnected.

For instance, to standardize you must validate the quality of the data, and for that, you must first find and catalog it. If you have an issue, even though it may look like a discrete problem, it more than likely involves many other aspects of data management.

The beauty of a unified platform is that in most use cases, you have one solution with native integration, and you can start using other modules.

What role do AI and ML play today in data governance, data quality, and master data management? How is it changing the process?

Kolinek: Machine learning enables customers to be more proactive. Previously, you would identify and report an issue. Someone would have to investigate what went awry and see if there was something wrong with the data. Then you would create a rule for data quality to prevent a recurrence. That is all reactive and is based on something breaking down, being found, reported, and then fixed.

Again, ML lets you be proactive. You give it training data instead of rules. The platform then detects differences in patterns and identifies anomalies to alert you before you even realized there was an issue. This is not possible with a rules-based approach, and it is much easier to scale if you have huge amounts of data sources. The more data you have, the better the training and its accuracy will be.

Other than cost savings, what benefits can enterprises gain through consolidating their data repositories? For instance, does it improve security, CX outcomes, etc.?

Kolinek: It does improve security and mitigates potential future leaks. For example, we had customers who were storing data that no one was using. In many cases, they did not even know the data existed! Now, they are not only unifying their technology stack, but they can also see all the stored data.

Onboarding new people onto the platform is also much easier with consolidated data. The more transparent the environment, the sooner people can use it and start gaining value.

It is not so much about saving money as it is about leveraging all your data to generate a competitive advantage and generate additional revenue. It provides data scientists with the means to build things that will advance the business.

What are the steps in adopting a data management platform?

Kolinek: Begin with the initial analysis. Focus on the biggest issues the company wants to tackle and select the platform modules to address them. Defining goals is key at this stage. What KPIs do you want to target? What level of ID do you want to achieve? These are questions you need to ask.

Next, you need a champion to advance execution and identify the main stakeholders who could drive the initiative. That requires extensive communications among different stakeholders, so it is vital to have someone focused on educating others about the benefits and helping teams onboard the system. Then comes the implementation phase where you address the key issues identified in the analysis, followed by rollout.

Finally, think about the next set of issues that need to be addressed, and if needed, enable additional modules in the platform to achieve those goals. The worst thing to do is purchase a tool and provide it, but offer no service, education, or support. This will ensure that adoption will be low. Education, support, and service are very important for the adoption phase.

Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open-source technologies. He is an esteemed reviewer of Linux distros and other open-source software. In addition, Jack extensively covers business technology and privacy issues, as well as developments in e-commerce and consumer electronics. Email Jack.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by Jack M. Germain
More in Data Management