PODCAST

AT&T’s Take on Shifting Cloud Challenges and Opportunities

Migrating to the cloud is all the rage, but companies large and small are finding that they need to move a wider variety of their daily operations to network-delivered services — if they are available and within their budgets. One of the world’s largest service providers, global telecommunications giant AT&T, has created advanced cloud services for its business customers. AT&T has developed the ability to provide virtual private clouds and other computing capabilities as integrated services at scale.

Chris Costello, assistant vice president of AT&T Cloud Services, knows firsthand the challenges and opportunities involved in implementing cloud technology to deliver and commercialize an adaptive and reliable cloud services ecosystem. Costello sees which cloud computing needs are being requested by smaller companies — bundled services, on-demand capabilities — and knows why the enterprise is emphasizing state-of-the-art security features.

Costello shared her story on building top-performing infrastructure, and how the cloud services trend could open the door to more opportunities for women in technology. The interview was conducted by Dana Gardner, Principal Analyst at Interarbor Solutions.


Download the podcast (33:23) or use the player:


Here are some excerpts:

Dana Gardner: Just to help us understand, why are business cloud services such an important initiative for you?

Chris Costello: AT&T has been in the hosting business for more than 15 years, and so it was only a natural extension for us to get into the cloud services business to evolve with customers’ changing business demands and technology needs.

We have cloud services in several areas. The first is our AT&T Synaptic Compute as a Service. This is a hybrid cloud that allows VMware clients to extend their private clouds into AT&T’s network-based cloud using a virtual private network. And it melds the security and performance of VPNs with the economics and flexibility of a public cloud. So the service is optimized for VMware’s more than 350,000 clients.

If you look at customers who have internal clouds today or private data centers, they like the control, the security, and the leverage that they have, but they really want the best of both worlds. There are certain workloads where they want to burst into a service provider’s cloud.

We give them that flexibility, agility and control, where they can simply point and click, using free downloadable tools from VMware, to instantly turn up workloads into AT&T’s cloud.

Another capability that we have in this space is AT&T Platform as a Service. This is targeted primarily to independent software vendors (ISVs), IT leaders, and line-of-business managers. It allows customers to choose from 50 pre-built applications, instantly mobilize those applications, and run them in AT&T’s cloud, all without having to write a single line of code.

So we’re really starting to get into more of the informal buyers, those line-of-business managers, and IT managers who don’t have the budget to build it all themselves, or don’t have the budget to buy expensive software licenses for certain application environments.

Examples of some of the applications that we support with our platform as a service (PaaS) are things like salesforce automation, quote and proposal tools, and budget management tools.

The third key category of AT&T’s Cloud Services is in the storage space. We have our AT&T Synaptic Storage as a Service, and this gives customers control over storage, distribution, and retrieval of their data, on the go, using any web-enabled device. In a little bit, I can get into some detail on use cases of how customers are using our cloud services.

This is a very important initiative for AT&T. We’re seeing customer demand of all shapes and sizes. We have a sizable business and effort supporting our small- to medium-sized business customers, and we have capabilities that we have tailor-developed just to reach those markets.

As an example, in SMB, it’s all about the bundle. It’s all about simplicity. It’s all about on-demand. And it’s all about pay-per-use and having a service provider they can trust.

In the enterprise space, you really start getting into detailed discussions around security. You also start getting into discussions with many customers who already have private networking solutions from AT&T that they trust. When you start talking with clients around the fact that they can run a workload, turn up a server in the cloud, behind their firewall, it really resonates with CIOs who we’re speaking with in the enterprise space.

Also in enterprises, it’s about having a globally consistent experience. So as these customers are reaching new markets, it’s all about not having to stand up an additional data center, compute instance, or what have you, and having a very consistent experience, no matter where they do business, anywhere in the world.

Gardner: The fact is that a significant majority of CIOs and IT executives are men, and that’s been the case for quite some time. But I’m curious, does cloud computing and the accompanying shift towards IT becoming more of a services-brokering role change that? Do you think that with the consensus building among businesses and partner groups being more important in that brokering role, this might bring in a new era for women in tech?

Costello: I think it is a new era for women in tech. Specifically to my experience in working at AT&T in technology, this company has really provided me with an opportunity to grow both personally and professionally.

I currently lead our Cloud Office at AT&T and, prior to that, ran AT&T’s global managed hosting business across our 38 data centers. I was also lucky enough to be chosen as one of the top women in wireline services.

What drives me as a woman in technology is that I enjoy the challenge of creating offers that meet customer needs, whether they be in the cloud space, things like driving e-commerce, high performance computing environments, or disaster recovery solutions.

I love spending time with customers. That’s my favorite thing to do. I also like to interact with many partners and vendors that I work with to stay current on trends and technologies. The key to success of being a woman working in technology is being able to build offers that solve customers’ business problems, number one.

Number two is being able to then articulate the value of a lot of the complexity around some of these solutions, and package the value in a way that’s very simple for customers to understand.

Some of the challenge and also opportunity of the future is that, as technology continues to evolve, it’s about reducing complexity for customers and making the service experience seamless. The trend is to deliver more and more finished services versus complex infrastructure solutions.

I’ve had the opportunity to interact with many women in leadership, whether they be my peer group, managers that work as a part of my team, and/or mentors that I have within AT&T that are senior leaders within the business.

I also mentor three women at AT&T, whether they be in technology, sales, or an operations role. So I’m starting to see this trend continue to grow.

Dana Gardner is president and principal analyst at Interarbor Solutions, which tracks trends, delivers forecasts and interprets the competitive landscape of enterprise applications and software infrastructure markets for clients. He also produces BriefingsDirect sponsored podcasts. Follow Dana Gardner on Twitter.Disclosure: VMware is a sponsor of BriefingsDirect podcasts.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories

Technewsworld Channels

Linux Security Study Reveals When, How You Patch Matters

Computer security only happens when software is kept up to date. That should be a basic tenet for business users and IT departments.

Apparently, it isn’t. At least for some Linux users who ignore installing patches, critical or otherwise.

A recent survey sponsored by TuxCare, a vendor-neutral enterprise support system for commercial Linux, shows companies fail to protect themselves against cyberattacks even when patches exist.

Results reveal that some 55 percent of respondents had a cybersecurity incident because an available patch was not applied. In fact, once a critical or high priority vulnerability was found, 56 percent took five weeks to one year on average to patch the vulnerability.

The goal of the study was to understand how organizations are managing security and stability in the Linux suite of products. Sponsored by TuxCare, the Ponemon Institute in March surveyed 564 IT staffers and security practitioners in 16 different industries in the United States.

Data from respondents shows that companies take too long to patch security vulnerabilities, even when solutions already exist. Regardless of their inaction, many of the respondents noted that they felt a heavy burden from a wide range of cyberattacks.

This is a fixable issue, noted Igor Seletskiy, CEO and founder of TuxCare. It is not because the solution does not exist. Rather, it is because it is difficult for businesses to prioritize future problems.

“The people building the exploit kits have gotten really, really good. It used to be 30 days was best practice [for patching], and that is still an ideal best practice for a lot of regulations,” TuxCare President Jim Jackson, told LinuxInsider.

Main Takeaways

The survey results expose the misconception that the Linux operating system is not rigorous and foolproof without intervention. So unaware users often don’t even activate a firewall. Consequently, many of the pathways for intrusion result from vulnerabilities that can be fixed.

“Patching is one of the most important steps an organization can take to protect themselves from ransomware and other cyberattacks,” noted Larry Ponemon, chairman and founder of Ponemon Institute.

Patching vulnerabilities is not just limited to the kernel. It needs to extend to other systems like libraries, virtualization, and database back ends, he added.

In November 2020, TuxCare launched the company’s first extended lifecycle support service for CentOS 6.0. It was wildly successful right off the bat, recalled Jackson. But what continues to trouble him is new clients coming for extended lifecycle support who had not done any patching.

“I always ask the same question. What have you been doing for the last year and a half? Nothing? You haven’t patched for a year. Do you realize how many vulnerabilities have piled up in that time?” he quipped.

Labor-Intensive Process

Ponemon’s research with TuxCare uncovered the issues organizations have with achieving the timely patching of vulnerabilities. That was despite spending an average of $3.5 million annually over 1,000 hours weekly monitoring systems for threats and vulnerabilities, patching, documenting, and reporting the results, according to Ponemon.

“To address this problem, CIOs and IT security leaders need to work with other members of the executive team and board members to ensure security teams have the resources and expertise to detect vulnerabilities, prevent threats, and patch vulnerabilities in a timely manner,” he said.

The report found that respondents’ companies that did patch spent considerable time in that process:

  • The most time spent each week patching applications and systems was 340 hours.
  • Monitoring systems for threats and vulnerabilities took 280 hours each week.
  • Documenting and/or reporting on the patch management process took 115 hours each week.

For context, these figures relate to an IT team of 30 people and a workforce of 12,000, on average, across respondents.

Boundless Excuses Persist

Jackson recalled numerous conversations with prospects who repeat the same sordid tale. They mention investing in vulnerability scanning. They look at the vulnerability report the scanning produced. Then they complain about not having enough resources to actually assign somebody to fix the things that show up on the scan reports.

“That’s crazy!” he said.

Another challenge companies experience is the ever-present whack-a-mole syndrome. The problem gets so big that organizations and their senior managers just do not get beyond being overwhelmed.

Jackson likened the situation to trying to secure their homes. A lot of adversaries lurk and are potential break-in threats. We know they are coming to look for the things you have in your house.

So people invest in an elaborate fence around their property and monitor cameras to try to keep an eye on every angle, every possible attack vector, around the house.

“Then they leave a couple of windows open and the back door. That is kind of akin to leaving vulnerabilities unpatched. If you patch it, it is no longer exploitable,” he said.

So first get back to the basics, he recommended. Make sure you do that before you spend on other things.

Automation Makes Patching Painless

The patching problem remains serious, according to Jackson. Perhaps the only thing that is improving is the ability to apply automation to manage much of that process.

“Any known vulnerability we have needs to be mitigated within two weeks. That has driven people to automation for live patching and more things so you can meet tens of thousands of workloads. You can’t start everything every two weeks. So you need technologies to get you through that and automate it,” he explained as a workable solution.

Jackson said he finds the situation getting better. He sees more people and organizations becoming aware of automation tools.

For example, automation can apply patches to open SSL and G and C libraries, while services are using them without having to bounce the services. Now database live patching is available in beta that allows TuxCare to apply security patches to Maria, MySQL, Mongo, and other kinds of databases while they’re running.

“So you do not have to restart the database server or any of the clients they use. Continuing to drive awareness definitely helps. It seems like more people are becoming aware and realizing they need that kind of a solution,” said Jackson.

Jack M. Germain

Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open-source technologies. He is an esteemed reviewer of Linux distros and other open-source software. In addition, Jack extensively covers business technology and privacy issues, as well as developments in e-commerce and consumer electronics. Email Jack.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by Jack M. Germain
More in Security