Will super chips disrupt the 'everything to the cloud' IT mentality?

Credit to Author: eschuman@thecontentfirm.com| Date: Wed, 10 Jan 2024 03:00:00 -0800

Enterprise IT for the last couple of years has grown disappointed in the economics — not to mention the cybersecurity and compliance impact — of corporate clouds. In general, with a few exceptions, enterprises have done little about it; most saw the scalability and efficiencies too seductive.

Might that change in 2024 and 2025?

Apple has begun talking about efforts to add higher-end compute capabilities to its chip, following similar efforts from Intel and NVIDIA. Although those new capabilities are aimed at enabling more large language model (LLM) capabilities on-device, anything that can deliver that level of data-crunching and analytics can also handle almost every other enterprise IT task. 

Given that enterprise CIOs are already less than thrilled with cloud costs — and the likelihood that Amazon, Google, and Microsoft are unlikely to do anything other than raise cloud rates this year — will these souped-up CPUs be the proverbial straw that broke the cloud camel’s back? 

The percentage of enterprise data moving to the cloud from on-prem steadily and gradually rose every year until roughly February 2020, when COVID-19 forced most companies to shutter offices and left computer rooms empty. Enterprises had to make emergent moves to the cloud, and once they did so, few were willing to meaningfully go back to the on-prem levels of 2019. 

“Companies and their CISOs and their CIOs are recognizing that the cloud has not turned out to be the all-encompassing panacea that some had hoped it would be,” said Brian Levine, the managing director for cybersecurity and data privacy at Ernst & Young, which now prefers to be called EY. “There are benefits and drawbacks of the cloud, including a lot of efficiencies that come from leveraging the cloud. But it also brings with it a whole new array of security issues, such as that criminals have the opportunity to go after the big cahuna, the big fish. That’s a really big bulls-eye on your back.”

Levine’s point is that the cloud creates a single point of failure. That’s a good thing to the extent that cloud security is top–notch, which is true for the major cloud environments. But it also means that if an attacker can break through, the bad guy has access to the top corporate secrets of hundreds, if not thousands, of major firms. That’s an attractive target, making it worth substantial effort and investment on the part of the attackers — especially if the secrets would interest state actors. 

Another factor involves cloud economics. The initial cloud sales pitches argued that enterprises could turn over their data needs to the cloud provider and reinvest  IT dollars elsewhere. 

Not only have cloud costs consistently grown, but the efficiencies didn’t materialize. Because enterprises typically have multiple cloud environments for different purposes (backup, disaster recovery, geographic issues dealing with data soverignty, etc.), they typically need to hire specialists in each cloud flavor, such as in AWS, Google and Azure. 

To be fair, that tradeoff might make fiscal sense for a small business without the ability to deliver the high-end security big cloud providers offer. But in the enterprise space, especially among the Fortune 100, companies often can deliver powerful security on their own. 

“I have a lot of clients who are unhappy with the amounts they are spending on cloud. They are, in fact, surprised by the amount it has cost over time,” Levine said. “You need a lot of specialists in your environment and you often are going to run into the ‘too many cooks’ problem, where the one hand doesn’t necessarily know what the other hand is doing.”

This can happen because enterprise IT works to make sure all settings and configurations precisely align with their needs. And yet, cloud staffers can make a universal change for clients that messes up that IT implementation. Even worse, not only do cloud teams often not ask permission from corporate customers before making a settings change, they typically don’t even tell them of the change. 

“This delivers a situation where your IT team is taking actions and then spending time checking to see if someone didn’t undo their actions. It’s a complexity problem,” Levine said, noting that the very efforts to secure an environment can inadvertently undermine protections. “Shrinking your environment (by bringing data in house, such as on-prem and on-device) makes it easier to secure your environment. Sometimes when you think you have created a better mousetrap such as running LLMs on a chip, it creates blindspots because you are not thinking about it as much.”

Barry Scannell, a technology law and data protection attorney with the William Fry law firm in Ireland, said he’s intrigued by the Apple move for security and compliance reasons, especially with the EU’s rules on privacy and AI. 

“On-device AI processing offers notable privacy benefits over cloud-based methods and enables offline functionality,” Scannell said. “This approach not only enhances user privacy in accordance with GDPR’s requirements, but also emphasizes the need for stringent data security protocols at the device level to safeguard against data breaches and unauthorized access.”

Scannell’s argument is certainly not necessarily limited to Apple’s generative AI efforts. The same is true for any data-handling on the device. If it doesn’t leave the device — or at least if it never leaves the enterprise environment — it is a much smaller target. Even better, that environment is controlled by enterprise IT. (That may or may not make it secure, but if IT is going to get hit with compliance violations and fines, it avoids the issue where the fault lies with a cloud staffer.)

Another perspective was best articulated by Malcolm Harkins, the chief security and trust officer at HiddenLayer. Harkins argued that while cybersecurity and compliance considerations are important, budget concerns will underlie the decision.

“At the end of the day, I am a strong believer that economics always wins,” Harkins said. “And if it ends up being economically more efficient/effective on-device, it will happen there and, if not, it will be in the cloud. I also think you have to look beyond the processing location. If I just process locally, how do I get the benefit of the broader AI advancements and flow of data?  So for some items, I think even if data is processed locally, the value — yes, the economics again — of having the data shared/leveraged with other information will still mean a lot of information will flow into the cloud.”

The issue really isn’t about whether enterprise IT is going to bring things back from the cloud. IT has always had the ability to do that (other than during the pandemic) and they, for the most part, have not. And I doubt that will happen. But what may very well happen is a small reduction in how much new data is sent to the cloud. 

Instead of sending 95%-plus of new data to the cloud, which is what appears to be happening with most enterprises, that figure might drop to perhaps 75% or 80%.

The key question: Will these new promised chip capabilities make a difference?

http://www.computerworld.com/category/security/index.rss