hybrid-cloud

Head in the clouds: A look at on-premises, cloud and hybrid architecture

SHARE

On virtually a daily basis I find myself involved in an in-depth discussion of the latest opinions on the benefits and pitfalls of different hybrid-cloud deployment architectures. Each of these conversations consists of either entirely new approaches being put forward or arguments as to which approach is the most suitable for a plethora of different technologies, industries and use cases. Without a doubt, cloud computing has become one of the biggest technology disruptors of the 21st century, and its rapid growth and adoption has become a challenge.

Some organizations are challenged to determine what their cloud strategy should be. Confusion exists around what being on cloud means across the board. Software vendors contribute to this challenge by describing their software as cloud ready, powered by the cloud and on cloud; but in reality, they offer different deployment topologies under these various headings.

Looking to the cloud

Based on my observations, I feel that every single organization, large and small, needs to be looking seriously at the benefits that cloud technologies can bring them. These benefits are taking away the challenges of security, resilience, backups, scalability and upgrades while providing cost reduction. Such benefits are significant to large-scale enterprises as much as mom-and-pop shops. I believe that small organizations finding the move to and execution on a comprehensive cloud strategy easier than large organizations is generally true; however, the benefits to large organizations from an effective strategy can be exponential.

To further demonstrate the value of a cloud strategy, I’ll break the various topologies that I see into three well-known categories: on premises, cloud and hybrid.

On-premises

This category likely needs no explanation. You own the hardware, it’s in your data center and you are responsible for all of it. If you need a patch on it and it breaks, you have to fix it. No problem. You have a backup, right? Oh, you don’t? Perhaps the backup didn’t run last night. Regardless, you are going to lose a full 24 hours of data, not to mention the cost of the downtime while you figure out how to fix it. Now, that scenario is pretty dramatic, but on more than one occasion in my career I have heard of similar instances, and none of us want to be that person in the data center when it happens.

Cloud

This category is where things get interesting. The cloud category doesn’t mean you don’t have control. Clients sometimes overlook, or don’t consider, the distinctions between public and private clouds and even less so between private, hosted clouds and private, on-premises clouds. I have spoken to many clients who have stated categorically they will never put their data into the cloud. In fact, a retailer that I recently worked with stated it will never put its client data anywhere but in its own basement, not even in an on-premises private cloud. Yet, competing clients in the same industry are pressing ahead aggressively with a private, hosted cloud strategy for exactly the same type of data.

In all but the most highly sensitive scenarios, such as government environments, I have a slightly different point of view. Rolling out a private, on-premises cloud can still offer significant benefits to organizations such as this retailer in the form of scalability, tenancy and so forth. However, they can be spooked by possible external access to their data and believe cloud technologies are bad for them. Maybe they’ve missed some of the understanding of the options that are available to them. This notion backs up my theory that despite the disruption cloud has caused, we are still very much at the beginning of the maturity curve, which is probably the reason why discussions on differing cloud topologies take up so much of my day.

Even for hosted, private-cloud deployments, an environment in which your data is hosted in a completely secure, private-cloud instance still gives you all the benefits of a cloud architecture and as good a level of security and protection as a traditional on-premises implementation. I see hosted, private-cloud implementations becoming increasingly common to enter into specialist agreements with the cloud provider regarding physical access to the machines. Do these types of arrangements really offer any more risk than, say, a disgruntled employee in the IT department accessing your physical machines on premises? My personal opinion aside, a large percentage of clients are not prepared to move their data out of their basements.

Hybrid

The hybrid category was the real driver for me wanting to write this post. The challenge with the on-premises or cloud category is it assumes that you’re all in for that topology. In reality, cloud deployments are going to take on some kind of hybrid topology. A hybrid-cloud topology can provide a mix of on-premises, private- and public-cloud components to build an architecture that takes advantage of the benefits provided by cloud computing. But it also provides a realistic way to keep control of the components or data that an organization doesn’t want to take off premises. Utilizing compute resources in the cloud for high-intensity tasks while keeping your most sensitive data locked up tight is a really valuable proposal for some clients that are challenged by the points made previously.

One of the most compelling points of view I come across is the notion of head-in-the-cloud architecture. In this scenario, users of business applications—the head—are hosted on cloud. They can take advantage of on-demand scalability and offer exceptional performance to the end users. Secure access into an organization’s on-premises or private cloud data center to retrieve the relevant data as required is provided by application programming interfaces (APIs) that are focused on the need of the consuming application. These APIs have data governance baked in, ensuring that access to the data is protected and used in compliance with regulatory requirements.

I have seen a big spike in the topic of head-in-the-cloud topologies. They seem to provide a progressive step to cloud for business-to-business (B2B) and business-to-consumer (B2C) applications for organizations that want to take advantage of the benefits of cloud topologies, but feel uncomfortable in doing so. For tasks requiring high levels of compute power that can be called upon as required, such as running deep analytics, leveraging a head-in-the-cloud topology makes total sense.

Consider a typical operational master data management (MDM) implementation, for example, such as IBM MDM. These large integration projects result in an integrated single view of the important business entities that exist across an enterprise. Organizations use this single view to be consumed by the critical business processes, ensuring that these processes consume the most accurate data.

Trying a head-in-a-cloud approach

But when it comes to understanding the data within these MDM systems, integrating the right tools to provide deep data discovery without impacting performance on what is a critical system can be challenging. A head-in-the-clouds approach to this problem allows portions of this data to be extracted into a landing zone, so that cloud-based tools can be used to provide deep analytics on the MDM data. Analytical tools can provide data scientists with the compute power needed to discover and understand the data within their MDM system without impacting business operations. The MDM data can be augmented in the cloud with other data sources, such as those from cloud data providers, to discover new insights from what would otherwise be dark data. Governed access to the MDM data through APIs helps ensure that sensitive data is handled with proper restrictions. In a head-in-the-cloud topology, all these benefits can be achieved without the need to meddle in the operation of the existing MDM system and risk either a long-running IT project or destabilizing the existing environment.

A head-in-the-cloud, hybrid-cloud strategy can provide a mechanism to extract value from existing on-premises IT investments by separating where the work gets done. This approach enables new ways of working with the data within the on-premises systems and enables self-service usage and consumption of that data, helping free up the analysts and scientists to use and discover insights from the data.

used with permission from IBM Big Data & Analytics Hub
by Jay Limburn, Senior Technical Staff Member and Offering Manager, IBM

UPCOMING VIRTUAL EVENTS

Demystifying Cyber Security for SMBs

sb-cyber-security-master-class

The continually changing threat landscape requires us to update best practices and add new concepts to keep your organization safe.

SESSION 4: Cyber Security Strategy
Watch On-Demand

SESSION 5: Cyber Insurance & MFA
Watch On-Demand

SESSION 6: Threat Detection | OCT. 16

Microsoft Copilot
Master Class Workshop

sb-microsoft-copilot-master-class

eMazzanti will host 60-minute Master Classes, that speak to how AI can help your business streamline and grow.

In each session, you will have Artificial Intelligence and Automation explained, view a live demo of Copilot, and see it live in action in a dynamic format.

RESOURCES

Cyber Security Awareness Hub

sb-Cyber-Security-Awareness-Hub

Cyber Security Awareness Kit, designed to be delivered to your team in bitesize chunks.

We are sharing the resources and highlighting services your organization needs, covering everything from multifactor authentication to software updates, showing your users just how easy it is to improve their security posture.

Resource Library

sb-resource-library

Insights to help you do what you do better, faster and more profitably.

> Tips to Stay Protected Against Phishing Attacks

> Understanding Ransomware 

> The 6 Known Wi-Fi Threat Categories Targeting Your Business and How to Defend Against Them

> Practical Advice for Avoiding Phishing Emails

Recent Articles

NEWSLETTER

Categories