Tuesday 14 September 2010

A Cloud Discussion With Dimension Data

A Cloud Discussion With Dimension Data: "
Working as I do at EMC, I get the opportunity to work with some of the world's best partners, including Dimension Data. Recently acquired by NTT, they're in an interesting position to do well as the world shifts from traditional IT to newer models.

I was recently asked to contribute a thought piece for a new customer publication they're creating. They asked the questions, I gave my best answers.

Graciously, they allowed me to post the piece here in advance of their publication. I'll supply a link when it gets published -- the publication shows every sign of being a good and valuable read.



There is a lot of industry hype around cloud computing- can you talk about virtualisation, private and public cloud computing what it means to clients today and why they should pay attention (if they are not already)?

Great question, so thanks. I’d like to start with a bit of context as to what might be going on here.

At a high level, many of us believe that the IT business is going through an important structural shift, to a world where the majority of IT is delivered as a dynamic, flexible and cost-effective service.

Historically, IT organizations would acquire chunks of discrete-yet-incompatible hardware and software, put it on their data center floor, and try to manage it all efficiently. That sort of approach is giving way to a pooled approach for infrastructure and software.

“Cloud” comes into the discussion as a convenient short-hand to identify what’s different here: IT is now built differently (dynamic pools of fully virtualized resources), operated differently (as an end-to-end service vs. individual technology specialties) and consumed differently (convenient consumption models for IT users and IT organizations alike).

“Private cloud” generally refers to any cloud that is under IT control: resource usage, service delivery, security and the like. “Public cloud” generally refers to any cloud where most of the control lies with the service provider, and not the consumer.

Many organizations operate at enough scale where cloud economics could easily apply to their internal operations – a cloud of your own, so to speak. Smaller organizations are looking to the new crop of service providers to provide the economics and the flexibility of the cloud, yet retain key aspects of control that they need.

I happen to think that we’ll see a lot of both going forward: cloud-like internal IT operations complemented by compatible service providers with IT still in control of the end result.

Virtualization is the technology that is largely responsible for making this all happen. By decoupling logical functions from physical resources, it makes the entire cloud discussion a realistic goal rather than some sort of unobtainable future state.

What does this mean to clients today?

So much emphasis has been put on lowering the costs of delivering IT services, and – while that is inarguably important – there’s far more to the story to consider.

Cloud deliver models are inherently flexible and dynamic. That means that anyone who uses IT services can generally get what they want – far faster and with less planning or commitment – than available before. That sort of speed and agility can be compelling if your business is moving fast – and most of them are!

Finally, there are a class of IT services where external providers have a unique advantage that’s hard to match with internal capabilities – certain industry applications are a good example, as are certain parts of the security and compliance stack, or data protection as another example.

Simply put – it’s not just about cheaper IT, in many cases it’s about better IT.

What should clients pay attention to?

For me, it’s all about the journey from where you are to where you’d like to be in the near future. Simply put, “cloud” has redefined where we’d all like the end state to be. Now that there’s general agreement on that, the work lies in the projects and initiatives that get us closer to that destination.

EMC_IT_three_phases Put another way, the journey is mostly about people and process, and less about enabling technology. As an example, we use a three-part model to describe how we think most IT organizations will progress. I’ve included a high-level graphic that EMC’s IT organization is using to describe the process.

The first phase is virtualizing the non-business-critical applications that IT owns: development and test, infrastructure support apps, and the like.

Here, you’re just introducing new technologies (mostly server virtualization, such as VMware) and not really touching either your operational processes nor your relationship with the business. Big savings typically result, but there’s more to be had.

The second phase is virtualizing the business-critical applications that people really care about: you know, the ones with specific names and business owners. Here, you’re not only introducing new technology to protect, secure and manage applications, you’re also fundamentally changing the operational processes you use to run IT. And, as we all know, doing this inevitably impacts roles and responsibilities within the IT organization.

But there’s a new payoff in addition to capex and opex savings: there’s the obvious potential for better operated IT: availability, protection, security, efficiency and the like.

The third phase involves creating a catalog of services for the business: infrastructure as a service, application platforms as a service, popular applications as a services and – more recently – desktop experiences as a service. Each of these can either be provided internally, externally or using a combination of both.

If you think about it, this phase is all about establishing a new relationship between the business and IT. IT is now the internal service provider, and business users learn to be intelligent consumers of those services.

As with any journey, planning and orchestration is essential. The skills required at every juncture are usually not found within the existing IT organization, necessitating the use of one or more external consultancies.

That being said, we are working with many hundreds of organizations who have started down this path. It’s no longer theory – people are making considerable progress, often with spectacular results.

As clients accelerate their adoption of virtualisation towards cloud computing what are some of the major challenges that they are facing and what are the steps that they can take today to get ready for the private and public cloud computing?

A lot can be said on this topic, but in the interest of brevity, I’ll keep it to more manageable sound bites:

• Adopt a “virtualization first” policy. Don’t invest a single dollar in the physical legacy unless all alternatives have been exhausted. Make sure that every piece of technology that lands on the data center floor moves you closer to the goal, and not away from it. No U-turns.

• Create a “next-gen” environment, and point it at your loudest critics. By “next gen”, we’re not only talking integrated virtualized technology, we’re talking service provider processes and procedures that create a cloud-like experience for end users. By “loudest critics”, I’m referring to the people who routinely criticize how IT goes about delivering their services. These can be application developers, business analysis professionals or just about any cadre of demanding knowledge workers in the organization.

• Don’t underestimate the impact of organizational change. The underlying proposition here is to do IT in a new way, using people who’ve made a career of doing it the old way. The larger and more entrenched your IT organization is, the more of a challenge this can be.

• Enlist the business. So many IT organizations have created walls between themselves and the people they serve. Those walls are going to have to come down for meaningful change to occur at an accelerated pace. Invest in the time and resources needed to have ongoing discussions with key business stakeholders before, during and after the journey.

• Focus on what matters. It’s so easy for this discussion to devolve into a beauty pageant around who’s got the best technology, simply because that’s a topic that most IT professionals feel comfortable discussing. Many IT leaders are using pre-integrated IT infrastructure (such as the Vblock from the VCE Coalition) to accelerate the transition, and focus the organization on achieving immediate results.

• Learn from others. If you’re an IT leader, you’re not alone. Just about every IT organization will have to make this sort of transition over the next several years, so it’s easy to find people who are in your situation that you can learn from.

With all this market hype around cloud computing can you tell us about how storage and data management will need to change to support cloud computing?

We’re fond of saying “cloud changes everything”: how IT is built, operated and consumed. Storage and data management topics aren’t immune from this discussion; indeed, most of the anxiety around cloud has to do with security and data management topics.

Like other parts of the technology landscape, storage has to become fully virtualized alongside servers, networks and other components. It too has to dynamically deliver a wide range of service levels (and cost points) from the same underlying infrastructure.

Storage has to be increasingly managed in concert with other technologies, and less as a standalone discipline. New control planes need to be established to make sure that information is consistently protected and secured even where it might be moving from place to place.

Using EMC as an example, you can see those themes already evident in our portfolio. Storage virtualization is de-facto now within individual arrays, and increasingly across multiple arrays in potentially multiple locations. Auto-tiering technologies (such as EMC’s FAST) automatically deliver a wide range of service levels (and new cost points) from the same shared storage infrastructure.

New storage management models increasingly coordinate with the server virtualization layer (think VMware’s vCenter) and the newer crop of integrated infrastructure managers (think EMC Ionix UIM). And not only are there new tools for protecting and securing data, but new ways of managing them as an end-to-end service – whether implemented in the data center, or through a compatible service provider.

One thing is for sure: there won’t be any less information to deal with in this new world, and information won’t be any less important.

Virtual storage is fast becoming a reality with the release of VPLEX from EMC – can you talk about why VPLEX is so revolutionary in the storage market?

One key aspect of cloud is the ability to progressively aggregate larger and larger pools of resources that you can use dynamically: primarily servers and storage. The larger the dynamic pool, the better – generally speaking. You can run more efficiently, you can respond more quickly to changing conditions, and you can protect applications and data more effectively.

From a storage perspective, we’ve done a good job at creating these dynamic storage pools within a single storage frame. I’d use Symmetrix VMAX as an example of this. A single VMAX can be petabytes in size, creating a considerable dynamic and virtualized pool of resources.

But that’s not enough. We need to be able to create pools across multiple storage frames from potentially multiple vendors, and – ideally – allow these pools to extend over progressively larger and larger distances.

That’s what VPLEX is about – creating these dynamic, flexible and non-disruptive pools of storage resources that can incorporate multiple frames from multiple vendors, and – over the next year or so – progressively incorporate storage resources in multiple locations separated by considerable distance.

Taken in its entirety, VPLEX works alongside server and network virtualization to create very appealing clouds that can not only ignore geographical distances, but leverage it in interesting ways – for example, dynamically relocating applications and information closer to users to improve the user experience, or perhaps take advantage of cost-effective capacity in different locations.

More pragmatically, customers are using VPLEX today to simply move storage workloads around from device to device without having to take downtime or otherwise impact users. It’s pretty exciting stuff.

Can you talk about storage in the cloud and the how real it is for clients today?

It’s more real than many people imagine, once you start focusing on real-world use cases rather than generic theory.

For example, doing backup and other forms of data protection to external service providers is quite popular, simply because the service provider is physically and logically separated from the primary application location. Technically speaking, that’s “storage in the cloud”, although most people don’t think about it that way. Many of EMC’s data protection products (Mozy, Avamar, Data Domain, Data Protection Advisor) etc. are being used for this purpose today.

A related popular application is long-term information repository and archiving, especially with compliance requirements. We’ve been sending physical information records off-site for decades, digital records aren’t really all that different if you think about it. In the EMC portfolio, that would be Centera, the newer SourceOne information governance platforms and portions of the Documentum portfolio.

There’s a growing requirement to gather and dynamically distribute information content globally for many customers. They need multiple points of presence around the globe, managed as an integrated service that conforms to policy objectives. Many of these “storage clouds” are being built with EMC’s Atmos platform.

Those are just a few examples where there’s an individual storage discussion. Far more frequently, the storage is associated with an application, both of which are run externally: think SaaS, hosting, outsourcing, etc.

Taken that way, there’s already a considerable amount of storage “in the cloud”!

Security concerns come up frequently as an objection to cloud computing – what is your stance on this and is it really as big a concern as some believe?

Being secure in the cloud isn’t particularly difficult. However, changing established policy and procedure to incorporate this new IT delivery model can be very difficult indeed.

From a pure technology perspective, it’s fairly easy to show that fully virtualized cloud environments can be far more secure and compliant than the traditional physical environments they replace. It’s like night and day in terms of new underlying capabilities.

That being said, people’s perceptions tend to lag the reality, often by many years. The good news is that we’ve recently had good success by focusing the security and compliance teams on the high level GRC management of the IT environment (cloud or otherwise) as the required control point for security and compliance, using the EMC RSA Archer environment.

It’s not surprising that giving people an integrated dashboard to monitor and control risk can make things move forward in a much more decisive manner. As these high-level GRC dashboards proliferate, I think we’ll see much less resistance going forward.

Compliance and data protection are high on the agenda for clients – when people move to cloud computing what needs to be taken into account to ensure compliance and data protection?

It’s actually worse than that – the rules change all the time, often without prior notice!

As a result, we’re strong advocates of policy definition and enforcement capabilities that are built in, rather than bolted on later. From an architecture perspective, virtualized entities (servers, storage, applications, databases, etc.) are relatively self-contained, making them easy to tag and indentify from a compliance or data protection perspective.

This tagging or labelling capability makes a fully automated approach far more implementable: as polices are established for labelled objects (for example, information that can’t leave Belgium, or always must be remotely mirrored at least 200KM away), policy enforcement is automatic. Compliance for existing policies can be monitored in real-time, non-compliant pieces can be easily identified and remediated, and the inevitable new policies can be implemented with a minimum of drama.

Harking back to a previous discussion around security, it’s not hard to show that compliance and data protection capabilities can be far better in a fully virtualized cloud model -- it’s more a matter of changing established perceptions and protocols.

As the world comes to embrace all that is ‘cloud’ where do you see the market going and how quickly?

I don’t think any part of the IT landscape will emerge unscathed from this industry transition. Perhaps the last transition of similar magnitude was the wholesale adoption of the internet – and we’re still feeling the repercussions of that one!

Vendors will need to build products that can be delivered as a service, rather than standalone physical entities. These products and technologies will increasingly go to market through a new generation of solution and service providers – as well as IT organizations that think of themselves as internal service providers.

Value-added solution providers (such as Dimension Data, now acquired by NTT) will be well served by helping customers manage the transition to the new model in an intelligent and rational fashion, as well as providing not only technology, but the new breed of services that will be so popular.

IT organizations will find themselves strongly encouraged to invest in the new model, while being obligated to keep the old model running during the transition. We believe that IT organizations will come to look very differently in the next few years – new skills, new roles, and new relationships with the business.

Consumers of IT – whether they be individuals, or in an organizational setting – are increasingly starting to demand a “cloud experience” – a catalog of useful services available on demand. And there’s no reason to believe that they won’t eventually get what they want – if not from the established IT organization, then certainly elsewhere.

In some sense, all of this IT change really isn’t that different to what’s already happened in other forms of infrastructure we take for granted: power, communications, transportation, facilities, manufacturing, labor forces, etc.

Taken that way, the transition is somewhat inevitable.

With anything new there are winners and losers - what are the winners going to look like in cloud computing - both clients and IT providers?

Any time there’s a big change in an industry, the winners tend to be those that recognize the change, and start accelerating their efforts towards the new model. Conversely, those that resist the change without good reason tend to suffer the consequences.

I think this statement is broadly applicable to just about any industry (not just IT). There’s little debate remaining that cloud concepts will significantly change the IT industry for just about every participant: vendor, solution provider, IT organization and user.

Any closing thoughts or comments?

Personally, I think Dimension Data is uniquely well positioned to serve their clients well through this transition.

The pieces are pretty easy to see – Dimension Data has access to a broad range of relevant technologies (including EMC’s!), they have the intellectual prowess to develop the new services and skills that customers will demand, they have the important “trusted advisor” relationship with their clients, and – now as part of NTT – they’ve got access to a world-class service provider infrastructure as part of their offer.

Indeed, the business of IT really hasn’t changed – but how we go about doing it is certainly changing. And I think Dimension Data serves as the prototypical example of what customers are going to want from their partners going forward.

Thanks for the opportunity to share a few thoughts!




"

No comments:

Post a Comment