Capacity planning and IT operation trends for 2016

The IT industry is heading for significant growth in 2016, with the pace of technological change continuing to intensify, and an ever-increasing deluge of data. As organisations look for ways to transform their service management capabilities to deliver against ever increasing service level expectations in this landscape, capacity planning is gaining a resurgence in focus.

Underlining this focus, earlier this year Gartner stated in its ‘Market Guide for Capacity Management Tools’ report, published on January 30, 2015, that “through 2018, more than 30 per cent of enterprises will use IT infrastructure capacity management tools for their critical infrastructures to gain competitive advantage, up from less than five per cent in 2014.”

So capacity planning is clearly an important business focus moving forward and one the channel needs to be aware of. To capitalise on the opportunities that this presents for channel partners, I wanted to outline the top four emerging technology trends that I believe are likely to affect IT operations and capacity planning efforts in the year ahead.

Each of these predicted trends evolve from the ongoing growth in the complexity of IT infrastructures, caused by factors including the continuing growth of big data; new technologies like wearables and IoT and the mix of physical, virtual and cloud based resources.

Capacity management will continue to loom large on the IT agenda

In 2015 we predicted that Capacity Management would be back on the agenda as a way to cut through the complexity of managing the IT environment. We were right, but this is a trend that will carry on into 2016 and beyond as complexity and automation continue to grow, and accurate visibility becomes even more essential.

Modern Capacity Planning tools will continue to adopt the latest advances in new techniques, like predictive analytics, and focus more on reporting in metrics and a language the business understands. Businesses do not typically talk in terms of CPU, Memory, GHz and GB’s, so IT teams need to be able to converse and report in terms of the IT impact of adding ‘x’ new customers, or a y% increase in expected payments or sales.

The challenge, as it always has been, is the translation layer – understanding the correlation between customers and transactions and compute resource. The latest advances in capacity planning software will enable this and transform what is currently a very challenging and manual process.

Virtualisation of applications will become the next big thing

There seems to be no end in sight to the growth in complexity of the typical IT environment, driving organisations to seek ways of achieving high-performing and agile infrastructures. To this end, we have already started seeing the virtualisation of hardware components but it will be the virtualisation of applications, allowing them to run regardless of the platform, which will be another important trend in 2016.

Indeed, there are already service providers out there who are able to automatically package any Windows application to remove platform dependencies. This virtualisation of applications enables centralised software distribution and configuration management, as well as access and entitlement, which reduces on-going management costs and simplifies the IT environment.

All this is helped by Docker, the free open-source software that lets developers easily take an app from creation on a PC, through testing and into production in the cloud without losing code or collecting bugs along the way. Docker (likened to shipping containers) has been downloaded more than 300 million times with its commercial version that was introduced in June, being used by around 10,000 groups.

So it’s clear that virtualisation is the future, but when planning a virtualisation programme, I would recommend letting your data inform you. Predictive analytics has a key role to play here as it can help to identify the relationship between expected user/transaction volumes, and capacity required.

Hybrid cloud will become the norm

The movement of data from on-premise to the cloud, in order to exploit the cloud’s flexibility and scalability, remains an important trend and we expect to see a continued move in that direction.

However there is still an element of nervousness associated with cloud based services and in many industries companies are still at the stage of experimenting. In some instances the ability to rapidly spin up additional compute power is fundamental to their business model and for start-ups, it’s still the default option.

We are, however, starting to see some early cloud adopters bring services back from the cloud as they start to fully understand the cost implications and believe they can run the services more efficiently within their own data centres. While nervousness around data security, and operational concerns around span of control and layers of complexity, many industries, especially those in heavily regulated sectors, consider cloud less attractive.

With this in mind, 2016 will see hybrid cloud becoming the preferred business choice due to its ability to provide the best of both worlds – combining of physical onsite infrastructure with both public and private-hosted cloud services.

It makes sense that a mixture of platforms works best for most organisations: the public cloud lends itself to the hosting of predictable services and systems that might need to be rapidly scaled up or down or even require cloud bursting at times of peak demand. Conversely, private clouds suit scenarios where regulations, latency and security demand that workloads are kept on-site; and on-premise infrastructure is still the preferred choice for the most sensitive data.

The rise of SIAM

Another continuing trend for 2016 reflects the challenges of remaining in control when data is held across the types of hybrid environments mentioned above, and large parts of the environment outsourced, with organisations relying on a myriad of service providers. In this kind of multi service provider environment, it is particularly challenging to create a unified view of demand and consumption of IT services, meaning it easy for inefficiencies and poor performance to emerge and creating much greater difficulty in coordinating change activities.

Against this backdrop, SIAM (service integration and management) is coming to the fore – adding a thin layer of governance over the top of such environments, with the aim of creating a framework where collaboration, co-ordination and communication between all parties involved is engendered. This enables organisations to manage their service providers in a consistent and efficient way, ensuring that services are performant, that they meets users’ needs and that only the services consumed are being paid for.

For the SIAM model to work, IT needs to find a way of bringing together a vast amount of data from many sources. Services need to be well defined with clearly understood boundaries between business units and service providers. Each party needs to know what they are accountable for and how they fit into the overall IT provision landscape.

The challenge is in aggregating all this data into a cohesive view that underpins effective collaboration across multiple stakeholders – enabling the current state to be clearly understood and future strategy to be catered for. When this is achieved, areas of over or under-provisioning can be quickly identified and complex changes touching multiple systems and suppliers effectively co-ordinated. Capacity planning has a huge role to play here, it provides the mechanism for organisations to predict the impact that complex changes in business demand will have on IT supply – and vice versa. It can provide that single pane of glass and help to assess and orchestrate what needs to happen across multiple systems and stakeholders to respond to change.

In conclusion – getting ahead of the game

In increasingly complex IT environments, where the pace of change (both business and technology) and the volume, velocity and variety of data is ever accelerating, smart organisations are getting on the front foot by strengthening their ability to plan ahead.

By implementing capacity planning, it is possible to extend your planning horizon and stay ahead of the game. Adopting technologies like predictive analytics and statistical modelling enables IT departments to de-risk IT service performance and plan for change, no matter what the next 12 months may bring.

Enjoying this content? Sign up for free today to receive the latest opinions, interviews, resources and news from the tech channel directly to your inbox.

Check Also

QBS Technology Group Continues META Expansion with Maxtec

QBS Technology Group has completed the acquisition of South Africa-based cybersecurity distributor Maxtec. The acquisition …