As an IT architect over the past 15 years, I’ve gathered a lot of knowledge and experience. I’ve read a lot of books on IT, blogs, product documents, with the aim of keeping up to date. Operationally, I’ve seen the inside of plenty of Data Centres, and been involved in deployments of lots of products, and the often difficult integration of new tech into old world systems. Through all those things, one constant reminder about the industry that has given me so much, is change, continual change.
Eventually all new tech becomes old tech, and made redundant by constant evolution or change, and no more so than the beloved “box” or server as most call it. There’s just so many obstacles, and touch points to get a server (in whatever form factor) deployed into your shiny rack, row, or entire DC. The cabling, the network, the storage, the OS. It’s all something we’ve become pretty good at it over the course of time, but I failed to even mention the quoting, ordering, and shipping fun that goes on top. It’s just that the writing is now on the wall, and many of my peers would say, “Hey mate… it’s been on the wall for 10+ years, surely you’ve noticed.” But I wasn’t witnessing it across the board, I was seeing limited deployments or adoption from existing enterprises.
I spend my working week talking to enterprise customers, and in just about everyone of these meetings, the customers are all looking at ways to move their data from their existing Data Center and asking how they can move it to the cloud. Data is the lifeblood of these organisations, and it’s that data which they value, not the brand of tin or colour of box where it’s running or stored. Customer’s are now focussed on how they manage their data over the entire lifecycle from birth to grave, but with a view to leveraging that data for all the secrets that it may hold.
So with that discussion point seemingly always on the table over the last few years, I thought I would discuss more on the ‘how’ enterprises are starting to leverage the cloud with their data, so that you too can spend less time architecting racks and rows of blades to avoid hot spots in your cold aisle.1. Non-production systems or Test/Dev/UAT in the cloud are a really great way to leverage the benefits of the cloud, while not putting all your production eggs in one basket. It also serves as a highway on-ramp for your development teams to implement DevOps processes and start building CI/CD pipelines for code releases. Rather than rely on shared access and manual processes in your Data Center or premises, the development team can leverage API orchestrated builds and deployments of code and applications to infrastructure on demand. This use case I would say has the most “bang for buck” as it helps move the organisation forward on many levels, while also maturing your operations and code deployment frequency. Not to forget reporting and analytics requirements can also begin to leverage elastic compute efficiently to handle peak periods. 2. Using the cloud for offsite backups is a very straightforward solution for most customers to implement, and can normally be implemented overnight. It offers an immediate reduction on risk, even if you have two or more data centres today, because that backup data is now located in a third location which has spread the risk over a different region/location. In the longer term most customers will embark upon this solution with the main aim to reduce their Data Center footprint or consumption. The additional benefits, of replicating backup data to the cloud, include being able to test the migration to cloud for each system prior to the actual migration event from the backed up copy already in the cloud. 3. While long term retention in the cloud should be the simplest solution of them all to implement and be able to stop storing years of tapes in your data centre store rooms! It’s often a difficult decision to make due to vendor lock-in for retaining long term data in a cloud object storage vault, as you need to consider how you restore that data in the years to come. In order to implement the solution you might need a compute instance running full-time in the cloud, or almost always you will need a media server licensed for the version you wrote the data in from now till the end of the retention period. And that’s a lot of risk, and maybe too many reasons to not leverage the cloud for your long term retention needs.
The good news is that there are many mature solutions to help you navigate your use of public clouds today, and by unquestionably there will be more in the years to come, it’s just a matter of when it’s the right time for you. But don’t leave it too long, in the years to come do you really think your enterprise can sustain the lead times for tin to be shipped, racked, stacked, connected and provisioned before it’s useful, versus your competitors who have 100 systems provisioned and ready to help deliver data to the business needs before you have even faxed the order to your supplier.
From my days managing Novell servers 20 odd yrs ago, sure the skill is long forgotten for how to troubleshoot an abend, but I can guarantee that a simpler and more agile business solution will make data centre management less of a pain, as there will be far less tin to manage inside.
To the cloud…