CEO of Rookout. Has led data-driven businesses, products and R&D teams over the last two decades, from startups to government organizations.
Cloud. Microservices. Containers. Serverless.
These are buzzwords everyone in the software industry has become familiar with. That’s not even getting into the world of “machine learning” and “AIOps” (artificial intelligence operations). While it’s true that many cutting-edge companies, particularly in the tech industry, are embracing and adopting modern software architectures and methodologies, the fact is that the large majority of companies are running legacy applications responsible for millions, if not billions, of dollars in revenue.
The pandemic has shown just how much we rely on these aging legacy IT systems. According to a recent report from AppDynamics, 66% of IT professionals say that “the pandemic has exposed weaknesses in their digital strategy, driving an urgent need to push through initiatives which were once a part of multiyear digital transformation programs.” While we can hope this time will be a forcing function for many businesses to reflect and modernize, history shows that change is hard and that if things return to normal, so will old processes.
Governments, banks, airlines — nearly every major industry is dealing with old IT, hardware and legacy code that makes moving fast impossible, resolving issues difficult and troubleshooting applications expensive. These old systems are more prone to have bugs, cause outages, and waste software engineering time.
One of the major reasons many of these organizations are slow to modernize is that they don’t want to jeopardize the stability of their core applications. If you ask their engineers, many of them wish they could snap their fingers and make them cloud-native, but the fact is that migrating to new technologies is cumbersome and often messy. While legacy code is a pain, it’s often responsible