I have been working on-and-off with âlegacyâ systems for decades. The exact definition of what such a thing is, may come across as vague and ill-defined, but thatâs ok. The next generations of software developers, data engineers and data scientists and in fact anyone working in tech, will present you with this idea and then youâll have to work out the realness of their perspective.
For any twenty or thirty-something in tech these days, anything created before they were born or started their career, is likely labeled legacy. Itâs a fair perspective. Any system has successors. Yet if it is âoldâ and is still clicking and whirring in the background as a key piece of technology holding business data together, it might reasonably be considered a part of some legacy.
The term is loaded though.
For those who havenât quite retired yet (myself included) â legacy connotes some sort of inflexible and unbendable technology that cannot be modernized or made more contemporary. In some industries or contexts though, legacy implies heritage, endurance and resilience. So which is it? Both views have their merits, but they have a different tonality to them, one very negative and one almost revered.
In my early working years, I had the pleasure of seeing the rise of the PC, at a time when upstart technologies were trying to bring personal computing into the home and the workplace. The idea of computing at home was for hobbyists and dismissed by the likes of IBM. Computing at work often involved being bound to a desk with a 30kg beige and brown housed CRT âgreen screen â dumb terminalâ with a keyboard that often weighed as much, or more, than the heaviest of modern day laptops.
One of the main characteristics of these systems, was that they were pretty consistent in the way that they operated. Yes, they were limited, especially in terms of overall functionality, but for the most part the journeys were constrained and the results and behaviours were consistent. Changes to these systems seemed to move glacially. The whole idea of even quarterly software updates for example, would be perhaps somewhat of a novelty. Those that had in-house software development teams laughably took the gestation period of a human baby to get pretty much âanythingâ done. Even bugs, when they were detected and root cause analysed, would take months to often be remediated, not because of complexity to solve, but rather because of the approaches to software change and update.
I suppose, in some industries, the technology was a bit more dynamic but certainly the friends and colleagues that I worked with in other industry sectors didnât seem to communicate that there was a high level of velocity of change in these systems. Many of them were mainframe and mini-mainframe based â often serviced by one or more of the big tech brands that dominated in those days.
I would suppose, that a characteristic of modern systems, and modern practices then, is probably encapsulated in the idea of handling of greater complexities. Dealing with higher volumes of data and the need for greater agility. The need for integrated solutions for example, has never pressed harder than it does today. We need and in fact demand interconnectedness and we need to be able to trace numerous golden threads of system interoperability and application technology interdependence at the data level, at unprecedented levels.
In the past we could get away with manual curation of all kinds of things, including describing what we had and where it was, but the volumes, complexities and dependencies of systems today, make the whole idea of doing these things manually, seem futile and fraught with the risk of being incomplete and quickly out of date. Automation is now more than a buzzword, itâs table stakes, and many will rightly assume that automation has already been considered in the design.
Legacy Systems and Their Limitations
As anyone who regularly uses office applications will attest. Just a cursory consideration of your presentations, documents and spreadsheet files, folders and shared content, you will find that they demonstrate just how quickly things can get out of hand.
Unless you are particularly OCD perhaps; you likely have just one steaming heap of documents that youâre hoping your operating system or cloud provider is able to adequately index for a random search.
If not, youâre bound to your naming conventions (if you have any), the recency timestamps or some other criteria. In some respects, even these aspects seems to make all this smell suspiciously like a âlegacy problemâ.
The growth of interest and focus in modern data management practices in general, means that we need to consider how business and operational demands are reshaping the future of data governance in general.
I still donât have a good definition for what a âLegacy Systemâ really is despite all this The general perspective is that it is something that predates what you work with on a daily basis. This seems as good as any definition. But, we have to acknowledge though, that legacy systems remain entrenched as the backbone of a great many organizationsâ data management strategies. The technology may have advanced and data volumes may have surged, but many legacy systems endure, despite or perhaps in spite of their inadequacies for contemporary and modern business needs.
Inability to Handle Modern Data Complexity
One of the most significant challenges posed by legacy data systems is often their inability to cope with data volumes and the inherent complexities of contemporary data. Pick your favourite system and consider how well it handles those documents I described earlier, either as documents or as links to those documents in some cloud repository.
Many of the legacy solutions that people think about as legacy solutions, were designed more than a generation ago when the technology choices were more limited, there was less globalization and we were still weaning ourselves off paper based and manually routed content and data. Often the data itself was conveniently structured with a prescribed meta model and stored in relational databases. These days, businesses face a deluge of new data typesâstructured, semi-structured, and unstructuredâemanating from an ever gorwing number of sources including social media, IoT, and applications.
Legacy transactions and master data systems are now having to deal with managing tens and hundreds of millions of records spread across function and form specific siloed systems. This fragmentation in turn, is leading to inconsistencies in the dataâs content, data quality and data reliability. All this makes it difficult for organizations to know what to keep and what to discard, what to pay attention to and what to ignore, what to use for action and what to simply consider as supplementary.
If there is enough metadata to describe all these systems, we may be lucky enough to index it and make it findable, assuming we know what to look for. The full or even partial adoption of the hybrid cloud has simply perpetuated the distributed silos problem. Now, instead of discrete applications or departments acting as data fiefdoms, we have the triple threat of data in legacy systems, unindexed. Data in local system stores and data in cloud systems. Any technical or non technical user finds it understandably challenging to find what they want and what they should care about because there are very few fully integrated seamless platforms that describe everything in a logical and accessible way.
Rigidity and Lack of Agility
Legacy and traditional systems are also characteristed by some inherent rigidity. The approach to implementing or running them often involves elongated processes that can take months or even years in implementation and require regimented discipline for daily operations. New initiatives hooked to legacy applications are typically characterized by expensive and high failure rates due to their own inherent complexity and the need for extensive customization to integrate and work with more contemporary technologies.
For example, prominent ERP software company SAP, announced in February 2020 that it would provide mainstream maintenance for core applications of [ECC] SAP Business Suite 7 software until the end of 2027.
But according to The Register, as recently as June 2024 representatives of DACH customers suggested that they donât believe they will even meet the 2030 cut-off when extended support ends.
Research by DSAG, representing SAP customers in DACH found that 68% still use the âLegacyâ platform. 22% suggesting that the SAP ECC/Business Suite influenced their SAP investment strategy for 2024. Many are reluctant to upgrade because they have invested so heavily in customizations. All this makes for some tough calls.
The rigidity of the legacy system compounded by the reticence of customers to upgrade does present a challenge in terms of understanding just how responsive any business can be, to changing business needs. SAP wants you to use shinier and glossier versions of their technology in order to maintain a good relationship with you and to ensure that they can continue adequately supporting your business into the future but if you wonât upgrade what are they to do?
The modern digital economies expect businesses to be able to pivot quickly in response to market trends or customer demands. Being stuck on legacy solutions may be holding them back. Companies running on legacy, may need significant time and resources to adapt further or scale to meet the expectations of the new. Apparent system inflexibility will likely hinder innovation and limit oneâs ability to compete effectively.
Unification is a possible answer
If you recognise and acknowledge these limitations, then youâre likely already shifting away from the traditional siloed approaches to data management towards more unified platforms.
Integrated solutions like SAP provide a holistic view of organizational data, and they have been paramount for years. But even here, not all the data is held in these gigantic systems. SAP would segment the platforms by business process. Order to Cash, Procure to Pay, Hire to Retire and so on. But business are multidimensional. Business processes arenât necessarily the way the business thinks about its data.
A multinational running on SAP may think about its data and systems in a very regional fashion, or by a specific industry segment like B2C or B2B; they may even fragment further depending on how they are set up. Channel-focused business for example is not unusual. eCommerce vs Retail Stores; D2C⌠The number of combinations and permutations are seemingly limitless. Yet each of these areas is likely just another data silo.
A break with data silos fosters cross-divisional collaboration allowing the business to enhance decision-making processes and improve overall operational efficiency. ERP doesnât necessarily promote this kind of thinking. Such a shift is not just reactive with respect to shortcomings of legacy systems and the like; it is also driven by a broader trend towards digital transformation.
In commercial banking for example, thinking through the needs and wants of the different regional representations, the in-market segments and then the portfolio partitions, means that some data is common, some data is not, but most importantly, all of the data likely needs to be in one unifying repository and definitely needs to be handled in a consisent, aligned, compliant and unified way. Through the lens of risk and compliance, everyoneâs behaviours and data are viewed in the same way, irrespective of where theuir data is held and who or what it relates to.
Incorporating modern capabilities like artificial intelligence (AI), machine learning (ML), and big data analytics requires solutions that can support these initiatives effectively and seems to be a popular topic of discussion. You can poo-poo AI and ML as fads with relatively limited real applicability and value right now, but like yester yearâs personal computers, mobile phone technology and like, these kinds of things have an insidious way of permeating our daily lives in ways that we may have never considered before and before we know it, we have become hooked on them as essential capabilities for us to get through our daily lives.
Lessons in retail
In modern retail in the developed world, for example, every product has a barcode and every barcode is attached to a master data record entry that is tied to a cost and pricing profile.
When you checkout at the grocery store, the barcode is a key to the record in the point of sale system ahnd pricing engines and thatâs the price that you see on the checkout receipt. Just 25 years ago, stores were still using pricing âgunsâ to put stickers on merchandise, something that still exists in many developing countries to this day. You might laugh, but in times of high inflation it was not uncommon for consumers to scratch about on the supermarket shelves looking for older stock of merchandise with the old price.
Sticker-based pricing may still prevail in places but often the checkout process is cashless, auto reconciling for checkout and inventory and especially for auto pricing all with the beep of a read of that barcode by a scanner.
As these technologies become even more affordable, and even more accessible to all sizes of business and even the most cost consciousness. In all aspects of buying, handling, merchandising and selling grows, the idea of individually priced merchandise will probably disappear altogether and weâll still be frustrated by the missing barcode entry in the database at checkout or that grocery item that is sold by weight and needs to be given its own personal pricing barcode because the checkout doesnât have a scale. This then becomes a legacy problem in itself where we straddle the old way of doing things and a new way.
In much the same way, transitioning from legacy to something more contemporary doesnât mean that an organization has to completely abandon heritage systems, but it does mean that continuing to retain, maintain and extend existing systems should be continuously evaluated. The point here is that once these systems move beyond their âbest-byâ date, an organization encumbered by them, should already have a migration, transition or displacement solution in mind or underway.
This would typically be covered by some sort of digital transformation initiative.
Modern Solutions and Approaches
In stark contrast to legacy systems, modern solutions are typically designed with flexibility and scalability in mind.
One could argue that perhaps therâs too much flexibility and scale sometimes, but they do take advantage of contemporary advanced technologies which means that they potentially secure a bit more of a resiliency lifeline.
A lifeline in the sense that you will continue to have software developers available to work on it, users who actively use it because of its more contemporary look and feel, and a few more serviceable versions before it is is surpassed by something newer and shinier, at which point it too becomes classified as âlegacyâ.
Cloud-Native Solutions
One of the most significant advancements in data systems these days, is the prevalence of cloud-native solutions. Not solutions ported to the cloud but rather solutions built from the ground up using the cloud-first design paradigm. I make this distinction because so many cloud offerings are nothing more than âmovedâ technologies.
Cloud native systems may use microservices architecture â a design approach allowing individual components to be developed, deployed, and scaled independently. They may also make use of on-demand âserverlessâ technologies. By taking advantage of the modularity afforded by microservices, organizations can adapt their data management capabilities relatively more quickly in response to changing business requirements. This could be through technology switch outs or incremental additions. The serverless elements means that they make use of compute on-demand and in theory this means a lower operational cost and reduced wastage due to overprovisioned idle infrastructure/
Many cloud-native data management solutions also have the ability to more easily harness artificial intelligence and machine learning technologies to enhance data processing and analysis capabilities. Such tool use facilitates real-time data integration from diverse sources, allowing businesses to more easily maintain accurate and up-to-date data records with less effort.
Instead of being bound to geographies and constraining hardware profiles, users only need to have an internet connection and suitable software infrastructure to securely authenticate. The technology that supports the compute being able to be switched out in a seemingly limitless number of combinations according to the capabilities and inventory of offerings of the hosting providers.
Scalability is one of the most pressing concerns associated with legacy systems, one that these contemporary systems technologies seem to have largely overcome. Cloud-native solutions purport to be able to handle growing data volumes with almost no limits.
A growing data footprint also compells the organizations that continue to generate vast amounts of data daily. The modern data solution suggests that it can scale horizontallyâadding more resources as needed without impairment and minimal disruption.
The concept of data mesh is also growing in popularity. It seems to be something that is gaining traction as an alternative to traditional centralized data management frameworks. On face value at least, this seems not dissimilar to the debate surrounding all-in-one versus best-of-breed solutions in the world of data applications. Both debates revolve around fundamental questions about how organizations should structure their data management practices to best meet their needs.
Data Mesh promotes a decentralized approach to data management by treating individual business domains as autonomous entities responsible for managing their own data as products. This domain-oriented strategy empowers teams within an organization to take ownership of their respective datasets while ensuring that they adhere to standardized governance practices. By decentralizing data ownership, organizations achieve greater agility and responsiveness in managing their information assets.
The concept also emphasizes collaboration between teams through shared standards and protocols for data interoperability. This collaborative approach fosters a culture of accountability while enabling faster decision-making processes driven by real-time insights. Set the policies, frameworks and approaches centrally but delegate the execution to the perhipheral domains to self-manage.
The Evolutionary Path Forward
Evolving from legacy to modern data management practices then starts to reflect broader transformations which occur through the embrace of things digital. Such a shift is not merely about adopting new tooling; it represents a fundamental change in how businesses view and manage their data assets. Centralized, constrained control gets displaced by distributed accountability.
Along the way, there will be some challenges to be considered. Amongst these, the cost of all these threads of divergence and innovation. Not all business areas will necessarily run at the same pace. Some will be a little more lethargic than others and their palate for change or alternative ways of working may be very constrained and limited.
Another issue will be the costs. With IT bugets remaining heavily constricted by most businesses, the idea of investing in technology bound initiatives is nowadays wrapped up in elaborate return-on-investment calculations and expectations.
The burden of supportive evidence for investment now falls to the promoters and promulgators of new ways of working and new tech; to provide proof points, timelines and a willingness to qualify the effort and the jsutification before the investment flows. With all the volatility that might exist in the business, sometimes these calculations, forecasts and predictions may be very hard to calculate.
Buying into new platforms and technologies also requires a candid assessment as to the viability or likelihood that any particular innovation will actually yield a tangible or meaningful business benefit. While ROI is one thing, the ability to convince stakeholders that the prize is a worthwhile prize is another. Artificial Intelligence, machine learning and big data analytics present as a trio of capabilities that hold promise that some will continue to doubt the utility of.
As evidenced by history being littered with market misreads like RIMâs Blackberry underestimating the iPhone and Kodak Filmâs lack of comprehension of the significance of digital photography. Big Techâs Alphabet (Google), Amazon, Apple, Meta and Microsoft may get a bunch wrong, but the more vulnerable business sector that depends on these tech giants cannot really afford to make too many mistakes.
Organizations need to invest as much in critically evaluating next generation data management technologies as in their own ongoing market research. They need to do this to understand evolving preferences and advancements. This includes observing the competition and shifts in demand.
Those that foster a culture of innovation, encourage experimentation and embrace new technologies need to be prepared to reallocate resources or risk having any position of strength that they have, being displaced, especially by newer more agile entrants to their markets. Agility means being able to quickly adapt, a crucial characteristic for responding effectively to market disruptions. Being trapped with a legacy mindset and legacy infrastructure retards an organizationâs ability to adapt.
Driving toward a modern Data-Driven Culture
To maximize the benefits of modern data management practices, organizations must foster a culture that prioritizes data-driven decision-making at all levels. In a modern data-driven culture an organizationâs data management environment is key. Decisions, strategies and operations at all levels need to be bound to data.
For this to work, data needs to be accessible, the evaluators, users, and consumers of the data need to be data literate and they need to have the requisite access and an implicit dependency on data as a part of their dailies. For effective data management there needs to be a philosophy of continuous improvement tied to performance metrics and KPIs like data quality measures accompanied by true accountability.
Building blocks for this data driven culture hinge not only on the composition of the people and their work practices but also on the infrastructure which needs to be scalable and reliable, secure and of high performance.
The data contained therein, needs to be comprehensive, rich and accessible in efficient and cost effective ways. The quality of the data needs to be able to stand up to all kinds of scrutiny from a regulatory and ethical standpoint, through auditability and functional suitability. Any efforts to make the whole approach more inclusive and embracing of a whole organization inclusive mindset should also be promoted. The ability to allow the individual business units to manage their own data and yet contribute to the data more holistically will ultimately make the data more valuable.
If legacy has not failed us already, it will. Failure may not be obvious. It could be a slow, degraded experience that hampers business innovation and progress. Organizations that do not have renewal and reevaluation as an integral part of their operating model.
To effectively transition from legacy systems to modern data management practices, organizations must recognize the critical limitations posed by outdated technologies and embrace the opportunities presented by contemporary solutions.
Legacy systems, while at some point foundational to business operations, often struggle to manage the complexities and voluminous data generated in todayâs digital landscape. Their rigidity and inability to adapt hinder innovation and responsiveness, makes it imperative for organizations to evaluate their reliance on such systems.
The shift towards modern solutionsâcharacterized by flexibility, scalability, and integrationâpresents a pathway for organizations to enhance their operational efficiency and decision-making capabilities. Cloud-native solutions and decentralized data management frameworks like Data Mesh empower businesses to harness real-time insights and foster collaboration across departments. By moving away from siloed approaches, organizations can create a holistic view of their data, enabling them to respond swiftly to market changes and customer demands.
As I look ahead, I see it as essential that organizations cultivate their own distinctive data-driven culture.
A culture that prioritizes accessibility, literacy, and continuous improvement in data management practices. Such a shift would not only enhance decision-making but also drive innovation, positioning any organization more competitively in an increasingly complex environment.
All organizations must take proactive steps to assess their current data management strategy and identify areas for modernization.
They should begin by evaluating the effectiveness of existing legacy systems and exploring integrated solutions that align with their business goals.
They should invest in training programs that foster data literacy among employees at all levels, ensuring that the workforce is equipped to leverage data effectively.
Commit to a culture of continuous improvement, where data quality and governance are prioritized. By embracing these changes, organizations can unlock the full potential of their data assets and secure a competitive advantage for the future.