Search for:
Have legacy systems failed us?


I have been working on-and-off with “legacy” systems for decades. The exact definition of what such a thing is, may come across as vague and ill-defined, but that’s ok. The next generations of software developers, data engineers and data scientists and in fact anyone working in tech, will present you with this idea and then you’ll have to work out the realness of their perspective.

For any twenty or thirty-something in tech these days, anything created before they were born or started their career, is likely labeled legacy. It’s a fair perspective. Any system has successors. Yet if it is ‘old’ and is still clicking and whirring in the background as a key piece of technology holding business data together, it might reasonably be considered a part of some legacy.

The term is loaded though.

For those who haven’t quite retired yet (myself included) – legacy connotes some sort of inflexible and unbendable technology that cannot be modernized or made more contemporary. In some industries or contexts though, legacy implies heritage, endurance and resilience. So which is it? Both views have their merits, but they have a different tonality to them, one very negative and one almost revered.

In my early working years, I had the pleasure of seeing the rise of the PC, at a time when upstart technologies were trying to bring personal computing into the home and the workplace. The idea of computing at home was for hobbyists and dismissed by the likes of IBM. Computing at work often involved being bound to a desk with a 30kg beige and brown housed CRT “green screen – dumb terminal” with a keyboard that often weighed as much, or more, than the heaviest of modern day laptops.

One of the main characteristics of these systems, was that they were pretty consistent in the way that they operated. Yes, they were limited, especially in terms of overall functionality, but for the most part the journeys were constrained and the results and behaviours were consistent. Changes to these systems seemed to move glacially. The whole idea of even quarterly software updates for example, would be perhaps somewhat of a novelty. Those that had in-house software development teams laughably took the gestation period of a human baby to get pretty much ‘anything’ done. Even bugs, when they were detected and root cause analysed, would take months to often be remediated, not because of complexity to solve, but rather because of the approaches to software change and update.

I suppose, in some industries, the technology was a bit more dynamic but certainly the friends and colleagues that I worked with in other industry sectors didn’t seem to communicate that there was a high level of velocity of change in these systems. Many of them were mainframe and mini-mainframe based – often serviced by one or more of the big tech brands that dominated in those days.

I would suppose, that a characteristic of modern systems, and modern practices then, is probably encapsulated in the idea of handling of greater complexities. Dealing with higher volumes of data and the need for greater agility. The need for integrated solutions for example, has never pressed harder than it does today. We need and in fact demand interconnectedness and we need to be able to trace numerous golden threads of system interoperability and application technology interdependence at the data level, at unprecedented levels.

In the past we could get away with manual curation of all kinds of things, including describing what we had and where it was, but the volumes, complexities and dependencies of systems today, make the whole idea of doing these things manually, seem futile and fraught with the risk of being incomplete and quickly out of date. Automation is now more than a buzzword, it’s table stakes, and many will rightly assume that automation has already been considered in the design.

Legacy Systems and Their Limitations

As anyone who regularly uses office applications will attest. Just a cursory consideration of your presentations, documents and spreadsheet files, folders and shared content, you will find that they demonstrate just how quickly things can get out of hand.

Unless you are particularly OCD perhaps; you likely have just one steaming heap of documents that you’re hoping your operating system or cloud provider is able to adequately index for a random search.

If not, you’re bound to your naming conventions (if you have any), the recency timestamps or some other criteria. In some respects, even these aspects seems to make all this smell suspiciously like a “legacy problem”.

The growth of interest and focus in modern data management practices in general, means that we need to consider how business and operational demands are reshaping the future of data governance in general.

I still don’t have a good definition for what a “Legacy System” really is despite all this The general perspective is that it is something that predates what you work with on a daily basis. This seems as good as any definition. But, we have to acknowledge though, that legacy systems remain entrenched as the backbone of a great many organizations’ data management strategies. The technology may have advanced and data volumes may have surged, but many legacy systems endure, despite or perhaps in spite of their inadequacies for contemporary and modern business needs.

Inability to Handle Modern Data Complexity

One of the most significant challenges posed by legacy data systems is often their inability to cope with data volumes and the inherent complexities of contemporary data. Pick your favourite system and consider how well it handles those documents I described earlier, either as documents or as links to those documents in some cloud repository.

Many of the legacy solutions that people think about as legacy solutions, were designed more than a generation ago when the technology choices were more limited, there was less globalization and we were still weaning ourselves off paper based and manually routed content and data. Often the data itself was conveniently structured with a prescribed meta model and stored in relational databases. These days, businesses face a deluge of new data types—structured, semi-structured, and unstructured—emanating from an ever gorwing number of sources including social media, IoT, and applications.

Legacy transactions and master data systems are now having to deal with managing tens and hundreds of millions of records spread across function and form specific siloed systems. This fragmentation in turn, is leading to inconsistencies in the data’s content, data quality and data reliability. All this makes it difficult for organizations to know what to keep and what to discard, what to pay attention to and what to ignore, what to use for action and what to simply consider as supplementary.

If there is enough metadata to describe all these systems, we may be lucky enough to index it and make it findable, assuming we know what to look for. The full or even partial adoption of the hybrid cloud has simply perpetuated the distributed silos problem. Now, instead of discrete applications or departments acting as data fiefdoms, we have the triple threat of data in legacy systems, unindexed. Data in local system stores and data in cloud systems. Any technical or non technical user finds it understandably challenging to find what they want and what they should care about because there are very few fully integrated seamless platforms that describe everything in a logical and accessible way.

Rigidity and Lack of Agility

Legacy and traditional systems are also characteristed by some inherent rigidity. The approach to implementing or running them often involves elongated processes that can take months or even years in implementation and require regimented discipline for daily operations. New initiatives hooked to legacy applications are typically characterized by expensive and high failure rates due to their own inherent complexity and the need for extensive customization to integrate and work with more contemporary technologies.

For example, prominent ERP software company SAP, announced in February 2020 that it would provide mainstream maintenance for core applications of [ECC] SAP Business Suite 7 software until the end of 2027.

But according to The Register, as recently as June 2024 representatives of DACH customers suggested that they don’t believe they will even meet the 2030 cut-off when extended support ends.

Research by DSAG, representing SAP customers in DACH found that 68% still use the “Legacy” platform. 22% suggesting that the SAP ECC/Business Suite influenced their SAP investment strategy for 2024. Many are reluctant to upgrade because they have invested so heavily in customizations. All this makes for some tough calls.

The rigidity of the legacy system compounded by the reticence of customers to upgrade does present a challenge in terms of understanding just how responsive any business can be, to changing business needs. SAP wants you to use shinier and glossier versions of their technology in order to maintain a good relationship with you and to ensure that they can continue adequately supporting your business into the future but if you won’t upgrade what are they to do?

The modern digital economies expect businesses to be able to pivot quickly in response to market trends or customer demands. Being stuck on legacy solutions may be holding them back. Companies running on legacy, may need significant time and resources to adapt further or scale to meet the expectations of the new. Apparent system inflexibility will likely hinder innovation and limit one’s ability to compete effectively.

Unification is a possible answer

If you recognise and acknowledge these limitations, then you’re likely already shifting away from the traditional siloed approaches to data management towards more unified platforms.

Integrated solutions like SAP provide a holistic view of organizational data, and they have been paramount for years. But even here, not all the data is held in these gigantic systems. SAP would segment the platforms by business process. Order to Cash, Procure to Pay, Hire to Retire and so on. But business are multidimensional. Business processes aren’t necessarily the way the business thinks about its data.

A multinational running on SAP may think about its data and systems in a very regional fashion, or by a specific industry segment like B2C or B2B; they may even fragment further depending on how they are set up. Channel-focused business for example is not unusual. eCommerce vs Retail Stores; D2C… The number of combinations and permutations are seemingly limitless. Yet each of these areas is likely just another data silo.

A break with data silos fosters cross-divisional collaboration allowing the business to enhance decision-making processes and improve overall operational efficiency. ERP doesn’t necessarily promote this kind of thinking. Such a shift is not just reactive with respect to shortcomings of legacy systems and the like; it is also driven by a broader trend towards digital transformation.

In commercial banking for example, thinking through the needs and wants of the different regional representations, the in-market segments and then the portfolio partitions, means that some data is common, some data is not, but most importantly, all of the data likely needs to be in one unifying repository and definitely needs to be handled in a consisent, aligned, compliant and unified way. Through the lens of risk and compliance, everyone’s behaviours and data are viewed in the same way, irrespective of where theuir data is held and who or what it relates to.

Incorporating modern capabilities like artificial intelligence (AI), machine learning (ML), and big data analytics requires solutions that can support these initiatives effectively and seems to be a popular topic of discussion. You can poo-poo AI and ML as fads with relatively limited real applicability and value right now, but like yester year’s personal computers, mobile phone technology and like, these kinds of things have an insidious way of permeating our daily lives in ways that we may have never considered before and before we know it, we have become hooked on them as essential capabilities for us to get through our daily lives.

Lessons in retail

In modern retail in the developed world, for example, every product has a barcode and every barcode is attached to a master data record entry that is tied to a cost and pricing profile.

When you checkout at the grocery store, the barcode is a key to the record in the point of sale system ahnd pricing engines and that’s the price that you see on the checkout receipt. Just 25 years ago, stores were still using pricing “guns” to put stickers on merchandise, something that still exists in many developing countries to this day. You might laugh, but in times of high inflation it was not uncommon for consumers to scratch about on the supermarket shelves looking for older stock of merchandise with the old price.

Sticker-based pricing may still prevail in places but often the checkout process is cashless, auto reconciling for checkout and inventory and especially for auto pricing all with the beep of a read of that barcode by a scanner.

As these technologies become even more affordable, and even more accessible to all sizes of business and even the most cost consciousness. In all aspects of buying, handling, merchandising and selling grows, the idea of individually priced merchandise will probably disappear altogether and we’ll still be frustrated by the missing barcode entry in the database at checkout or that grocery item that is sold by weight and needs to be given its own personal pricing barcode because the checkout doesn’t have a scale. This then becomes a legacy problem in itself where we straddle the old way of doing things and a new way.

In much the same way, transitioning from legacy to something more contemporary doesn’t mean that an organization has to completely abandon heritage systems, but it does mean that continuing to retain, maintain and extend existing systems should be continuously evaluated. The point here is that once these systems move beyond their “best-by” date, an organization encumbered by them, should already have a migration, transition or displacement solution in mind or underway.

This would typically be covered by some sort of digital transformation initiative.

Modern Solutions and Approaches

In stark contrast to legacy systems, modern solutions are typically designed with flexibility and scalability in mind.

One could argue that perhaps ther’s too much flexibility and scale sometimes, but they do take advantage of contemporary advanced technologies which means that they potentially secure a bit more of a resiliency lifeline.

A lifeline in the sense that you will continue to have software developers available to work on it, users who actively use it because of its more contemporary look and feel, and a few more serviceable versions before it is is surpassed by something newer and shinier, at which point it too becomes classified as “legacy”.

Cloud-Native Solutions

One of the most significant advancements in data systems these days, is the prevalence of cloud-native solutions. Not solutions ported to the cloud but rather solutions built from the ground up using the cloud-first design paradigm. I make this distinction because so many cloud offerings are nothing more than ‘moved’ technologies.

Cloud native systems may use microservices architecture — a design approach allowing individual components to be developed, deployed, and scaled independently. They may also make use of on-demand “serverless” technologies. By taking advantage of the modularity afforded by microservices, organizations can adapt their data management capabilities relatively more quickly in response to changing business requirements. This could be through technology switch outs or incremental additions. The serverless elements means that they make use of compute on-demand and in theory this means a lower operational cost and reduced wastage due to overprovisioned idle infrastructure/

Many cloud-native data management solutions also have the ability to more easily harness artificial intelligence and machine learning technologies to enhance data processing and analysis capabilities. Such tool use facilitates real-time data integration from diverse sources, allowing businesses to more easily maintain accurate and up-to-date data records with less effort.

Instead of being bound to geographies and constraining hardware profiles, users only need to have an internet connection and suitable software infrastructure to securely authenticate. The technology that supports the compute being able to be switched out in a seemingly limitless number of combinations according to the capabilities and inventory of offerings of the hosting providers.

Scalability is one of the most pressing concerns associated with legacy systems, one that these contemporary systems technologies seem to have largely overcome. Cloud-native solutions purport to be able to handle growing data volumes with almost no limits.

A growing data footprint also compells the organizations that continue to generate vast amounts of data daily. The modern data solution suggests that it can scale horizontally—adding more resources as needed without impairment and minimal disruption.

The concept of data mesh is also growing in popularity. It seems to be something that is gaining traction as an alternative to traditional centralized data management frameworks. On face value at least, this seems not dissimilar to the debate surrounding all-in-one versus best-of-breed solutions in the world of data applications. Both debates revolve around fundamental questions about how organizations should structure their data management practices to best meet their needs.

Data Mesh promotes a decentralized approach to data management by treating individual business domains as autonomous entities responsible for managing their own data as products. This domain-oriented strategy empowers teams within an organization to take ownership of their respective datasets while ensuring that they adhere to standardized governance practices. By decentralizing data ownership, organizations achieve greater agility and responsiveness in managing their information assets.

The concept also emphasizes collaboration between teams through shared standards and protocols for data interoperability. This collaborative approach fosters a culture of accountability while enabling faster decision-making processes driven by real-time insights. Set the policies, frameworks and approaches centrally but delegate the execution to the perhipheral domains to self-manage.

The Evolutionary Path Forward

Evolving from legacy to modern data management practices then starts to reflect broader transformations which occur through the embrace of things digital. Such a shift is not merely about adopting new tooling; it represents a fundamental change in how businesses view and manage their data assets. Centralized, constrained control gets displaced by distributed accountability.

Along the way, there will be some challenges to be considered. Amongst these, the cost of all these threads of divergence and innovation. Not all business areas will necessarily run at the same pace. Some will be a little more lethargic than others and their palate for change or alternative ways of working may be very constrained and limited.

Another issue will be the costs. With IT bugets remaining heavily constricted by most businesses, the idea of investing in technology bound initiatives is nowadays wrapped up in elaborate return-on-investment calculations and expectations.

The burden of supportive evidence for investment now falls to the promoters and promulgators of new ways of working and new tech; to provide proof points, timelines and a willingness to qualify the effort and the jsutification before the investment flows. With all the volatility that might exist in the business, sometimes these calculations, forecasts and predictions may be very hard to calculate.

Buying into new platforms and technologies also requires a candid assessment as to the viability or likelihood that any particular innovation will actually yield a tangible or meaningful business benefit. While ROI is one thing, the ability to convince stakeholders that the prize is a worthwhile prize is another. Artificial Intelligence, machine learning and big data analytics present as a trio of capabilities that hold promise that some will continue to doubt the utility of.

As evidenced by history being littered with market misreads like RIM’s Blackberry underestimating the iPhone and Kodak Film’s lack of comprehension of the significance of digital photography. Big Tech’s Alphabet (Google), Amazon, Apple, Meta and Microsoft may get a bunch wrong, but the more vulnerable business sector that depends on these tech giants cannot really afford to make too many mistakes.

Organizations need to invest as much in critically evaluating next generation data management technologies as in their own ongoing market research. They need to do this to understand evolving preferences and advancements. This includes observing the competition and shifts in demand.

Those that foster a culture of innovation, encourage experimentation and embrace new technologies need to be prepared to reallocate resources or risk having any position of strength that they have, being displaced, especially by newer more agile entrants to their markets. Agility means being able to quickly adapt, a crucial characteristic for responding effectively to market disruptions. Being trapped with a legacy mindset and legacy infrastructure retards an organization’s ability to adapt.

Driving toward a modern Data-Driven Culture

To maximize the benefits of modern data management practices, organizations must foster a culture that prioritizes data-driven decision-making at all levels. In a modern data-driven culture an organization’s data management environment is key. Decisions, strategies and operations at all levels need to be bound to data.

For this to work, data needs to be accessible, the evaluators, users, and consumers of the data need to be data literate and they need to have the requisite access and an implicit dependency on data as a part of their dailies. For effective data management there needs to be a philosophy of continuous improvement tied to performance metrics and KPIs like data quality measures accompanied by true accountability.

Building blocks for this data driven culture hinge not only on the composition of the people and their work practices but also on the infrastructure which needs to be scalable and reliable, secure and of high performance.

The data contained therein, needs to be comprehensive, rich and accessible in efficient and cost effective ways. The quality of the data needs to be able to stand up to all kinds of scrutiny from a regulatory and ethical standpoint, through auditability and functional suitability. Any efforts to make the whole approach more inclusive and embracing of a whole organization inclusive mindset should also be promoted. The ability to allow the individual business units to manage their own data and yet contribute to the data more holistically will ultimately make the data more valuable.

If legacy has not failed us already, it will. Failure may not be obvious. It could be a slow, degraded experience that hampers business innovation and progress. Organizations that do not have renewal and reevaluation as an integral part of their operating model.

To effectively transition from legacy systems to modern data management practices, organizations must recognize the critical limitations posed by outdated technologies and embrace the opportunities presented by contemporary solutions.

Legacy systems, while at some point foundational to business operations, often struggle to manage the complexities and voluminous data generated in today’s digital landscape. Their rigidity and inability to adapt hinder innovation and responsiveness, makes it imperative for organizations to evaluate their reliance on such systems.

The shift towards modern solutions—characterized by flexibility, scalability, and integration—presents a pathway for organizations to enhance their operational efficiency and decision-making capabilities. Cloud-native solutions and decentralized data management frameworks like Data Mesh empower businesses to harness real-time insights and foster collaboration across departments. By moving away from siloed approaches, organizations can create a holistic view of their data, enabling them to respond swiftly to market changes and customer demands.

As I look ahead, I see it as essential that organizations cultivate their own distinctive data-driven culture.

A culture that prioritizes accessibility, literacy, and continuous improvement in data management practices. Such a shift would not only enhance decision-making but also drive innovation, positioning any organization more competitively in an increasingly complex environment.

All organizations must take proactive steps to assess their current data management strategy and identify areas for modernization.

They should begin by evaluating the effectiveness of existing legacy systems and exploring integrated solutions that align with their business goals.

They should invest in training programs that foster data literacy among employees at all levels, ensuring that the workforce is equipped to leverage data effectively.

Commit to a culture of continuous improvement, where data quality and governance are prioritized. By embracing these changes, organizations can unlock the full potential of their data assets and secure a competitive advantage for the future.


Read More
Author: Clinton Jones

The Duality of Ideas: Multiplication and Execution


Ideas are the seeds that blossom into groundbreaking products, services, and solutions. The belief that “ideas are like rabbits” is a long embraced one, it celebrates the notion that the more ideas one generates, the greater the potential for breakthroughs. However, Apple founder and innovator, Steve Jobs astutely observed, the mere abundance of ideas can also breed a dangerous “disease” – the misguided belief that a brilliant concept alone is sufficient for success, without the arduous journey of execution.

The metaphor of ideas as rabbits captivates, conjuring up images of rapid multiplication and boundless potential and just as rabbits are known for prolific breeding (breeding like rabbits), ideas can have the capacity to spawn new thoughts, concepts, and perspectives at an astonishing rate in specific circumstances and in the company of specific audiences.

Idea proliferation through ‘brainstorming’ is often celebrated in creative circles, where such sessions and ideation workshops are designed to unleash a torrent of possibilities, each one building upon the last.

Author and entrepreneur James Altucher eloquently captures this sentiment in his book “Choose Yourself,” stating, “Ideas are the multiplicative force that allows a human to combine, recombine, and create new ideas from old ideas.” This concept also resonates with the concept of “idea sex,” a process of combining existing ideas to generate novel ones attributed to Science writer,  Matt Ridley first came up with the concept in 2010 writing a book on the subject called The Rational Optimist as well as speaking at a TED talk under the theory of “Ideas have sex” espoused at the “blue-sky thinking” conference.

Apple’s Steve Jobs also cautioned, that the unbridled multiplication of ideas can lead to a dangerous pitfall – the “disease of thinking that a really great idea is 90% of the work.” So the rabbits metaphor plays in this space again. Overbreeding rabbits can lead to various health issues and diseases for both the mother rabbits and their offspring. Diseases like pregnancy toxemia, uterine cancer, mastitis, exhaustion and malnutrition. For the offspring there is the risk of genetic defects, and weakened immunity.

A Jobs Stanford commencement speech emphasized the immense effort required to transform even the most brilliant idea into a tangible, successful product or service.”There’s a huge gulf,” he proclaims, “between a great idea and its ultimately becoming a phenomenal success in the real world.”

Jobs understood that the path from conception to realization is fraught with challenges, requiring relentless problem-solving, teamwork, and a willingness to make countless tradeoffs and refinements along the way.

Such sentiment echo the words of renowned management consultant Peter Drucker, who described ideas as not dissimilar to babies, in that they need to be born and nurtured. Newborns requires constant care and attention to thrive, an idea must be meticulously cultivated, refined, and executed to reach its full potential.

Jobs warns against the “disease” as a false belief that simply having a great idea is enough – that the mere act of sharing or discussing a brilliant concept is tantamount to success. This misconception can lead to complacency, a lack of follow-through, and a failure to recognize the immense effort required to bring an idea to fruition.

In contrast, Jobs also championed a balanced approach, one that embraced the rapid multiplication of ideas while simultaneously recognizing the necessity of diligent execution. Understanding that true innovation lies not only in the generation of ideas but also in the ability to identify the most promising concepts and nurture them through a rigorous process of refinement, collaboration, and problem-solving.

Guy Kawasaki, author and speaker, states, “Ideas are easy. Implementation is hard.” Akin to “ideas are cheap”. He also emphasizes the importance of execution, noting that even the most groundbreaking ideas are worthless without the dedication and perseverance required to bring them to life. The duality of ideas – their curated multiplication and the necessity of considered execution form the balance that product managers, designers and architects must consider.

So if your thinking is that “ideas are like rabbits” which do you celebrate? The boundless potential of human creativity, or the carefree strewing of concepts without due conisderation for the immense effort required to transform those ideas into tangible successes.

The true path to innovation lies not in the mere abundance of ideas but in the ability to identify the most promising ones and nurture them through a relentless pursuit of excellence, collaboration, and attention to detail per Jobs.

The “disease” of overvaluing ideas be cured, and the full potential of human ingenuity be realized if we accept the tax of execution.

The key message for product managers is to strike a balance between fostering idea generation and ensuring rigorous execution. While the rapid multiplication of ideas is essential for innovation, overvaluing ideas alone can lead to the proverbial falls into the pit.

Execution is King

Here are some ideas that product managers should be considering in this context:

Implement systematic approaches like the SIT (Systematic Inventive Thinking) formula, which provides techniques for acquiring skills and generating original ideas.

  • Subtraction: Removing an essential component from a product or service and finding new uses for it.
  • Multiplication: Repeating or multiplying a component that was previously considered non-essential.
  • Division: Separating a product or service into smaller components and rearranging them.
  • Task Unification: Assigning new tasks or functions to existing components.
  • Attribute Dependency: Linking two independent attributes or components to create a new value proposition.

Invest time and effort in developing and maintaining some sort of strategic product roadmap that translates the visionary product strategy into actionable plans, defining milestones and timelines aligned with the vision of the product(s) and the business.

Set and agree on clear objectives, priorities, and key performance indicators (KPIs) based on customer needs, market research, and the overall product strategy.

Influence and collaborate on the allocation of resources efficiently, budgets, team members, and resource allocation to maximize value and productivity.

Continuously evaluate and refine your product managemnt strategies based on data-driven decision-making, user feedback, and market dynamics.


Read More
Author: Clinton Jones

To the cloud no more? That is the question.


Cloud computing has undergone a remarkable transformation over the past decade.

What was once hailed as a panacea for companies struggling with the high costs and unsustainability of on-premise IT infrastructure has now become a more nuanced and complex landscape. Businesses continue to grapple with the decision to migrate to the cloud or maintain a hybrid approach, the complexity, costs and risk are essential to understand the evolving dynamics and the potential pitfalls that lie ahead.

The initial appeal of cloud solutions was undeniable.

By offloading the burden of hardware maintenance, software updates, and data storage to cloud providers, companies could focus on their core business activities and enjoy the benefits of scalability, flexibility, and cost optimization. The cloud promised to revolutionize the way organizations managed their IT resources, allowing them to adapt quickly to changing market demands and technological advancements.

However, not all businesses have fully embraced the cloud, especially when it comes to their mission-critical systems. Companies that handle sensitive or proprietary data have often been more cautious in their approach, opting to maintain a significant portion of their operations on-premise. These organizations may have felt a sense of vindication as they watched some of their cloud-first counterparts grapple with the complexities and potential risks associated with entrusting such critical systems to third-party providers.

The recent news from Basecamp, for example, was driven by spiraling costs, irrespective of the cloud provider (they tried AWS and GCP). Thus, Basecamp decided to leave the cloud computing model and move back to on-premise infrastructure to contain costs, reduce complexity, avoid hidden costs, and retain margin. This way they felt that they had more control of the delivery and sustainment outcomes.

The Ongoing Costs of Cloud-First Strategies

Cloud bills, for example, can comprise hundreds of millions or billions of rows of data, making them difficult to analyze in traditional tools like Excel and cloud computing reduces upfront startup costs, including setup and maintenance costs, with 94% of IT professionals reporting this benefit. Accenture for example, found cloud migration leads to 30-40% Total Cost of Ownership (TCO) savings.

As many as 60% of C-suite executives also cite security as the top benefit of cloud computing, ahead of cost savings, scalability, ease of maintenance, and speed.

The private cloud services market for example, is projected to experience significant growth in the coming years. According to Technavio, the global private cloud services market size is expected to grow by $276.36 billion from 2022 to 2027, at a CAGR of 26.71%. 

The cloud of course supports automation, reducing the risk of human errors that cause security breaches and accoridnly the platforms help capture the cost of tagged, untagged, and untaggable cloud resources, as well as allocate 100% of shared costs. For those organizations that have wholeheartedly adopted a cloud-first strategy, the operational budgets for cloud technologies have often continued to climb year-over-year.

Instead of fully capitalizing on the advances in cloud technology, these companies may find themselves having to maintain or even grow their cost base to take advantage of the latest offerings. The promise of cost savings and operational efficiency that initially drew them to the cloud may not have materialized as expected.

As this cloud landscape continues to evolve, a critical question arises: is there a breaking point where cloud solutions may become unviable for all but the smallest or most virtualized cloud-interwoven businesses?

This concern is particularly relevant in the context of customer data management, where the increasing number of bad actors and risk vectors, coupled with the growing web of regulations and restrictions at local, regional, and international levels, can contribute to a sense of unease about entrusting sensitive customer data to cloud environments.

The Evolving Regulatory Landscape & Cyber threats

The proliferation of data privacy regulations, such as the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) in the United States, has added a new layer of complexity to the cloud adoption equation.

These regulations, along with a growing number of industry-specific compliance requirements, have placed significant demands on organizations to ensure the security and privacy of the data they handle, regardless of where it is stored or processed.For businesses operating in multiple jurisdictions, navigating the web of regulations can be a daunting task, as the requirements and restrictions can vary widely across different regions.

Failure to comply with these regulations can result in hefty fines, reputational damage, and even legal consequences, making the decision to entrust sensitive data to cloud providers a high-stakes proposition.

Alongside the evolving regulatory landscape, the threat of cyber attacks has also intensified, with bad actors constantly seeking new vulnerabilities to exploit.

Cloud environments, while offering robust security measures, are not immune to these threats, and the potential for data breaches or system compromises can have devastating consequences for businesses and their customers.

The growing sophistication of cyber attacks, coupled with the increasing value of customer data, has heightened the need for robust security measures and comprehensive risk management strategies. Companies must carefully evaluate the security protocols and safeguards offered by cloud providers, as well as their own internal security practices, to ensure the protection of their most valuable assets.

Balancing Innovation and Risk Management

In light of these challenges, many businesses are exploring hybrid approaches that combine on-premise and cloud-based solutions.

This strategy allows organizations to maintain control over their mission-critical systems and sensitive data, while still leveraging the benefits of cloud computing for less sensitive or more scalable workloads.

Some companies are also taking a more selective approach to cloud adoption, carefully evaluating which workloads and data sets are suitable for cloud migration.

By adopting a risk-based approach, they can balance the potential benefits of cloud solutions with the need to maintain a high level of control and security over their most critical assets.

As the cloud landscape continues to evolve, it is essential for businesses to carefully evaluate their cloud strategies and adapt them to the changing circumstances.

This may involve regularly reviewing their cloud usage, cost optimization strategies, and the evolving regulatory and security landscape to ensure that their cloud solutions remain aligned with their business objectives and risk tolerance.Regular monitoring and assessment of cloud performance, cost-effectiveness, and security posture can help organizations identify areas for improvement and make informed decisions about their cloud investments.

Collaboration with cloud providers and industry experts can also provide valuable insights and best practices to navigate the complexities of the cloud ecosystem.

As the cloud landscape continues to evolve, it is clear that the path forward will not be a one-size-fits-all solution.

Businesses must be careful in weighing the potential benefits of cloud adoption against the risks and challenges that come with entrusting their critical data and systems to third-party providers.The future of cloud solutions will likely involve a more nuanced and balanced approach, where organizations leverage the power of cloud computing selectively and strategically, while maintaining a strong focus on data security, regulatory compliance, and risk management.

Collaboration between businesses, cloud providers, and regulatory bodies will likely be crucial in shaping the next chapter of the cloud revolution, ensuring that the benefits of cloud technology are realized in a secure and sustainable manner.


Read More
Author: Uli Lokshin

Inspiring Others: The Art of Transforming Passion into Shared Vision


Leaders are often driven by a deep, unwavering passion for their cause, political position, product, or objective.

This dedication can be a powerful force, fuelling determination and propelling them towards their goals. However, this very passion can also become a source of frustration when others fail to share the same level of enthusiasm.

It can be disheartening for leaders when employees are not as excited about new initiatives, when family members do not fully support their efforts, or when outsiders seem disinterested in engaging with the leader’s agenda.

Such a lack of shared passion can leave the leader feeling alone, angry, misunderstood, and under-appreciated. While it is tempting to argue, debate, or pressure others into aligning with one’s vision, this approach often proves counterproductive.

Overcoming this frustration lies in the one’s ability to inspire others, rather than simply asserting their own desires. Inspiring leaders understand that the most effective way to achieve their goals is to focus on what others want, rather than solely on their own agenda.

By explaining one’s passion in the language and perspective of those you seek to influence, you can create a shared vision that resonates with the needs and desires of your audience.

Connecting the collective’s needs and dsires

One of the hallmarks of an inspiring leader is often their ability to shift the conversation focus from themselves to that of the collective.

Instead of simply extolling the virtues of their own position or product, they take the time to understand the needs, concerns, and aspirations of those they wish to engage. In the absence of this, you have tghe master-serf relationship where employees in particular are just wage-slaves; or in clubs or societies where the leader is the prophet and the others are just acolytes or disciples.

By doing so, a leader can craft a narrative that speaks directly to the interests and motivations of those around them; but it needs to be grounded in everyone’s reality.

“The more scarce and valuable commodity is cold-shower-self-honesty”

 Joel Runyon

Rather than relying just on passion alone, inspiring leaders are willing to step back and critically examine their own circumstances, their assumptions, biases, and communication strategies. They recognize that their personal enthusiasm, while genuine, may not be enough to sway others who have different priorities and perspectives.

The Shared Vision

By focusing on the needs and desires of the team, an inspiring leader is able to craft a shared vision that aligns with the goals and personal and collective aspirations of those around them.

This shared vision becomes a powerful tool for overcoming the frustration that can arise when others do not immediately embrace the leader’s passion.Rather than simply pushing their own agenda, inspiring leaders take the time to understand what motivates their employees, family members, or external stakeholders. They then weave these insights into a narrative that highlights how the leader’s vision can help others achieve their own objectives.

Such an approach creates a sense of mutual investment and shared purpose, fostering a collaborative environment where everyone feels invested in the success of the endeavor.

Empathy and Emotional Connection

Inspiring leaders often understand that passion alone is not enough to drive meaningful change.

Recognizing the importance of empathy and emotional connection in engaging others and cultivating a shared vision, they actively listen to the concerns and perspectives of others, they are able to tailor their message in a way that resonates on a deeper level.

An emotional connection is crucial in overcoming the frustration that can arise when others do not immediately share the same passion. By demonstrating a genuine understanding of the challenges and aspirations of their audience, inspiring leaders are able to build trust and foster a sense of shared purpose. This, in turn, helps to overcome resistance and create a collaborative environment for all.

The Art of Inspiration

Inspiring leaders understand that the role is not to simply assert their own desires, but to create a compelling vision that aligns with the needs and aspirations of those they seek to influence.

This requires a delicate balance of passion, empathy, and strategic communication.

A focus on “them” speaks directly to the concerns and motivations of others. The most valuable commodity is not just passion, but the willingness to engage in that “cold-shower-self-honesty” – critically examining own assumptions, biases, and communication strategies.

A process of self-reflection and audience-centric communication, inspiring leaders are able to overcome the frustration that can arise when others do not immediately share their passion and potential impatience.

A shared vision that resonates with the needs and desires of their audience fosters collaborative environment where everyone feels invested in the success of the endeavor.

Ultimately, the art of inspiration is not about forcing others to conform to the leader’s agenda, but about cultivating a shared sense of purpose and mutual investment and having everyone else naturally come together on the journey of exploration, discovery and execution. Only through connecting with the needs and desires of others, inspiring leaders are able to achieve greater success and create lasting change.


Read More
Author: Flaminio

The Impact of Artificial Intelligence on Business Operations


Artificial Intelligence (AI), today more than ever before, stands out as a transformative force reshaping the way businesses operate.

Like all modern technologies, it has infiltrated many aspects of business, enhancing efficiency, improving customer experiences, and driving innovation. It’s touch, is felt from customer service to data analytics.

AI is revolutionizing traditional approaches and propelling organizations into a new era of possibilities but it is challenged by concerns about bias, transparency and its ability to hallucinate.

Some history

The Turing Test, proposed by British mathematician, computer scientist and codebreaker Alan Turing in 1950, was considered a measure of a machine’s ability to exhibit intelligent behavior indistinguishable from that of a human.

The test serves as a rudimentary benchmark for assessing a machine’s capability to display human-like intelligence in natural language conversation but the latest developments with Large Language Models (LLMs) and how they naively behave may have most broken the fundamentals of this test and we may need to think of new ways to assess AI.

The basic premise of the Turing Test is to assess a machine’s ability to engage in human-like conversation, that’s still relevant, but its applicability and limitations have become more pronounced in the context of LLMs. LLMs don’t actually understand what you’re saying or asking.

Despite all this, one of the most significant impacts of AI on business operations is evident in customer service. The very space where we want a conversation, may be better served by an AI.

Chatterbots

The reason may be quite simple. We’re not actually looking for a social conversation with an AI when we use a chatbot or a virtual assistant, instead we’re looking for information, or answers to solve the thing that has brought us to the chatbot in the first place.

The first “chatterbot” is reputed to be ELIZA, created in the mid-1960s by Joseph Weizenbaum, a computer scientist at the Massachusetts Institute of Technology (MIT).

ELIZA operated by processing user responses to supplied prompts and generating pre-defined, contextually appropriate replies.

Using a combination of pattern matching and simple keyword recognition techniques it simulated a Rogerian psychotherapist.

Although the interactions were relatively basic, ELIZA’s ability to mimic human conversation and provide responses that seemed meaningful and engaging was groundbreaking at the time.

If you’re interested, there is a javascript version of ELIZA originally written by Michal Wallace and significantly enhanced by George Dunlop that you can try out at the CSU Fullerton Psychology Department.

When applications are integrated with NLP capabilities, the application “understands” and processes human language. This feature can be part of augmentation of chatbots and virtual assistants and facilitates interactions with customers, employees, and others. Chatbots and virtual assistants powered by AI-driven RPA can engage in natural language conversations, answer queries, and provide assistance, enhancing customer service and user experience.

AI-powered chatbots and virtual assistants have come a long way and are just starting to revolutionize the way businesses interact with their customers. With instant responses to customer queries, personalized recommendations, routine task handling, they can ensure a relatively seamless customer experience.

The process robots are coming

An area I have dipped in and out of at various points in my work career since Y2K, is robotic process automation (RPA). The goal of the RPA being to automate mundane and repetitive tasks. Tasks that were previously low value and time-consuming for employees. Early RPAs were very prescriptive and simplistically programmed but today they are amore adaptive. One of the earliest examples of RPA-like automation can be traced back to the introduction of screen scraping software in the 1990s.

AI-driven RPA goes beyond basic task automation by incorporating so called cognitive capabilities. With machine learning (ML) algorithms, RPA systems can analyze vast amounts of data, recognize patterns, and make decisions based on historical and real-time information. This “cognitive” automation allows businesses to automate complex tasks that require decision-making, such as data analysis, customer service interactions, and fraud detection.

AI in fraud detection, risk management, and algorithmic trading has machine learning algorithms analyze financial data in real-time, identifying unusual patterns and potential bad actor activities, thereby enhancing security and minimizing financial losses.

RPA integrated with AI can excel in processing unstructured data, such as invoices, forms, and emails. Through Optical Character Recognition (OCR) and machine learning, such systems can extract relevant information from documents more accurately than people and faster! This capability streamlines document-based processes, such as invoice processing and claims management, reducing manual errors and improving overall document handling efficiency.

Automation liberates human resources, allowing employees to focus on more strategic and creative aspects of their roles; the kinds of applications include dataentry, invoice processing, and report generation are now handled efficiently by AI-driven systems, leading to higher productivity and reduced operational costs.

Smart reporting

AI has been transforming data analysis for a while now, by enabling businesses to glean improved insights from vast datasets.

Machine learning algorithms analyze historical data, identify patterns, and predict future trends with remarkable accuracy. This predictive analytics can help a business make better informed decisions, optimize inventory practices, more precisely forecast customer demands, and enhance overall operational efficiency.

AI-driven applications optimizing supply chain operations look to historical sales data, market trends, and weather patterns, for example, to predict demand more accurately.

This multi-threaded predictive capability aids businesses in avoiding stock-outs, reducing inventory holdings, and minimizing waste. AI-powered algorithms are also used to optimize route planning and delivery scheduling, which can all improve the effectiveness and cost profile of logistics operations.

By combining data analytics with AI, businesses automate their data analysis and generate more precise actionable insights. AI-driven analytics systems process vast datasets, identify trends, and provide answers in near real-time. Decision-makers now have timely and accurate information, enabling them to make better informed choices to drive business growth and innovation.

More business focus areas

The examples cited above are probably the areas I have seen benefits more commonly from AI in the business setting, but there are at least almost a dozen more that can be considered.

AI algorithms that analyze customer behavior and preferences, enable businesses to create highly targeted marketing campaigns. The campaigns might include personalized recommendations, content, and advertisements to enhance customer engagement and increase conversion rates.

Healthcare professionals have started to consider the use of AI in diagnosing diseases, analyzing medical images, and predicting patient outcomes. Machine learning algorithms can process vast amounts of medical data, leading to more accurate diagnoses and personalized treatment plans.

Analysing medical images, such as X-rays, CT scans, MRIs, lab slides and mammograms, AI, can process these artefacts at speeds much faster than human medical professionals. Algorithms can quickly identify patterns, anomalies, and potential areas of concern.

Subtle changes in medical images that might not be immediately apparent to human eyes are more easily indetified by AI. This early detection can lead to the diagnosis of diseases at their nascent stages, improving the chances of successful treatment and recovery. This is particularly crucial in diseases like cancer, where early detection significantly improves patient outcomes. In critical cases, rapid analysis can be life-saving.

Intelligent tutoring and educational systems adapt to learner styles, providing customized educational content and feedback. AI also aids in automating the administrative tasks for educational institutions, improving efficiency.

In manufacturing and operations, the use of AI can assist businesses in anticipating equipment failures, reducing downtime and maintenance costs.

In talent acquisition processes, automating resume screening, candidate matching, and even conducting initial interviews can accelerate candidate evaluation. Chatbots powered by AI handle the routine HR inquiries, HR professionals focus on more strategic and higher value tasks like employee engagement and development.

AI is employed in environmental monitoring and conservation efforts to predict natural disasters, monitor pollution levels, and aid in wildlife conservation, contributing to more effective environmental preservation strategies.

Legal assistance tools that are AI-powered can help legal professionals in document review, contract analysis, and legal research. Natural Language Processing algorithms enable these tools to process and analyze large volumes of legal documents efficiently, improving accuracy and saving time for lawyers and paralegals.

Artificial Intelligence (AI) has become a transformative force revolutionizing various aspects of business operations. From customer service to data analytics.

AI-driven technologies have significantly enhanced efficiency, improved customer experiences, and driven innovation across diverse sectors.

However, the rapid integration of AI in business processes has raised concerns regarding bias, transparency, and the ability of AI systems to comprehend human-like conversations, especially in the context of Large Language Models (LLMs).

The traditional Turing Test, once a benchmark for assessing machine intelligence, now faces challenges due to the complex behavior of LLMs, prompting the need for new evaluation methods.

Despite these challenges, AI-powered chatbots and virtual assistants have reshaped customer interactions, providing instant responses and personalized recommendations, thereby ensuring seamless customer experiences. AI-driven Robotic Process Automation (RPA) has automated mundane tasks, liberating human resources and enabling employees to focus on strategic and creative aspects of their roles.

AI has revolutionized data analysis, supply chain optimization, healthcare diagnostics, education, talent acquisition, environmental monitoring, and legal assistance, showcasing its vast potential in diverse business focus areas.

As businesses continue to harness the power of AI, it is imperative to address the ethical concerns and develop innovative solutions, ensuring that AI remains a valuable asset in shaping the future of business operations.


Read More
Author: Clinton Jones

Nurturing a Culture of Empowerment for Innovation in Business Leadership


In my view, effective leadership involves empowering employees and fostering a culture of innovation amongst teams. This is particularly important in industry sectors that are bound up in imprecise ways to achieve both specific and vaguely specified outcomes. These may include technology and software development, entertainment and media, fashion and design, advertising and marketing, renewable energy, and sustainability.

The concept of autonomous yet thoughtful decision-making is a powerful leadership strategy that helps to drive desired productive outcomes. Many may understand the significance of autonomy and empowerment but not acknowledge the importance in various business settings.

This often means emphasizing the need for a shift away from very traditional command and control models (C&C). C&C is prevalent in more traditional and bureaucratic organizations it has often been associated with industries where standardization, efficiency, and compliance were crucial, such as military, manufacturing, and certain government sectors.

Some of the key characteristics of C&C include centralized decision-making where the decision-making power is concentrated in the hands of those at the top. This approach often leaves little room for input from the lower levels of employees. There’s a chain of command and decisions are typically passed down that chain.

A second common characteristic is the strictness of the hierarchy. Organizationally the structure is typically like a pyramid with clearly delineated lines of authority and control. Each level reports to the one above it, and instructions flow from the top down. There may be an emphasis on discipline and control to ensure that employees adhere to prescribed and somewhat predictable processes in order to meet performance expectations.

C&C is often characterized by rigid adherence to rules, procedures, and protocols. Employees are expected to follow specific guidelines as prescribed without deviation. In line with the pyramid, communication follows formal channels, such as through managers and supervisors and information or insights may be limited as the communication slows up and down the hierarchy. Everyone is assigned specific roles and responsibilities, and tasks are somewhat clearly defined. Employees have little autonomy to make decisions or exercise creativity. The focus is on carrying out assigned tasks as directed.

While this leadership model can be effective in certain circumstances as I previously described, it is often criticized for its limitations in an ambiguous, dynamic, and fluid business environment.

In industries that require adaptability, creativity, and innovation, like the tech sector, the command and control model in my experience, actually hinders employee engagement, limits the flow of ideas, and inhibits organizational agility. I am more in favor of participative and collaborative leadership that empowers employees and fosters a culture of innovation and a genuine desire for ownership and accountability.

Instead, I advocate for a more informal and relaxed, and collaborative leadership approach that encourages creativity and innovation where the leadership team functions as player-coaches and builds genuine consensus and collective agreement on big and small decisions through dialog and negotiation.

Growth through empowerment

If you want growth, then empowering employees goes beyond simple delegation; it requires trusting individuals to make informed decisions and providing them with the necessary resources and autonomy to act. In so doing you foster a sense of ownership and accountability within the workforce which then leads to higher job satisfaction and improved overall productivity.

The core of successful empowerment lies in striking the right balance between autonomy and thoughtful decision-making. You should want autonomous yet well-considered decision-making from your employees. Autonomy allows teams and individuals to leverage their expertise and creativity to address complex challenges effectively. However, it must be complemented with considered decision-making, where employees gather information, seek advice, and analyze potential outcomes before acting. Remember you’re paying these people for their expertise and to perform a particular job. If you’re only interested in barking instructions at them then you may as well just hire unskilled people with no particular specialisms or experience.

Tailoring empowerment models to your business settings is important since the universal benefits of empowerment and autonomy don’t necessarily manifest in all cultures or work settings. The application should therefore be tailored to suit the specific business contexts you have. There are a few different implementation models to consider. Task-based, Team-based, and Individualized

Task-based empowerment is typical for industries with routine tasks. Task-based empowerment can streamline processes and enhance productivity. By granting employees authority over specific responsibilities, business leaders enable them to make decisions related to their assigned tasks, boosting efficiency. Without disrupting efficiency and effectiveness, for example, employees can rotate and resequence their tasking according to their preferences and observed optimality.

Team-based empowerment is most appropriate in dynamic environments which ultimately benefit from improved collaboration and collective decision-making and where these activities take center stage. By allowing teams to pool diverse perspectives and expertise, leaders have the potential to tap into the opportunities afforded by collective innovation.

In roles requiring specialized skills, individual-based empowerment can be highly effective. Leaders empower subject matter experts to make decisions in their areas according to proficiency, fostering innovation and excellence in technology-driven fields. This is individual-based empowerment

C&C with its centralized decision-making and strict protocols works somewhat well in highly regulated industries, this approach stifles creativity though and limits adaptability in technology development. Employees may feel restricted, resulting in decreased motivation, innovation, and engagement.

Conversely, the informal and relaxed leadership style promotes open communication, trust, and collaboration. By empowering employees to make autonomous decisions, leadership fosters an essential culture of innovation and agility. This approach is particularly effective in software development and technology-driven operations, where creativity thrives in a flexible environment.

Getting the best out of teams and individuals

Getting the best out of the two quite different approaches still requires you to set clear objectives with agreed measurable outcomes. Empowering employees requires providing clear objectives and expectations. Well-defined goals using the SMART framework (Specific, Measurable, Achievable, Relevant, Time-bound) help individuals self-manage their work efficiently.

SMART goal setting

Another facet is effective time management; autonomy allows individuals to manage their own time. To be effective though, there needs to be discipline. Discipline is essential for ensuring its effective use. Encouraging employees to prioritize tasks, set deadlines, and avoid distractions maintains their productivity.

Autonomous employees must also be accountable for their work. Encouraging ownership and transparent progress reporting foster a sense of responsibility.

Self-Management in Project and Program Management

In technology development, adopting an agile methodology enhances self-management. Empowered teams self-organize, collaborate, and make quick decisions to adapt to changing requirements effectively.

Leaders can further empower teams by providing autonomy in decision-making. Open communication and input from team members drive a self-managed and collaborative environment.

Project management itself involves ongoing learning and improvement allowing employees to reflect on progress and take initiative. These empowering approaches support positive change and have a greater likelihood of driving success.

As already suggested, empowerment also requires balancing discipline with flexibility. Through research, it has been found that Innovation thrives in more flexible environments. Leaders must therefore be open to diverse methods and ideas, trusting teams to find effective solutions. Open channels of communication facilitate not only bidirectional trust, they also support employee self-management and lead to continuous improvement.

A few words of caution

Sometimes, in our earnest efforts to empower others and provide autonomy, we may inadvertently deceive ourselves into believing that we have relinquished command and control.

Despite statements of intent, the empowerment we claim to have granted is a self-deceiving illusion. We might unknowingly perpetuate a micro-management overreach by behaving in exactly the opposite way to what we suggest we are thinking. This can occur when we continuously bombard teams with questions, second-guess their independent decisions, and challenge their judgment with frequency. Individuals and teams need to occasionally fail to learn. While our intention may be to offer support and ensure success, our actions may inadvertently stifle creativity and autonomy.

True empowerment necessitates trust and allowing individuals the space to take ownership of their work is critical. Constantly questioning decisions sends mixed signals, who is actually making the decisions? Undermining the confidence of team leads and team members impedes their ability to innovate freely. To genuinely empower others, we must genuinely let go of control, offer guidance only when sought, celebrate their success, recognize missteps, and offer encouragement, coaching, and reassurance.

Occasional mistakes are part of the learning process. By fostering a culture of trust and granting autonomy, we can break free from the C&C mindset, unleashing the full potential of our teams to drive creativity and achieve remarkable results.

Nurturing a culture of empowerment is essential for fostering innovation in business leadership. By tailoring empowerment models to the specifics of the business setting and adopting informal leadership styles, leaders can cultivate creativity and adaptability, particularly in software development and technology-driven operations. Encouraging discipline and self-management in tasking and project and program management enables employees to thrive in an environment that values autonomy while maintaining focus and efficiency. Striking the right balance between discipline and flexibility empowers teams to innovate, drive success, and contribute to sustainable growth.

Suggested reading :


Read More
Author: Clinton Jones

The “on the side” hustlers


In recent years, the trend of remote work has gained popularity, especially in the technology industry.

The advent of modern technology has seen certain kinds of employees no longer being required to be physically present in an office to perform their job functions.

This flexibility is praised by many as having benefits but also comes with a set of challenges, not just related to the intermingling of work and home life but also the shift of some of the real estate costs and costs typically associated with office-bound workers.

Benefits particularly lauded by those working remotely, are the avoidance of long commutes and of course health and safety concerns in times of pandemic.

One particular challenge for employers is the potential for employees to engage in “moonlighting”, which can be a significant concern for certain employers, especially in the technology industry. For the uninitiated, moonlighting refers to the practice of working a second job in addition to one’s primary job.

The origin of “moonlighting” dates back to the early 1800s, when it was commonly used to describe the practice of working at night by the light of the moon.

In the early 1900s, the term began to be used more broadly to refer to working a second job in addition to the primary job. Moonlighting became increasingly prevalent in the United States during the Second World War when workers were encouraged to take on additional jobs to support the war effort.

After the war, moonlighting continued to be a common practice, especially among blue-collar workers who were looking to earn extra income to support their families.

In the 1960s and 70s, moonlighting gained even more widespread acceptance as a way for employees to pursue their passions or supplement their income. However, during this time, moonlighting also became a great source of controversy, with some employers expressing concerns about conflicts of interest and decreased productivity.

With the rise of the “gig economy” in recent years, moonlighting has become even more prevalent, especially in industries like technology, where employees have in-demand skills that can be used for side projects and freelance work.

For employees, it’s a way to earn extra income or pursue their passions. For employers, it can create a conflict of interest and pose a significant perceived and actual risk to their business. In the tech industry, where employees have access to sensitive and confidential information for example, moonlighting poses the risk of unintentional or deliberate data breaches and this in turn jeopardizes a company’s reputation and introduces avoidable potential security risks.

An employee working on a project for a competing company may unintentionally share confidential information, leading to a data breach. Policies on personal use of equipment often mitigate this but things might also be said in conversations and in other work-related circumstances.

There are several reasons why tech workers may be particularly more prone to moonlighting. Firstly, the nature of their work often involves flexible hours and remote working arrangements, which can make it easier for them to take on additional work.

Secondly, the relative shortage of competent tech workers who are in high demand at a specific price point, and the skills they possess can be valuable to other companies.

From the employer’s perspective, it could be argued that there is the potential for decreased productivity, missed deadlines, and poorer quality of work as well as potential legal and previously cited reputational risks.

To mitigate the risks associated with moonlighting, employers often take contractual, policy, and control steps. Firstly, they may include moonlighting clauses in employee contracts, they may prohibit employees from taking on additional work without prior approval.

Monitoring and tracking systems may also be considered on work assigned equipment like keyboard and screen monitors to ensure that employees are not engaging in unauthorized moonlighting or behaviors. These systems not only monitor employee activity but also flag suspicious behavior, such as accessing unauthorized websites or sharing confidential information.

It’s a difficult balancing act, employees value the flexibility and freedom to pursue side projects, and a blanket ban on moonlighting can lead to increased staff turnover and decreased job satisfaction.

Employers can consider a more nuanced approach, such as allowing employees to engage in moonlighting as long as it doesn’t create a conflict of interest or compromise the company’s security though exactly how this is measured may be difficult to establish.

Complete prohibition of moonlighting altogether may seem like an easy solution but can also create some interesting disbenefits.

Employees disallowed from the pursuit of side projects or engaging in freelance work may feel very stifled and demotivated in their primary job. This may result in decreased job satisfaction and productivity, ultimately causing harm to the quality of their work and commitment to the business.

Prohibition can also lead to talent loss as employees not allowed to pursue side projects or freelance work may be more likely to seek employment elsewhere, where they have more flexibility and autonomy. Employers who prohibit moonlighting may find themselves struggling to attract and retain top talent as a result of this restrictiveness.

Side projects and freelance work can provide valuable indirect learning experiences that help employees develop new skills they can bring back to their primary job.

The prohibition of moonlighting can also create legal risks for employers. In some states and countries, laws protect employee rights to engage in lawful off-duty activities, which may include moonlighting. Employers who prohibit moonlighting without a clear and compelling reason may be at risk of legal action from employees.


Read More
Author: Uli Lokshin

Playing at work


It has been a while since a colleague raised their eyebrows at me and my light-hearted disposition to work. But make no mistake, work itself is serious stuff but it doesn’t have to be boring and dull if you tackle it with the right mindset.

The idea that work should be considered as play might seem counterintuitive at first glance. We typically associate play with leisure and work with productivity and output. However, there are many compelling reasons to consider work as play, and doing so can lead to a more enjoyable and fulfilling work experience.

When we think of play, we think of activities that we enjoy and look forward to doing; the same can be true for work if we approach it with a positive mindset. By finding ways to make work enjoyable and engaging, we can change our perception of work from something that we have to do to something that we really want to do.

Albert Einstein is often quoted as saying, “Play is the highest form of research.” This statement suggests that Einstein believed that play was not only enjoyable but also an important part of the learning and discovery process. Einstein also believed that work and play were not mutually exclusive, and that one could approach work with a playful and creative mindset.

In addition to his famous quote, Einstein also wrote about the importance of play in his personal life. He enjoyed playing the violin, sailing, and hiking, and he often credited his hobbies with helping him to think creatively and come up with new ideas. Einstein believed that play was a way to stimulate the imagination and that it could lead to new insights and discoveries.

Play is also often associated with creativity and innovation; when we engage in play, we are more likely to experiment and try new things, which can lead to new ideas and insights. Similarly, when we approach work with a playful mindset, we are more likely to come up with creative solutions to problems and find new ways to approach complex and simple tasks.

Play fosters intrinsic motivation such that when we engage in play, we do it because we enjoy it, not because we expect to get a reward or avoid punishment. Similarly, when we approach work as play, we are more likely to be intrinsically motivated to do our best possible work. We are not just working for a paycheck or promotion, but because we enjoy the work itself.

Many considered play a social activity, one that can help build relationships and foster a sense of community. Similarly, when we approach work with a playful mindset, we are more likely to build positive relationships with our colleagues and enjoy working together. This can lead to a more supportive and collaborative work environment.

Play is a great way to reduce stress and promote relaxation, many a keen sportsman will take to the golf course or the outdoors to destress. Similarly, when we approach work with a playful mindset, we are more likely to feel less stressed and more relaxed. This can lead to a more positive work environment and improved overall well-being.

There are many reasons why work should be considered play.

By finding ways to make work enjoyable and engaging, we can approach work with a positive mindset and enjoy the work itself.

Approaching work as play can foster creativity, build relationships, reduce stress, and promote intrinsic motivation.

As we spend a significant portion of our lives at work, it is essential to find ways to make work more enjoyable and fulfilling. By considering work as play, we can achieve this and create a more positive and productive work experience.


Read More
Author: Clinton Jones

You can’t work from home


While the rise of remote work and hybrid working models have made it possible for many employees to work from home, there are still some types of work that are not suitable for remote work.

Some jobs require specific equipment or tools that are not easily accessible outside of the workplace, while others require close collaboration and communication with colleagues.

Some jobs require specialized equipment, such as manufacturing or laboratory work, which cannot be easily transported or replicated in a home environment.

For example, a chemical engineer who works in a lab may require access to specific materials and tools that are not available at home, making it impossible to work remotely.

Some jobs require a physical presence, such as those in the hospitality or healthcare industries. For example, a nurse cannot provide care to patients from home, and a restaurant worker cannot prepare and serve food remotely.

Some jobs require frequent face-to-face interaction, such as those in sales or customer service. For example, a salesperson may need to visit clients in person, and a customer service representative may need to speak with customers directly to address their concerns.

Some jobs require close collaboration and communication with colleagues, such as those in research and development or project management. For example, a team of engineers working on a new product may need to work together in person to share ideas, troubleshoot problems, and make decisions.

Client-facing interactions jobs, such as those in the legal or financial industries may preclude working from home. For example, a lawyer may need to meet with clients in person to discuss legal matters, and a financial advisor may need to provide advice to clients face-to-face.

There is also a class of jobs that require on-site supervision. These jobs require on-site supervision, such as those in construction or manufacturing. For example, a construction worker needs to be supervised by a manager who can oversee the work and ensure that it is done safely and effectively.

While remote work and hybrid working models have made it possible for many employees to work from home, there are still some types of work that are not suitable for remote work.

Jobs that require specialized equipment, physical presence, frequent face-to-face interaction, team collaboration, client-facing interaction, and on-site supervision are among the types of work that are not suitable for working from home.

Organizations need to assess the requirements of each job role and determine whether it can be done remotely, on-site, or in a hybrid model to ensure that employees can work effectively and efficiently.


Read More
Author: Flaminio