Search for:
Have legacy systems failed us?


I have been working on-and-off with “legacy” systems for decades. The exact definition of what such a thing is, may come across as vague and ill-defined, but that’s ok. The next generations of software developers, data engineers and data scientists and in fact anyone working in tech, will present you with this idea and then you’ll have to work out the realness of their perspective.

For any twenty or thirty-something in tech these days, anything created before they were born or started their career, is likely labeled legacy. It’s a fair perspective. Any system has successors. Yet if it is ‘old’ and is still clicking and whirring in the background as a key piece of technology holding business data together, it might reasonably be considered a part of some legacy.

The term is loaded though.

For those who haven’t quite retired yet (myself included) – legacy connotes some sort of inflexible and unbendable technology that cannot be modernized or made more contemporary. In some industries or contexts though, legacy implies heritage, endurance and resilience. So which is it? Both views have their merits, but they have a different tonality to them, one very negative and one almost revered.

In my early working years, I had the pleasure of seeing the rise of the PC, at a time when upstart technologies were trying to bring personal computing into the home and the workplace. The idea of computing at home was for hobbyists and dismissed by the likes of IBM. Computing at work often involved being bound to a desk with a 30kg beige and brown housed CRT “green screen – dumb terminal” with a keyboard that often weighed as much, or more, than the heaviest of modern day laptops.

One of the main characteristics of these systems, was that they were pretty consistent in the way that they operated. Yes, they were limited, especially in terms of overall functionality, but for the most part the journeys were constrained and the results and behaviours were consistent. Changes to these systems seemed to move glacially. The whole idea of even quarterly software updates for example, would be perhaps somewhat of a novelty. Those that had in-house software development teams laughably took the gestation period of a human baby to get pretty much ‘anything’ done. Even bugs, when they were detected and root cause analysed, would take months to often be remediated, not because of complexity to solve, but rather because of the approaches to software change and update.

I suppose, in some industries, the technology was a bit more dynamic but certainly the friends and colleagues that I worked with in other industry sectors didn’t seem to communicate that there was a high level of velocity of change in these systems. Many of them were mainframe and mini-mainframe based – often serviced by one or more of the big tech brands that dominated in those days.

I would suppose, that a characteristic of modern systems, and modern practices then, is probably encapsulated in the idea of handling of greater complexities. Dealing with higher volumes of data and the need for greater agility. The need for integrated solutions for example, has never pressed harder than it does today. We need and in fact demand interconnectedness and we need to be able to trace numerous golden threads of system interoperability and application technology interdependence at the data level, at unprecedented levels.

In the past we could get away with manual curation of all kinds of things, including describing what we had and where it was, but the volumes, complexities and dependencies of systems today, make the whole idea of doing these things manually, seem futile and fraught with the risk of being incomplete and quickly out of date. Automation is now more than a buzzword, it’s table stakes, and many will rightly assume that automation has already been considered in the design.

Legacy Systems and Their Limitations

As anyone who regularly uses office applications will attest. Just a cursory consideration of your presentations, documents and spreadsheet files, folders and shared content, you will find that they demonstrate just how quickly things can get out of hand.

Unless you are particularly OCD perhaps; you likely have just one steaming heap of documents that you’re hoping your operating system or cloud provider is able to adequately index for a random search.

If not, you’re bound to your naming conventions (if you have any), the recency timestamps or some other criteria. In some respects, even these aspects seems to make all this smell suspiciously like a “legacy problem”.

The growth of interest and focus in modern data management practices in general, means that we need to consider how business and operational demands are reshaping the future of data governance in general.

I still don’t have a good definition for what a “Legacy System” really is despite all this The general perspective is that it is something that predates what you work with on a daily basis. This seems as good as any definition. But, we have to acknowledge though, that legacy systems remain entrenched as the backbone of a great many organizations’ data management strategies. The technology may have advanced and data volumes may have surged, but many legacy systems endure, despite or perhaps in spite of their inadequacies for contemporary and modern business needs.

Inability to Handle Modern Data Complexity

One of the most significant challenges posed by legacy data systems is often their inability to cope with data volumes and the inherent complexities of contemporary data. Pick your favourite system and consider how well it handles those documents I described earlier, either as documents or as links to those documents in some cloud repository.

Many of the legacy solutions that people think about as legacy solutions, were designed more than a generation ago when the technology choices were more limited, there was less globalization and we were still weaning ourselves off paper based and manually routed content and data. Often the data itself was conveniently structured with a prescribed meta model and stored in relational databases. These days, businesses face a deluge of new data types—structured, semi-structured, and unstructured—emanating from an ever gorwing number of sources including social media, IoT, and applications.

Legacy transactions and master data systems are now having to deal with managing tens and hundreds of millions of records spread across function and form specific siloed systems. This fragmentation in turn, is leading to inconsistencies in the data’s content, data quality and data reliability. All this makes it difficult for organizations to know what to keep and what to discard, what to pay attention to and what to ignore, what to use for action and what to simply consider as supplementary.

If there is enough metadata to describe all these systems, we may be lucky enough to index it and make it findable, assuming we know what to look for. The full or even partial adoption of the hybrid cloud has simply perpetuated the distributed silos problem. Now, instead of discrete applications or departments acting as data fiefdoms, we have the triple threat of data in legacy systems, unindexed. Data in local system stores and data in cloud systems. Any technical or non technical user finds it understandably challenging to find what they want and what they should care about because there are very few fully integrated seamless platforms that describe everything in a logical and accessible way.

Rigidity and Lack of Agility

Legacy and traditional systems are also characteristed by some inherent rigidity. The approach to implementing or running them often involves elongated processes that can take months or even years in implementation and require regimented discipline for daily operations. New initiatives hooked to legacy applications are typically characterized by expensive and high failure rates due to their own inherent complexity and the need for extensive customization to integrate and work with more contemporary technologies.

For example, prominent ERP software company SAP, announced in February 2020 that it would provide mainstream maintenance for core applications of [ECC] SAP Business Suite 7 software until the end of 2027.

But according to The Register, as recently as June 2024 representatives of DACH customers suggested that they don’t believe they will even meet the 2030 cut-off when extended support ends.

Research by DSAG, representing SAP customers in DACH found that 68% still use the “Legacy” platform. 22% suggesting that the SAP ECC/Business Suite influenced their SAP investment strategy for 2024. Many are reluctant to upgrade because they have invested so heavily in customizations. All this makes for some tough calls.

The rigidity of the legacy system compounded by the reticence of customers to upgrade does present a challenge in terms of understanding just how responsive any business can be, to changing business needs. SAP wants you to use shinier and glossier versions of their technology in order to maintain a good relationship with you and to ensure that they can continue adequately supporting your business into the future but if you won’t upgrade what are they to do?

The modern digital economies expect businesses to be able to pivot quickly in response to market trends or customer demands. Being stuck on legacy solutions may be holding them back. Companies running on legacy, may need significant time and resources to adapt further or scale to meet the expectations of the new. Apparent system inflexibility will likely hinder innovation and limit one’s ability to compete effectively.

Unification is a possible answer

If you recognise and acknowledge these limitations, then you’re likely already shifting away from the traditional siloed approaches to data management towards more unified platforms.

Integrated solutions like SAP provide a holistic view of organizational data, and they have been paramount for years. But even here, not all the data is held in these gigantic systems. SAP would segment the platforms by business process. Order to Cash, Procure to Pay, Hire to Retire and so on. But business are multidimensional. Business processes aren’t necessarily the way the business thinks about its data.

A multinational running on SAP may think about its data and systems in a very regional fashion, or by a specific industry segment like B2C or B2B; they may even fragment further depending on how they are set up. Channel-focused business for example is not unusual. eCommerce vs Retail Stores; D2C… The number of combinations and permutations are seemingly limitless. Yet each of these areas is likely just another data silo.

A break with data silos fosters cross-divisional collaboration allowing the business to enhance decision-making processes and improve overall operational efficiency. ERP doesn’t necessarily promote this kind of thinking. Such a shift is not just reactive with respect to shortcomings of legacy systems and the like; it is also driven by a broader trend towards digital transformation.

In commercial banking for example, thinking through the needs and wants of the different regional representations, the in-market segments and then the portfolio partitions, means that some data is common, some data is not, but most importantly, all of the data likely needs to be in one unifying repository and definitely needs to be handled in a consisent, aligned, compliant and unified way. Through the lens of risk and compliance, everyone’s behaviours and data are viewed in the same way, irrespective of where theuir data is held and who or what it relates to.

Incorporating modern capabilities like artificial intelligence (AI), machine learning (ML), and big data analytics requires solutions that can support these initiatives effectively and seems to be a popular topic of discussion. You can poo-poo AI and ML as fads with relatively limited real applicability and value right now, but like yester year’s personal computers, mobile phone technology and like, these kinds of things have an insidious way of permeating our daily lives in ways that we may have never considered before and before we know it, we have become hooked on them as essential capabilities for us to get through our daily lives.

Lessons in retail

In modern retail in the developed world, for example, every product has a barcode and every barcode is attached to a master data record entry that is tied to a cost and pricing profile.

When you checkout at the grocery store, the barcode is a key to the record in the point of sale system ahnd pricing engines and that’s the price that you see on the checkout receipt. Just 25 years ago, stores were still using pricing “guns” to put stickers on merchandise, something that still exists in many developing countries to this day. You might laugh, but in times of high inflation it was not uncommon for consumers to scratch about on the supermarket shelves looking for older stock of merchandise with the old price.

Sticker-based pricing may still prevail in places but often the checkout process is cashless, auto reconciling for checkout and inventory and especially for auto pricing all with the beep of a read of that barcode by a scanner.

As these technologies become even more affordable, and even more accessible to all sizes of business and even the most cost consciousness. In all aspects of buying, handling, merchandising and selling grows, the idea of individually priced merchandise will probably disappear altogether and we’ll still be frustrated by the missing barcode entry in the database at checkout or that grocery item that is sold by weight and needs to be given its own personal pricing barcode because the checkout doesn’t have a scale. This then becomes a legacy problem in itself where we straddle the old way of doing things and a new way.

In much the same way, transitioning from legacy to something more contemporary doesn’t mean that an organization has to completely abandon heritage systems, but it does mean that continuing to retain, maintain and extend existing systems should be continuously evaluated. The point here is that once these systems move beyond their “best-by” date, an organization encumbered by them, should already have a migration, transition or displacement solution in mind or underway.

This would typically be covered by some sort of digital transformation initiative.

Modern Solutions and Approaches

In stark contrast to legacy systems, modern solutions are typically designed with flexibility and scalability in mind.

One could argue that perhaps ther’s too much flexibility and scale sometimes, but they do take advantage of contemporary advanced technologies which means that they potentially secure a bit more of a resiliency lifeline.

A lifeline in the sense that you will continue to have software developers available to work on it, users who actively use it because of its more contemporary look and feel, and a few more serviceable versions before it is is surpassed by something newer and shinier, at which point it too becomes classified as “legacy”.

Cloud-Native Solutions

One of the most significant advancements in data systems these days, is the prevalence of cloud-native solutions. Not solutions ported to the cloud but rather solutions built from the ground up using the cloud-first design paradigm. I make this distinction because so many cloud offerings are nothing more than ‘moved’ technologies.

Cloud native systems may use microservices architecture — a design approach allowing individual components to be developed, deployed, and scaled independently. They may also make use of on-demand “serverless” technologies. By taking advantage of the modularity afforded by microservices, organizations can adapt their data management capabilities relatively more quickly in response to changing business requirements. This could be through technology switch outs or incremental additions. The serverless elements means that they make use of compute on-demand and in theory this means a lower operational cost and reduced wastage due to overprovisioned idle infrastructure/

Many cloud-native data management solutions also have the ability to more easily harness artificial intelligence and machine learning technologies to enhance data processing and analysis capabilities. Such tool use facilitates real-time data integration from diverse sources, allowing businesses to more easily maintain accurate and up-to-date data records with less effort.

Instead of being bound to geographies and constraining hardware profiles, users only need to have an internet connection and suitable software infrastructure to securely authenticate. The technology that supports the compute being able to be switched out in a seemingly limitless number of combinations according to the capabilities and inventory of offerings of the hosting providers.

Scalability is one of the most pressing concerns associated with legacy systems, one that these contemporary systems technologies seem to have largely overcome. Cloud-native solutions purport to be able to handle growing data volumes with almost no limits.

A growing data footprint also compells the organizations that continue to generate vast amounts of data daily. The modern data solution suggests that it can scale horizontally—adding more resources as needed without impairment and minimal disruption.

The concept of data mesh is also growing in popularity. It seems to be something that is gaining traction as an alternative to traditional centralized data management frameworks. On face value at least, this seems not dissimilar to the debate surrounding all-in-one versus best-of-breed solutions in the world of data applications. Both debates revolve around fundamental questions about how organizations should structure their data management practices to best meet their needs.

Data Mesh promotes a decentralized approach to data management by treating individual business domains as autonomous entities responsible for managing their own data as products. This domain-oriented strategy empowers teams within an organization to take ownership of their respective datasets while ensuring that they adhere to standardized governance practices. By decentralizing data ownership, organizations achieve greater agility and responsiveness in managing their information assets.

The concept also emphasizes collaboration between teams through shared standards and protocols for data interoperability. This collaborative approach fosters a culture of accountability while enabling faster decision-making processes driven by real-time insights. Set the policies, frameworks and approaches centrally but delegate the execution to the perhipheral domains to self-manage.

The Evolutionary Path Forward

Evolving from legacy to modern data management practices then starts to reflect broader transformations which occur through the embrace of things digital. Such a shift is not merely about adopting new tooling; it represents a fundamental change in how businesses view and manage their data assets. Centralized, constrained control gets displaced by distributed accountability.

Along the way, there will be some challenges to be considered. Amongst these, the cost of all these threads of divergence and innovation. Not all business areas will necessarily run at the same pace. Some will be a little more lethargic than others and their palate for change or alternative ways of working may be very constrained and limited.

Another issue will be the costs. With IT bugets remaining heavily constricted by most businesses, the idea of investing in technology bound initiatives is nowadays wrapped up in elaborate return-on-investment calculations and expectations.

The burden of supportive evidence for investment now falls to the promoters and promulgators of new ways of working and new tech; to provide proof points, timelines and a willingness to qualify the effort and the jsutification before the investment flows. With all the volatility that might exist in the business, sometimes these calculations, forecasts and predictions may be very hard to calculate.

Buying into new platforms and technologies also requires a candid assessment as to the viability or likelihood that any particular innovation will actually yield a tangible or meaningful business benefit. While ROI is one thing, the ability to convince stakeholders that the prize is a worthwhile prize is another. Artificial Intelligence, machine learning and big data analytics present as a trio of capabilities that hold promise that some will continue to doubt the utility of.

As evidenced by history being littered with market misreads like RIM’s Blackberry underestimating the iPhone and Kodak Film’s lack of comprehension of the significance of digital photography. Big Tech’s Alphabet (Google), Amazon, Apple, Meta and Microsoft may get a bunch wrong, but the more vulnerable business sector that depends on these tech giants cannot really afford to make too many mistakes.

Organizations need to invest as much in critically evaluating next generation data management technologies as in their own ongoing market research. They need to do this to understand evolving preferences and advancements. This includes observing the competition and shifts in demand.

Those that foster a culture of innovation, encourage experimentation and embrace new technologies need to be prepared to reallocate resources or risk having any position of strength that they have, being displaced, especially by newer more agile entrants to their markets. Agility means being able to quickly adapt, a crucial characteristic for responding effectively to market disruptions. Being trapped with a legacy mindset and legacy infrastructure retards an organization’s ability to adapt.

Driving toward a modern Data-Driven Culture

To maximize the benefits of modern data management practices, organizations must foster a culture that prioritizes data-driven decision-making at all levels. In a modern data-driven culture an organization’s data management environment is key. Decisions, strategies and operations at all levels need to be bound to data.

For this to work, data needs to be accessible, the evaluators, users, and consumers of the data need to be data literate and they need to have the requisite access and an implicit dependency on data as a part of their dailies. For effective data management there needs to be a philosophy of continuous improvement tied to performance metrics and KPIs like data quality measures accompanied by true accountability.

Building blocks for this data driven culture hinge not only on the composition of the people and their work practices but also on the infrastructure which needs to be scalable and reliable, secure and of high performance.

The data contained therein, needs to be comprehensive, rich and accessible in efficient and cost effective ways. The quality of the data needs to be able to stand up to all kinds of scrutiny from a regulatory and ethical standpoint, through auditability and functional suitability. Any efforts to make the whole approach more inclusive and embracing of a whole organization inclusive mindset should also be promoted. The ability to allow the individual business units to manage their own data and yet contribute to the data more holistically will ultimately make the data more valuable.

If legacy has not failed us already, it will. Failure may not be obvious. It could be a slow, degraded experience that hampers business innovation and progress. Organizations that do not have renewal and reevaluation as an integral part of their operating model.

To effectively transition from legacy systems to modern data management practices, organizations must recognize the critical limitations posed by outdated technologies and embrace the opportunities presented by contemporary solutions.

Legacy systems, while at some point foundational to business operations, often struggle to manage the complexities and voluminous data generated in today’s digital landscape. Their rigidity and inability to adapt hinder innovation and responsiveness, makes it imperative for organizations to evaluate their reliance on such systems.

The shift towards modern solutions—characterized by flexibility, scalability, and integration—presents a pathway for organizations to enhance their operational efficiency and decision-making capabilities. Cloud-native solutions and decentralized data management frameworks like Data Mesh empower businesses to harness real-time insights and foster collaboration across departments. By moving away from siloed approaches, organizations can create a holistic view of their data, enabling them to respond swiftly to market changes and customer demands.

As I look ahead, I see it as essential that organizations cultivate their own distinctive data-driven culture.

A culture that prioritizes accessibility, literacy, and continuous improvement in data management practices. Such a shift would not only enhance decision-making but also drive innovation, positioning any organization more competitively in an increasingly complex environment.

All organizations must take proactive steps to assess their current data management strategy and identify areas for modernization.

They should begin by evaluating the effectiveness of existing legacy systems and exploring integrated solutions that align with their business goals.

They should invest in training programs that foster data literacy among employees at all levels, ensuring that the workforce is equipped to leverage data effectively.

Commit to a culture of continuous improvement, where data quality and governance are prioritized. By embracing these changes, organizations can unlock the full potential of their data assets and secure a competitive advantage for the future.


Read More
Author: Clinton Jones

Air Travel is a miserable experience these days


I have done a fair bit of travel in the past six months both in and out of North America, across Asia and in and out of Europe. The carriers have included Alaska Airlines, AirAsia, British Airways, Finnair, Ryanair, Scoot, Singapore Airlines and United Airlines. I have also booked flights for others on British Airways with an Aer Lingus codeshare and Icelandair.

I have to say that with pretty much all of them, the experience of online booking and then checking in has been pretty awful and the costs of the tickets pretty steep. In terms of value for money, I am frustrated. In terms of how the value has declined further since COVID during which many airlines received governmental survival incentives and gouged the few remaining passengers, I am even more disappointed.

My earliest recollection of air travel was in the mid 1970s, a Vickers Viscount, the plane’s top cruising speed was about 500kph at an altitude of 25,000ft. I know these things because in-flight, as a youth, I was allowed into the cockpit to see the plane on a night run while it was in flight. Seating was a 3 + 2, with a single aisle running down the plane. The plane had gigantic oval porthole windows.

Airline tickets were a waxy carbonized booklet, often typed up with the details and the boarding pass was according to souvenir evidence, hand written. You were required to confirm your flight the day before departure by calling the airline and at check-in, luggage was weighed on a beam scale. With the country and era in which I was traveling, passengers had to explicitly identify their luggage on the tarmac before it was loaded into the hold for safety reasons. Family members stood on an open balcony at in the airport building and waved to you as you walked across the tarmac and climbed the stairs.

My next freshest memory is my first long haul flight to London in 1980; mostly unremarkable, aside from the fact that we left late at night and seemingly arrived in the morning despite flying for what seemed like a whole day. This journey was on a significantly more substantial aircraft, a Boeing 707 which flew into London Gatwick. The flight was not particularly memorable aside from the fact that I distinctly remember the plane had a smoking section!

Air travel was a luxury for many at least up until the late 1980’s. For us, tickets were bought by my parents on a layaway and planned as far as up to a year in advance. But these days, flying somewhere even only for a couple of hours is pretty much available to a broad swathe of people and is certainly not a luxurious experience. If you think commercial air travel is glamorous, you should think again.

The Golden Age of Air Travel

Post WWII Airlines competed to provide exceptional service, and passengers were treated to a level of comfort and luxury that seemingly has become a distant memory to all except those flying for obscene amounts of money.

Flying was an experience savoured, marked by exotic meals served on fine china, attentive cabin crew, and spacious seating. Passengers often dressed-up, adding to the atmosphere of sophistication and excitement.

Flying was an exotic experience accompanied by arrival at some far flung destination. The travellers would board these flying machines with a sense of anticipation, ready to enjoy the amenities that came as a part of their ticket. Long haul flights provided a collective cinema experience, passengers, especially the younger ones, were given keepsakes, games, puzzles, crayons etc and music or audio programming was piped to every seat. The experience was designed to make passengers feel some kind of privilege, a far cry from today’s reality.

The Shift in Airline Economics

As the airline industry evolved, so did its economic landscape. The deregulation of the airline industry in the late 1970s in the United States marked a significant turning point. It led to increased competition among airlines, which ultimately drove ticket prices down. While this made air travel more accessible to the general public, it also set the stage for a shift in how airlines operated. US Jimmy Carter Airline Deregulation Act, signed in 1978 witnessed the cost of air travel going down accompanied by a decline in the quality of service. Other regions would soon follow suit.

To remain competitive, airlines began to adopt cost-cutting measures that would fundamentally change the passenger experience. The focus shifted from providing an exceptional journey to maximizing profits. As a result, many of the amenities that once defined air travel were eliminated or reduced. The once-coveted in-flight meals were replaced by military rations-like snack boxes. And complimentary beverages have all but evaporated.

The Decline of Comfort and Service

Today, the experience of flying is characterized by discomfort and a complete lack of personal service.

Airlines have crammed more seats into aircraft, this has led to reduced seat pitch, reduced legroom and narrower aisles as described in the WSJ article The Incredible Shrinking Plane Seat .

An average economy class seat now offers less space than it did decades ago, seat width is down as much as four inches over the last 30 years. Seat pitch has shrunk from about 35 inches to 31 and in some cases as little as 28 inches – on some airlines, seats have NO Pitch at all — allowing airlines to add more seats they can then sell.

Many passengers now find themselves wedged between strangers for hours on end. The once spacious cabins have become cramped, and any sense of personal space has effectively been dissolved.

In-flight service has also suffered. Cabin crew are often stretched thin, serving hundreds of passengers with limited resources. The personal touch that once defined air travel has been replaced by a more transactional approach. Passengers are now often treated as numbers rather than individuals, leading to a sense of impersonal service.

The Rise of Low-Cost Carriers

The emergence of low-cost carriers has further exacerbated the decline of air travel glamour. Airlines such as Ryanair and EasyJet, WhizzAir, Scoot, AirAsia and JET have revolutionized the industry by offering significantly lower fares. However, this has come at a cost. Passengers are now faced with a plethora of additional fees for services that were once included in the ticket price.

Additional fees are now charged for checked bags, carry-on bags, seat selection, paper boarding passes, and in-flight refreshments like water, tea and coffee, all add up quickly, turning what initially appears to be a bargain into something much more expensive.

The low-cost model has led to a homogenization of the flying experience. Passengers are herded like cattle, boarding and disembarking in an industrial flow that prioritizes efficiency over comfort.

The thrill of flying has been replaced by a long list of anxiety creating circumstances including, worries about overweight luggage, getting a middle seat allocation, being unable to find overhead stowage, having to arrive hours ahead of departure, inadequate lounge seating, being unsure about whether the plane will leave on time, not being able to pay for anything with cash, ungodly departure and landing times, inconveniently located airports, crowded terminals and long and arduous security lines.

The Impact on Passenger Experience

The cumulative effect of these changes has been a significant shift in how passengers perceive air travel. While flying is now more accessible, the magic and excitement of the journey is now simply not there. Travellers approach air travel with a sense of dread rather than anticipation. The stress of getting to the airport, navigating security, and enduring cramped seating has overshadowed the joy of reaching a new destination.

There is a general lack of amenities and the service is highly impersonal. Surveys indicate that a significant percentage of travellers feel that the in-flight experience has deteriorated over the years. The once-coveted experience of enjoying a meal at 30,000 feet has been replaced by the reality of overpriced snacks and limited F&B options.

The rise of technology has not necessarily improved the passenger experience. While online check-in and mobile boarding passes have streamlined some processes, they have also contributed to a more transactional relationship between airlines and passengers. The human touch that once characterized air travel has been replaced by automated systems and self-service kiosks, not all of which are available or functioning, queues everywhere and the proverbial cattle station handling experience that passengers are subjected to, at every stage.

Cancel your ticket or have it cancelled for you, and you have no guarantees of a full refund or restitution or compensation. Instead the industry has spawned a whole world of travel insurance, and reinsurance with middle men and brokers selling you tickets, selling you travel protection and everything in between.

Image Credit :Shutterstock

Nostalgia for the Past

For me at least, the nostalgia for the golden age of air travel is palpable. I reminisce about the days when flying was truly an event, marked by some special novelty and excitement. The memory of being served a nice meal, spacious seating, and some level of personalized attention from flight attendants even in coach/economy class. It evokes in me, a sense of longing for a bygone era that I can only experience again if I am prepared to pay a massive premium.

The nostalgia is not just about comfort in flying; it’s a desire for the experience of non utilitarian travel itself. The thrill of embarking on an adventure, the anticipation of exploring new geography, cultures, and the joy of connecting with fellow passengers. It has all been overshadowed by an altogether more stressful modern air travel experience.

Looking Ahead: The Future of Air Travel

The airline industry may evolve further, balancing cost-cutting measures with passenger expectations may make low fares and air travel more accessible but there’s a growing demand for improved service and comfort. The airlines that can find a way to enhance the passenger experience while maintaining competitive pricing may stand out in an increasingly crowded market.

Improved in-flight entertainment systems won’t cut it, in fact some airlines are cutting back on these too. Enhanced seating designs might help, but not if the designs continue to shrink personal space and add discomfort.

Improved customer service training could help restore some of the lost glamour of flying but a renewed focus on customer satisfaction and personalized service is what is really needed in order for airlines to regain the trust and loyalty of veteran travellers.


Read More
Author: Clinton Jones

The Duality of Ideas: Multiplication and Execution


Ideas are the seeds that blossom into groundbreaking products, services, and solutions. The belief that “ideas are like rabbits” is a long embraced one, it celebrates the notion that the more ideas one generates, the greater the potential for breakthroughs. However, Apple founder and innovator, Steve Jobs astutely observed, the mere abundance of ideas can also breed a dangerous “disease” – the misguided belief that a brilliant concept alone is sufficient for success, without the arduous journey of execution.

The metaphor of ideas as rabbits captivates, conjuring up images of rapid multiplication and boundless potential and just as rabbits are known for prolific breeding (breeding like rabbits), ideas can have the capacity to spawn new thoughts, concepts, and perspectives at an astonishing rate in specific circumstances and in the company of specific audiences.

Idea proliferation through ‘brainstorming’ is often celebrated in creative circles, where such sessions and ideation workshops are designed to unleash a torrent of possibilities, each one building upon the last.

Author and entrepreneur James Altucher eloquently captures this sentiment in his book “Choose Yourself,” stating, “Ideas are the multiplicative force that allows a human to combine, recombine, and create new ideas from old ideas.” This concept also resonates with the concept of “idea sex,” a process of combining existing ideas to generate novel ones attributed to Science writer,  Matt Ridley first came up with the concept in 2010 writing a book on the subject called The Rational Optimist as well as speaking at a TED talk under the theory of “Ideas have sex” espoused at the “blue-sky thinking” conference.

Apple’s Steve Jobs also cautioned, that the unbridled multiplication of ideas can lead to a dangerous pitfall – the “disease of thinking that a really great idea is 90% of the work.” So the rabbits metaphor plays in this space again. Overbreeding rabbits can lead to various health issues and diseases for both the mother rabbits and their offspring. Diseases like pregnancy toxemia, uterine cancer, mastitis, exhaustion and malnutrition. For the offspring there is the risk of genetic defects, and weakened immunity.

A Jobs Stanford commencement speech emphasized the immense effort required to transform even the most brilliant idea into a tangible, successful product or service.”There’s a huge gulf,” he proclaims, “between a great idea and its ultimately becoming a phenomenal success in the real world.”

Jobs understood that the path from conception to realization is fraught with challenges, requiring relentless problem-solving, teamwork, and a willingness to make countless tradeoffs and refinements along the way.

Such sentiment echo the words of renowned management consultant Peter Drucker, who described ideas as not dissimilar to babies, in that they need to be born and nurtured. Newborns requires constant care and attention to thrive, an idea must be meticulously cultivated, refined, and executed to reach its full potential.

Jobs warns against the “disease” as a false belief that simply having a great idea is enough – that the mere act of sharing or discussing a brilliant concept is tantamount to success. This misconception can lead to complacency, a lack of follow-through, and a failure to recognize the immense effort required to bring an idea to fruition.

In contrast, Jobs also championed a balanced approach, one that embraced the rapid multiplication of ideas while simultaneously recognizing the necessity of diligent execution. Understanding that true innovation lies not only in the generation of ideas but also in the ability to identify the most promising concepts and nurture them through a rigorous process of refinement, collaboration, and problem-solving.

Guy Kawasaki, author and speaker, states, “Ideas are easy. Implementation is hard.” Akin to “ideas are cheap”. He also emphasizes the importance of execution, noting that even the most groundbreaking ideas are worthless without the dedication and perseverance required to bring them to life. The duality of ideas – their curated multiplication and the necessity of considered execution form the balance that product managers, designers and architects must consider.

So if your thinking is that “ideas are like rabbits” which do you celebrate? The boundless potential of human creativity, or the carefree strewing of concepts without due conisderation for the immense effort required to transform those ideas into tangible successes.

The true path to innovation lies not in the mere abundance of ideas but in the ability to identify the most promising ones and nurture them through a relentless pursuit of excellence, collaboration, and attention to detail per Jobs.

The “disease” of overvaluing ideas be cured, and the full potential of human ingenuity be realized if we accept the tax of execution.

The key message for product managers is to strike a balance between fostering idea generation and ensuring rigorous execution. While the rapid multiplication of ideas is essential for innovation, overvaluing ideas alone can lead to the proverbial falls into the pit.

Execution is King

Here are some ideas that product managers should be considering in this context:

Implement systematic approaches like the SIT (Systematic Inventive Thinking) formula, which provides techniques for acquiring skills and generating original ideas.

  • Subtraction: Removing an essential component from a product or service and finding new uses for it.
  • Multiplication: Repeating or multiplying a component that was previously considered non-essential.
  • Division: Separating a product or service into smaller components and rearranging them.
  • Task Unification: Assigning new tasks or functions to existing components.
  • Attribute Dependency: Linking two independent attributes or components to create a new value proposition.

Invest time and effort in developing and maintaining some sort of strategic product roadmap that translates the visionary product strategy into actionable plans, defining milestones and timelines aligned with the vision of the product(s) and the business.

Set and agree on clear objectives, priorities, and key performance indicators (KPIs) based on customer needs, market research, and the overall product strategy.

Influence and collaborate on the allocation of resources efficiently, budgets, team members, and resource allocation to maximize value and productivity.

Continuously evaluate and refine your product managemnt strategies based on data-driven decision-making, user feedback, and market dynamics.


Read More
Author: Clinton Jones

In life, generally old keys cannot open new doors


To progress and achieve new goals, we must let go of old mindsets, habits, and behaviours that no longer serve us.

Holding onto our past experiences and some historical knowledge can hinder our growth and keep us stuck in familiar but unfulfilling behavioural and mindset patterns. As one evolves and matures and takes on new challenges, one should adapt one’s approach accordingly.

As the saying suggests, trying to use the same “keys” from the past to unlock the doors of the future can be very frustrating and even futile.

To open new doors of opportunity, work, and endeavour, one must be willing to sometimes embrace radical change and embrace new ways of thinking and acting. This may require completely unlearning some of our old and potentially limiting beliefs about ourselves, those around us, and their and our potential. We should not let yesterday’s experiences and mindset dictate today’s possibilities.

Discarding the “dead weight” of a bunch of old keys that jangle our lives, creates cognitive and emotional space for the new keys we need to succeed. This discard could mean breaking old habits, ending toxic relationships and associations, or adopting a fresh perspective.

This all takes work, but the potential rewards in terms of personal growth and new opportunities often make it very worthwhile. The key is to focus on the new vision you have for your life, and take small daily actions to make it a reality.

Swapping out unproductive habits for ones that serve your goals with growth integral to your mindset, accompanied by a consistent effort can often open doors to a better future.

To embark upon this change is not easy, but it may well be necessary for your growth and to release you from being stuck in a rut or dissatisfied with your current circumstances. If these characteristics encumber your daily life, then it’s a sign that it’s time to let go of the old and embrace the new.

It’s important to take an honest look at one’s life and identify the areas that need improvement. Dead-end job, unfulfilling relationship, bad habits? Once we pinpoint the issues, we can start to develop a plan for change.

Letting Go

One of the biggest obstacles to change is our attachment to the past. Humans by their very nature are mostly nostalgic and sentimental.

Nostalgia is such a common human experience, with most people reflecting on the past as often as once a week, according to some studies.  This involves sentimentally longing or having an affection for the past, typically for a period or place with happy personal associations. This is often triggered by something reminding an individual of a positive experience from the past, such as songs, smells, photographs, or loneliness. Often it is characterized by bittersweet or even painful memories of the past.

The most nostalgic of us do tend to have certain personality traits, such as daydreaming frequently, being sentimental, overthinking, romanticizing the past, and not liking change. Nostalgia peaks during transitional age ranges like the teens through 20s and over 50. This nostalgia can have both positive and negative effects, positively boosting one’s mood, increasing self-esteem, providing a sense of social support, and helping one cope with difficult life transitions. Excessive nostalgia and dwelling too much on the past may have corrosive negative consequences.

We may cling to old habits, relationships, or beliefs because they feel comfortable and familiar. However, this attachment can prevent us from moving forward and reaching our full potential. It’s important to acknowledge that the past is gone and that we can’t change it but what we can change is our perspective on it.

Instead of viewing the past as a burden, we can see it as a learning experience that has shaped us into who we are today. Letting go of the past, we create space for new opportunities and experiences. We free ourselves from the weight of old baggage and can focus on the present moment and the future.

Growth and Rewards

To open new doors in life, we need to adopt a growth mindset.

This means embracing challenges, learning from mistakes, and continuously striving for improvement.A growth mindset allows us to see obstacles as opportunities for growth rather than roadblocks. When we face a challenge, we can approach it with curiosity and a willingness to learn rather than fear and resistance.Developing a growth mindset also requires us to be receptive to feedback and criticism despite it being hard to hear and accept. That feedback can provide valuable insights into areas where we can improve and inform our growth, we can accelerate our progress and open new doors more quickly with new keys.

Change is tied to action, so it’s not enough to simply think about the changes we want to make; we need to take concrete steps to make them happen – one way is with setting clear goals for ourselves. Understanding where we want to go and what we want to achieve. Understanding what we think success looks like. By setting specific, measurable goals, we can create our own personal roadmap for growth to help us stay focused on the path ahead.

Another important action is to develop a support system; seeking out supportive friends, family, or mentors who can make a difference in terms of providing emotional support and perhaps other kinds of support too. These people can provide encouragement, advice, and accountability as we work towards our goals.

Finally, it’s important to celebrate one’s progress and successes along the way; since change is a journey, it’s important to acknowledge the small wins that keep us motivated and inspired and celebrate our achievements, from which we can build momentum and stay focused on the bigger picture.


Read More
Author: Clinton Jones

To the cloud no more? That is the question.


Cloud computing has undergone a remarkable transformation over the past decade.

What was once hailed as a panacea for companies struggling with the high costs and unsustainability of on-premise IT infrastructure has now become a more nuanced and complex landscape. Businesses continue to grapple with the decision to migrate to the cloud or maintain a hybrid approach, the complexity, costs and risk are essential to understand the evolving dynamics and the potential pitfalls that lie ahead.

The initial appeal of cloud solutions was undeniable.

By offloading the burden of hardware maintenance, software updates, and data storage to cloud providers, companies could focus on their core business activities and enjoy the benefits of scalability, flexibility, and cost optimization. The cloud promised to revolutionize the way organizations managed their IT resources, allowing them to adapt quickly to changing market demands and technological advancements.

However, not all businesses have fully embraced the cloud, especially when it comes to their mission-critical systems. Companies that handle sensitive or proprietary data have often been more cautious in their approach, opting to maintain a significant portion of their operations on-premise. These organizations may have felt a sense of vindication as they watched some of their cloud-first counterparts grapple with the complexities and potential risks associated with entrusting such critical systems to third-party providers.

The recent news from Basecamp, for example, was driven by spiraling costs, irrespective of the cloud provider (they tried AWS and GCP). Thus, Basecamp decided to leave the cloud computing model and move back to on-premise infrastructure to contain costs, reduce complexity, avoid hidden costs, and retain margin. This way they felt that they had more control of the delivery and sustainment outcomes.

The Ongoing Costs of Cloud-First Strategies

Cloud bills, for example, can comprise hundreds of millions or billions of rows of data, making them difficult to analyze in traditional tools like Excel and cloud computing reduces upfront startup costs, including setup and maintenance costs, with 94% of IT professionals reporting this benefit. Accenture for example, found cloud migration leads to 30-40% Total Cost of Ownership (TCO) savings.

As many as 60% of C-suite executives also cite security as the top benefit of cloud computing, ahead of cost savings, scalability, ease of maintenance, and speed.

The private cloud services market for example, is projected to experience significant growth in the coming years. According to Technavio, the global private cloud services market size is expected to grow by $276.36 billion from 2022 to 2027, at a CAGR of 26.71%. 

The cloud of course supports automation, reducing the risk of human errors that cause security breaches and accoridnly the platforms help capture the cost of tagged, untagged, and untaggable cloud resources, as well as allocate 100% of shared costs. For those organizations that have wholeheartedly adopted a cloud-first strategy, the operational budgets for cloud technologies have often continued to climb year-over-year.

Instead of fully capitalizing on the advances in cloud technology, these companies may find themselves having to maintain or even grow their cost base to take advantage of the latest offerings. The promise of cost savings and operational efficiency that initially drew them to the cloud may not have materialized as expected.

As this cloud landscape continues to evolve, a critical question arises: is there a breaking point where cloud solutions may become unviable for all but the smallest or most virtualized cloud-interwoven businesses?

This concern is particularly relevant in the context of customer data management, where the increasing number of bad actors and risk vectors, coupled with the growing web of regulations and restrictions at local, regional, and international levels, can contribute to a sense of unease about entrusting sensitive customer data to cloud environments.

The Evolving Regulatory Landscape & Cyber threats

The proliferation of data privacy regulations, such as the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) in the United States, has added a new layer of complexity to the cloud adoption equation.

These regulations, along with a growing number of industry-specific compliance requirements, have placed significant demands on organizations to ensure the security and privacy of the data they handle, regardless of where it is stored or processed.For businesses operating in multiple jurisdictions, navigating the web of regulations can be a daunting task, as the requirements and restrictions can vary widely across different regions.

Failure to comply with these regulations can result in hefty fines, reputational damage, and even legal consequences, making the decision to entrust sensitive data to cloud providers a high-stakes proposition.

Alongside the evolving regulatory landscape, the threat of cyber attacks has also intensified, with bad actors constantly seeking new vulnerabilities to exploit.

Cloud environments, while offering robust security measures, are not immune to these threats, and the potential for data breaches or system compromises can have devastating consequences for businesses and their customers.

The growing sophistication of cyber attacks, coupled with the increasing value of customer data, has heightened the need for robust security measures and comprehensive risk management strategies. Companies must carefully evaluate the security protocols and safeguards offered by cloud providers, as well as their own internal security practices, to ensure the protection of their most valuable assets.

Balancing Innovation and Risk Management

In light of these challenges, many businesses are exploring hybrid approaches that combine on-premise and cloud-based solutions.

This strategy allows organizations to maintain control over their mission-critical systems and sensitive data, while still leveraging the benefits of cloud computing for less sensitive or more scalable workloads.

Some companies are also taking a more selective approach to cloud adoption, carefully evaluating which workloads and data sets are suitable for cloud migration.

By adopting a risk-based approach, they can balance the potential benefits of cloud solutions with the need to maintain a high level of control and security over their most critical assets.

As the cloud landscape continues to evolve, it is essential for businesses to carefully evaluate their cloud strategies and adapt them to the changing circumstances.

This may involve regularly reviewing their cloud usage, cost optimization strategies, and the evolving regulatory and security landscape to ensure that their cloud solutions remain aligned with their business objectives and risk tolerance.Regular monitoring and assessment of cloud performance, cost-effectiveness, and security posture can help organizations identify areas for improvement and make informed decisions about their cloud investments.

Collaboration with cloud providers and industry experts can also provide valuable insights and best practices to navigate the complexities of the cloud ecosystem.

As the cloud landscape continues to evolve, it is clear that the path forward will not be a one-size-fits-all solution.

Businesses must be careful in weighing the potential benefits of cloud adoption against the risks and challenges that come with entrusting their critical data and systems to third-party providers.The future of cloud solutions will likely involve a more nuanced and balanced approach, where organizations leverage the power of cloud computing selectively and strategically, while maintaining a strong focus on data security, regulatory compliance, and risk management.

Collaboration between businesses, cloud providers, and regulatory bodies will likely be crucial in shaping the next chapter of the cloud revolution, ensuring that the benefits of cloud technology are realized in a secure and sustainable manner.


Read More
Author: Uli Lokshin

Inspiring Others: The Art of Transforming Passion into Shared Vision


Leaders are often driven by a deep, unwavering passion for their cause, political position, product, or objective.

This dedication can be a powerful force, fuelling determination and propelling them towards their goals. However, this very passion can also become a source of frustration when others fail to share the same level of enthusiasm.

It can be disheartening for leaders when employees are not as excited about new initiatives, when family members do not fully support their efforts, or when outsiders seem disinterested in engaging with the leader’s agenda.

Such a lack of shared passion can leave the leader feeling alone, angry, misunderstood, and under-appreciated. While it is tempting to argue, debate, or pressure others into aligning with one’s vision, this approach often proves counterproductive.

Overcoming this frustration lies in the one’s ability to inspire others, rather than simply asserting their own desires. Inspiring leaders understand that the most effective way to achieve their goals is to focus on what others want, rather than solely on their own agenda.

By explaining one’s passion in the language and perspective of those you seek to influence, you can create a shared vision that resonates with the needs and desires of your audience.

Connecting the collective’s needs and dsires

One of the hallmarks of an inspiring leader is often their ability to shift the conversation focus from themselves to that of the collective.

Instead of simply extolling the virtues of their own position or product, they take the time to understand the needs, concerns, and aspirations of those they wish to engage. In the absence of this, you have tghe master-serf relationship where employees in particular are just wage-slaves; or in clubs or societies where the leader is the prophet and the others are just acolytes or disciples.

By doing so, a leader can craft a narrative that speaks directly to the interests and motivations of those around them; but it needs to be grounded in everyone’s reality.

“The more scarce and valuable commodity is cold-shower-self-honesty”

 Joel Runyon

Rather than relying just on passion alone, inspiring leaders are willing to step back and critically examine their own circumstances, their assumptions, biases, and communication strategies. They recognize that their personal enthusiasm, while genuine, may not be enough to sway others who have different priorities and perspectives.

The Shared Vision

By focusing on the needs and desires of the team, an inspiring leader is able to craft a shared vision that aligns with the goals and personal and collective aspirations of those around them.

This shared vision becomes a powerful tool for overcoming the frustration that can arise when others do not immediately embrace the leader’s passion.Rather than simply pushing their own agenda, inspiring leaders take the time to understand what motivates their employees, family members, or external stakeholders. They then weave these insights into a narrative that highlights how the leader’s vision can help others achieve their own objectives.

Such an approach creates a sense of mutual investment and shared purpose, fostering a collaborative environment where everyone feels invested in the success of the endeavor.

Empathy and Emotional Connection

Inspiring leaders often understand that passion alone is not enough to drive meaningful change.

Recognizing the importance of empathy and emotional connection in engaging others and cultivating a shared vision, they actively listen to the concerns and perspectives of others, they are able to tailor their message in a way that resonates on a deeper level.

An emotional connection is crucial in overcoming the frustration that can arise when others do not immediately share the same passion. By demonstrating a genuine understanding of the challenges and aspirations of their audience, inspiring leaders are able to build trust and foster a sense of shared purpose. This, in turn, helps to overcome resistance and create a collaborative environment for all.

The Art of Inspiration

Inspiring leaders understand that the role is not to simply assert their own desires, but to create a compelling vision that aligns with the needs and aspirations of those they seek to influence.

This requires a delicate balance of passion, empathy, and strategic communication.

A focus on “them” speaks directly to the concerns and motivations of others. The most valuable commodity is not just passion, but the willingness to engage in that “cold-shower-self-honesty” – critically examining own assumptions, biases, and communication strategies.

A process of self-reflection and audience-centric communication, inspiring leaders are able to overcome the frustration that can arise when others do not immediately share their passion and potential impatience.

A shared vision that resonates with the needs and desires of their audience fosters collaborative environment where everyone feels invested in the success of the endeavor.

Ultimately, the art of inspiration is not about forcing others to conform to the leader’s agenda, but about cultivating a shared sense of purpose and mutual investment and having everyone else naturally come together on the journey of exploration, discovery and execution. Only through connecting with the needs and desires of others, inspiring leaders are able to achieve greater success and create lasting change.


Read More
Author: Flaminio

January is a hunt for work month


If there is one thing that the recent shake-up in the employment market has brought home to many, especially those in tech, it is that work as traditionally understood by many of previous generations, can be unstable. Late in 2023 TikTok’s parent company, ByteDance, dismissed about 1,000 employees in its gaming unit. Fortnite developer Epic Games parted ways with over 800 personnel and Unity has continued a spree that started in 2023 and runs into 2024. The estimates for 2023 are 9000 people impacted in the gaming entertainment sector alone.

I always find a look at layoffs.fyi kind of interesting, they have been around a couple of years now with data going back to 2020. Frontdesk, Invision, Twitch, Lazada, Citrix, Audible, Flipkart Trend Micro, Unity, New Work SE, Google. Some of the numbers are substantial. As of today, 51 Tech companies and 7528 known layoffs in 2024 alone.

The recent changes in the employment market have highlighted the instability of traditional work structures. This has been particularly evident in the tech industry but it is not isolated to tech.

Several factors contribute to this shift:

  • High job openings: Despite a slight decrease in November, job openings in the US remain high by historical standards. This indicates a strong demand for workers.
  • Job market resilience: The US economy added more jobs than expected, demonstrating the resilience in the labor market.
  • Inflation and interest rates: The Federal Reserve has raised its benchmark interest rate multiple times to combat inflation, which has led to a gradual decline in job openings since peaking in March 2022. This could potentially lead to a cooling of the job market. In Europe, inflation dogged many European countries in 2022 and 2023 falling to 3.1% in the EU and 4.7% in the UK
  • Occupational Shift: occupational mixes shifted, with the most highly skilled individuals enjoying the strongest job growth over the last decade, while middle-skill workers had fewer opportunities.
  • Geographic Concentration: Employment growth in general has been concentrated in a handful of regions.
  • Labor Mobility: Labor mobility in the EU has been rising as workers in the lower-income regions migrate to dynamic cities to fill jobs.
  • COVID-19 Impact: The COVID-19 pandemic led to a decrease in national employment rates for 23 of the EU Member States in 2020 compared with the previous year. In the US, the COVID-19 pandemic led to significant job losses, but many of these jobs have since been restored.
  • Fake Work: According to Fortune, some companies reportedly hired people simply to snub competitors and neutralize the likelihood of those valued resources being snatched up by them. What they described as also being “thanks to an over-hiring spree to satisfy the “vanity” of bosses at the likes of Meta and Alphabet“.

These factors all lead to a reevaluation of traditional employment models, with many individuals and companies now exploring more flexible and resilient alternatives. This includes remote work, contract-based roles, and a greater emphasis on skills rather than specific job titles.

It’s a complex issue with many facets, and the job market will evolve in response to these and other pressures.

Organizations, much like living organisms, undergo cycles of growth and contraction influenced by economic conditions. During economic upturns, companies often seize expansion opportunities, hiring more talent to meet increased demands.

This growth phase can lead to a sense of abundance and optimism within the workforce. Conversely, economic downturns may prompt organizations to reassess their structures, resulting in layoffs, restructuring, or a more streamlined approach to operations.

Individual contributors must recognize these patterns to better navigate the shifts in their work environments. Understanding that these changes are often not personal but strategic responses to economic realities can provide a valuable perspective. By staying attuned to the broader organizational context, individuals can position themselves to adapt and contribute effectively during periods of change.

People-to-Manager Ratio

The people-to-manager ratio is a critical aspect of organizational dynamics. This ratio influences the effectiveness of management and the well-being of individual contributors. While there’s no universal formula for the perfect ratio, finding the right balance is essential.

In scenarios where the ratio is too high, individual contributors may feel a lack of guidance or support, leading to burnout and diminished performance. On the other hand, an excessively low ratio might result in micromanagement and hinder autonomy.

Organizations that strike the right balance empower managers to provide meaningful support to their teams while ensuring that individual contributors have the autonomy and resources needed to excel in their roles. This balance fosters a healthy work environment where everyone can thrive.

Imposter Syndrome

Imposter syndrome is a common challenge that individual contributors, especially very competent younger people, and women in particular, may face. This is prevalent, particularly during times of organizational change. It involves persistent feelings of inadequacy and a fear of being exposed as a fraud, despite evidence of competence or clear qualification.

To overcome imposter syndrome, one should actively reflect on achievements, skills, and the unique perspectives one brings to the role. Seeking constructive feedback from colleagues and mentors can provide valuable insights into one’s strengths. Acknowledging accomplishments, no matter how small helps build confidence and dispel the irrational belief of being an imposter. Watch out for gaslighters though.

In the face of organizational shifts, individuals need to recognize their intrinsic value. Understanding that you were hired for a reason and have the skills to contribute meaningfully can be a powerful antidote to any imposter syndrome you may suffer from.

Professional Growth

Regular self-assessment is a cornerstone of professional growth.

Individual contributors should evaluate their roles in the broader organizational context, considering how their work aligns with overarching goals. This involves a critical examination of tasks, responsibilities, and the impact of their contributions.

An effective self-evaluation goes beyond job responsibilities; it delves into the quality of work, initiative, and the ability to collaborate with colleagues. By identifying areas for improvement and actively seeking growth opportunities, individuals position themselves as proactive contributors to the organization’s success.

This reflective process allows individuals to align their goals with the organization’s objectives, ensuring that their contributions remain relevant and valuable, even in the face of organizational changes.

Reinvention of the self

In times of uncertainty, like now, the ability to reinvent oneself becomes a strategic advantage. This reinvention can take various forms, including acquiring new knowledge, adapting behaviors to meet evolving challenges, and delivering tangible results.

Continuous learning is another cornerstone of professional development. If we end up with more than four cornerstones, consider that the building that is your occupation, career, and role, may not be a quadrilateral.

We should all actively seek opportunities to acquire new skills, stay informed about industry trends, and engage in relevant training programs. This not only enhances individual capabilities but also contributes to the organization’s overall resilience.

Adapting behaviors involves staying attuned to evolving workplace dynamics. This may include embracing collaborative technologies, refining communication skills, and fostering a mindset of adaptability. Being open to change and displaying a positive attitude can position you as an asset during times of organizational flux.

Delivering tangible results is also a fundamental aspect of proving one’s value. Individual contributors should focus on outcomes, highlighting achievements and the positive impact of work. This may involve setting measurable goals, taking ownership of projects, and consistently delivering high-quality results that contribute to the organization’s success.

Critical Thinking

Critical thinking is a catalyst for innovation and problem-solving. Those who cultivate such a skill can navigate uncertainties with agility. During periods of organizational change, critical thinking involves strategic analysis, identifying potential challenges, and proposing effective solutions.

Proactive engagement in critical thinking demonstrates leadership qualities. Individual contributors should actively participate in discussions, offer insights, and contribute to decision-making processes. This not only showcases value but positions one as an essential contributor to the organization’s resilience and adaptability.

Critical thinking involves anticipating future trends and challenges. By staying ahead of the curve, you can position yourself as a more valuable asset, contributing to the organization’s ability to navigate changing economic climates.

Good luck otherwise with the current storm that we seem to be sailing through and hope for calmer waters soon, but hopefully not so calm that you get bored.


Read More
Author: Clinton Jones

Data Governance playbooks for 2024


Back in 2020, I offered up some thoughts for consideration around generic or homogenous data governance playbooks. Revisit it if you care to.

This was in part fueled by frustrations with the various maturity models and potential frameworks available but also by the push, particularly from some software vendors, to suggest that a data governance program could be relatively easily captured and implemented generically using boiler-plated scenarios by any organization without necessarily going through the painful process of analysis, assessment and design.

Of course, there is the adage, “Anything worth doing, is worth doing well“, and that remains a truism as applicable to a data governance program as anything else in the data management space.

You can’t scrimp on the planning and evaluation phase if you want to get your data governance program to be widely adopted and effective irrespective of how many bucks you drop and irrespective of the mandates and prescripts yelled from the boardroom.

Like any change program, a DG initiative needs advocacy and design appropriate to the context and no vendor is going to do that perfectly well for you without you making a significant investment of time, people and effort to get the program running. If you’re evaluating a software vendor to do this for you, in particular, you need to be sure to check out their implementation chops and assess their domain knowledge, particularly relevant to your industry sector, market and organizational culture. This is a consulting focus area that “The Big Four” have started to look more closely at and are competing with boutique consultancies on. So if you have a passion for consulting and you feel all the big ERP and CRM projects have been done and you want to break into this space, then here is an area to consider.

What is it exactly?

The term “playbook” in a business context is borrowed from American football. In sports, a playbook is often a collection of a team’s plays and strategies, all compiled and organized into one book or binder. Players are expected to learn “the plays” and ahead of the game the coach and team work out the play that they are likely to run at the opposing team or the approach that they will use if the opposing team is observed to run a particular play of their own. Some plays may be offensive, some defensive and then there may be other plays for specialised tactical runs at a given goal or target.

A “business playbook” contains all your company’s processes, policies, and standard operating procedures (SOPs). Also termed a “company playbook”, it is effectively a manual outlining how your business does what it does, down to each business operations role, responsibility, business strategy, and differentiator. This should be differentiated from a RunBook where the latter is your “go-to” if a team needs step-by-step instructions for certain tasks. Playbooks have a broader focus and are great for teams that need to document more complex processes. It is a subtlety that is appreciated more when you are in the weeds of the work than when you are talking or thinking conceptually about new ways of optimizing organizational effectiveness and efficiency.

A data governance playbook is then effectively a library of documented processes and procedures that describe each activity in terms of the inputs and capture or adoption criteria, the processes to be completed, who would be accountable for which tasks, and the interactions required. It also often outlines the deliverables, quality expectations, data controls, and the like.

Under the US President’s management agenda, the federal Data Strategy offers up a Data Governance Playbook that is worth taking a look at as an example. Similarly, the Health IT Playbook is a tool for administrators, physician practice owners, clinicians and practitioners, practice staff, and anyone else who wants to leverage health IT. The focus is on the protection and security of patient information and ensuring patient safety.

So, in 2024, if you’re just picking up the concept of a playbook, and a data governance playbook in particular, it is likely that you’ll look at what the software vendors have in mind; you’ll evaluate a couple of implementation proposals from consultants and you’ll consider repurposing something from adjacent industry, a past project or a comparable organization.

Taking a “roll-your-own” approach

There’s plenty of reading content out there from books written by industry practitioners, and analysts, to technology vendors, as mentioned. Some are as dry as ditchwater and very few get beyond a first edition, although some authors have been moderately successful at pushing out subsequent volumes with different titles. A lot of the content though, will demonstrate itself to be thought exercises with examples, things/factors to consider, experiences and industry or context-specific understandings or challenges. Some will focus on particular functionality or expectations around the complementary implementation or adoption of particular technologies.

With the latest LLM and AI/ML innovations, you’ll also discover a great deal of content. Many of these publications, articles and posts found across the internet have already been parsed and assimilated into the LLM engines so, a good starting point is for you to ask your favourite chatbot what it thinks.

Using a large language model (LLM) like ChatGPT to facilitate the building of data playbooks might be feasible to a certain extent but there will be challenges.

On the plus-side. An LLM could generate content and provide templates for various sections of a data playbook, such as data classification, access controls, data lifecycle management, and compliance. It can also assist in drafting policy statements, guidelines, and procedures.

It could help in explaining complex data governance concepts, definitions, and best practices in a more accessible language for use in say a business glossary or thesaurus. This could be beneficial for individuals who might not have a deep understanding of data governance – think about your data literacy campaigning in this context.

Users can also directly interact with an LLM in a question-answer format to seek clarity on specific aspects of data governance and help build an understanding of key data governance concepts and data management requirements.

Just as for generic playbooks, there are going to be problems with this approach, LLMs operate based on patterns learned from a diverse range of data, but they often lack domain specificity. A data management platform or data catalog itself might have an LLM attached to it but has it been trained with data governance practice content?

Data governance often requires an understanding of industry-specific regulations, data types, and organizational contexts that might not be captured adequately by a generic model.

We’ve also heard about AI hallucinations, and some of us may have even experienced a chatbot hallucination. Without the particular character of data governance practice and domain knowledge, there’s a risk that the AI might generate content that is wholly or partially inaccurate, incomplete, or not aligned with the actual organizational need. This then, would have you second-guessing the results and having to dig into the details to ensure that the suggested content is appropriate. You’ll need to have a domain expert on hand to validate the machine-generated output.

Data governance practices and regulations are also ever-evolving. What the LLM might not be aware of, is new regulations, new compliance expectations or new industry standards. So leaning purely on machine-generated content may be deficient in revealing emerging best practices unless it gets to be trained with updates.

Each organization has its unique culture, structure, and processes. The intertwined nature of DG with the various organizational processes, and understanding these interconnections is vital; that’s best achieved with careful analysis, process design and domain knowledge. The tool you use to help elaborate your playbook might simply provide information in isolation, without any grasp of the broader organizational context. Without appropriate training and prompting, the specific nuances of the organization will make it almost impossible to tailor the generated content to align with organizational goals and practices.

I guess my whole point is that you will not escape the human factor. If you insist on going it alone and relying on machine-generated content in particular then that same content should undergo thorough validation by domain experts and organizational stakeholders to ensure that the results are accurate and aligned with organizational and industry requirements.

The use of modern-day tooling to assist human experts in drafting and refining data playbooks is a valuable acceleration approach that has merit but just as for generic playbooks and templates, you need to leverage the strengths of canned, automated generation and human expertise to arrive at a good result.

I’d love to hear what if anything you’ve done with chatbots, AI, ML and LLM to generate content. If you are implementing any data management or data governance initiatives, I would love to know how successful you have been and any tips or tricks you acquired along the way.


Read More
Author: Clinton Jones

A Strategic Approach to Data Management


There is a delicate balance between the needs of data scientists and the requirements of data security and privacy.

Data scientists often need large volumes of data to build robust models and derive valuable insights. However, the accumulation of data increases the risk of data breaches, which is a concern for security teams.

This hunger for data and the need for suitable control over sensitive data creates a tension between the data scientists seeking more data and the security teams implementing measures to protect data from inappropriate use and abuse.

A strategic approach to data management is needed, one that satisfies the need for data-driven insights while also mitigating security risks.

There needs to be an emphasis on understanding the depth of the data, rather than just hoarding it indiscriminately.

Towards Data Science article Author, Stephanie Kirmer reflects on her experience as a senior machine learning engineer and discusses the challenges organizations face as they transition from data scarcity to data abundance.

Kirmer highlights the importance of making decisions about data retention and striking a balance between accumulating enough data for effective machine learning and avoiding the pitfalls of data hoarding.

Kirmer also touches on the impact of data security regulations, which add a layer of complexity to the issue. Despite the challenges, Kirmer advocates for a nuanced approach that balances the interests of consumers, security professionals, and data scientists.

Kirmer also stresses the importance of establishing principles for data retention and usage to guide organizations through the decisions surrounding data storage.

Paul Gillin, Technology Journalist at Computerworld raised this topic back in 2021. in his piece Data hoarding: The consequences go far beyond compliance risk, Gillin discusses the implications of data hoarding, which extends beyond just compliance risks. It highlights how the decline in storage costs has led to a tendency to retain information rather than discard it. 

Pijus Jauniťkis a writer in Internet Security at Surfshark describes how the practice can lead to significant risks, especially with regulations like the General Data Protection Act in Europe and similar legislation in other parts of the world.

There is however a landscape where data is both a valuable asset and a potential liability, a balanced and strategic approach to data management is crucial to ensure that the needs of both groups are met.

The data community has a significant responsibility in recognizing both.

Data management responsibilities extend beyond the individual who created or collected the data. Various parties are involved in the research process and play a role in ensuring quality data stewardship.

To generate valuable data insights, people need to become fluent in data. Data communities can help individuals immerse themselves in the language of data, encouraging data literacy.

A governing body organizationally, is often responsible for the strategic guidance of a data governance program, prioritization for the data governance projects and initiatives, approval of organization-wide data policies and standards and if there isn’t one, one should be established.

Accountability includes the responsible handling of classified and controlled information, upholding data use agreements made with data providers, minimizing data collection, informing individuals and organizations of the potential uses of their data.

In the world of data management, there is a collective duty to prioritize and respond to the ethical, legal, social, and privacy-related challenges that come from using data in new and different ways in advocacy and social change.

A balanced and strategic approach to data management is crucial to ensure that the needs of all stakeholders are met. We collectively need to find the right balance between leveraging data for insights and innovation, while also respecting privacy, security, and ethical considerations.


Read More
Author: Uli Lokshin

Unlocking Value through Data and Analytics


Organizations are constantly seeking ways to unlock the full potential of their data, analytics, and artificial intelligence (AI) portfolios.

Gartner, Inc., a global research and advisory firm, identified the top 10 trends shaping the Data and Analytics landscape in 2023 earlier this year.

.These trends not only provide a roadmap for organizations to create new sources of value but also emphasize the imperative for D&A leaders to articulate and optimize the value they deliver in business terms.

Bridging the Communication Gap

The first and foremost trend highlighted by Gartner is “Value Optimization.”

Many D&A leaders struggle to articulate the tangible value their initiatives bring to the organization in terms that resonate with business objectives.

Gareth Herschel, VP Analyst at Gartner, emphasizes the importance of building “value stories” that establish clear links between D&A initiatives and an organization’s mission-critical priorities.

Achieving value optimization requires a multifaceted approach, integrating competencies such as value storytelling, value stream analysis, investment prioritization, and the measurement of business outcomes.

Managing AI Risk: Beyond Compliance

As organizations increasingly embrace AI, they face new risks, including ethical concerns, data poisoning, and fraud detection circumvention.

“Managing AI Risk” is the second trend outlined by Gartner, highlighting the need for effective governance and responsible AI practices.

This goes beyond regulatory compliance, focusing on building trust among stakeholders and fostering the adoption of AI across the organization.

Observability: Unveiling System Behaviour

Another trend, “Observability,” emphasizes the importance of understanding and answering questions about the behaviour of D&A systems. .

This characteristic allows organizations to reduce the time it takes to identify performance-impacting issues and make timely, informed decisions.

Data and analytics leaders are encouraged to evaluate observability tools that align with the needs of primary users and fit into the overall enterprise ecosystem.

Creating a Data-Driven Ecosystem

Gartner’s fourth trend, “Data Sharing Is Essential,” underscores the significance of sharing data both internally and externally.

Organizations are encouraged to treat data as a product, preparing D&A assets as deliverables for internal and external use.

Collaborations in data sharing enhance value by incorporating reusable data assets, and the adoption of a data fabric design is recommended for creating a unified architecture for data sharing across diverse sources.

Nurturing Responsible Practices

“D&A Sustainability,” extends the responsibility of D&A leaders beyond providing insights for environmental, social, and governance (ESG) projects.

It urges leaders to optimize their own processes for sustainability, addressing concerns about the energy footprint of D&A and AI practices. This involves practices such as using renewable energy, energy-efficient hardware, and adopting small data and machine learning techniques.

Enhancing Data Management

“Practical Data Fabric,” introduces a data management design pattern leveraging metadata to observe, analyse, and recommend data management solutions. .

By enriching the semantics of underlying data and applying continuous analytics over metadata, data fabric generates actionable insights for both human and automated decision-making. It empowers business users to confidently consume data and enables less-skilled developers in the integration and modelling process.

Emergent AI

“Emergent AI,” heralds the transformative potential of AI technologies like ChatGPT and generative AI. As one AI researcher described it, “AI ‘Emergent Abilities’ Are A Mirage”. Per a paper presented in May at the Stanford Data Science 2023 Conference related to claims of emergent abilities in artificially intelligent large language models (LLMs) in particular and cited by Andréa Morris Contributor on Science, Robots & The Arts in Forbes.

This emerging trend however seemingly trivial, is expected to redefine how companies operate, offering scalability, versatility, and adaptability. As AI becomes more pervasive, it is poised to enable organizations to apply AI in novel situations, expanding its value across diverse business domains.

Gartner’s highlights another trend, “Converged and Composable Ecosystems,” and old topic from the start of the 2020s, it is focused on designing and deploying data ana analytics platforms that operate cohesively through seamless integrations, governance, and technical interoperability.

The trend advocates for modular, adaptable architectures that can dynamically scale to meet evolving business needs.

“Consumers as Creators,” is nothing particularly new, it envisions a shift from predefined dashboards to conversational, dynamic, and embedded user experiences as a ninth trend.

 Werner Geyser described 20 Creator Economy Statistics That Will Blow You Away in 2023 in his Influencer marketing hub piece

A large percentage of consumers identify as creators. Over 200 Million People globally, consider themselves as “creators”.

Content Creators Can Earn Over $50k a Year and the global influencer market size has increased now to a potential revenue earner of $21 Billion In 2023.

Organizations are encouraged to empower content consumers by providing easy-to-use automated and embedded insights, fostering a culture where users can become content creators.

Humans remain the key decision makers and not every decision can or should be automated. Decision support and the human role in automated and augmented decision-making remain as critical considerations.

Organizations need to combine data and analytics with human decision-making in their data literacy programs. While indicators from marketing analysts like Gartner may serve as a compass, guiding leaders toward creating value, managing risks, and embracing innovations the imperative is to deliver provable value at scale underscores the strategic role of data and analytics leaders in shaping the future for their organizations.

As the data and analytics landscape continues to evolve, organizations that leverage the trends strategically will be well-positioned to turn extreme uncertainty into new business opportunities.


Read More
Author: Jewel Tan

Balancing Cloud Transformation in Turbulent Times


The spectre of an impending economic downturn looms large, prompting business leaders to re-evaluate their strategic decisions, particularly regarding cloud transformation.

Simon Jelley, General Manager for SaaS Protection, Endpoint and Backup Exec at Veritas Technologies notes that despite the economic uncertainty, cloud migration remains a prevalent trend, with 60% of enterprise data already residing in the cloud.

However, the challenge lies in maintaining the cost benefits associated with the cloud, as evidenced by the fact that 94% of enterprises fail to stay within their cloud budgets.

To address this, businesses are encouraged to adopt a hybrid multicloud environment, necessitating careful data management strategies. Here are key steps organizations should take:

  • Establish Data Visibility: Gain a comprehensive understanding of where your data resides, whether on-premises or in the public cloud.
  • Enable Workload Migration/Portability: Facilitate seamless movement of workloads between on-premises infrastructure and various cloud service providers.
  • Leverage Software-Defined Storage: Embrace agile and scalable storage solutions to accommodate the dynamic nature of multicloud environments.
  • Prioritize Data Regulatory and Compliance Issues: Ensure compliance with data regulations across different cloud environments.
  • Eliminate Data Protection Silos: Streamline data protection processes to avoid fragmentation and enhance overall security.

By implementing these measures, organizations can fortify their data management capabilities, ensure resilience and meet compliance objectives amid economic uncertainties.

Cybercrime: A Persistent Threat, Demands Proactive Measures

As cybercrime continues to evolve, organizations must adapt their data management strategies to withstand increasingly sophisticated attacks. Ransomware, in particular, remains a potent weapon for cybercriminals seeking to exploit the value of organizational data.

While addressing cyber resilience is crucial, Jelley also advocates for a proactive approach to reduce the risk of attacks. The focus is on increasing data visibility, and the suggested steps include:

  • Create a Data Taxonomy or Classification System: Classify data based on sensitivity and importance to establish a clear understanding of information assets.
  • Establish a Single Source of Truth (SSOT) Location: Designate centralized locations for each category of data to streamline management and control.
  • Define and Implement Policies: Develop and enforce policies tailored to the specific requirements of identified data types.
  • Continually Update and Maintain Data Taxonomy, SSOT, and Policies: Keep data management strategies agile and responsive to evolving cyber threats.

By adhering to these proactive measures, organizations can limit exposure and enhance their ability to recover in the event of a cyber attack, ultimately safeguarding their critical data.

Digitization 3.0: Unleashing the Power of Usable Data

Digitization has undergone significant phases, with the current era—Digitization 3.0—focusing on extracting maximum value from data while ensuring security, resiliency, and privacy. Jelley emphasizes the importance of contextualizing data to enhance its usability, paving the way for user experience-driven workflows. Building upon the foundation of the preceding trends, organizations can achieve this by:

  • Consolidating Data Control: Utilize platforms capable of managing data across diverse environments, including on-premises, virtual, and multicloud.
  • Map Uses and Users: Conduct a thorough analysis of existing tools and users to seamlessly transition to a consolidated platform.
  • Implement Adequate Training: Ensure that teams are well-versed in utilizing the new consolidated platform to maximize its functionalities.

Digitization 3.0 represents a paradigm shift in data utilization, emphasizing the need for organizations to not only manage and protect their data but also harness its full potential to drive innovation and customer-centric experiences.

As businesses navigate the intricate landscape of data management in 2023, Simon Jelley’s insights shed light on the pivotal trends shaping the industry.

Economic uncertainty, cybercrime, and Digitization 3.0 collectively underscore the importance of proactive, adaptive data management strategies. By embracing data visibility, fortifying cybersecurity measures, and leveraging the power of contextualized data, organizations can not only weather the challenges of the present but also position themselves for success in the data-driven future.

Jelley reiterates the fundamental importance of caring about data—its management, protection, and the ability to address prevailing trends. In a world where information is a critical asset, businesses that prioritize effective data management will not only survive but thrive in the face of evolving challenges.

As we close out 2023, staying abreast of these trends and implementing strategic data management practices will be integral to achieving long-term success in a data-centric business landscape.


Read More
Author: Flaminio

The Impact of Artificial Intelligence on Business Operations


Artificial Intelligence (AI), today more than ever before, stands out as a transformative force reshaping the way businesses operate.

Like all modern technologies, it has infiltrated many aspects of business, enhancing efficiency, improving customer experiences, and driving innovation. It’s touch, is felt from customer service to data analytics.

AI is revolutionizing traditional approaches and propelling organizations into a new era of possibilities but it is challenged by concerns about bias, transparency and its ability to hallucinate.

Some history

The Turing Test, proposed by British mathematician, computer scientist and codebreaker Alan Turing in 1950, was considered a measure of a machine’s ability to exhibit intelligent behavior indistinguishable from that of a human.

The test serves as a rudimentary benchmark for assessing a machine’s capability to display human-like intelligence in natural language conversation but the latest developments with Large Language Models (LLMs) and how they naively behave may have most broken the fundamentals of this test and we may need to think of new ways to assess AI.

The basic premise of the Turing Test is to assess a machine’s ability to engage in human-like conversation, that’s still relevant, but its applicability and limitations have become more pronounced in the context of LLMs. LLMs don’t actually understand what you’re saying or asking.

Despite all this, one of the most significant impacts of AI on business operations is evident in customer service. The very space where we want a conversation, may be better served by an AI.

Chatterbots

The reason may be quite simple. We’re not actually looking for a social conversation with an AI when we use a chatbot or a virtual assistant, instead we’re looking for information, or answers to solve the thing that has brought us to the chatbot in the first place.

The first “chatterbot” is reputed to be ELIZA, created in the mid-1960s by Joseph Weizenbaum, a computer scientist at the Massachusetts Institute of Technology (MIT).

ELIZA operated by processing user responses to supplied prompts and generating pre-defined, contextually appropriate replies.

Using a combination of pattern matching and simple keyword recognition techniques it simulated a Rogerian psychotherapist.

Although the interactions were relatively basic, ELIZA’s ability to mimic human conversation and provide responses that seemed meaningful and engaging was groundbreaking at the time.

If you’re interested, there is a javascript version of ELIZA originally written by Michal Wallace and significantly enhanced by George Dunlop that you can try out at the CSU Fullerton Psychology Department.

When applications are integrated with NLP capabilities, the application “understands” and processes human language. This feature can be part of augmentation of chatbots and virtual assistants and facilitates interactions with customers, employees, and others. Chatbots and virtual assistants powered by AI-driven RPA can engage in natural language conversations, answer queries, and provide assistance, enhancing customer service and user experience.

AI-powered chatbots and virtual assistants have come a long way and are just starting to revolutionize the way businesses interact with their customers. With instant responses to customer queries, personalized recommendations, routine task handling, they can ensure a relatively seamless customer experience.

The process robots are coming

An area I have dipped in and out of at various points in my work career since Y2K, is robotic process automation (RPA). The goal of the RPA being to automate mundane and repetitive tasks. Tasks that were previously low value and time-consuming for employees. Early RPAs were very prescriptive and simplistically programmed but today they are amore adaptive. One of the earliest examples of RPA-like automation can be traced back to the introduction of screen scraping software in the 1990s.

AI-driven RPA goes beyond basic task automation by incorporating so called cognitive capabilities. With machine learning (ML) algorithms, RPA systems can analyze vast amounts of data, recognize patterns, and make decisions based on historical and real-time information. This “cognitive” automation allows businesses to automate complex tasks that require decision-making, such as data analysis, customer service interactions, and fraud detection.

AI in fraud detection, risk management, and algorithmic trading has machine learning algorithms analyze financial data in real-time, identifying unusual patterns and potential bad actor activities, thereby enhancing security and minimizing financial losses.

RPA integrated with AI can excel in processing unstructured data, such as invoices, forms, and emails. Through Optical Character Recognition (OCR) and machine learning, such systems can extract relevant information from documents more accurately than people and faster! This capability streamlines document-based processes, such as invoice processing and claims management, reducing manual errors and improving overall document handling efficiency.

Automation liberates human resources, allowing employees to focus on more strategic and creative aspects of their roles; the kinds of applications include dataentry, invoice processing, and report generation are now handled efficiently by AI-driven systems, leading to higher productivity and reduced operational costs.

Smart reporting

AI has been transforming data analysis for a while now, by enabling businesses to glean improved insights from vast datasets.

Machine learning algorithms analyze historical data, identify patterns, and predict future trends with remarkable accuracy. This predictive analytics can help a business make better informed decisions, optimize inventory practices, more precisely forecast customer demands, and enhance overall operational efficiency.

AI-driven applications optimizing supply chain operations look to historical sales data, market trends, and weather patterns, for example, to predict demand more accurately.

This multi-threaded predictive capability aids businesses in avoiding stock-outs, reducing inventory holdings, and minimizing waste. AI-powered algorithms are also used to optimize route planning and delivery scheduling, which can all improve the effectiveness and cost profile of logistics operations.

By combining data analytics with AI, businesses automate their data analysis and generate more precise actionable insights. AI-driven analytics systems process vast datasets, identify trends, and provide answers in near real-time. Decision-makers now have timely and accurate information, enabling them to make better informed choices to drive business growth and innovation.

More business focus areas

The examples cited above are probably the areas I have seen benefits more commonly from AI in the business setting, but there are at least almost a dozen more that can be considered.

AI algorithms that analyze customer behavior and preferences, enable businesses to create highly targeted marketing campaigns. The campaigns might include personalized recommendations, content, and advertisements to enhance customer engagement and increase conversion rates.

Healthcare professionals have started to consider the use of AI in diagnosing diseases, analyzing medical images, and predicting patient outcomes. Machine learning algorithms can process vast amounts of medical data, leading to more accurate diagnoses and personalized treatment plans.

Analysing medical images, such as X-rays, CT scans, MRIs, lab slides and mammograms, AI, can process these artefacts at speeds much faster than human medical professionals. Algorithms can quickly identify patterns, anomalies, and potential areas of concern.

Subtle changes in medical images that might not be immediately apparent to human eyes are more easily indetified by AI. This early detection can lead to the diagnosis of diseases at their nascent stages, improving the chances of successful treatment and recovery. This is particularly crucial in diseases like cancer, where early detection significantly improves patient outcomes. In critical cases, rapid analysis can be life-saving.

Intelligent tutoring and educational systems adapt to learner styles, providing customized educational content and feedback. AI also aids in automating the administrative tasks for educational institutions, improving efficiency.

In manufacturing and operations, the use of AI can assist businesses in anticipating equipment failures, reducing downtime and maintenance costs.

In talent acquisition processes, automating resume screening, candidate matching, and even conducting initial interviews can accelerate candidate evaluation. Chatbots powered by AI handle the routine HR inquiries, HR professionals focus on more strategic and higher value tasks like employee engagement and development.

AI is employed in environmental monitoring and conservation efforts to predict natural disasters, monitor pollution levels, and aid in wildlife conservation, contributing to more effective environmental preservation strategies.

Legal assistance tools that are AI-powered can help legal professionals in document review, contract analysis, and legal research. Natural Language Processing algorithms enable these tools to process and analyze large volumes of legal documents efficiently, improving accuracy and saving time for lawyers and paralegals.

Artificial Intelligence (AI) has become a transformative force revolutionizing various aspects of business operations. From customer service to data analytics.

AI-driven technologies have significantly enhanced efficiency, improved customer experiences, and driven innovation across diverse sectors.

However, the rapid integration of AI in business processes has raised concerns regarding bias, transparency, and the ability of AI systems to comprehend human-like conversations, especially in the context of Large Language Models (LLMs).

The traditional Turing Test, once a benchmark for assessing machine intelligence, now faces challenges due to the complex behavior of LLMs, prompting the need for new evaluation methods.

Despite these challenges, AI-powered chatbots and virtual assistants have reshaped customer interactions, providing instant responses and personalized recommendations, thereby ensuring seamless customer experiences. AI-driven Robotic Process Automation (RPA) has automated mundane tasks, liberating human resources and enabling employees to focus on strategic and creative aspects of their roles.

AI has revolutionized data analysis, supply chain optimization, healthcare diagnostics, education, talent acquisition, environmental monitoring, and legal assistance, showcasing its vast potential in diverse business focus areas.

As businesses continue to harness the power of AI, it is imperative to address the ethical concerns and develop innovative solutions, ensuring that AI remains a valuable asset in shaping the future of business operations.


Read More
Author: Clinton Jones

Get Smart


Becoming progressively smarter, or continuously improving one’s intelligence and personal success, is a goal that anyone can pursue. However, naturally smart people can easily sabotage their work or social progress through certain behaviors or thoughtless acts.

If you embrace lifelong learning by reading books, taking courses, attending seminars, and staying updated with the latest information in your field you will know that this is an effective way to become progressively smarter.

Naturally smart individuals may occasionally become complacent, assuming they already know enough, which can hinder further growth. If you regularly challenge yourself with complex problems and puzzles you will find this encourages critical thinking and problem-solving skills.

Natural intelligence and over-reliance on talent can lead to relying solely on one’s innate abilities, neglecting the importance of practice and effort. Over time others will overtake you. Practices like mindfulness and meditation can improve focus, reduce stress, and enhance cognitive abilities over time.

Smart individuals may become so engrossed in their pursuits that they overlook their mental well-being, potentially leading to burnout. This can have a negative impact not only on their mental health but also on their physical health. Those who choose to focus on building a diverse social network of people with different perspectives find themselves in discussions with others on many topics which can broaden horizons and introduce new ideas and divergent thinking.

Overconfidence in one’s intelligence may lead to dismissing others’ ideas and missing out on valuable insights. Seek out feedback on your work and actively incorporate constructive criticism. This helps you identify areas for personal improvement and growth. Being naturally smart can make you sensitive to criticism, leading to you avoiding feedback or becoming defensive.

Efficient management of your time wherein you allocate enough hours for learning, work, and relaxation leads to a well-balanced schedule which can enhance personal productivity and creativity. Perfectionism is the enemy of good and some smart people might become perfectionists, spending too much time on one task, which can hinder their overall progress.

Prioritize physical health through regular exercise, a balanced diet, and adequate sleep. Physical well-being positively impacts cognitive function. Getting something like 10,000 paces in daily is a simple ritual that can help. Neglecting Self-Care because you are focused on intellectual pursuits can have a bad effect on your general health so keep this always in mind as you balance out your days.

Establish clear, achievable goals and track your progress. This provides motivation and direction for personal growth. However, keep in mind that setting unrealistic expectations and overly ambitious goals can lead to frustration when they’re not immediately achieved.

Embrace change and be willing to adapt to new technologies and methods. The ability to learn and adapt quickly is a key trait of smart individuals. Conversely, resistance to change and being attached to existing knowledge and methods can impede progress when better alternatives emerge.

Develop your emotional intelligence by understanding and managing your emotions and the emotions of others. This skill is crucial for success in various aspects of life. Recognize that a lack of empathy especially amongst highly intelligent individuals can harm relationships and limit success in teamwork and leadership roles.

Becoming progressively smarter requires a commitment to learning, adaptability, and self-improvement. Even naturally smart individuals can hinder their own progress by falling into the traps of arrogance, and perfectionism, and neglecting their well-being. It’s essential to strike a balance between leveraging natural intelligence and putting in the effort to continually grow and develop.


Read More
Author: Flaminio

Intelligence


Intelligence is a multifaceted and complex concept that has intrigued intellectuals for centuries. Its definition, measurement, and understanding have evolved over time, and it continues to be a subject of debate and research.

Can it be viewed as a product, a process, content, or style? Why it is often described as encompassing all of these aspects?

Intelligence as a Product

One way to think about intelligence is as a product, an outcome, or a result of cognitive processes. This perspective is often associated with the idea of intelligence quotient (IQ) and standardized intelligence tests. IQ tests are designed to measure a person’s cognitive abilities and compare them to a standardized population and the distribution across that population. Scores on such tests are often considered a product of one’s intellectual abilities.

IQ tests, such as the Stanford-Binet or Wechsler Adult Intelligence Scale, are designed to assess a range of cognitive abilities, including logical reasoning, problem-solving, verbal comprehension, and mathematical skills. The scores derived from these tests are used to classify individuals into categories of intellectual ability, such as “average,” “above average,” or “below average.”

The product-oriented view of intelligence involves assigning numerical scores to individuals based on their performance on these standardized tests. This scoring allows for the comparison of individuals’ cognitive abilities, and it can be used for various practical purposes, such as educational placement and job selection.

However, it’s important to recognize the limitations of this perspective; standardized tests may not capture the full spectrum of human cognitive abilities, and they can be influenced by cultural and socioeconomic factors. They also do not account for other important aspects of intelligence, such as creativity, emotional intelligence, or practical problem-solving skills.

Intelligence as a Process

Intelligence is not static, it is dynamic and ever-evolving. If you take this perspective there is an emphasis on the cognitive activities and mental processes involved in thinking, learning, and problem-solving. From this viewpoint, intelligence is the ability to adapt to new situations, acquire knowledge, and make informed decisions.

Intelligence as a process encompasses a wide range of cognitive processes, including memory, perception, attention, reasoning, and problem-solving. These processes work together to enable individuals to gather information, process it, and use it to make decisions.

Intelligence also involves the capacity to learn from experience and adapt to changing circumstances. It is not solely determined by innate abilities but is influenced by factors such as education, exposure to new ideas, and the ability to apply knowledge effectively.

Viewing intelligence as a process also allows for an understanding of its development over one’s life. Children may develop cognitive skills at different rates, and most adults can continue to learn and adapt throughout their lives.

Recognizing intelligence as a process has practical implications for education and training. Effective teaching methods should consider the cognitive processes involved in learning and adapt instruction accordingly.

Intelligence as Content

Intelligence includes the content of knowledge and information that we possess at a point in time. This aspect of intelligence relates to what one knows and understands about the world. However, having knowledge alone does not necessarily equate to intelligence; intelligence also involves the ability to use and apply that knowledge practically and effectively.

The content-oriented perspective of intelligence acknowledges that individuals differ in the depth and breadth of their knowledge. Some people may have extensive knowledge in specific domains, such as mathematics, history, and art, while others may have a more general knowledge base.

Intelligence can be domain-specific, meaning that an individual may excel in one area of knowledge but not in others. For example, a person could be highly knowledgeable about music but less knowledgeable about science; this doesn’t make them particularly more or less intelligent, the assessment largely depends on what they are being assessed against and for what purpose.

Intelligence is not just about what one knows, therefore, but also about how effectively you can apply that knowledge to solve problems, make decisions, and navigate real-world situations. This ability to apply knowledge is a crucial aspect of practical intelligence.

Intelligence as Style

Cognitive styles refer to individual differences in how people approach and process information. Styles can be considered as aspects of intelligence because they influence how individuals think, learn, and solve problems. Different cognitive styles can be seen as different approaches to using one’s cognitive abilities.

The Analytical vs. Creative is an interesting way to think about style as some individuals have an analytical cognitive style, characterized by a preference for systematic and logical thinking. Others may have a creative cognitive style, which involves generating novel ideas and thinking outside the box. Both styles can be valuable in different contexts.

Another cognitive style relates to practical problem-solving. Some people may excel at finding efficient solutions to everyday challenges, the proverbial “MacGyver”, demonstrating a practical intelligence style. This is particularly valuable in real-world situations like sticky situations where coming up with imaginative ways to extricate oneself from the situation is essentially a show of intelligence.

Emotional intelligence is a distinct cognitive style that involves the ability to recognize, understand, and manage one’s own emotions and the emotions of others. This has become a very popular trait in the corporate world. Those who have high emotional intelligence are acknowledged as having an edge over those for whom the EQ is low. EQ plays a crucial role in social interactions and interpersonal relationships.

The “All Dimensions” intelligence

While each of these perspectives—intelligence as a product, process, content, and style—provides valuable insights into the nature of intelligence, it is essential to recognize that intelligence is complex and multidimensional. Intelligence cannot be fully captured by any one dimension alone. Instead, it is the integration of all or at least many of these dimensions; giving us a more comprehensive understanding of human intelligence.

A holistic view of intelligence acknowledges both cognitive abilities and the capacity to apply these abilities to real-world situations. Holistic assessment of intelligence recognizes that intelligence is not limited to a single aspect but involves a combination of cognitive processes, knowledge, cognitive styles, and practical problem-solving abilities.

Intelligence is highly context-dependent. What may be considered intelligent behavior in one situation may not be so in another. This contextual aspect of intelligence highlights the importance of adaptability and flexibility in using cognitive abilities effectively.

The variance among individuals, with some excelling in different aspects of intelligence witnesses the presence of some with superior analytical intelligence, while others are creative or practical. Recognizing and valuing these individual differences is crucial for fostering diversity and innovation especially when building teams.

Cultural and societal factors shape the way intelligence is defined and valued and certain aspects of intelligence are more highly valued over others, this leads to variations in what is considered “intelligent behaviour”.

Since intelligence is not fixed but can be developed and enhanced throughout one’s life, the educational and experiential opportunities offered to individuals often play a significant role in shaping and expanding an individual’s intelligence.


Read More
Author: Clinton Jones

Inclusion vs Integration


Diverse needs hold significant importance in modern education for a multitude of compelling reasons. First and foremost, contemporary education places a strong emphasis on inclusivity and equity.

Inclusivity entails recognizing and addressing the diverse needs of students to ensure that all individuals, regardless of their backgrounds, abilities, or disabilities, have equal access to a high-quality education.

Such a focus on equity is aligned with the principles of social justice and human rights.

Legal and ethical obligations also play a pivotal role in emphasizing the consideration of diverse needs. Numerous countries have enacted laws and regulations that mandate educational institutions to provide equal educational opportunities for all students.

This includes the Individuals with Disabilities Education Act (IDEA) in the United States, which necessitates the provision of services and accommodations to meet the diverse needs of students. Complying with these legal obligations is an integral aspect of contemporary education.

In addition to legal imperatives, the realities of today’s world further underscore the significance of recognizing diverse needs. Globalization and cultural diversity have made schools more diverse than ever before, with students hailing from various cultural, linguistic, and socio-economic backgrounds. It is essential to understand and address the diverse needs of these students to foster cross-cultural understanding, tolerance, and effective communication in an interconnected global society.

Advancements in educational research and knowledge have also heightened the awareness of diverse needs. Developments in educational psychology and neuroscience have provided educators with a deeper understanding of how students learn. This knowledge has underscored the wide variability in learning styles, cognitive abilities, and neurological profiles among students. Consequently, tailoring instruction to meet diverse needs is crucial for enhancing learning outcomes.

Contemporary educational theories, such as Howard Gardner’s theory of multiple intelligences, acknowledge that intelligence is not confined to a singular dimension. Instead, students possess a range of strengths and abilities. Consequently, education should be adaptable to accommodate these diverse talents and aptitudes.

Preparing students for a diverse and inclusive workforce is another paramount goal of modern education. To thrive in today’s job market, students must develop skills in collaboration, problem-solving, and communication. Embracing diverse needs within the classroom helps students build these essential skills.

Education is not solely concerned with academic development; it also plays a pivotal role in shaping students’ moral and social growth. Recognizing and respecting diverse needs fosters empathy, tolerance, and social responsibility, contributing to the development of well-rounded citizens.

Inclusive education is seen as the gold standard for students with disabilities. It promotes their integration into mainstream classrooms, offering them opportunities for socialization and access to a more comprehensive curriculum. This, in turn, can significantly improve their long-term outcomes.

The expectations of parents and communities have also evolved to expect more. Parents and communities increasingly expect schools to provide inclusive education that caters to the diverse needs of their children which has led to higher expectations for educational institutions to implement them.

In education inclusion and integration are two distinct approaches for accommodating students with diverse needs.

Both approaches aim to provide an equitable and supportive learning environment, but they differ in their philosophies and practices. Additionally, there are nuanced alternatives that blend elements of both approaches.

Inclusion

Inclusion is a philosophy that advocates for the full and active participation of all students, including those with disabilities or special needs, in regular education classrooms and activities. It promotes the idea that every student has a right to be part of the general education setting.

This approach typically involves modifying the curriculum in teaching practice, teaching methods, and classroom arrangements to accommodate the diverse needs of all students. Support services, such as special education teachers or aides, may be provided within the regular classroom to help students with disabilities.

Prioritizing the creation of a diverse and accepting learning environment where students of all abilities learn together is the focus, for inclusion, the goal is to minimize segregation and promote social interaction among students.

Integration

Integration’s philosophy is to emphasize bringing students with disabilities into regular education classrooms on a temporary or partial basis. It may not necessarily involve a commitment to the full inclusion of all students, but rather a blending of students with and without disabilities for specific activities or lessons.

In practice, integrated settings are where students with disabilities may spend some of their time in regular classrooms and the rest in special education classrooms or resource rooms. The degree of participation in the general education setting can vary widely.

Integration focuses on providing students with disabilities access to the regular curriculum and social experiences to the extent deemed appropriate, while still acknowledging the existence of separate special education programs.

Alternative approaches

Inclusive Integration: This approach combines elements of both inclusion and integration. It recognizes that students have varying needs and abilities, so it allows for flexibility. Some students may spend most of their time in regular classrooms (inclusion), whilst others may participate in specific subjects or activities in a more specialized setting (integration).

Differentiated Instruction: involves tailoring teaching methods and content to meet the diverse needs of all students within a regular classroom. Teachers adjust their instruction to accommodate different learning styles and abilities, providing individualized support as needed.

Universal Design for Learning (UDL): UDL is a framework that promotes the design of educational materials, environments, and practices that are accessible to all students from the outset. It reduces the need for separate accommodations by creating inclusive learning experiences.

Co-Teaching: In co-teaching, a general education teacher and a special education teacher work together in the same classroom. This collaborative approach allows for a wide range of support within the regular classroom, catering to diverse needs.

Inclusion and integration represent different approaches to inclusive education, with inclusion being more focused on full participation and integration allowing for varying degrees of participation.

Some alternatives aim to strike a balance between these approaches to best meet the needs of diverse learners in an inclusive education setting. The choice of approach depends on the individual needs of students and the goals of the educational institution.


Read More
Author: Jewel Tan

Hype Cycle


Have you heard of this thing?

More formally, it is referred to as “The Gartner Hype Cycle”, a graphical representation and methodology developed by the research and advisory firm Gartner and first introduced by Jackie Fenn, a VP at Gartner in the 1995 research report titled “The Hype Cycle”.

The objective is to depict the lifecycle stages that emerging technologies typically go through as they evolve from initial conception to mainstream adoption.

The cycle attempts to provides insights into how technologies are perceived and how they progress in terms of visibility, expectations, and maturity. The Hype Cycle was ultimately developed as a response to the observations of Gartner, that many new technologies were subject to exaggerated expectations followed by disappointment when those technologies failed to deliver immediate revolutionary changes.

Gartner recognized the need for a more structured way to manage these expectations and provide more realistic guidance for the adoption of emerging technologies. In the 25+ years that have passed since its introduction, it has become well-known and widely used as a framework in the technology industry to analyse and discuss the trajectories of new technologies, their adoption patterns, and the challenges they face.

TL;DR The Cycle is an adaptation of a view on technology life cycles and ultimately tries to deal with perceived discontinuities in adoption.

“One of the more interesting features of Gartner’s Hype Cycle is that it takes into account the unbridled and almost euphoric optimism that accompanies the introduction of some technologies and, of course, the inevitable precipitous decline of the next-best thing.” 1

Usage

The Hype Cycle has its critics, but has helped technology leaders, investors, and decision-makers make informed decisions about when and how to engage with emerging technologies based on a more balanced understanding of their potential benefits and risks.

Business leaders could use the Gartner Hype Cycle as a strategic tool to guide decisions about adopting and investing in specific emerging technologies in the absence of their own proofs.

For example, the Cycle provides an overview of various technologies and their maturity levels as part of Landscape Assessment such that it can be used to identify technologies that are relevant to specific industry and business objectives. This helps one stay informed about the latest trends and innovations.

The Hype Cycle can help leaders anticipate the potential impact of emerging technologies on their business in their Strategic Planning; for future technology adoption and integration by considering the expected trajectory of each technology along the cycle.

By understanding where a technology stands on the Hype Cycle, leaders can better assess the associated risks in Adoption Risk Assessments, thereby avoiding being caught up in the “Peak of Inflated Expectations” and instead make decisions based on a balanced view of the technology’s potential benefits and challenges.

When leaders engage in Resource Allocation, they can do this more effectively by considering which technologies are approaching the “Slope of Enlightenment” and “Plateau of Productivity.” These phases indicate that the technology is becoming more practical and can yield tangible results, making it a better candidate for investment.

The Hype Cycle helps leaders create an innovation strategy that aligns with an organization’s goals guiding them in deciding whether to be early adopters or wait until a technology matures further before investing resources.

Being aware of where competitors are in terms of technology adoption can provide insights into potential competitive advantage if a business successfully navigates the Hype Cycle it can gain an edge by embracing transformative technologies at the right time.

When considering technology solutions or partnerships, business leaders can refer to the Hype Cycle to understand the vendor landscape. Businesses can make informed decisions about which vendors align with their technology adoption timeline and objectives.

The Hype Cycle can help leaders educate executive leadership teams, investors, and other stakeholders about the expected progression of technologies and promotes a more realistic understanding of the potential outcomes and timelines.

The Hype Cycle encourages leaders to think long-term trends and consider these relative to short-term implications of technology adoption. This perspective is crucial for creating sustainable strategies.

Extra considerations

By recognizing that technologies go through different phases, leaders can adjust their expectations accordingly. This minimizes disappointment when a technology doesn’t deliver immediate transformative results.

The Hype Cycle was specifically designed to track the adoption of technologies, as mentioned. It might not capture the nuances of non-technological innovations or concepts that don’t fit the typical technology lifecycle. Different types of innovations may follow different adoption trajectories; for example, consumer products might have their own adoption patterns that are influenced by factors like fashion trends and lifestyle choices.

Some innovations might be deeply rooted in cultural or societal changes that don’t fit the linear progression of the Hype Cycle. The cycle might not adequately account for cultural shifts and their impact on adoption. Innovations in fields with strong regulatory frameworks (such as healthcare or finance) might also be influenced by legal and compliance factors in ways that the Hype Cycle might not fully capture.

Innovations that cater to niche markets or localized needs might not exhibit the same wide-ranging adoption patterns that technology trends often do, and ideas that are incremental improvements or evolutionary changes to existing solutions might not experience the same hype-disillusionment cycle as radical and transformative innovations. The Hype Cycle involves subjective perceptions and market trends, which might not be as relevant for innovations driven by other factors, such as scientific discoveries.

Industry specific implications

The Hype Cycle has been used to analyse the adoption of new medical technologies, healthcare IT solutions, and digital health innovations and as such it helps healthcare organizations understand the stages of acceptance and integration for technologies like telemedicine, electronic health records, and medical wearables.

In educational technology (EdTech) it has helped educators and educational institutions assess the potential impact and adoption of new teaching tools, e-learning platforms, and educational software. The financial services sector has used the Hype Cycle to evaluate the adoption of financial technologies (fintech) such as blockchain, robo-advisors, and mobile payment solutions.

In automotive the Hype Cycle is often applied to emerging technologies like autonomous vehicles, electric vehicles, and connected car systems. Similar analysis has appeared in the adoption of renewable energy technologies, smart grid solutions, and sustainable practices within the energy sector. The Hype Cycle has been applied to emerging trends in environmental conservation and sustainability, such as circular economy practices, green technologies, and eco-friendly product innovations.

Retailers use the Hype Cycle to understand the adoption of technologies like augmented reality (AR) for shopping experiences, Internet of Things (IoT) devices for inventory management, and contactless payment methods. Media and entertainment in trends around virtual reality (VR) and augmented reality (AR) applications, streaming platforms, and content delivery technologies.

Applicability is also found in aggrotech, travel, tourism, real estate and even the adoption of technologies by government and the public sector.

The hype cycle is an adaptable framework that has proven valuable in understanding the adoption and maturation patterns of innovations across a wide range of industries but the applicability may need to be tailored to suit the specific characteristics of each industry.

Extra Reading


Read More
Author: Flaminio

Mastering Remote Work Success


Shifting to remote work has had a significant impact on B2B audiences and attendance of webinars. Work-from-home accelerated by the pandemic brought several changes in how people engage with media and webinars and other online events in particular.

Imagine you’re a software company hosting a webinar for B2B clients. Instead of a generic topic, you decide to address a specific pain point your clients face – data security.

Your webinar perhaps offers insights into the latest data breach trends, real-life case studies of companies that suffered breaches, and actionable steps to strengthen data protection. Attendees leave with a comprehensive understanding of potential vulnerabilities and practical solutions they can implement.

Sounds pretty compelling doesn’t it?

If you’re a software company and you’re not running regular webinars, you’re missing a trick.

For marketing agencies hosting webinars on content strategy the standard title of “Effective Content Strategy Tips” is likely more compelling when rephrased as “Unlocking the Power of Content for greater ROI”; this re-hook captures attention and conveys the value attendees can ultimately expect. So while content hasn’t changed necessarily, the way it is positioned may change your signup rates.

Would you attend a medical technology webinar on telehealth where there is no expert on the agenda? Probably not, having an SME (Subject Matter Expert) on the agenda could make all the difference in terms of credibility. Medtech companies usually target presenting professionals of repute, who have been at the forefront of their field, like virtual healthcare. The SME’s reputation will likely draw in a better-sized audience interested in hearing insights from a recognized expert.

Management consulting firms that conduct webinars on leadership development ask attendees to share their top leadership challenges through live polls. This interactivity with the audience in real-time, makes participants feel their concerns are being addressed directly and keeps the audience engaged. Polls are always a great frame-up for speakers too!

If you run an e-commerce platform then prospects and existing customers want to hear stories from the trenches – they want to hear other customers’ experiences. Combining slides, video content, and successful customer interaction scenarios combined with a live chat with customer support teams could create a multi-faceted experience that keeps attendees engaged.

Trying to engage a global B2B audience is tough with time zones. Instead of hosting a single webinar consider multiple sessions at different times to cater to different time zones. Hold a session during European business hours and another during Asian business hours, to increase your international reach.

You’d think the day of the week doesn’t matter; selecting a Wednesday for a webinar on say streamlining logistics processes is an astute move when considered in the context of knowing that midweek is generally when businesses are in a “working mode” and are more likely to allocate time for professional development.

A one-hour webinar is tidy, but your speakers will fill all the time you allocate. A two-hour webinar is a bit long and will likely see audiences drop off after sixty minutes anyway. Settle on a 45-minute session where you cover key strategies, present a real-life success story, and host a brief Q&A, ensuring participants’ attention is maintained throughout the event.

The webinar invitation is underestimated in terms of its impact. Invitations highlight the message you want to convey and depending on where you received your mailing list from, could be just what existing customers are looking for, or might spur prospects into action, point out that you plan to showcase your company’s understanding of customer and prospect-specific needs.

Something you can leverage as part of the early signup is an incentive. Mention this on the landing page, in the invitation, and in the various calls to action that you might send to customers and prospects. Tempt them with details of a white paper, a discount, a private consultation, or a tchotchke for early signup.

Running a webinar solo, even with an expert in tow, may not tell the whol story you want to tell. Consider finding a partner to join the webinar with you. This could be a reseller, an integrator, a promoter, or an organization with adjacent and vested interest. This will diversify the content and you likely harvest sign-ups from their stable of contacts too!

Observers love social proof, this can come in the form of customer or beneficiary testimonials that showcase the positive impact of past webinars, your product, or services. The value derived from webinars should be spoken about to encourage signups and build audience trust and demonstrate the value participants could gain.

After a webinar o always send attendees a follow-up email. The email should include a link to the webinar recording, a summary of key takeaways, and perhaps even something exclusive.

Provided you recorded that webinar, it could have a half-life beyond your wildest expectations. Once you’ve held the live session, post and promote the past event online in order to harvest more viewers and contacts.

Each of these strategies, when thoughtfully executed, contributes to a successful B2B webinar campaign that holds the potential for attracting a willing and engaged audience.

By focusing on delivering genuine value and respecting the audience’s time and needs, your company can build strong connections and foster lasting relationships.


Read More
Author: Flaminio

Real-Time Data: Enhancing Customer Experience for Your Business


In today’s fast-paced world, customers demand immediate gratification and personalized experiences.

To meet these expectations, businesses must understand their customers’ needs and preferences to create memorable experiences that drive customer loyalty. In this article, we explore various ways that businesses can utilize real-time data to improve their customer experience.

Keeping Customer Data in the Cloud

Storing customer information in the cloud is a convenient and efficient way for businesses to access real-time data. By tracking customer activity, purchase history, and preferences, businesses can gain valuable insights into their customers’ behavior and needs. This information can help businesses personalize marketing campaigns, tailor promotions, and ultimately enhance their overall customer experience.

Inventory Management

In addition to improving customer satisfaction, real-time data can also assist businesses with inventory management. By tracking stock levels and sales patterns, businesses can optimize their inventory and ensure they have the right products available when customers need them. This not only reduces stockouts but also increases sales and ultimately leads to a better overall customer experience.

Real-Time Price Adjustments

Real-time data also provides businesses with the ability to adjust prices in real-time, which is particularly advantageous in the retail industry. With price fluctuations based on demand or supply, monitoring market trends and consumer behavior enables businesses to optimize profits and meet customer expectations. By utilizing real-time data for pricing, businesses can create a competitive edge in the marketplace and build customer loyalty.

Single Customer View

To provide personalized experiences, businesses must understand their customers fully. Using a single customer view, businesses can consolidate customer data from multiple sources into one central location. This provides a holistic view of the customer, including their behavior, preferences, and activity across all channels. With this information, businesses can provide personalized recommendations and tailored promotions that meet the customer’s needs.

Geolocation Services

Geolocation services, such as GPS and beacons, offer businesses an excellent opportunity to engage with customers more effectively. Location-based data enables businesses to send personalized promotions and alerts to customers when they are near their stores, which helps drive traffic and improve the overall customer experience. By leveraging location-based data, businesses can create targeted marketing campaigns that resonate with their customers, ultimately leading to improved customer satisfaction.

Personalizing the Customer Experience

In today’s competitive marketplace, personalization has become a requirement for businesses that want to succeed. Real-time data provides businesses with the ability to create personalized experiences that resonate with their customers, including tailored marketing campaigns, personalized recommendations, and customized promotions. By utilizing real-time data for personalization, businesses can enhance customer loyalty, improve customer retention rates, and ultimately increase profits.

Improve Customer Service

Real-time data can also help businesses improve their customer service. By tracking customer behavior and preferences, businesses can anticipate their needs and provide proactive support. For example, if a customer has purchased a product, they might need help setting it up. By anticipating this need and offering support, businesses can enhance their customer experience and build loyalty.

Leveraging real-time data can be a game-changer for businesses looking to enhance their customer experience and stay ahead of the competition. With cloud-based storage of customer information, a single customer view, and personalized promotions, businesses can engage with customers more effectively, improving overall satisfaction and loyalty. By providing proactive support and anticipating customer needs through data analysis, businesses can take their customer service to the next level.


Read More
Author: Flaminio

Evolution of work: the office and home


I couldn’t help myself, I was so incensed by the latest article I read on the insistence of some execs that people return to the office, that I felt a compulsion to write something a little extended on the topic.

Few would dispute the suggestion that office work has undergone a remarkable transformation in the last couple of years, transitioning from rigid hierarchies and traditional cubicles to flexible, collaborative spaces that embrace remote work and digital communication.

The transformation has been driven by technological advancements, changing societal norms, and a reevaluation of the nature of work itself.

The history and nature of office work is a curiosity, if you were to explore the kinds of jobs associated with office environments you might even be surprised, if you analyze the shift away from the traditional office setting you start to recognise the implications for corporate culture, collaboration, and information security.

History
The concept of office work can be traced back to ancient civilizations where scribes and administrators maintained records and facilitated communication for rulers and governments. We often forget that clerical staff existed in the time of the Babylonians, the Pharaohs and the Emperors of Rome, and the Moghuls.

The modern office as we know it emerged only during the Industrial Revolution. With the rise of industrialization and the need for efficient administration, large corporations and government agencies established centralized offices to manage various tasks such as record-keeping, communication, and coordination.

1970s office

Women at work in the bookkeeping room at the Bank of America in 1970. 
Hulton Archive/Getty Images

By the mid-20th century, offices had become synonymous with rows of cubicles, typewriters, and paper files. The hierarchical structure was prominent, with managers overseeing clerical staff performing repetitive tasks. Communication was mostly face-to-face or conducted through interoffice memos.

Office work encompasses a wide range of roles across industries. Some of the typical office jobs include:

  • Administrative Assistants: These individuals provide administrative support, manage schedules, coordinate meetings, and handle correspondence.
  • Accountants and Finance Professionals: Responsible for managing financial records, budgeting, and preparing reports for the organization’s financial well-being.
  • Human Resources Personnel: Oversee recruitment, employee relations, benefits administration, and training programs.
  • Marketing and Sales Teams: Plan and execute marketing strategies, analyze customer data, and manage client relationships.
  • IT Professionals: Maintain the organization’s technological infrastructure, provide technical support, and ensure data security.
  • Project Managers: Coordinate tasks, timelines, and resources to ensure projects are executed efficiently.

There is a nice Indeed article that lays out a raft of once-associated office-bound occupations

All Change
The advent of technology, especially the internet, has brought about a significant shift in how work is done.

The rise of personal computers, email, and digital communication platforms redefined the traditional office space. The concept of remote work emerged, allowing employees to perform tasks from locations other than the physical office. Think “teleworkers”.

Many who still hold “office jobs” today witnessed a shift in their understanding of work-life balance. They sought more flexibility in their work arrangements, and employers recognized the benefits of remote work in attracting and retaining talent.

The COVID-19 pandemic accelerated this shift further, as many organizations were forced to swiftly transition to remote work to ensure business continuity.

Remote work not only offers employees the flexibility to balance personal and professional responsibilities but also reduces commute-related stress and expenses. Additionally, it allows organizations to tap into a global talent pool, leading to increased diversity and innovative thinking within teams.

However, remote work is not without its challenges. The lack of face-to-face interaction can hinder spontaneous collaboration, and feelings of isolation might affect employee morale. To overcome these challenges, organizations turn to digital collaboration tools, video conferencing, and project management platforms to foster communication and teamwork. People’s calendars now get filled up with potentially many more meetings.

Virtual team-building activities, online workshops, and regular check-ins help maintain a sense of community among remote workers. Organizations are also incorporating flexible work hours to accommodate different time zones and individual preferences, further enhancing employee satisfaction and well-being.

The shift toward remote work has also prompted a reevaluation of corporate culture (and expenditure). Traditional office cultures often focused on visible signs of productivity, such as time spent at a desk, rather than on outcomes. In contrast, remote work emphasizes results over mere presence. This change in focus should lead to a more results-oriented and trust-based work environment. But all this assumes that business leaders are actually focused on outcomes and are willing to measure and track outcomes and dare I say, even recognize what kind of outcomes they want to see. What is the baseline measure for productivity? Many can’t say for certain.

Focusing on Effectiveness
Productivity and effective communication often go hand in hand, it is quite obviously essential for any organization’s success what needs to be done, and how, should be communicated with clarity. It is equally important for team members to understand how they will be assessed and how they should self-assess.

The digital age has brought both opportunities and challenges and digital tooling facilitates quick communication but can also lead to information overload and misinterpretation.

It’s crucial for organizations to establish clear communication guidelines and encourage active listening and empathy, irrespective of whether employees are working remotely or in the office.

Clerical, administrative, and office work has moved squarely beyond the physical office but has brought another challenge front and center. That concern is ensuring information security.

Controlling information assets has become paramount. Remote work opens up new avenues for cyberattacks and data breaches. Organizations now have to invest in robust cybersecurity measures, including encryption, multi-factor authentication, and regular employee training on recognizing and responding to security threats.

Moonlighters
A further concern that has been raised, is principally focused on offshore workers, but it is as relevant on shire and near-shore too. The concern is about the potential for remote workers to engage in moonlighting activities.

Moonlighting refers to the practice of employees taking on additional jobs, often during their official work hours, without the knowledge or consent of their primary employer.

This phenomenon can lead to decreased productivity, compromised work quality, and conflicts of interest. Additionally, moonlighting might result in the leakage of sensitive company information to competitors or unauthorized parties. Addressing this risk is essential to maintain the integrity of remote work arrangements and uphold the organization’s values.

Clear Communication and Policies are the first steps in mitigating the risk of moonlighting. It requires clear communication regarding the organization’s policies.

It should include explicit clauses in employment contracts or remote work agreements that address moonlighting, specifying whether it’s allowed, prohibited, or requires prior approval. These policies should outline the potential consequences of engaging in unauthorized work during official work hours, emphasizing the importance of transparency and ethical behavior.

Organizations that implement robust performance metrics that allow managers to gauge remote workers’ productivity and work quality objectively will help.

Regular performance evaluations and goal-setting sessions can provide insights into an employee’s commitment and focus. When employees are held accountable for their output, they are more likely to prioritize their primary job responsibilities over moonlighting activities.

Cultivating a culture of trust and engagement within the organization is the fine line to walk here.

When employees feel valued and respected, they are less likely to seek external employment opportunities that might compromise their primary role. Encouraging open dialogue between managers and remote workers, allows them to voice any concerns or challenges they might be facing. This approach fosters a sense of belonging and reduces the temptation to engage in moonlighting.

Business leaders should also consider offering flexible work arrangements that accommodate employees’ personal needs and interests.

By allowing flexible work hours, employees may have the opportunity to pursue personal projects or interests outside of their primary role without resorting to moonlighting during official work hours. This can strike a balance between promoting individual growth and ensuring productivity.

The technology can also be used to monitor employee productivity and engagement during work hours, it can provide insights into how employees spend their time while working remotely. However, it’s important to use these tools transparently and ethically, respecting employees’ privacy and autonomy and of course respecting local laws.

Leaders that maintain regular check-ins and communication channels between managers and remote workers and who stay connected and informed about ongoing projects and challenges have the advantage.

Engaged managers can identify potential signs of moonlighting and address them proactively. These interactions also reinforce the organization’s commitment to maintaining high work standards.

Dealing with executive fear
Executives have long associated the visibility of their employees’ physical presence in the office with a sense of control, productivity, and the traditional markers of success.

However, focusing solely on the visibility of physical bodies is not conducive to fostering a thriving and innovative work environment. Execs should overcome their concerns about visibility and consider rethinking the nature of corporate offices.

Success is measured not by the number of hours an employee spends at their desk but by the results they achieve. Rigid adherence to physical presence can create a culture where employees prioritize appearing busy over-delivering meaningful outcomes. Execs who shift their focus to valuing tangible contributions rather than mere visibility encourage a more results-oriented approach among their teams.

I already mentioned flexible work arrangements, including remote and hybrid models. This demonstrates trust in employees’ ability to manage their work effectively regardless of their physical location. This trust, in turn, boosts employee morale, loyalty, and commitment to the organization. By allowing employees to balance work and personal responsibilities, executives show their dedication to employee well-being and work-life integration.

Rethinking the traditional office model allows organizations to tap into a diverse talent pool that might not be able to commute to a physical office location. Remote work and flexible arrangements open the doors to hiring individuals from different geographies, backgrounds, and perspectives. Diversity can drive innovation, creativity, and a broader range of ideas.

Redefining the nature of corporate offices can lead to cost savings related to office space, utilities, and facilities management. Such resources can be redirected toward investing in technology, employee development, and other areas that contribute to the organization’s growth and sustainability.

Execs should shift their perspective on corporate offices from being showcases of power and impressiveness to becoming hubs of collaboration, innovation, and employee well-being.

Rather than focusing on grandeur, consider designing office spaces that foster collaboration and teamwork. Create open and flexible workspaces that encourage spontaneous interactions, idea sharing, and cross-functional collaboration. These spaces can include communal areas, breakout rooms, and digital tools that facilitate virtual collaboration for remote team members.

Prioritize employee well-being by providing amenities that enhance their physical and mental health. Incorporate elements like ergonomic furniture, natural lighting, recreational spaces, and wellness programs. Execs who invest in employee well-being create an environment where employees feel valued and motivated to contribute their best.

Technology infusion throughout the office environment facilitates communication and enhances productivity. Video conferencing systems, interactive displays, and collaboration platforms connect teams across different locations and time zones. This approach ensures that both in-office and remote employees can participate in discussions and projects effectively and seamlessly.

Shifting the focus from measuring success based on office hours to evaluating outcomes and contributions and setting clear performance metrics that align with business goals and encourage employees, allows them to take ownership of their work. This approach empowers employees to demonstrate their value through tangible results.

Consider giving employees the ability to design their workdays and environments in ways that optimize their productivity and well-being. Allow them to choose when and where they work, within the boundaries of team collaboration and project deadlines. Execs who prioritize employee empowerment create a sense of ownership and accountability.

Remote work is here to stay, organizations need to reimagine their corporate culture, they need to foster collaboration across distances, and address the challenges of effective communication and information security.

The office is no longer bound by physical walls; it’s now a dynamic space where technology and human creativity intersect to shape the future of work.

By embracing these changes, organizations create a harmonious balance between tradition and innovation, leading to increased productivity, employee satisfaction, and success in the ever-evolving landscape of office work.


Read More
Author: Uli Lokshin

Nurturing a Culture of Empowerment for Innovation in Business Leadership


In my view, effective leadership involves empowering employees and fostering a culture of innovation amongst teams. This is particularly important in industry sectors that are bound up in imprecise ways to achieve both specific and vaguely specified outcomes. These may include technology and software development, entertainment and media, fashion and design, advertising and marketing, renewable energy, and sustainability.

The concept of autonomous yet thoughtful decision-making is a powerful leadership strategy that helps to drive desired productive outcomes. Many may understand the significance of autonomy and empowerment but not acknowledge the importance in various business settings.

This often means emphasizing the need for a shift away from very traditional command and control models (C&C). C&C is prevalent in more traditional and bureaucratic organizations it has often been associated with industries where standardization, efficiency, and compliance were crucial, such as military, manufacturing, and certain government sectors.

Some of the key characteristics of C&C include centralized decision-making where the decision-making power is concentrated in the hands of those at the top. This approach often leaves little room for input from the lower levels of employees. There’s a chain of command and decisions are typically passed down that chain.

A second common characteristic is the strictness of the hierarchy. Organizationally the structure is typically like a pyramid with clearly delineated lines of authority and control. Each level reports to the one above it, and instructions flow from the top down. There may be an emphasis on discipline and control to ensure that employees adhere to prescribed and somewhat predictable processes in order to meet performance expectations.

C&C is often characterized by rigid adherence to rules, procedures, and protocols. Employees are expected to follow specific guidelines as prescribed without deviation. In line with the pyramid, communication follows formal channels, such as through managers and supervisors and information or insights may be limited as the communication slows up and down the hierarchy. Everyone is assigned specific roles and responsibilities, and tasks are somewhat clearly defined. Employees have little autonomy to make decisions or exercise creativity. The focus is on carrying out assigned tasks as directed.

While this leadership model can be effective in certain circumstances as I previously described, it is often criticized for its limitations in an ambiguous, dynamic, and fluid business environment.

In industries that require adaptability, creativity, and innovation, like the tech sector, the command and control model in my experience, actually hinders employee engagement, limits the flow of ideas, and inhibits organizational agility. I am more in favor of participative and collaborative leadership that empowers employees and fosters a culture of innovation and a genuine desire for ownership and accountability.

Instead, I advocate for a more informal and relaxed, and collaborative leadership approach that encourages creativity and innovation where the leadership team functions as player-coaches and builds genuine consensus and collective agreement on big and small decisions through dialog and negotiation.

Growth through empowerment

If you want growth, then empowering employees goes beyond simple delegation; it requires trusting individuals to make informed decisions and providing them with the necessary resources and autonomy to act. In so doing you foster a sense of ownership and accountability within the workforce which then leads to higher job satisfaction and improved overall productivity.

The core of successful empowerment lies in striking the right balance between autonomy and thoughtful decision-making. You should want autonomous yet well-considered decision-making from your employees. Autonomy allows teams and individuals to leverage their expertise and creativity to address complex challenges effectively. However, it must be complemented with considered decision-making, where employees gather information, seek advice, and analyze potential outcomes before acting. Remember you’re paying these people for their expertise and to perform a particular job. If you’re only interested in barking instructions at them then you may as well just hire unskilled people with no particular specialisms or experience.

Tailoring empowerment models to your business settings is important since the universal benefits of empowerment and autonomy don’t necessarily manifest in all cultures or work settings. The application should therefore be tailored to suit the specific business contexts you have. There are a few different implementation models to consider. Task-based, Team-based, and Individualized

Task-based empowerment is typical for industries with routine tasks. Task-based empowerment can streamline processes and enhance productivity. By granting employees authority over specific responsibilities, business leaders enable them to make decisions related to their assigned tasks, boosting efficiency. Without disrupting efficiency and effectiveness, for example, employees can rotate and resequence their tasking according to their preferences and observed optimality.

Team-based empowerment is most appropriate in dynamic environments which ultimately benefit from improved collaboration and collective decision-making and where these activities take center stage. By allowing teams to pool diverse perspectives and expertise, leaders have the potential to tap into the opportunities afforded by collective innovation.

In roles requiring specialized skills, individual-based empowerment can be highly effective. Leaders empower subject matter experts to make decisions in their areas according to proficiency, fostering innovation and excellence in technology-driven fields. This is individual-based empowerment

C&C with its centralized decision-making and strict protocols works somewhat well in highly regulated industries, this approach stifles creativity though and limits adaptability in technology development. Employees may feel restricted, resulting in decreased motivation, innovation, and engagement.

Conversely, the informal and relaxed leadership style promotes open communication, trust, and collaboration. By empowering employees to make autonomous decisions, leadership fosters an essential culture of innovation and agility. This approach is particularly effective in software development and technology-driven operations, where creativity thrives in a flexible environment.

Getting the best out of teams and individuals

Getting the best out of the two quite different approaches still requires you to set clear objectives with agreed measurable outcomes. Empowering employees requires providing clear objectives and expectations. Well-defined goals using the SMART framework (Specific, Measurable, Achievable, Relevant, Time-bound) help individuals self-manage their work efficiently.

SMART goal setting

Another facet is effective time management; autonomy allows individuals to manage their own time. To be effective though, there needs to be discipline. Discipline is essential for ensuring its effective use. Encouraging employees to prioritize tasks, set deadlines, and avoid distractions maintains their productivity.

Autonomous employees must also be accountable for their work. Encouraging ownership and transparent progress reporting foster a sense of responsibility.

Self-Management in Project and Program Management

In technology development, adopting an agile methodology enhances self-management. Empowered teams self-organize, collaborate, and make quick decisions to adapt to changing requirements effectively.

Leaders can further empower teams by providing autonomy in decision-making. Open communication and input from team members drive a self-managed and collaborative environment.

Project management itself involves ongoing learning and improvement allowing employees to reflect on progress and take initiative. These empowering approaches support positive change and have a greater likelihood of driving success.

As already suggested, empowerment also requires balancing discipline with flexibility. Through research, it has been found that Innovation thrives in more flexible environments. Leaders must therefore be open to diverse methods and ideas, trusting teams to find effective solutions. Open channels of communication facilitate not only bidirectional trust, they also support employee self-management and lead to continuous improvement.

A few words of caution

Sometimes, in our earnest efforts to empower others and provide autonomy, we may inadvertently deceive ourselves into believing that we have relinquished command and control.

Despite statements of intent, the empowerment we claim to have granted is a self-deceiving illusion. We might unknowingly perpetuate a micro-management overreach by behaving in exactly the opposite way to what we suggest we are thinking. This can occur when we continuously bombard teams with questions, second-guess their independent decisions, and challenge their judgment with frequency. Individuals and teams need to occasionally fail to learn. While our intention may be to offer support and ensure success, our actions may inadvertently stifle creativity and autonomy.

True empowerment necessitates trust and allowing individuals the space to take ownership of their work is critical. Constantly questioning decisions sends mixed signals, who is actually making the decisions? Undermining the confidence of team leads and team members impedes their ability to innovate freely. To genuinely empower others, we must genuinely let go of control, offer guidance only when sought, celebrate their success, recognize missteps, and offer encouragement, coaching, and reassurance.

Occasional mistakes are part of the learning process. By fostering a culture of trust and granting autonomy, we can break free from the C&C mindset, unleashing the full potential of our teams to drive creativity and achieve remarkable results.

Nurturing a culture of empowerment is essential for fostering innovation in business leadership. By tailoring empowerment models to the specifics of the business setting and adopting informal leadership styles, leaders can cultivate creativity and adaptability, particularly in software development and technology-driven operations. Encouraging discipline and self-management in tasking and project and program management enables employees to thrive in an environment that values autonomy while maintaining focus and efficiency. Striking the right balance between discipline and flexibility empowers teams to innovate, drive success, and contribute to sustainable growth.

Suggested reading :


Read More
Author: Clinton Jones

Choosing Technology Solutions


Factors Influencing Decision-Making in Parity Competitive Products

It is interesting when you work in a competitive technology space, exploring how decisions are arrived at, especially in terms of customer technology choices.

As individuals, we face these challenges regularly, perhaps not even really thinking about the process much. We do it, by choosing appliances, mobile phones, cars, houses, etc. Our choices and decisions about products are influenced by a complex interplay of cognitive, emotional, social, and situational factors.

Researchers delve ever deeper into trying to understand these dynamics to help businesses create more effective marketing strategies and to aid policymakers in promoting informed decision-making among consumers but in the end, we don’t seem to really have a magic formula for how we settle on a decision.

In the corporate world, the same challenge for consumers is of selecting the most suitable technology solutions to meet their specific needs and objectives. Whether it’s for software, hardware, or other IT solutions, the decision-making process is often complex and critical for success, both personally, and for the organization.

In cases where competing technologies exhibit similar functionality and capabilities, additional factors become crucial in influencing the final selection. Consider the significance of various factors in the decision-making process, including the character and personality of the customer account manager, the presales consultant, the engagement process, pricing, company size, customer references, and other pertinent aspects, all of which, in my experience, influence the final outcome.

Picking and choosing

Organizationally, defining a need and a budget is often the starting point. Something doesn’t work, something is broken, or something is problematic. The idea might be to automate, it may be to replace or renew. When organizations seek particular technology solutions they will typically define requirements and objectives clearly.

The framing of the problem is often part of the budget-requesting process but sometimes the solution is already conceived and the problem or problem statement is already relegated. Appropriate requirements definition involves understanding the specific problems to be solved, suggesting the desired outcomes, and trying to arrive at the key performance indicators (KPIs) that will be used to measure the success of the chosen solution. If you don’t have a problem statement and you don’t have success measures then you likely don’t have a clear vision.

This may seem incredibly obvious, but if we again revert to thinking like a consumer, if you want to replace your refrigerator, you must be clear as to why you want to replace it. You may have experienced a windfall, but what is the problem with the fridge you have? You may have moved, and have no room for your old fridge or no fridge at all, or your existing fridge may actually be defective or broken. Your problem statement likely isn’t “I have money to burn – I will buy a fridge” it has to be something else. As an organization there has to be a clear vision about what problems are to be solved through the acquisition, this in turn initiates an evaluation and selection process.

Requests

For big corporate purchases, there is the RFI, RFP, RFQ process. Some have all of these, some have just a couple of these and the thresholds vary according to the specific purchasing policies.

An organization may start by issuing a Request for Information (RFI) to potential vendors. This initial step helps gather general information about products and solutions available in the market. The RFI allows the vendor to provide high-level overviews of offerings, giving the organization a broad understanding of what is available. The trigger for an RFI is often off the back of preliminary or cursory research. Using industry analyst insights, using comparison sites, or simply familiarity with particular technology vendors.

After the RFI phase (which might be skipped altogether), the organization may choose to issue a Request for Proposal (RFP) or Request for Quotation (RFQ) to a select group of vendors. These documents contain detailed specifications, requirements, and evaluation criteria.

Vendors respond with comprehensive proposals outlining how their technology solutions align with the organization’s needs. Invited participants may need to sign non-disclosure agreements (NDA) or may choose to ignore the RFP/RFQ entirely because they determine the prospective customer is not their ideal customer or because the expectations of the request are misaligned with their capabilities.

My first exposure to “tenders” was back in the late 1980’s when I participated in a closed technology consulting services tender to a governmental tourism department. Naively, I was pretty confident about my abilities and those of my partners, in being able to deliver a comprehensive and coherent service that was perfectly framed in the requirements document. What I hadn’t considered, was who I would be competing with and their approach to providing proof of past successes. The price of effectively the offer, it seemed, would not be the only selection criteria.

So this then, brings us to the competitive selection process. It isn’t always a bidding war. In cases where the RFP/RFQ responses reveal that multiple technology solutions demonstrate parity with similar functionality and capabilities, the decision-making process is more intricate. Identifying the subtle differentiators among competing products needs careful consideration in selecting the best-fit solution.

The hard comparisons

Exposure to the UI is often had, by asking for live or recorded demonstrations. Though these are often delivered in a heavily managed way, the User experience (UX) is a significant factor that can sway the decision. A technology solution that appears to be intuitive, user-friendly, and purportedly requires minimal training for employees will likely be preferred over one that is complex and seemingly difficult to navigate. Usability assessments, user testing, and interface evaluations may be the next step and may help gauge the product’s intuitiveness and its potential impact on productivity. These typically occur, when using evaluations, asking for Proof of Concept (POC) demos, or the like.

The ability of a technology solution to seamlessly integrate with the organization’s existing systems and infrastructure can significantly impact its perceived value. It is one thing to say or respond that the integration is there, it is another thing when the integration is physically demonstrated or proven. Compatibility and interoperability are often essential considerations, they can reduce implementation complexity and overall cost. Organizations assess the extent of existing integrations, the ease of connecting with a new solution, and the potential for future integration needs or promises.

Scale is important too. Organizations often prioritize solutions that can grow and adapt to future needs and growth. A technology solution that can accommodate expansion and changes in requirements ultimately offers longer-term value. Scalability assessments involve evaluating how the solution performs under various load scenarios, such as increased user counts or expanded datasets or integrations are also something which may be tested or require detailed testing evidence.

The level of support and maintenance offered by the vendor can heavily influence the decision. A responsive support team and well-defined service level agreements (SLAs) are often deemed critical, especially in complex technology implementations. The availability of a 24/7 help desk, an online knowledge base, and timely issue resolution are aspects that can significantly impact the organization’s overall satisfaction with the vendor.

Softer comparators

Execs often don’t like to hear that people count. The preferred perspective is that the solution and the company’s reputation stand for itself. The idea that buying IBM for example was a safe technology bet, was something that was echoed in many IT dept halls in the 1960s, ’70s and ’80s even though it was never actually the company’s tagline. Though IBM never used the phrase in advertising, some strap lines did allude to it: “At IBM, we sell a good night’s sleep.” per Origin.

“It was thought up by someone in corporate America cleverly commentating on the danger to one’s job when selecting pricey software that… didn’t quite pan out.” opines Cole Shafer on Honey Copy.

Another view might say, well if you’re spending your time looking for something that is likely tried and tested, go with one of the “big boys” as opposed to a Wang, RCA, Data General, Digital Equipment Corporation (DEC), Control Data Corporation (CDC), Scientific Data Systems (SDS/Xerox) who, like the below mid-1980s advert from Wang might use the phrase to explain away a failed tender to their own execs.

.

But consider yet another perspective the one where the prospective customer chases the underdog because they feel that they have a greater ability to influence the design, feel, and roadmap of a particular product because they feel that they might have leverage. This may seem counter intuitive, but it is definitely a potential factor to consider. Assuming that because you’re the underdog you don’t have a chance, may be the wrong assumption to make.

When technology solutions reach parity in functionality and capabilities, various other factors play pivotal roles in the decision-making process and some of these are unpalatably soft for execs to stomach.

The Midas touch

The personality, knowledge, expertise, and responsiveness of the customer account manager and presales consultant can leave a lasting impact on the prospect.

A strong relationship with the vendor’s representatives instills confidence in the partnership and can lead to better collaboration throughout the deal brokerage, the implementation, and beyond.

Organizations will often indirectly look for account managers and consultants who take the time to understand their unique challenges and propose appropriately tailored solutions or product positioning that align with their specific needs.

An empathetic approach that focuses on building trust and addressing concerns can ultimately be a deal-maker since it fosters a more positive and productive relationship between vendor and client.

The engagement process proposed by the vendor, including project management methodologies and communication practices, is also crucial.

A well-structured engagement plan can lead to smoother implementations and successful outcomes, basically describing to the buyer, how you plan to present your organization and offerings to the prospect.

Organizations often evaluate the vendor’s approach to project management, by looking at how they manage the opportunity as a project, including the allocation of presales and supplementary support resources, communication frequency, and risk mitigation strategies.

Effective project management processes even in the sales cycle, help ensure that the pitch stays on track and that potential derailing issues are addressed promptly.

While pricing is always going to be a factor, it becomes even more crucial when technologies are at parity. Organizations may consider other factors such as the spread of costs, including upfront costs, ongoing maintenance expenses, licensing models, and potential hidden costs.

An unseen (by the vendor) comprehensive cost-benefit analysis may be conducted, considering both short-term and long-term financial implications. This means, that as the seller you must strike a balance between budget constraints and the perceived value of the technology solution.

It’s hard to be bigger than you are, but the size and reputation of the vendor’s company is influential.

Established companies with a strong track record of successful implementations and a significant customer base may be perceived as more reliable. However, smaller companies with niche expertise may also offer unique advantages, such as personalized service and a higher level of attention to individual client needs. Organizations must evaluate their risk tolerance and assess the potential benefits and drawbacks associated with companies of different sizes.

A way to compensate for size is with physical and written, and telephonic references. Customer references and case studies provide valuable insights into real-world experiences with the solution.

Organizations often seek feedback from existing customers to gauge satisfaction and success rates. Engaging in conversations with references allows the organization to ask specific questions related to its unique requirements, implementation challenges, and the vendor’s responsiveness to their needs. Additionally, case studies that showcase successful implementations in organizations with similar profiles offer valuable validation of the technology’s suitability.

Pudding time

Some technology vendors simply won’t entertain a PoC at all. The rationale for not offering a PoC is legion. Among them, the endless PoC with seemingly infinite levels of scope creep, poorly articulated success criteria, costs to serve, deciding how to support etc

A Proof of Concept (PoC) can be a powerful tool in the decision-making process. It allows the organization to evaluate how the technology performs against specific expectations and for specific use cases before making a final commitment.

A well-designed PoC should focus on validating critical aspects, such as performance, security, scalability, and integration capabilities but success hinges on organizations working closely with the vendor to define clear success criteria for the PoC to ensure that the evaluation process remains objective and aligned with their objectives.

Communication

This is really more, of the softer stuff, but ultimately it needs to be acknowledged that in technology decision-making, the human element plays a significant enough role in shaping perceptions and building trust that it is not uncommon to see vendors roll in a larger cast of characters as the opportunity manager see the deal teeter on a buy/abandon decision.

In my days as a customer and working on the vendor side, the “suits” would often arrive when special encouragement or reassurances were needed. I was always amused for example, by the chocolate brown suits that the IBM account execs would often wear, and the shiny blue suits that the guys from SAP would often be seen wearing. I don’t remember the ones the Compaq and HP guys wore but everyone was in a suit!

The most competent account managers and presales consultants who genuinely understand the organization’s pain points and challenges pitch and propose solutions that resonate resoundingly with the organization’s objectives. They’re active listeners and respond when asked, as opposed to those who might listen to respond. Assuaging an organization’s concerns, asking clarifying questions, and demonstrating empathy for the challenges the organization faces all help in developing a deeper understanding of the organization’s unique context.

Effective communication and responsiveness build confidence in the vendor’s ability to address any concerns or issues that may arise. Timely responses to queries and proactive communication foster a sense of partnership and reliability. Candid and transparent communication about timelines, milestones, and potential risks helps manage expectations and allows the organization to plan accordingly.

Big purchases are all about partnerships rather than just transactions. A long-term commitment from the vendor fosters a sense of security and suggests a culture of collaboration. Vendors that demonstrate a vested interest in the organization’s success are perceived to be more likely to provide good ongoing support, upgrades, and enhancements, and commit to collaborate on future initiatives.

To foster a long-term partnership, organizations will always seek out vendors who prioritize customer success and demonstrate a commitment to continuous improvement.

What’s your perspective, or do you think it is all a perfect confluence of timing, product, price place, and people – essentially luck?


Read More
Author: Clinton Jones

A question of choice


The question of “choice” has been extensively explored in various fields such as psychology, behavioral economics, marketing, and decision-making research. Understanding how people choose products and what factors influence their decisions is of great interest to businesses, policymakers, and academics alike.

  1. Decision-making process: Research suggests that decision-making is a complex process influenced by numerous cognitive, emotional, and social factors. People often go through multiple stages in their decision-making journey, including problem recognition, information search, evaluation of alternatives, and finally, the actual choice.
  2. Rationality vs. heuristics: Traditional economic theory assumes that individuals make rational choices based on objective information and maximize their utility. However, numerous studies have shown that people often rely on mental shortcuts or heuristics to simplify decision-making, leading to biases and suboptimal choices.
  3. Cognitive biases: Various cognitive biases affect how people perceive information and make decisions. For example, the “availability heuristic” causes people to overestimate the likelihood of events based on how easily they can recall related instances. The “anchoring effect” occurs when an initial piece of information (an anchor) influences subsequent judgments.
  4. The paradox of choice: Studies have shown that an abundance of options can lead to decision paralysis and increased dissatisfaction with the chosen option. Having too many choices can overwhelm individuals, making it difficult for them to settle on a decision.
  5. Emotional factors: Emotions play a significant role in decision-making. People may choose products based on how they want to feel or how they perceive the product will enhance their emotions (e.g., happiness, status, security).
  6. Social influence: People are often influenced by the choices and opinions of others. Social proof, where individuals follow the actions of others in ambiguous situations, and peer recommendations can sway decision-making.
  7. Framing and presentation: The way choices are presented can significantly impact decision-making. For instance, whether a product is framed as a “gain” or “loss” can influence how people perceive its value and desirability.
  8. Branding and marketing: Effective branding and marketing strategies can influence consumer choices by creating strong associations between products and desirable attributes or emotions. Advertising and endorsements can sway consumer preferences.
  9. Habit and routine: In some cases, people may choose products out of habit or routine, without consciously considering alternatives. Habitual decision-making relies on automatic processes rather than a deliberate evaluation of options.
  10. Personal values and identity: Consumers often make choices that align with their values, self-concept, and identity. A product may be chosen because it represents a person’s ideal self-image or reflects their personal beliefs.

People’s choices and decisions about products are influenced by a complex interplay of cognitive, emotional, social, and situational factors. Researchers continue to delve deeper into understanding these dynamics to help businesses create more effective marketing strategies and to aid policymakers in promoting informed decision-making among consumers.


Read More
Author: Uli Lokshin

Crossing the Data Divide: Closing Gaps in Perception of Data as Corporate Asset
It’s my great pleasure and honor to begin as a columnist for TDAN.com. TDAN.com has a long and distinguished record of giving voice to ideas in the data space and I will do my best to continue that tradition. The Crossing the Data Divide column will be aimed at data leaders. That is to say, […]


Read More
Author: John Wills

Cracking the whip


The bullwhip effect is a phenomenon where small changes in one end of a system can cause large fluctuations in another end of the system. You’ll see it most frequently associated with supply chain and logistics imbalances and fluctuations in demand and supply with amplification of variability in demand.

An adjunctive area of thinking is around the cobweb theorem in economics which relates to market price variability as a result of those same market supply and demand elements. The bullwhip effect can lead to excess inventory, lost revenue, and overinvestment in production whereas the cobweb theorem can lead to radical swings in market prices.

The bullwhip effect and the cobweb theorem are related because they both show how small changes in demand can cause large changes in supply. Ultimately both relate to imperfect information and associative reactions in end-to-end system that are not perfectly aligned or coordinated, leading to overreaction or underreaction by the system participants.

Bullwhip effects and cobweb based pricing in particular often lead to inefficiency, waste, and instability.

Effects on software development

One of the systems that can experience the bullwhip effect is software development. In the process of creating, testing, and deploying software applications that meet the needs and expectations of customers I see many stages and participants. Product managers, sponsors, developers, testers, development managers and of course customers, and users. At each stage the participants have different information and incentives that affect their own decisions and actions.

The bullwhip effect can occur in software development when there is a mismatch between the actual demand for the product or features in the product and the perception of demand in the minds of those within the software organization.

A customer may request a minor enhancement for example but those involved in the development process may interpret this as a major change and vice versa. More or less time and resources may be invested in the development of feature or capability than necessary. Let’s also be clear. Changes may also be triggered by other imperfect information, such as the relationship with the customer’s value in the software house’s revenue contribution chain or the software adoption lifecycle within the customer. Customer X may be a household brand but make a small bottom line contribution or customer Y may be an unknown brand yet have significant economic value to the business and a host of other possible combinations in between.

All of these factors can lead to software product delivery delays, overruns, and potential waste. If the customer requests a major change in the software, but this is underestimated, this can lead to an inadequate results, errors, defects, and customer dissatisfaction and the risk of loss.

Interpretation of requirements is so critical and yet so often overlooked beyond the face value of the ask. Sometimes “the ask” is also founded on anecdotes and opinions instead of evidence, insufficient modelling, the absence of prototyping and minimal market feedback. It is almost as if we fear hearing the critique and are just eager to build a solution.

There is also the proverbial problem of poor communication or handoff of requirements among the various participants. This can lead to distorted information and inaccurate requirements as a whole.

When stakeholders place overly ambitious demands on product or development teams or make decision pivots in an erratic or irregular way instead of making progressive, small incremental changes regularly, you can land up with spikes of activity, abandoning of initiatives in flight and incomplete work. These lead to resourcing and planning confusion, delivery crises and potentially wild cost variations. Everything becomes urgent and predictability on delivery goes out the window in favour of the latest new shiny thing.

Delivery crises or cost and estimate variations introduce uncertainty and anxiousness into the delivery assurance and process and negatively impact the potential usefulness of the roadmap and delivery plans. Promises or suggestions of intent made today settle as dust for tomorrow.

Deals contingent on feature delivery, renewals contingent on feature delivery; omissions of detailed fact, allowing unchallenged assumptions to be made about the presence or capability of features and an over-optimism on actual capabilities relative to real and present functionality encourages and perhaps even induces customers deals but creates feature alignment uncertainty and ruins the best made roadmap plans and planning in general.

Product and engineering managers have seen it time and time again; in start-up software houses it is perhaps the worst of all. Hunger for the commercial deal leads to over promising without due consideration of the impact to roadmap execution plans for existing commitments and other competing customer priorities or issues all in pursuit of business growth.

Demand information, such as feedback or analytics data, that is not shared or not used effectively by the organization as a whole, could have a direct impact on scheduling and resourcing and could result in poor coordination and planning.

Human behaviours, such as greed, exaggeration, or panic can influence offers and commercial decisions especially in times of economic uncertainty or at critical time junctures in the calendar like month, quarter or financial year end to meet quotas or squeeze budgets.

The product backlog

From a backlog perspective, timelines often become elongated, feature backlogs grow and actual product output may crawl, developers may land up producing software features uniquely designed for particular customers or industry segments in response to commercial obligations rather than in alignment with the mission of the business or a given product line’s vision.

There can be loss of revenue too, where features and capabilities are developed in a “misaligned with the market” way. Since developers are often the order takers from product management or executives, they rarely have the real opportunity to dispute the relative priority of things and opportunities may get missed, features may get rushed and the completeness of capability overlooked in favour of a “get it out fast” mindset, all in pursuit of perhaps a box checking exercise or to meet the narrowest of possible needs and expectations quickly.

Feature proliferation without market validation and qualification is effectively overinvestment or misplaced investment. A feature ROI will reveal that some features that seemed great in principle effectively become part of a wasteland of technical debt in production. The features may not be valued and worse, may not be used or adopted at all and removing them may be more expensive than simply having them linger in the product in perpetuity.

Poor quality is often also a result of the bullwhip with developers producing software inconsistently or of a low quality, buggy, unreliable, or incompatible with customer or user expectations. This in turn leads to lowered customer satisfaction, where customers or users are unhappy with the software features they receive or do not receive.

Remediation strategies

The bullwhip effect can be reduced or prevented in software development by adopting some neutralising strategies.

The first of these is an improvement in communication where there is the deliberate use of clear and consistent language, documentation, and feedback without guile or hubris.

This means practical down to earth descriptors that relate to state, opportunity and need. This may be in relation to opportunities, actual customer situations or on the flipside, a changing situation in relation to the state of the technology landscape or resources required to the work. The communication has to go both ways, servicing demand and supply aspects of the business. Developers tell technology stories and account managers tell customer stories. Product managers play back both sides.

Smoothing demand is really about settling on a product execution plan that works for everyone. This is achieved by actually being forward thinking and prescriptive about the product roadmap, features and functions. Providing enough detail to assuage concerns about progress but no so much detail that it becomes a road for the back in terms of execution. General direction of travel and progress markers are what count.

Focusing on intentions for the product based on the core values of the business and how the product lines up against those. The challenge here is for businesses with a small or emerging foothold in the market and a young or small product portfolio that they are trying to evolve. All that said, by choosing a narrow market segment rather than anything and everything with a potential pulse, your software business is focused on where the sales effort investment is likely to yield the best possible opportunities. Customers that are too big may be overbearing, those that are too small may be too expensive to service.

Pricing is often very contentious. A combination between science and the dark arts, it is difficult to always get the price for products perfectly right at the beginning. For this reason, many software products start with a relatively straight forward pricing model that becomes infinitely more complex and sophisticated as the size of the customer opportunity grows and the deals mature. You want to leave just enough money on the table to not feel that you undersold your product.

This may sometimes lead to hockey stick pricing models that seem super affordable or even free at low levels but then grow exponentially according to use or data or something like that – these strategies often leading to debates about value based pricing models. Attempts at price banding, price stepping and the like sometimes help. When these are combined with multiyear discounts or contingencies and other complex calculations, customers and sales people sometimes may get equally confused and the value to cost understanding compromised. This in turn can lead to horse trading or bidding wars, price gouging or ballooning. Poor thinking on pricing can be a destabilizer for investment runways it can also extend them. Be prepared to always rethink your pricing.

Customer churn considerations usually only kick in after the first anniversary of the sale but again, when considered in relation to pricing, feature development, cost to serve and the relative value of the opportunity to the software vendor, these can have an extraordinary and disruptive effect on software development lifecycle management as the customer’s specific backlog requirements get dusted off and get given elevated priority to ensure renewal, all with the limited context of a renewal at risk.

Sharing details on opportunities that have specific needs and expectations should happen regularly. The main participants in this process of communication should be the account teams and the product management team. Nothing should ever be promised to prospects without a clear understanding of the criticality of the requirement to the prospect, the relative position of that thing in a backlog (it may not even exist), the alignment with the product vision and a good understanding of the cost to service the requirement and optimistic timeline. I’d encourage customer teams to regularly involve product management in their customer qualification process also. Product managers crave the indulgence of not just customers but also prospects, as bystanders in the product demos and discovery sessions, they get to hear about the customer context problems and are best positioned to spot opportunities for product improvement or enhancement first hand.

Finally, it is worth considering that all of this relates to managing human behaviours. By educating and motivating all the teams to make rational and informed decisions based on facts rather than emotions one is better positioned to deliver superior software products consistently and affordably. By applying these strategies, software development houses can likely avoid or minimize the bullwhip effect and improve their operational and process efficiency, quality, and customer satisfaction.

Photo Credit: Pexels : Photo by Müşerref İkizoğlu


Read More
Author: Clinton Jones

Give Your Customers a Personal Experience by Leveraging Customer Intent


As a business owner or marketing professional, you have to understand the concept of customer intent and how it can be best used for personalized customer experience.

By understanding what it is, businesses can create more personalized marketing offers and tailor their strategies to customers on an individual level. Today, Jones Associates shares a quick post that opens up the doors for you to use customer intent in your marketing endeavors.

What Is Customer Intent?

Customer intent simply means your customers’ motivation behind a certain action. Their intent is what they want to accomplish. For example, if someone searches for women’s shoes, they likely want to buy women’s shoes. You can utilize Google Trends to get an idea of popular searches, but your webmaster can give you more information specific to your site. Intent does not always translate initially into a purchase. Sometimes, your customers’ intent is simply to gather information, such as pricing or size and availability.

How To Leverage Customer Intent

There are many different ways that you can get to know your customers’ underlying motivation.

To fully understand customer intent, you have to do research. You might send surveys or create a focus group. You can also do what’s known as social listening, which is simply the act of monitoring and analyzing social media conversations. You can also utilize data culled from your website to get a snapshot of your customers’ journeys. For example, if they put a pair of shoes in their shopping cart, you know that their intent is, ultimately, to purchase shoes.

If you extrapolate this to an ecommerce store, obviously the site has to be user-friendly with customizable content and an easy-to-use payment section, as well as the ability to analyze sales and inventory data to better streamline each customer’s experience. Luckily, there are options for commerce software that allow you to do just that.

Tailored Content Converts

Grazitti Interactive says it best, “Custom content is the future of marketing.” Why? Because customers don’t want to feel like they are nothing but an open wallet. Customizing your content helps set you apart from the competition and helps your customers throughout their buying journey. Nine out of 10 users rate custom content as more helpful than generic information. Think of it this way: when you reach customers with the info and insight they want, you become a more trustworthy source that shows that you truly know what they need.

Customer Retention Matters

Custom content is not only important when targeting new customers. Constant Content explains that your customized marketing materials also reinforce your image and create a stronger following of loyal customers. Everyone wants more customers, but don’t ever lose sight of the fact that simply increasing your customer retention rate by 5% can boost your revenue by up to 95%. This is not only accomplished through increased sales but also because happy customers tell their friends and family.

Ultimately, offering your customers curated content based on their actions and intent is one of the best ways to offer real value to those that support your business. Remember, custom-tailored content converts more customers, and it’s widely considered more helpful than generic pop-ups and other content. To get the most out of your efforts, make sure to listen to your customers, use data to inform your business decisions, and keep organized and accurate records of your findings.

If you’re looking for a global management consultant who can help you meet and exceed your goals, contact Jones Associates today to get started.


Read More
Author: Uli Lokshin

RSS
YouTube
LinkedIn
Share