Search for:
To the cloud no more? That is the question.


Cloud computing has undergone a remarkable transformation over the past decade.

What was once hailed as a panacea for companies struggling with the high costs and unsustainability of on-premise IT infrastructure has now become a more nuanced and complex landscape. Businesses continue to grapple with the decision to migrate to the cloud or maintain a hybrid approach, the complexity, costs and risk are essential to understand the evolving dynamics and the potential pitfalls that lie ahead.

The initial appeal of cloud solutions was undeniable.

By offloading the burden of hardware maintenance, software updates, and data storage to cloud providers, companies could focus on their core business activities and enjoy the benefits of scalability, flexibility, and cost optimization. The cloud promised to revolutionize the way organizations managed their IT resources, allowing them to adapt quickly to changing market demands and technological advancements.

However, not all businesses have fully embraced the cloud, especially when it comes to their mission-critical systems. Companies that handle sensitive or proprietary data have often been more cautious in their approach, opting to maintain a significant portion of their operations on-premise. These organizations may have felt a sense of vindication as they watched some of their cloud-first counterparts grapple with the complexities and potential risks associated with entrusting such critical systems to third-party providers.

The recent news from Basecamp, for example, was driven by spiraling costs, irrespective of the cloud provider (they tried AWS and GCP). Thus, Basecamp decided to leave the cloud computing model and move back to on-premise infrastructure to contain costs, reduce complexity, avoid hidden costs, and retain margin. This way they felt that they had more control of the delivery and sustainment outcomes.

The Ongoing Costs of Cloud-First Strategies

Cloud bills, for example, can comprise hundreds of millions or billions of rows of data, making them difficult to analyze in traditional tools like Excel and cloud computing reduces upfront startup costs, including setup and maintenance costs, with 94% of IT professionals reporting this benefit. Accenture for example, found cloud migration leads to 30-40% Total Cost of Ownership (TCO) savings.

As many as 60% of C-suite executives also cite security as the top benefit of cloud computing, ahead of cost savings, scalability, ease of maintenance, and speed.

The private cloud services market for example, is projected to experience significant growth in the coming years. According to Technavio, the global private cloud services market size is expected to grow by $276.36 billion from 2022 to 2027, at a CAGR of 26.71%. 

The cloud of course supports automation, reducing the risk of human errors that cause security breaches and accoridnly the platforms help capture the cost of tagged, untagged, and untaggable cloud resources, as well as allocate 100% of shared costs. For those organizations that have wholeheartedly adopted a cloud-first strategy, the operational budgets for cloud technologies have often continued to climb year-over-year.

Instead of fully capitalizing on the advances in cloud technology, these companies may find themselves having to maintain or even grow their cost base to take advantage of the latest offerings. The promise of cost savings and operational efficiency that initially drew them to the cloud may not have materialized as expected.

As this cloud landscape continues to evolve, a critical question arises: is there a breaking point where cloud solutions may become unviable for all but the smallest or most virtualized cloud-interwoven businesses?

This concern is particularly relevant in the context of customer data management, where the increasing number of bad actors and risk vectors, coupled with the growing web of regulations and restrictions at local, regional, and international levels, can contribute to a sense of unease about entrusting sensitive customer data to cloud environments.

The Evolving Regulatory Landscape & Cyber threats

The proliferation of data privacy regulations, such as the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) in the United States, has added a new layer of complexity to the cloud adoption equation.

These regulations, along with a growing number of industry-specific compliance requirements, have placed significant demands on organizations to ensure the security and privacy of the data they handle, regardless of where it is stored or processed.For businesses operating in multiple jurisdictions, navigating the web of regulations can be a daunting task, as the requirements and restrictions can vary widely across different regions.

Failure to comply with these regulations can result in hefty fines, reputational damage, and even legal consequences, making the decision to entrust sensitive data to cloud providers a high-stakes proposition.

Alongside the evolving regulatory landscape, the threat of cyber attacks has also intensified, with bad actors constantly seeking new vulnerabilities to exploit.

Cloud environments, while offering robust security measures, are not immune to these threats, and the potential for data breaches or system compromises can have devastating consequences for businesses and their customers.

The growing sophistication of cyber attacks, coupled with the increasing value of customer data, has heightened the need for robust security measures and comprehensive risk management strategies. Companies must carefully evaluate the security protocols and safeguards offered by cloud providers, as well as their own internal security practices, to ensure the protection of their most valuable assets.

Balancing Innovation and Risk Management

In light of these challenges, many businesses are exploring hybrid approaches that combine on-premise and cloud-based solutions.

This strategy allows organizations to maintain control over their mission-critical systems and sensitive data, while still leveraging the benefits of cloud computing for less sensitive or more scalable workloads.

Some companies are also taking a more selective approach to cloud adoption, carefully evaluating which workloads and data sets are suitable for cloud migration.

By adopting a risk-based approach, they can balance the potential benefits of cloud solutions with the need to maintain a high level of control and security over their most critical assets.

As the cloud landscape continues to evolve, it is essential for businesses to carefully evaluate their cloud strategies and adapt them to the changing circumstances.

This may involve regularly reviewing their cloud usage, cost optimization strategies, and the evolving regulatory and security landscape to ensure that their cloud solutions remain aligned with their business objectives and risk tolerance.Regular monitoring and assessment of cloud performance, cost-effectiveness, and security posture can help organizations identify areas for improvement and make informed decisions about their cloud investments.

Collaboration with cloud providers and industry experts can also provide valuable insights and best practices to navigate the complexities of the cloud ecosystem.

As the cloud landscape continues to evolve, it is clear that the path forward will not be a one-size-fits-all solution.

Businesses must be careful in weighing the potential benefits of cloud adoption against the risks and challenges that come with entrusting their critical data and systems to third-party providers.The future of cloud solutions will likely involve a more nuanced and balanced approach, where organizations leverage the power of cloud computing selectively and strategically, while maintaining a strong focus on data security, regulatory compliance, and risk management.

Collaboration between businesses, cloud providers, and regulatory bodies will likely be crucial in shaping the next chapter of the cloud revolution, ensuring that the benefits of cloud technology are realized in a secure and sustainable manner.


Read More
Author: Uli Lokshin

Data Governance playbooks for 2024


Back in 2020, I offered up some thoughts for consideration around generic or homogenous data governance playbooks. Revisit it if you care to.

This was in part fueled by frustrations with the various maturity models and potential frameworks available but also by the push, particularly from some software vendors, to suggest that a data governance program could be relatively easily captured and implemented generically using boiler-plated scenarios by any organization without necessarily going through the painful process of analysis, assessment and design.

Of course, there is the adage, “Anything worth doing, is worth doing well“, and that remains a truism as applicable to a data governance program as anything else in the data management space.

You can’t scrimp on the planning and evaluation phase if you want to get your data governance program to be widely adopted and effective irrespective of how many bucks you drop and irrespective of the mandates and prescripts yelled from the boardroom.

Like any change program, a DG initiative needs advocacy and design appropriate to the context and no vendor is going to do that perfectly well for you without you making a significant investment of time, people and effort to get the program running. If you’re evaluating a software vendor to do this for you, in particular, you need to be sure to check out their implementation chops and assess their domain knowledge, particularly relevant to your industry sector, market and organizational culture. This is a consulting focus area that “The Big Four” have started to look more closely at and are competing with boutique consultancies on. So if you have a passion for consulting and you feel all the big ERP and CRM projects have been done and you want to break into this space, then here is an area to consider.

What is it exactly?

The term “playbook” in a business context is borrowed from American football. In sports, a playbook is often a collection of a team’s plays and strategies, all compiled and organized into one book or binder. Players are expected to learn “the plays” and ahead of the game the coach and team work out the play that they are likely to run at the opposing team or the approach that they will use if the opposing team is observed to run a particular play of their own. Some plays may be offensive, some defensive and then there may be other plays for specialised tactical runs at a given goal or target.

A “business playbook” contains all your company’s processes, policies, and standard operating procedures (SOPs). Also termed a “company playbook”, it is effectively a manual outlining how your business does what it does, down to each business operations role, responsibility, business strategy, and differentiator. This should be differentiated from a RunBook where the latter is your “go-to” if a team needs step-by-step instructions for certain tasks. Playbooks have a broader focus and are great for teams that need to document more complex processes. It is a subtlety that is appreciated more when you are in the weeds of the work than when you are talking or thinking conceptually about new ways of optimizing organizational effectiveness and efficiency.

A data governance playbook is then effectively a library of documented processes and procedures that describe each activity in terms of the inputs and capture or adoption criteria, the processes to be completed, who would be accountable for which tasks, and the interactions required. It also often outlines the deliverables, quality expectations, data controls, and the like.

Under the US President’s management agenda, the federal Data Strategy offers up a Data Governance Playbook that is worth taking a look at as an example. Similarly, the Health IT Playbook is a tool for administrators, physician practice owners, clinicians and practitioners, practice staff, and anyone else who wants to leverage health IT. The focus is on the protection and security of patient information and ensuring patient safety.

So, in 2024, if you’re just picking up the concept of a playbook, and a data governance playbook in particular, it is likely that you’ll look at what the software vendors have in mind; you’ll evaluate a couple of implementation proposals from consultants and you’ll consider repurposing something from adjacent industry, a past project or a comparable organization.

Taking a “roll-your-own” approach

There’s plenty of reading content out there from books written by industry practitioners, and analysts, to technology vendors, as mentioned. Some are as dry as ditchwater and very few get beyond a first edition, although some authors have been moderately successful at pushing out subsequent volumes with different titles. A lot of the content though, will demonstrate itself to be thought exercises with examples, things/factors to consider, experiences and industry or context-specific understandings or challenges. Some will focus on particular functionality or expectations around the complementary implementation or adoption of particular technologies.

With the latest LLM and AI/ML innovations, you’ll also discover a great deal of content. Many of these publications, articles and posts found across the internet have already been parsed and assimilated into the LLM engines so, a good starting point is for you to ask your favourite chatbot what it thinks.

Using a large language model (LLM) like ChatGPT to facilitate the building of data playbooks might be feasible to a certain extent but there will be challenges.

On the plus-side. An LLM could generate content and provide templates for various sections of a data playbook, such as data classification, access controls, data lifecycle management, and compliance. It can also assist in drafting policy statements, guidelines, and procedures.

It could help in explaining complex data governance concepts, definitions, and best practices in a more accessible language for use in say a business glossary or thesaurus. This could be beneficial for individuals who might not have a deep understanding of data governance – think about your data literacy campaigning in this context.

Users can also directly interact with an LLM in a question-answer format to seek clarity on specific aspects of data governance and help build an understanding of key data governance concepts and data management requirements.

Just as for generic playbooks, there are going to be problems with this approach, LLMs operate based on patterns learned from a diverse range of data, but they often lack domain specificity. A data management platform or data catalog itself might have an LLM attached to it but has it been trained with data governance practice content?

Data governance often requires an understanding of industry-specific regulations, data types, and organizational contexts that might not be captured adequately by a generic model.

We’ve also heard about AI hallucinations, and some of us may have even experienced a chatbot hallucination. Without the particular character of data governance practice and domain knowledge, there’s a risk that the AI might generate content that is wholly or partially inaccurate, incomplete, or not aligned with the actual organizational need. This then, would have you second-guessing the results and having to dig into the details to ensure that the suggested content is appropriate. You’ll need to have a domain expert on hand to validate the machine-generated output.

Data governance practices and regulations are also ever-evolving. What the LLM might not be aware of, is new regulations, new compliance expectations or new industry standards. So leaning purely on machine-generated content may be deficient in revealing emerging best practices unless it gets to be trained with updates.

Each organization has its unique culture, structure, and processes. The intertwined nature of DG with the various organizational processes, and understanding these interconnections is vital; that’s best achieved with careful analysis, process design and domain knowledge. The tool you use to help elaborate your playbook might simply provide information in isolation, without any grasp of the broader organizational context. Without appropriate training and prompting, the specific nuances of the organization will make it almost impossible to tailor the generated content to align with organizational goals and practices.

I guess my whole point is that you will not escape the human factor. If you insist on going it alone and relying on machine-generated content in particular then that same content should undergo thorough validation by domain experts and organizational stakeholders to ensure that the results are accurate and aligned with organizational and industry requirements.

The use of modern-day tooling to assist human experts in drafting and refining data playbooks is a valuable acceleration approach that has merit but just as for generic playbooks and templates, you need to leverage the strengths of canned, automated generation and human expertise to arrive at a good result.

I’d love to hear what if anything you’ve done with chatbots, AI, ML and LLM to generate content. If you are implementing any data management or data governance initiatives, I would love to know how successful you have been and any tips or tricks you acquired along the way.


Read More
Author: Clinton Jones

A Strategic Approach to Data Management


There is a delicate balance between the needs of data scientists and the requirements of data security and privacy.

Data scientists often need large volumes of data to build robust models and derive valuable insights. However, the accumulation of data increases the risk of data breaches, which is a concern for security teams.

This hunger for data and the need for suitable control over sensitive data creates a tension between the data scientists seeking more data and the security teams implementing measures to protect data from inappropriate use and abuse.

A strategic approach to data management is needed, one that satisfies the need for data-driven insights while also mitigating security risks.

There needs to be an emphasis on understanding the depth of the data, rather than just hoarding it indiscriminately.

Towards Data Science article Author, Stephanie Kirmer reflects on her experience as a senior machine learning engineer and discusses the challenges organizations face as they transition from data scarcity to data abundance.

Kirmer highlights the importance of making decisions about data retention and striking a balance between accumulating enough data for effective machine learning and avoiding the pitfalls of data hoarding.

Kirmer also touches on the impact of data security regulations, which add a layer of complexity to the issue. Despite the challenges, Kirmer advocates for a nuanced approach that balances the interests of consumers, security professionals, and data scientists.

Kirmer also stresses the importance of establishing principles for data retention and usage to guide organizations through the decisions surrounding data storage.

Paul Gillin, Technology Journalist at Computerworld raised this topic back in 2021. in his piece Data hoarding: The consequences go far beyond compliance risk, Gillin discusses the implications of data hoarding, which extends beyond just compliance risks. It highlights how the decline in storage costs has led to a tendency to retain information rather than discard it. 

Pijus Jauniťkis a writer in Internet Security at Surfshark describes how the practice can lead to significant risks, especially with regulations like the General Data Protection Act in Europe and similar legislation in other parts of the world.

There is however a landscape where data is both a valuable asset and a potential liability, a balanced and strategic approach to data management is crucial to ensure that the needs of both groups are met.

The data community has a significant responsibility in recognizing both.

Data management responsibilities extend beyond the individual who created or collected the data. Various parties are involved in the research process and play a role in ensuring quality data stewardship.

To generate valuable data insights, people need to become fluent in data. Data communities can help individuals immerse themselves in the language of data, encouraging data literacy.

A governing body organizationally, is often responsible for the strategic guidance of a data governance program, prioritization for the data governance projects and initiatives, approval of organization-wide data policies and standards and if there isn’t one, one should be established.

Accountability includes the responsible handling of classified and controlled information, upholding data use agreements made with data providers, minimizing data collection, informing individuals and organizations of the potential uses of their data.

In the world of data management, there is a collective duty to prioritize and respond to the ethical, legal, social, and privacy-related challenges that come from using data in new and different ways in advocacy and social change.

A balanced and strategic approach to data management is crucial to ensure that the needs of all stakeholders are met. We collectively need to find the right balance between leveraging data for insights and innovation, while also respecting privacy, security, and ethical considerations.


Read More
Author: Uli Lokshin

Unlocking Value through Data and Analytics


Organizations are constantly seeking ways to unlock the full potential of their data, analytics, and artificial intelligence (AI) portfolios.

Gartner, Inc., a global research and advisory firm, identified the top 10 trends shaping the Data and Analytics landscape in 2023 earlier this year.

.These trends not only provide a roadmap for organizations to create new sources of value but also emphasize the imperative for D&A leaders to articulate and optimize the value they deliver in business terms.

Bridging the Communication Gap

The first and foremost trend highlighted by Gartner is “Value Optimization.”

Many D&A leaders struggle to articulate the tangible value their initiatives bring to the organization in terms that resonate with business objectives.

Gareth Herschel, VP Analyst at Gartner, emphasizes the importance of building “value stories” that establish clear links between D&A initiatives and an organization’s mission-critical priorities.

Achieving value optimization requires a multifaceted approach, integrating competencies such as value storytelling, value stream analysis, investment prioritization, and the measurement of business outcomes.

Managing AI Risk: Beyond Compliance

As organizations increasingly embrace AI, they face new risks, including ethical concerns, data poisoning, and fraud detection circumvention.

“Managing AI Risk” is the second trend outlined by Gartner, highlighting the need for effective governance and responsible AI practices.

This goes beyond regulatory compliance, focusing on building trust among stakeholders and fostering the adoption of AI across the organization.

Observability: Unveiling System Behaviour

Another trend, “Observability,” emphasizes the importance of understanding and answering questions about the behaviour of D&A systems. .

This characteristic allows organizations to reduce the time it takes to identify performance-impacting issues and make timely, informed decisions.

Data and analytics leaders are encouraged to evaluate observability tools that align with the needs of primary users and fit into the overall enterprise ecosystem.

Creating a Data-Driven Ecosystem

Gartner’s fourth trend, “Data Sharing Is Essential,” underscores the significance of sharing data both internally and externally.

Organizations are encouraged to treat data as a product, preparing D&A assets as deliverables for internal and external use.

Collaborations in data sharing enhance value by incorporating reusable data assets, and the adoption of a data fabric design is recommended for creating a unified architecture for data sharing across diverse sources.

Nurturing Responsible Practices

“D&A Sustainability,” extends the responsibility of D&A leaders beyond providing insights for environmental, social, and governance (ESG) projects.

It urges leaders to optimize their own processes for sustainability, addressing concerns about the energy footprint of D&A and AI practices. This involves practices such as using renewable energy, energy-efficient hardware, and adopting small data and machine learning techniques.

Enhancing Data Management

“Practical Data Fabric,” introduces a data management design pattern leveraging metadata to observe, analyse, and recommend data management solutions. .

By enriching the semantics of underlying data and applying continuous analytics over metadata, data fabric generates actionable insights for both human and automated decision-making. It empowers business users to confidently consume data and enables less-skilled developers in the integration and modelling process.

Emergent AI

“Emergent AI,” heralds the transformative potential of AI technologies like ChatGPT and generative AI. As one AI researcher described it, “AI ‘Emergent Abilities’ Are A Mirage”. Per a paper presented in May at the Stanford Data Science 2023 Conference related to claims of emergent abilities in artificially intelligent large language models (LLMs) in particular and cited by Andréa Morris Contributor on Science, Robots & The Arts in Forbes.

This emerging trend however seemingly trivial, is expected to redefine how companies operate, offering scalability, versatility, and adaptability. As AI becomes more pervasive, it is poised to enable organizations to apply AI in novel situations, expanding its value across diverse business domains.

Gartner’s highlights another trend, “Converged and Composable Ecosystems,” and old topic from the start of the 2020s, it is focused on designing and deploying data ana analytics platforms that operate cohesively through seamless integrations, governance, and technical interoperability.

The trend advocates for modular, adaptable architectures that can dynamically scale to meet evolving business needs.

“Consumers as Creators,” is nothing particularly new, it envisions a shift from predefined dashboards to conversational, dynamic, and embedded user experiences as a ninth trend.

 Werner Geyser described 20 Creator Economy Statistics That Will Blow You Away in 2023 in his Influencer marketing hub piece

A large percentage of consumers identify as creators. Over 200 Million People globally, consider themselves as “creators”.

Content Creators Can Earn Over $50k a Year and the global influencer market size has increased now to a potential revenue earner of $21 Billion In 2023.

Organizations are encouraged to empower content consumers by providing easy-to-use automated and embedded insights, fostering a culture where users can become content creators.

Humans remain the key decision makers and not every decision can or should be automated. Decision support and the human role in automated and augmented decision-making remain as critical considerations.

Organizations need to combine data and analytics with human decision-making in their data literacy programs. While indicators from marketing analysts like Gartner may serve as a compass, guiding leaders toward creating value, managing risks, and embracing innovations the imperative is to deliver provable value at scale underscores the strategic role of data and analytics leaders in shaping the future for their organizations.

As the data and analytics landscape continues to evolve, organizations that leverage the trends strategically will be well-positioned to turn extreme uncertainty into new business opportunities.


Read More
Author: Jewel Tan

Balancing Cloud Transformation in Turbulent Times


The spectre of an impending economic downturn looms large, prompting business leaders to re-evaluate their strategic decisions, particularly regarding cloud transformation.

Simon Jelley, General Manager for SaaS Protection, Endpoint and Backup Exec at Veritas Technologies notes that despite the economic uncertainty, cloud migration remains a prevalent trend, with 60% of enterprise data already residing in the cloud.

However, the challenge lies in maintaining the cost benefits associated with the cloud, as evidenced by the fact that 94% of enterprises fail to stay within their cloud budgets.

To address this, businesses are encouraged to adopt a hybrid multicloud environment, necessitating careful data management strategies. Here are key steps organizations should take:

  • Establish Data Visibility: Gain a comprehensive understanding of where your data resides, whether on-premises or in the public cloud.
  • Enable Workload Migration/Portability: Facilitate seamless movement of workloads between on-premises infrastructure and various cloud service providers.
  • Leverage Software-Defined Storage: Embrace agile and scalable storage solutions to accommodate the dynamic nature of multicloud environments.
  • Prioritize Data Regulatory and Compliance Issues: Ensure compliance with data regulations across different cloud environments.
  • Eliminate Data Protection Silos: Streamline data protection processes to avoid fragmentation and enhance overall security.

By implementing these measures, organizations can fortify their data management capabilities, ensure resilience and meet compliance objectives amid economic uncertainties.

Cybercrime: A Persistent Threat, Demands Proactive Measures

As cybercrime continues to evolve, organizations must adapt their data management strategies to withstand increasingly sophisticated attacks. Ransomware, in particular, remains a potent weapon for cybercriminals seeking to exploit the value of organizational data.

While addressing cyber resilience is crucial, Jelley also advocates for a proactive approach to reduce the risk of attacks. The focus is on increasing data visibility, and the suggested steps include:

  • Create a Data Taxonomy or Classification System: Classify data based on sensitivity and importance to establish a clear understanding of information assets.
  • Establish a Single Source of Truth (SSOT) Location: Designate centralized locations for each category of data to streamline management and control.
  • Define and Implement Policies: Develop and enforce policies tailored to the specific requirements of identified data types.
  • Continually Update and Maintain Data Taxonomy, SSOT, and Policies: Keep data management strategies agile and responsive to evolving cyber threats.

By adhering to these proactive measures, organizations can limit exposure and enhance their ability to recover in the event of a cyber attack, ultimately safeguarding their critical data.

Digitization 3.0: Unleashing the Power of Usable Data

Digitization has undergone significant phases, with the current era—Digitization 3.0—focusing on extracting maximum value from data while ensuring security, resiliency, and privacy. Jelley emphasizes the importance of contextualizing data to enhance its usability, paving the way for user experience-driven workflows. Building upon the foundation of the preceding trends, organizations can achieve this by:

  • Consolidating Data Control: Utilize platforms capable of managing data across diverse environments, including on-premises, virtual, and multicloud.
  • Map Uses and Users: Conduct a thorough analysis of existing tools and users to seamlessly transition to a consolidated platform.
  • Implement Adequate Training: Ensure that teams are well-versed in utilizing the new consolidated platform to maximize its functionalities.

Digitization 3.0 represents a paradigm shift in data utilization, emphasizing the need for organizations to not only manage and protect their data but also harness its full potential to drive innovation and customer-centric experiences.

As businesses navigate the intricate landscape of data management in 2023, Simon Jelley’s insights shed light on the pivotal trends shaping the industry.

Economic uncertainty, cybercrime, and Digitization 3.0 collectively underscore the importance of proactive, adaptive data management strategies. By embracing data visibility, fortifying cybersecurity measures, and leveraging the power of contextualized data, organizations can not only weather the challenges of the present but also position themselves for success in the data-driven future.

Jelley reiterates the fundamental importance of caring about data—its management, protection, and the ability to address prevailing trends. In a world where information is a critical asset, businesses that prioritize effective data management will not only survive but thrive in the face of evolving challenges.

As we close out 2023, staying abreast of these trends and implementing strategic data management practices will be integral to achieving long-term success in a data-centric business landscape.


Read More
Author: Flaminio

Incentivizing Consumers to Self-Serve Zero-Party Data and Consent


Privacy remains a big deal and there are several reasons why consumers may be hesitant to allow organizations to master their personal data.

Organizations keep records on consumers for various reasons, among them, personalization, service, marketing, compliance and fraud prevention.

They may use your data to personalize your experience with their products or services; using your browsing and purchase history to recommend products that you are more likely to be interested in.

Keeping records of your interactions with customer service teams enables them to provide better support in the future and ensure that needs are met quickly and efficiently.

Marketing campaigns may be annoying but when they are personalized there may be a change in perception. Analysing behaviour and preferences, marketeers can create more relevant and targeted advertising that is more likely to result in a conversion.

Especially in financial services, organizations need to keep records on consumers to comply with legal and regulatory requirements. For example, they may need to keep records of your transactions for tax or accounting purposes but also to minimize the likelihood of money laundering or illegal use of financial instruments and infrastructure.

In exchange for goods, services or funding, they may use consumer data to prevent fraudulent activity.; monitoring behaviour, usage profiles and transactions, they can identify suspicious activity and take action to prevent fraud.

On the flipside, consumers may feel that their personal data is sensitive and should be kept private.

They may worry that if an organization masters their personal data, it could be used for nefarious purposes or sold to third-party companies without their consent.

Consumers may also be concerned that if an organization masters their personal data, it could be at risk of being hacked or stolen by cybercriminals, resulting in potential identity theft, personal financial loss, and other undesirable consequences.

One of the reasons is that consumers feel that if an organization masters their personal data, they lose control over it; worrying that the data will be used in ways they do not approve of, or that they will not be able to access or delete their data as they see fit.

In particular, consumers worry that their personal data could be used to discriminate against them based on their race, gender, religion, or other personal characteristics. Personal data that is used to make decisions about who to hire, who to offer loans to, or who to market products to are undesirable uses of personal data, for consumers at least.

Consumers have long held feelings that if an organization masters their personal data, it could also lead to unwanted intrusion into their personal lives accompanied by constantly being targeted with ads or other forms of marketing, based on their behaviour being monitored and analysed in ways that feel intrusive or uncomfortable and an invasion of privacy.

Zero party data

An opt-in approach with first-party data can help to address some of the concerns that consumers may have about their personal data being mastered.

First-party data refers to the information that consumers willingly provide through interactions with a website, a product, or a service. An opt-in approach means that organizations only collect and use the consumer’s data with the explicit consent of the consumer. This can give consumers greater control over their data, and can help to build trust between consumers and organizations.

Those privacy concerns can be addressed through opt-in meaning consumers must explicitly agree to allow the collection and use of the data in specific ways. This can give consumers greater control over their personal information and can help to ensure that their data is being used only for legitimate purposes.

By limiting the data that is collected to only what is necessary for specific purposes, the opt-in approach with first-party data helps to reduce the exposure risk associated with prospective data breaches. Organizations that collect first-party data are often also more invested in protecting that data, as it is valuable for building and maintaining the customer relationship.

An opt-in approach also gives consumers more control over the personal information allowing them to choose which data to continue to share, and supporting opt-out of specific data and its collection at any time.

To reduce the risk of discrimination, organizations are required to obtain explicit consent before collecting data on personal characteristics and though data is typically used for personalization and targeted advertising, the consumer can decide how it should be used especially in relation to important decisions about the consumer.

An opt-in approach with first-party data also helps to reduce the feeling of intrusiveness. Consumers now have control over what data is collected and how it is used, the personalization and customization can enhance the user experience rather than detracting from it.

If an organization is considering implementing a customer master data management solution, it’s important to understand how this approach can address consumers’ concerns about their personal data.

Through increased transparency the CMDM provides greater transparency into the data that an organization collects and how it is used; this in turn builds trust with consumers, as they can see exactly what information is being collected and why.

By centralizing the customer data in a CMDM and implementing robust security measures, a customer master data management solution reduces the vectors and edges that provide risk in the event of data breaches. This can also provide reassurance to consumers who are concerned about the security of their personal information.

A CMDM also enables organizations to provide more personalized experiences to customers which in turn helps to build stronger relationships with customers, increases loyalty, and ultimately drives revenue growth.

An opt-in approach gives customers more control over their data, the CMDM can demonstrate that the organization respects the privacy of its customers. This is often a important differentiator in the competitive marketplace, where consumers are increasingly concerned about their data privacy.

CMDM also helps with compliance. Organizations need to comply with data privacy regulations, such as GDPR and CCPA. CMDM’s like that offered by Pretectum, can help to avoid legal and reputational risks associated with non-compliance by providing reassurance to customers and regulators that consumer data is being handled in a responsible and compliant manner.

Overall, a customer master data management solution can help to build trust with customers, enhance data security, deliver better customer experiences, and demonstrate respect for privacy and compliance with regulations.

Communicating with customers about how their personal data is being collected, used, and protected is increasingly important in good customer relationship management.

Consumers expect organizations to be transparent about the data they collect and how it is being used. They expect clear communication on the purpose of the data collection, and what benefits the customers can expect from it. They also expect to provide customers with easy-to-understand information about their data rights and options for managing the data.

Organizations would reassure customers that their personal data is being stored and protected securely, explaining the measures they have put in place to safeguard against data breaches, such as encryption, firewalls, and access controls.

Using an opt-in approach to data collection, means that customers have control over the data that is collected and can choose to opt out at any time. The benefits of opting in are of course more personalized experiences or access to exclusive offers.

Emphasizing respect for privacy of customers and a commitment to protecting personal data go hand in hand and would also explain compliance with relevant data privacy regulations. The responsible organization also highlights any certifications or standards they have achieved in in relation to governance and compliance regulation adherence.

The benefits that customers expect from the data collection might seem obvious such as an enhanced overall experience, but providing examples of how the data is being used to personalize products and services, improve customer service, and offer tailored promotions and discounts is important communication.

Overall, effective communication with customers about the implementation of a customer master data management solution is most critical to building trust and addressing concerns.

Transparency on intent and behaviours, emphasizing data security and privacy, using an opt-in approach, highlighting customer benefits, and complying with relevant regulations, organizations can reassure their consumers that their personal data is being handled responsibly and ethically.

In response, consumers should engage in self-service zero-party data and consent inquiries because it allows them to have greater control over their personal data and the experiences they have with an organization.

By providing preferences and consent, consumers can receive more relevant and personalized experiences, products, and services.

Ecommerce sites could show recommendations based on customer stated interests and preferences, health apps could provide workout plans tailored to a user’s fitness level and selected goals.

Reduced clutter in inboxes may make interactions with an organization more efficient and enjoyable and when accompanied by the ability to decide what information is shared with an organization and how it is used, feelings of more control of personal data and confidence that it is being handled responsibly may follow.

Keeping the interest alive

If the data is collected but not used, it should be securely stored and deleted after a reasonable period of time to ensure compliance with relevant data privacy regulations and businesses can incentivize consumers to provide their data in the context of self-service zero-party data and consent inquiry by offering exclusives, discounts, rewards and previews.

Offering exclusive content, such as whitepapers, eBooks, or reports only accessible to those who provide their data can be a powerful incentive, especially for customers who are interested in a particular topic.

Personalized discounts or coupons to customers who provide their data especially in retail could encompass discounts on next purchases based on stated interests or style preferences.

A free cup of coffee, for example, is obvious at a coffee shop but consider how Waitrose did the same for loyal card holders and how other retailers do the same for their loyalty scheme members. The offer of a free drink after a certain number of visits, with additional rewards for sharing preferences and feedback is an obvious option but the others are a little more subtle.

Giving customers early access to new products, services, or features if they provide their data like AMEX customers in association with events or event tickets is a great way to build excitement and loyalty among customers. Capital One and other financial institutions incentivize in similar ways.

Game or challenge e vents that encourages customers to provide their data like PokĂŠmon Go, a 2016 augmented reality mobile game offers participants rewards for completing certain challenges. Additional rewards for sharing preferences and data is common with many card loyalty schemes as well as social apps.

In the end, it’s important to ensure that any incentives offered are aligned with the interests and preferences of customers, and that they are relevant and valuable.

Organizations today should ensure that they are transparent about their data collection practices and are respecting the privacy of their customers at all times.

Give customers the opportunity to self serve and drive first party data into the DNA of your business.


Read More
Author: Uli Lokshin