Search for:
7 Data Security Best Practices for Your Enterprise


When it comes to cybersecurity, some enterprises still take it with a grain of salt. But make no mistake about it: No one is safe. Companies such as BBC made it into the list of victims of the most significant data breaches in 2023, contributing to a staggering 5.3 billion total breached records. So, it’s safe […]

The post 7 Data Security Best Practices for Your Enterprise appeared first on DATAVERSITY.


Read More
Author: Anas Baig

The Evolution of AI Graph Databases: Building Strong Relations Between Data (Part One)


We live in an era in which business operations and success are based in large part on how proficiently databases are handled. This is an area in which graph databases have emerged as a transformative force, reshaping our approach to handling and analyzing datasets.  Unlike the conventional structure of traditional methods of accessing databases, which […]

The post The Evolution of AI Graph Databases: Building Strong Relations Between Data (Part One) appeared first on DATAVERSITY.


Read More
Author: Prashant Pujara

What to Expect in 2024: The Dominance of Hybrid and Multi-Cloud Architecture


In 2024, the IT landscape will continue to shift and evolve. Driven by the relentless advancement of cloud computing and the growing demand for hybrid and multi-cloud environments, the next year will be marked by several trends that will become more mainstream. From the dominance of cloud-native virtual desktop infrastructure (VDI) solutions to the impact […]

The post What to Expect in 2024: The Dominance of Hybrid and Multi-Cloud Architecture appeared first on DATAVERSITY.


Read More
Author: Amitabh Sinha

How To Plan To Data Roadmap For 2024 – Elevating Your Data Strategy


It’s that time of year again. When data leaders, VPs and Directors need to start planning out their data roadmap. Of course, this brings up an important question, how should you start planning out your data roadmap? Especially if you’re data team has found itself stuck in the data service trap. Simply providing data and…
Read more

The post How To Plan To Data Roadmap For 2024 – Elevating Your Data Strategy appeared first on Seattle Data Guy.


Read More
Author: research@theseattledataguy.com

The Actian Data Platform’s Superior Price-Performance

When it comes to choosing a technology partner, price and performance should be top of mind. “Price-performance” refers to the measure of how efficiently a database management system (DBMS) utilizes system resources, such as processing power, memory, and storage, in relation to its cost. It is a crucial factor for organizations to consider when selecting a DBMS, as it directly impacts the overall performance and cost-effectiveness of their data management operations. The Actian Data Platform can provide the price-performance you’re looking for and more.

Getting the most value out of any product or service has always been a key objective of any smart customer. This is especially true of those who lean on database management systems to help their businesses compete and grow in their respective markets, even more so when you consider the exponential growth in both data sources and use cases in any given industry or vertical. This might apply if you’re an insurance agency that needs real-time policy quote information, or if you’re in logistics and need the most accurate, up-to-date information about the location of certain shipments. Addressing use cases like these as cost-effectively as possible is key in today’s fast-moving world.

The Importance of Prioritizing Optimal Price-Performance

Today, CFOs and technical users alike are trying to find ways to get the best price-performance possible from their DBMS systems. Not only are CFOs interested in up-front acquisition and implementation costs, but also all costs downstream that are associated with the utilization and maintenance of whichever system they choose.

Technical users of various DBMS offerings are also looking for alternative ways to utilize their systems to save costs. In the back alleys of the internet (places like Reddit and other forums), users of various DBMS platforms are discussing how to effectively “game” their DBMS platforms to get the best price-performance possible, sometimes leading to the development of shadow database solutions just to try and save costs.

According to a December 2022 survey by Actian, 56% of businesses struggle to maintain costs as data volumes and workloads increase. These types of increases affect total cost of ownership and related infrastructure maintenance, support, query complexity, the number of concurrent users, and management overhead, which have a significant impact on the costs involved in using a database management system.

Superior Price-Performance

Having been established over 50 years ago, Actian was in the delivery room when enterprise data management was born. Since then, we’ve had our fingers on the pulse of the market’s requirements, developing various products that meet various use cases from various industries worldwide.

The latest version of the Actian Data Platform includes native data integrations with 300+ out-of-the-box connectors and scalable data warehousing and analytics that produce REAL real-time insights to more confident support decision-making. The Actian Data Platform can be used on-premises, in the cloud across multiple clouds, and in a hybrid model. The Actian Platform also provides no-code, low-code, and pro-code solutions to enable a multitude of users, both technical and non-technical.

The 2023 Gigaom TCP-H Benchmark Test

At Actian, we’re really curious about how our platform compared with other major players and whether or not it could help deliver the price-performance being sought after in the market. In June of 2023, we commissioned a TCP-H Benchmark test with GigaOm, pitting the Actian Data Platform against both Google Big Query and Snowflake. This test involved running 22 queries against a 30TB TCP-H data set. Actian’s response times were better than the competition in 20 of those 22 requests. Furthermore, the benchmark report revealed that:

  • In a test of five concurrent users, Actian was overall 3x faster than Snowflake and 9x faster than Big Query.

  • In terms of price-performance, the Actian Data Platform produced even greater advantages when running the five concurrent user TPC-H queries. Actian proved roughly 4x less expensive to operate than Snowflake, based on cost per query per hour, and 16x less costly than BigQuery.

These were compelling results. Overall, the GigaOm TCP-H benchmark shows that the Actian Data Platform is a high-performance cloud data warehouse that is well-suited for organizations that need to analyze large datasets quickly and cost-effectively.

Actian customer, the Automobile Association (AA), located in the United Kingdom, was able to reduce their quote response time to 400 milliseconds. Without the speed provided by the Actian Platform, they wouldn’t have been able to provide prospective customers the convenience of viewing insurance quotes on their various comparison pages, which allows them to gain and maintain a clear advantage over their competitors.

Let Actian Help

If price-performance is a key factor for you, and you’re looking for a complete data platform that will provide superior capabilities and ultimately lower your TCO, do these three things:

  1. Download a copy of the Gigaom TCP-H Benchmark report and read the results for yourself.
  2. Take the Actian Data Platform for a test-drive!
  3. Contact us! One of our friendly, knowledgeable representatives will be in touch with you to discuss the benefits of the Actian Data Platform and how we can help you have more confidence in your data-driven decisions that keep your business growing.

The post The Actian Data Platform’s Superior Price-Performance appeared first on Actian.


Read More
Author: Phil Ostroff

Forget p(Doom): Here’s How to Hold AI Accountable by Creating an Inference Economy


AI’s existential risk is no laughing matter: As much as the tech world tries to make sense of it all, nobody definitively knows whether the dystopian “control the world” hype is legit or just sci-fi. Before we even cross that bridge (real or not), however, we need to face a more immediate problem. We’re already […]

The post Forget p(Doom): Here’s How to Hold AI Accountable by Creating an Inference Economy appeared first on DATAVERSITY.


Read More
Author: Sishir Varghese

The Silver Bullet Myth: Debunking One-Size-Fits-All Solutions in Data Governance


Data Governance plays a crucial role in modern business, yet the approach to it is often mired in unhelpful misconceptions. While 61% of leaders indicate a desire to optimize Data Governance processes, only 42% think that they are on track to meet their goals. This disparity highlights a significant challenge: The need for effective strategies has been […]

The post The Silver Bullet Myth: Debunking One-Size-Fits-All Solutions in Data Governance appeared first on DATAVERSITY.


Read More
Author: Samuel Bocetta

Generative AI Trends for 2024


It sometimes seems like generative AI (GenAI) is sweeping the business world. Not so. We remain in the hype cycle, with forward-thinking organizations refraining from full-blown rollouts for the foreseeable future. Caution is important; organizations that rapidly deploy generative AI integrations have already faced blowback in the form of lawsuits and investor scrutiny. To avoid […]

The post Generative AI Trends for 2024 appeared first on DATAVERSITY.


Read More
Author: Brett Hansen

Proper Data Management Drives Business Success


Data Management used to be an enhancement to business operations. But as the datasphere continues to grow exponentially, it’s now a critical factor to business success. This is inclusive of data visibility, access controls, classification, compliance, and backups. Successful business leaders will recognize that the ability to manage their company’s data effectively has substantial financial […]

The post Proper Data Management Drives Business Success appeared first on DATAVERSITY.


Read More
Author: Sonya Duffin

Do You Have a Data Quality Framework?

We’ve shared several blogs about the need for data quality and how to stop data quality issues in their tracks. In this post, we’ll focus on another way to help ensure your data meets your quality standards on an ongoing basis by implementing and utilizing a data quality management framework. Do you have this type of framework in place at your organization? If not, you need to launch one. And if you do have one, there may be opportunities to improve it.

A data quality framework supports the protocols, best practices, and quality measures that monitor the state of your data. This helps ensure your data meets your quality threshold for usage and allows more trust in your data. A data quality framework continuously profiles data using systematic processes to identify and mitigate issues before the data is sent to its destination location.

Now that you know a data quality framework is needed for more confident, data-driven decision-making and data processes, you need to know how to build one.

Establish Quality Standards for Your Use Cases

Not every organization experiences the same data quality problems, but most companies do struggle with some type of data quality issue. Gartner estimated that every year, poor data quality costs organizations an average of $12.9 million.

As data volumes and the number of data sources increase, and data ecosystems become increasingly complex, it’s safe to assume the cost and business impact of poor data quality have only increased. This proves there is a growing need for a robust data quality framework.

The framework allows you to:

  • Assess data quality against established metrics for accuracy, completeness, and other criteria
  • Build a data pipeline that follows established data quality processes
  • Pass data through the pipeline to ensure it meets your quality standard
  • Monitor data on an ongoing basis to check for quality issues

The framework should make sure your data quality is fit for purpose, meaning it meets the standard for the intended use case. Various use cases can have different quality standards, yet it’s a best practice to have an established data quality standard for the business as a whole. This ensures your data meets the minimum standard.

Key Components of a Data Quality Framework

While each organization will face its own unique set of data quality challenges, essential components needed for a data quality framework will be the same. They include:

  • Data governance: Data governance makes sure that the policies and roles used for data security, integrity, and quality are performed in a controlled and responsible way. This includes governing how data is integrated, handled, used, shared, and stored, making it a vital component of your framework.
  • Data profiling: Actian defines data profiling as the process of analyzing data, looking at its structure and content, to better understand how it’s relevant and useful, what it’s missing, and how it can be improved. Profiling helps you identify any problems with the data, such as any inconsistencies or inaccuracies.
  • Data quality rules: These rules determine if the data meets your quality standard, or if it needs to be improved or transformed before being integrated or used. Predefining your rules will assist in verifying that your data is accurate, valid, complete, and meets your threshold for usage.
  • Data cleansing: Filling in missing information, filtering out unneeded data, formatting data to meet your standard, and ensuring data integrity is essential to achieving and maintaining data quality. Data cleansing helps with these processes.
  • Data reporting. This reporting gives you information about the quality of your data. Reports can be documents or dashboards that show data quality metrics, issues, trends, recommendations, or other information.

These components work together to create the framework needed to maintain data quality.

Establish Responsibilities and Metrics

As you move forward with your framework, you’ll need to assign specific roles and responsibilities to employees. These people will manage the data quality framework and make sure the data meets your defined standards and business goals. In addition, they will implement the framework policies and processes, and determine what technologies and tools are needed for success.

Those responsible for the framework will also need to determine which metrics should be used to measure data quality. Using metrics allows you to quantify data quality across attributes such as completeness, timeliness, and accuracy. Likewise, these employees will need to define what good data looks like for your use cases.

Many processes can be automated, making the data quality framework scalable. As your data and business needs change and new data becomes available, you will need to evolve your framework to meet new requirements.

Expert Help to Ensure Quality Data

Your framework can monitor and resolve issues over the lifecycle of your data. The framework can be used for data in data warehouses, data lakes, or other repositories to deliver repeatable strategies, processes, and procedures for data quality.

An effective framework reduces the risk of poor-quality data—and the problems poor quality presents to your entire organization. The framework ensures trusted data is available for operations, decision-making, and other critical business needs. If you need help improving your data quality or building a framework, we’re here to help.

Related resources you may find useful:

¡      Mastering Your Data with a Data Quality Management Framework

¡      What is Data Lifecycle Management?

¡      What is the Future of Data Quality Management

The post Do You Have a Data Quality Framework? appeared first on Actian.


Read More
Author: Actian Corporation

What Trends to Expect in 2024 in Enterprise Storage? (Part One)


Looking ahead to the new year, we’ve identified seven trends in enterprise storage for 2024. In part one, we’ll define and explore the first four trends. The remaining three trends will be the focus of part two. This information will help equip you to prioritize and be successful in the new year.  Trend: Freeing up […]

The post What Trends to Expect in 2024 in Enterprise Storage? (Part One) appeared first on DATAVERSITY.


Read More
Author: Eric Herzog

Supercharging Value from Data in 2024


Without a doubt, initiatives such as generative AI (GenAI) and cloud migration have garnered the bulk of attention among influencers and data leaders this year, as organizations tried to determine how, and if, they made sense for their business. This trend looks like it will continue in 2024, as nearly all of Gartner’s top strategic […]

The post Supercharging Value from Data in 2024 appeared first on DATAVERSITY.


Read More
Author: Atanas Kiryakov

Data Protection: Trends and Predictions for 2024
Data protection, as the term implies, refers to the safeguarding of personal data from unauthorized access, disclosure, alteration, or destruction. Data protection revolves around the principles of integrity, availability, and confidentiality. Integrity ensures that data remains accurate and consistent during its lifecycle. Availability guarantees that data is accessible and usable when needed, while confidentiality ensures […]


Read More
Author: Gilad David Maayan

The Data-Centric Revolution: Best Practices and Schools of Ontology Design
I was recently asked to present “Enterprise Ontology Design and Implementation Best Practices” to a group of motivated ontologists and wanna-be ontologists. I was flattered to be asked, but I really had to pause for a bit. First, I’m kind of jaded by the term “best practices.” Usually, it’s just a summary of what everyone […]


Read More
Author: Dave McComb

Legal Issues for Data Professionals: A New Data Licensing Model
Setting the Stage: Data as a Business Asset This column presents a new model for licensing and sharing data, one that I call the “Decision Rights Data Licensing Model” (or the “Decision Rights Model,” in a shorter form) and one that has been met with acceptance in commercial transactions. The Decision Rights Model addresses current business […]


Read More
Author: William A. Tanenbaum

The Currency of Information: What Kind of Asset Is Data? (Part Two)
Data professionals often talk about the importance of managing data and information as organizational assets, but what does this mean? What is the actual business value of data and information? How can this value be measured? How do we manage data and information as assets? These are some of the questions that I intend to […]


Read More
Author: Larry Burns

The Role of Dark Data: Uncovering Insights in Unused Information
Dark data remains one of the greatest untapped resources in business. This is due to the vast amounts of usable data that exists within an organization, but is not utilized or analyzed to serve a specific purpose. These untapped sources could include customer information, transaction records, and more. Since dark data represents missed opportunities for […]


Read More
Author: Ainsley Lawrence

Enhancing Data Quality in Clinical Trials
One of the reasons why there’s always excess production in the textile sector is the stringent requirement of meeting set quality standards. It’s a simple case of accepting or rejecting a shipment, depending on whether it meets the requirements. As far as healthcare is concerned, surprisingly, only two out of five health executives believe they receive healthy data through […]


Read More
Author: Irfan Gowani

Data Governance in 2024


As we approach 2024, the landscape of data governance is set to undergo transformative changes. This article delves into the anticipated advancements and shifts in data governance, highlighting the integration of AI, the importance of metadata management, the emphasis on data quality and literacy, evolving operating models, and the integration of data warehousing and lake […]

The post Data Governance in 2024 appeared first on LightsOnData.


Read More
Author: George Firican

Is Your Data Quality Framework Up to Date?

A data quality framework is the systematic processes and protocols that continually monitor and profile data to determine its quality. The framework is used over the lifecycle of data to ensure the quality meets the standard necessary for your organization’s use cases.

Leveraging a data quality framework is essential to maintain the accuracy, timeliness, and usefulness of your data. Yet with more data coming into your organization from a growing number of sources, and more use cases requiring trustworthy data, you need to make sure your data quality framework stays up to date to meet your business needs.

If you’re noticing data quality issues, such as duplicate data sets, inaccurate data, or data sets that are missing information, then it’s time to revisit your data quality framework and make updates.

Establish the Data Quality Standard You Need

The purpose of the framework is to ensure your data meets a minimum quality threshold. This threshold may have changed since you first launched your framework. If that’s the case, you will need to determine the standard you now need, then update the framework’s policies and procedures to ensure it provides the data quality required for your use cases. The update ensures your framework reflects your current data needs and data environment.

Evaluate Your Current Data Quality

You’ll want to understand the current state of your data. You can profile and assess your data to gauge its quality, and then identify any gaps between your current data quality and the quality needed for usage. If gaps exist, you will need to determine what needs to be improved, such as data accuracy, structure, or integrity.

Reevaluate Your Data Quality Strategy

Like your data quality framework, your data quality strategy needs to be reviewed from time to time to ensure it meets your current requirements. The strategy should align with business requirements for your data, and your framework should support the strategy. This is also an opportunity to assess your data quality tools and processes to make sure they still fit your strategy; and make updates as needed. Likewise, this is an ideal time to review your data sources and make sure you are bringing in data from all the sources you need—new sources are constantly emerging and may be beneficial to your business.

Bring Modern Processes into Your Framework

Data quality processes, such as data profiling and data governance, should support your strategy and be part of your framework. These processes, which continuously monitor data quality and identify issues, can be automated to make them faster and scalable. If your data processing tools are cumbersome and require manual intervention, consider modernizing them with easy-to-use tools.

Review the Framework on an Ongoing Basis

Regularly reviewing your data quality framework ensures it is maintaining data at the quality standard you need. As data quality needs or business needs change, you will want to make sure the framework meets your evolving requirements. This includes keeping your data quality metrics up to date, which could entail adding or changing your metrics for data quality.

Ensuring 7 Critical Data Quality Dimensions

Having an up-to-date framework helps maintain quality across these seven attributes:

  1. Completeness: The data is not missing fields or other needed information and has all the details you need.
  2. Validity: The data matches its intended need and usage.
  3. Uniqueness: The data set is unique in the database and not duplicated.
  4. Consistency: Data sets are consistent with other data in the database, rather than being outliers.
  5. Timeliness: The data set offers the most accurate information that’s available at the time the data is used.
  6. Accuracy: The data has values you expect and are correct.
  7. Integrity: The data set meets your data quality and governance standards.

Your data quality framework should have the ability to cleanse, transform, and monitor data to meet these attributes. When it does, this gives you the confidence to make data-driven decisions.

What Problems Do Data Quality Frameworks Solve?

An effective framework can address a range of data quality issues. For example, the framework can identify inaccurate, incomplete, and inconsistent data to prevent poor-quality data from negatively impacting the business. A modern, up-to-date framework can improve decision-making, enable reliable insights, and potentially save money, by preventing incorrect conclusions or unintended outcomes caused by poor-quality data. A framework that ensures data meets a minimum quality standard also supports business initiatives and improves overall business operations. For instance, the data can be used for campaigns, such as improving customer experiences, or predicting supply chain delays.

 Make Your Quality Data Easy to Use for Everyone

Maintaining data quality is a constant challenge. A current data quality framework mitigates the risk that poor quality data poses to your organization by keeping data accurate, complete, and timely for its intended use cases. When your framework is used in conjunction with the Actian Data Platform, you can have complete confidence in your data. The platform makes accurate data easy to access, share, and analyze to reach your business goals faster.

Related resources you may find useful:

The post Is Your Data Quality Framework Up to Date? appeared first on Actian.


Read More
Author: Actian Corporation

Choosing Tools for Data Pipeline Test Automation (Part 2) 


In part one of this blog post, we described why there are many challenges for developers of data pipeline testing tools (complexities of technologies, large variety of data structures and formats, and the need to support diverse CI/CD pipelines). More than 15 distinct categories of test tools that pipeline developers need were described.  Part two delves […]

The post Choosing Tools for Data Pipeline Test Automation (Part 2)  appeared first on DATAVERSITY.


Read More
Author: Wayne Yaddow

Data Catalog, Semantic Layer, and Data Warehouse: The Three Key Pillars of Enterprise Analytics


Analytics at the core is using data to derive insights for measuring and improving business performance [1]. To enable effective management, governance, and utilization of data and analytics, an increasing number of enterprises today are looking at deploying the data catalog, semantic layer, and data warehouse. But what exactly are these data and analytics tools […]

The post Data Catalog, Semantic Layer, and Data Warehouse: The Three Key Pillars of Enterprise Analytics appeared first on DATAVERSITY.


Read More
Author: Prashanth Southekal and Inna Tokarev Sela

Benefitting from Generative AI: What to Expect in 2024


According to a recent McKinsey study, the impact of generative AI (GenAI) impact on productivity could add trillions of dollars in value to the global economy which they estimate will add the equivalent of $2.6 trillion to $4.4 trillion annually across the 63 use cases analyzed. About 75% of the value that GenAI’s use cases could […]

The post Benefitting from Generative AI: What to Expect in 2024 appeared first on DATAVERSITY.


Read More
Author: Ryan Welsh

A Strategic Approach to Data Management


There is a delicate balance between the needs of data scientists and the requirements of data security and privacy.

Data scientists often need large volumes of data to build robust models and derive valuable insights. However, the accumulation of data increases the risk of data breaches, which is a concern for security teams.

This hunger for data and the need for suitable control over sensitive data creates a tension between the data scientists seeking more data and the security teams implementing measures to protect data from inappropriate use and abuse.

A strategic approach to data management is needed, one that satisfies the need for data-driven insights while also mitigating security risks.

There needs to be an emphasis on understanding the depth of the data, rather than just hoarding it indiscriminately.

Towards Data Science article Author, Stephanie Kirmer reflects on her experience as a senior machine learning engineer and discusses the challenges organizations face as they transition from data scarcity to data abundance.

Kirmer highlights the importance of making decisions about data retention and striking a balance between accumulating enough data for effective machine learning and avoiding the pitfalls of data hoarding.

Kirmer also touches on the impact of data security regulations, which add a layer of complexity to the issue. Despite the challenges, Kirmer advocates for a nuanced approach that balances the interests of consumers, security professionals, and data scientists.

Kirmer also stresses the importance of establishing principles for data retention and usage to guide organizations through the decisions surrounding data storage.

Paul Gillin, Technology Journalist at Computerworld raised this topic back in 2021. in his piece Data hoarding: The consequences go far beyond compliance risk, Gillin discusses the implications of data hoarding, which extends beyond just compliance risks. It highlights how the decline in storage costs has led to a tendency to retain information rather than discard it. 

Pijus Jauniťkis a writer in Internet Security at Surfshark describes how the practice can lead to significant risks, especially with regulations like the General Data Protection Act in Europe and similar legislation in other parts of the world.

There is however a landscape where data is both a valuable asset and a potential liability, a balanced and strategic approach to data management is crucial to ensure that the needs of both groups are met.

The data community has a significant responsibility in recognizing both.

Data management responsibilities extend beyond the individual who created or collected the data. Various parties are involved in the research process and play a role in ensuring quality data stewardship.

To generate valuable data insights, people need to become fluent in data. Data communities can help individuals immerse themselves in the language of data, encouraging data literacy.

A governing body organizationally, is often responsible for the strategic guidance of a data governance program, prioritization for the data governance projects and initiatives, approval of organization-wide data policies and standards and if there isn’t one, one should be established.

Accountability includes the responsible handling of classified and controlled information, upholding data use agreements made with data providers, minimizing data collection, informing individuals and organizations of the potential uses of their data.

In the world of data management, there is a collective duty to prioritize and respond to the ethical, legal, social, and privacy-related challenges that come from using data in new and different ways in advocacy and social change.

A balanced and strategic approach to data management is crucial to ensure that the needs of all stakeholders are met. We collectively need to find the right balance between leveraging data for insights and innovation, while also respecting privacy, security, and ethical considerations.


Read More
Author: Uli Lokshin

Unlocking Value through Data and Analytics


Organizations are constantly seeking ways to unlock the full potential of their data, analytics, and artificial intelligence (AI) portfolios.

Gartner, Inc., a global research and advisory firm, identified the top 10 trends shaping the Data and Analytics landscape in 2023 earlier this year.

.These trends not only provide a roadmap for organizations to create new sources of value but also emphasize the imperative for D&A leaders to articulate and optimize the value they deliver in business terms.

Bridging the Communication Gap

The first and foremost trend highlighted by Gartner is “Value Optimization.”

Many D&A leaders struggle to articulate the tangible value their initiatives bring to the organization in terms that resonate with business objectives.

Gareth Herschel, VP Analyst at Gartner, emphasizes the importance of building “value stories” that establish clear links between D&A initiatives and an organization’s mission-critical priorities.

Achieving value optimization requires a multifaceted approach, integrating competencies such as value storytelling, value stream analysis, investment prioritization, and the measurement of business outcomes.

Managing AI Risk: Beyond Compliance

As organizations increasingly embrace AI, they face new risks, including ethical concerns, data poisoning, and fraud detection circumvention.

“Managing AI Risk” is the second trend outlined by Gartner, highlighting the need for effective governance and responsible AI practices.

This goes beyond regulatory compliance, focusing on building trust among stakeholders and fostering the adoption of AI across the organization.

Observability: Unveiling System Behaviour

Another trend, “Observability,” emphasizes the importance of understanding and answering questions about the behaviour of D&A systems. .

This characteristic allows organizations to reduce the time it takes to identify performance-impacting issues and make timely, informed decisions.

Data and analytics leaders are encouraged to evaluate observability tools that align with the needs of primary users and fit into the overall enterprise ecosystem.

Creating a Data-Driven Ecosystem

Gartner’s fourth trend, “Data Sharing Is Essential,” underscores the significance of sharing data both internally and externally.

Organizations are encouraged to treat data as a product, preparing D&A assets as deliverables for internal and external use.

Collaborations in data sharing enhance value by incorporating reusable data assets, and the adoption of a data fabric design is recommended for creating a unified architecture for data sharing across diverse sources.

Nurturing Responsible Practices

“D&A Sustainability,” extends the responsibility of D&A leaders beyond providing insights for environmental, social, and governance (ESG) projects.

It urges leaders to optimize their own processes for sustainability, addressing concerns about the energy footprint of D&A and AI practices. This involves practices such as using renewable energy, energy-efficient hardware, and adopting small data and machine learning techniques.

Enhancing Data Management

“Practical Data Fabric,” introduces a data management design pattern leveraging metadata to observe, analyse, and recommend data management solutions. .

By enriching the semantics of underlying data and applying continuous analytics over metadata, data fabric generates actionable insights for both human and automated decision-making. It empowers business users to confidently consume data and enables less-skilled developers in the integration and modelling process.

Emergent AI

“Emergent AI,” heralds the transformative potential of AI technologies like ChatGPT and generative AI. As one AI researcher described it, “AI ‘Emergent Abilities’ Are A Mirage”. Per a paper presented in May at the Stanford Data Science 2023 Conference related to claims of emergent abilities in artificially intelligent large language models (LLMs) in particular and cited by Andréa Morris Contributor on Science, Robots & The Arts in Forbes.

This emerging trend however seemingly trivial, is expected to redefine how companies operate, offering scalability, versatility, and adaptability. As AI becomes more pervasive, it is poised to enable organizations to apply AI in novel situations, expanding its value across diverse business domains.

Gartner’s highlights another trend, “Converged and Composable Ecosystems,” and old topic from the start of the 2020s, it is focused on designing and deploying data ana analytics platforms that operate cohesively through seamless integrations, governance, and technical interoperability.

The trend advocates for modular, adaptable architectures that can dynamically scale to meet evolving business needs.

“Consumers as Creators,” is nothing particularly new, it envisions a shift from predefined dashboards to conversational, dynamic, and embedded user experiences as a ninth trend.

 Werner Geyser described 20 Creator Economy Statistics That Will Blow You Away in 2023 in his Influencer marketing hub piece

A large percentage of consumers identify as creators. Over 200 Million People globally, consider themselves as “creators”.

Content Creators Can Earn Over $50k a Year and the global influencer market size has increased now to a potential revenue earner of $21 Billion In 2023.

Organizations are encouraged to empower content consumers by providing easy-to-use automated and embedded insights, fostering a culture where users can become content creators.

Humans remain the key decision makers and not every decision can or should be automated. Decision support and the human role in automated and augmented decision-making remain as critical considerations.

Organizations need to combine data and analytics with human decision-making in their data literacy programs. While indicators from marketing analysts like Gartner may serve as a compass, guiding leaders toward creating value, managing risks, and embracing innovations the imperative is to deliver provable value at scale underscores the strategic role of data and analytics leaders in shaping the future for their organizations.

As the data and analytics landscape continues to evolve, organizations that leverage the trends strategically will be well-positioned to turn extreme uncertainty into new business opportunities.


Read More
Author: Jewel Tan

RSS
YouTube
LinkedIn
Share