Search for:
Customer Fundamentals – time to take a big step

Master Data Management may be positioned as a “silver bullet” for the woes of poor customer master data but it doesn’t solve for long-standing systemic organizational deficiencies.

Any organization embarking on any kind of Master Data Management (MDM) initiative will need to look long and hard at a number of characteristics of how data is created, described and managed that are independent of the MDM itself if they wish to get the best value from their MDM.

That highly desirable 360º view of the customer, for example, what is it exactly?

Benefits as described by vendors and industry experts are numerous, but perhaps a small handful is the most important and most achievable.

  • Reducing the costs and risks of customer data ownership
  • Reducing friction in transacting or engaging with the customer
  • Improved customer segmentation and targeting lists

All three of these outcomes are however heavily contingent on a number of important behaviours and organizational culture shifts.

Continue reading at Pretectum.com

Some customer data is missing
The incomplete person

How often does this come up as a problem to solve? It may happen more frequently than you think.

Having clean, comprehensive, and consistent data is paramount to the most appropriate customer engagement and interaction. If your business is also an advocate and heavy user of automation, machine learning and artificial intelligence then your technical teams will tell you that the results of their efforts are commensurate not only with their efforts but also with the quality of the data that they are working with.

Without the best possible customer data, your staff and systems are exposed to a partial picture which can result in bad decisions, model bias and skewed results.

The US National Library of Medicine and National Institute of Health (PMC) journal contains an article from May 2013 describing three types of potential data deficiency in any given data set. While the focus in this case by the author, Hyun Kang would be on suitability for studies, this basis is useful for considering customer master data in general.

The three types are Missing Completely at Random (MCAR), Missing at Random (MAR), and MNAR (Missing Not at Random). Each with its own cause and potential solutions.

We’ll look at this through the lens of a customer master data management system. Read more at Pretectum.com

Cumulocity, IoT, and Analytics


Today, all of our digital devices and sensors are interconnected in the Internet of Things. My company has built an extension for the KNIME Analytics Platform that enables you to connect to Software AG’s Cumulocity IoT platform so that you can use the more advanced analytics provided by KNIME on your Cumulocity data. The concept behind the Cumulocity platform […]

The post Cumulocity, IoT, and Analytics appeared first on DATAVERSITY.


Read More
Author:

Dietrich Wettschereck

The ROI of email and why it matters

The email has long been an essential weapon in the marketer’s arsenal and many businesses have increased their reliance on the channel during the pandemic.

In Validity and the DMA’s Marketer Email Tracker 2021, almost three-quarters (72%) of brands reported that email is their preferred and most used channel to engage with customers across the customer journey, compared to 66% who said social media.

While email’s popularity and effectiveness aren’t disputed, calculating exactly how valuable it is as a marketing channel can be tricky. In fact, Validity found that one in three marketers aren’t confident calculating the ROI of email, which presents a significant challenge when it comes to reporting on performance and securing budget and resources ongoing. (This can become a vicious cycle – with less budget and resources, performance is likely to decline).

Read more

Developing your Data DNA

In a world where everything is changing too quickly, the evolution towards agile, interoperable data services is a welcome change. Data is now able to be delivered as a service, without the need for costly investments in data centers and the resources needed to manage them. As more companies embrace the cloud, data integration and data quality need to be a more important consideration.

As a result, organizations are focusing on delivering products and services at a faster pace, and to achieve this, operational analytics is more critical than ever. Today, organizations are reliant on using their data, along with external data, to make better decisions.

And just as the cloud alleviated the expense and expertise needed to manage infrastructure, data is also seeing accelerated value from the cloud. Data lakes and cloud data warehouses make it affordable and easy to store and use all your data. So why are companies still struggling to maximize their data potential?

It’s probably due to one of these 3 culprits:

  1. You failed to alight stakeholders and create a data-driven culture. This is, by far, the primary reason why most data projects fail. In fact, according to a 2021 survey of Fortune 1000 companies “executives report that cultural challenges – not technology challenges – represent the biggest impediment to successful adoption of data initiatives and the biggest barrier to realizing business outcomes.” For any data project to succeed, there needs to be strong leadership at the top of the organization and a data culture that permeates throughout the organization.
  2. Your data is – literally – everywhere. I’m sure I’m not telling you anything new, but it really can’t be overstated – your data is living in places you don’t know about. It’s in third-party systems, spreadsheets on personal devices, and in public online repositories. It’s also in legacy systems which can pose a significant challenge since these are often proprietary and not always the most cooperative when you need to retrieve data regularly. These older systems are often also considered mission-critical, so if you don’t create a data-driven culture, there may be resistance from application owners. As you put together your Rockstar team of stakeholders, this is a good time to audit the systems in use by every department. This leads me to my last point on what is limiting data insights….
  3.  Your data quality sucks. While it stands to reason that your data isn’t going to be perfect, it should be as accurate and consistent as possible to drive better business decisions. At a minimum, data quality requires:
    • Discovery and Profiling. Know where your data lives and what it does. Understand the accuracy and completeness of your data and use that as a baseline. Data quality is like laundry, it never ends.
    • Standard, conformant and clean data. Once you’ve done the work to understand your data, it’s important to define what “good” looks like and create rules that maintain that definition going forward. If you have a team that is focused on this today, understanding what those rules are and why they exist is a critical component of a successful data project.
    • Deduplicated data. While no one wants to forecast revenue twice, with many databases and storage residing in the cloud, duplicate data can cause more than incorrect reports. Cloud costs can easily spiral if you’re storing and analyzing duplicate data.

Today, more organizations than ever are facing the challenge of increasing data and technological complexity, but few are seeing a significant return.  To thrive in the digital era, organizations must embrace new thinking. Infusing data obsession into the corporate DNA will allow data to start driving better decisions and better results. Check out how Actian’s DataConnect integration platform can help with your data quality goals.

The post Developing your Data DNA appeared first on Actian.


Read More
Author:

Traci Curran
Developing your Data DNA

In a world where everything is changing too quickly, the evolution towards agile, interoperable data services is a welcome change. Data is now able to be delivered as a service, without the need for costly investments in data centers and the resources needed to manage them. As more companies embrace the cloud, data integration and data quality need to be a more important consideration.

As a result, organizations are focusing on delivering products and services at a faster pace, and to achieve this, operational analytics is more critical than ever. Today, organizations are reliant on using their data, along with external data, to make better decisions.

And just as the cloud alleviated the expense and expertise needed to manage infrastructure, data is also seeing accelerated value from the cloud. Data lakes and cloud data warehouses make it affordable and easy to store and use all your data. So why are companies still struggling to maximize their data potential?

It’s probably due to one of these 3 culprits:

  1. You failed to alight stakeholders and create a data-driven culture. This is, by far, the primary reason why most data projects fail. In fact, according to a 2021 survey of Fortune 1000 companies “executives report that cultural challenges – not technology challenges – represent the biggest impediment to successful adoption of data initiatives and the biggest barrier to realizing business outcomes.” For any data project to succeed, there needs to be strong leadership at the top of the organization and a data culture that permeates throughout the organization.
  2. Your data is – literally – everywhere. I’m sure I’m not telling you anything new, but it really can’t be overstated – your data is living in places you don’t know about. It’s in third-party systems, spreadsheets on personal devices, and in public online repositories. It’s also in legacy systems which can pose a significant challenge since these are often proprietary and not always the most cooperative when you need to retrieve data regularly. These older systems are often also considered mission-critical, so if you don’t create a data-driven culture, there may be resistance from application owners. As you put together your Rockstar team of stakeholders, this is a good time to audit the systems in use by every department. This leads me to my last point on what is limiting data insights….
  3.  Your data quality sucks. While it stands to reason that your data isn’t going to be perfect, it should be as accurate and consistent as possible to drive better business decisions. At a minimum, data quality requires:
    • Discovery and Profiling. Know where your data lives and what it does. Understand the accuracy and completeness of your data and use that as a baseline. Data quality is like laundry, it never ends.
    • Standard, conformant and clean data. Once you’ve done the work to understand your data, it’s important to define what “good” looks like and create rules that maintain that definition going forward. If you have a team that is focused on this today, understanding what those rules are and why they exist is a critical component of a successful data project.
    • Deduplicated data. While no one wants to forecast revenue twice, with many databases and storage residing in the cloud, duplicate data can cause more than incorrect reports. Cloud costs can easily spiral if you’re storing and analyzing duplicate data.

Today, more organizations than ever are facing the challenge of increasing data and technological complexity, but few are seeing a significant return.  To thrive in the digital era, organizations must embrace new thinking. Infusing data obsession into the corporate DNA will allow data to start driving better decisions and better results. Check out how Actian’s DataConnect integration platform can help with your data quality goals.

The post Developing your Data DNA appeared first on Actian.


Read More
Author:

Traci Curran
What is a data domain? (examples included)


Determining your data domains is an important part of your data strategy. So what is a data domain?  It actually can mean a couple of things, depending if we look at it from the point of view of data management and database management, or if we look at it from the point of view of data […]

The post What is a data domain? (examples included) appeared first on LightsOnData.


Read More
Author:

George Firican
2022 Predictions: Cloud-Native Apps, the Edge, Ransomware, and More


As we look toward the new year, we expect some of the key trends will involve cloud-native apps, edge use cases, private clouds, ransomware, business continuity, and data sovereignty. All of these will have implications for how organizations manage, store, and protect their data in 2022. Gary Ogasawara, CTO, Cloudian: Cloud-native apps will go to […]

The post 2022 Predictions: Cloud-Native Apps, the Edge, Ransomware, and More appeared first on DATAVERSITY.


Read More
Author:

Gary Ogasawara and Jon Toor

Transforming Analytics with a Modern Data Stack


There’s been a lot of talk about the modern data stack recently. Much of this focus is placed on the innovations around the movement, transformation, and governance of data as it relates to the shift from on-premise to cloud data warehouse-centric architectures. When it comes to the analytics layer – which sits on top of the modern […]

The post Transforming Analytics with a Modern Data Stack appeared first on DATAVERSITY.


Read More
Author:

Ajay Khanna

Hybrid Cloud Data Warehouses: Don’t Get Stuck on One Side or the Other

Hybrid clouds seem to be the way the wind is blowing. According to the Enterprise Strategy Group, the number of organizations committed to or interested in a hybrid cloud strategy has increased from 81% in 2017 to 93% in 2020. But what exactly is a hybrid cloud? Turns out, there are a lot of definitions. I’ll share a definition from Deloitte that I like:

“Hybrid cloud is cloud your way. It’s integrating information systems—from on-premises core systems to private cloud, public cloud, and edge environments—to maximize your IT capabilities and achieve better business outcomes. It’s designing, building, and accelerating your possible.”

Why Hybrid Cloud?

Data warehouse deployments on premises and in the public cloud play equally important roles in a hybrid cloud strategy. The Enterprise Strategy Group found that 89% of organizations still expect to have a meaningful on-premises footprint in three years. At the same time, Gartner predicts that public cloud services will be essential for 90% of data and analytics innovation by 2022. Accordingly, organizations are adopting a hybrid cloud strategy to leverage the right mix of locations to meet their needs.

Consider: The cloud provides the flexibility to build out and modify services in an agile manner, the potential to scale almost infinitely, the assurance of enhanced business continuity, and the ability to avoid capital expenditures (CapEx)—all of which continue to accelerate the adoption of cloud-based data warehouses. But data warehouses running on premises in your own data center deliver their advantages:

  • Data gravity: Sometimes data is hard to move to public clouds since there’s so much of it and/or the data have interdependencies with other systems and databases.
  • More control over governance and regulatory compliance: You know where and under what geographic or other restrictions your data is operating.
  • More control over deployment infrastructure: You may want to use hardware, operating systems, databases, applications, and tools you already have.
  • Avoiding high operational expenditure (OpEx): Consumption-based pricing models in public clouds can lead to high OpEx when usage is frequent – particularly if that data is fluid, moving between public clouds and on-premise locations. 

Hybrid Cloud Evaluation Criteria

To get optimal benefits from a hybrid cloud data warehouse, though, you’ll need a solution that can drive better business outcomes while using fewer resources. For starters, you’ll want a single-solution architecture that can operate in both public and on premises environments. Solutions from many data warehouse vendors either don’t do this well or don’t do this at all. Many vendor’s data warehouse solution runs in the public cloud or on-premises, and their “hybrid” versions have been cobbled together to meet the increase in demand. However, without the same data and integration services on-premises and in the cloud, the same data model, the same identity, and the same security and management systems, these solutions effectively saddle you with two siloed data warehouse deployments.

Why are common data and integration services, the same data model, the same identity, and the same security and management systems important? Let me tell you:

Same Data Services

It is essential that your data warehouse supports the same data services for public cloud and on-premises data warehouses. Without this, you will wind up with data redundancy and data consistency issues, duplications of effort and resources (human and technical), increased costs, and an inability to provide seamless access across environments.

Same Data Model

A data model determines how data is organized, stored, processed, and presented. Having a single data model for the on-premises and cloud-based sides of your data warehouse eliminates incongruencies across source systems. It also strengthens data governance by ensuring that data is created and maintained in accordance with company standards, policies, and business rules. As data is transformed within the data warehouse—on-premises or in the cloud—it continues to adhere to data definitions and integrity constraints defined in the data model.

Same Identity Authentication

Your users should be able to sign on to on-premises and cloud data warehouses using the same login ID and password. Data warehouse support for single sign-on (SSO) access helps eliminate password fatigue for users and, more importantly, can help you ensure that your organization’s identity policies are extended to protect both data warehouse locations.

Same Security and Management Services

Shared security services are also critical for hybrid cloud data warehouses. I’ve already written two blog posts that provide details on security, governance, and privacy requirements for the modern data warehouse, one on database and one on cloud service security, so I will refer you to those for more details. But I would like to point out in this discussion that you will need integrated security services across your on-premises and public cloud environments to ensure a strong and consistent security posture for your hybrid data warehouse.

Finally, shared services for management tasks offer clear advantages in terms of cost, control, and simplicity:

  • You’ll need fewer staff members to develop, maintain, and monitor the components in your hybrid deployment.
  • You’ll improve control through consistent upgrades, patches, and backups.
  • You’ll simplify metering and licensing requirements across environments.

Actian Avalanche Hybrid Cloud Data Warehouse

It should come as no surprise that a hybrid cloud data warehouse that meets all these criteria does exist: the Actian Avalanche™ hybrid-cloud data warehouse, integration, and management platform is a true hybrid cloud data warehouse that can be deployed on-premises as well as in multiple public clouds, including AWS, Azure, and Google Cloud. You can read more about the Avalanche solution here.

 

This originally appeared on iTechnologySeries on October 12, 2021.

The post Hybrid Cloud Data Warehouses: Don’t Get Stuck on One Side or the Other appeared first on Actian.


Read More
Author:

Teresa Wingfield
Hybrid Cloud Data Warehouses: Don’t Get Stuck on One Side or the Other

Hybrid clouds seem to be the way the wind is blowing. According to the Enterprise Strategy Group, the number of organizations committed to or interested in a hybrid cloud strategy has increased from 81% in 2017 to 93% in 2020. But what exactly is a hybrid cloud? Turns out, there are a lot of definitions. I’ll share a definition from Deloitte that I like:

“Hybrid cloud is cloud your way. It’s integrating information systems—from on-premises core systems to private cloud, public cloud, and edge environments—to maximize your IT capabilities and achieve better business outcomes. It’s designing, building, and accelerating your possible.”

Why Hybrid Cloud?

Data warehouse deployments on premises and in the public cloud play equally important roles in a hybrid cloud strategy. The Enterprise Strategy Group found that 89% of organizations still expect to have a meaningful on-premises footprint in three years. At the same time, Gartner predicts that public cloud services will be essential for 90% of data and analytics innovation by 2022. Accordingly, organizations are adopting a hybrid cloud strategy to leverage the right mix of locations to meet their needs.

Consider: The cloud provides the flexibility to build out and modify services in an agile manner, the potential to scale almost infinitely, the assurance of enhanced business continuity, and the ability to avoid capital expenditures (CapEx)—all of which continue to accelerate the adoption of cloud-based data warehouses. But data warehouses running on premises in your own data center deliver their advantages:

  • Data gravity: Sometimes data is hard to move to public clouds since there’s so much of it and/or the data have interdependencies with other systems and databases.
  • More control over governance and regulatory compliance: You know where and under what geographic or other restrictions your data is operating.
  • More control over deployment infrastructure: You may want to use hardware, operating systems, databases, applications, and tools you already have.
  • Avoiding high operational expenditure (OpEx): Consumption-based pricing models in public clouds can lead to high OpEx when usage is frequent – particularly if that data is fluid, moving between public clouds and on-premise locations. 

Hybrid Cloud Evaluation Criteria

To get optimal benefits from a hybrid cloud data warehouse, though, you’ll need a solution that can drive better business outcomes while using fewer resources. For starters, you’ll want a single-solution architecture that can operate in both public and on premises environments. Solutions from many data warehouse vendors either don’t do this well or don’t do this at all. Many vendor’s data warehouse solution runs in the public cloud or on-premises, and their “hybrid” versions have been cobbled together to meet the increase in demand. However, without the same data and integration services on-premises and in the cloud, the same data model, the same identity, and the same security and management systems, these solutions effectively saddle you with two siloed data warehouse deployments.

Why are common data and integration services, the same data model, the same identity, and the same security and management systems important? Let me tell you:

Same Data Services

It is essential that your data warehouse supports the same data services for public cloud and on-premises data warehouses. Without this, you will wind up with data redundancy and data consistency issues, duplications of effort and resources (human and technical), increased costs, and an inability to provide seamless access across environments.

Same Data Model

A data model determines how data is organized, stored, processed, and presented. Having a single data model for the on-premises and cloud-based sides of your data warehouse eliminates incongruencies across source systems. It also strengthens data governance by ensuring that data is created and maintained in accordance with company standards, policies, and business rules. As data is transformed within the data warehouse—on-premises or in the cloud—it continues to adhere to data definitions and integrity constraints defined in the data model.

Same Identity Authentication

Your users should be able to sign on to on-premises and cloud data warehouses using the same login ID and password. Data warehouse support for single sign-on (SSO) access helps eliminate password fatigue for users and, more importantly, can help you ensure that your organization’s identity policies are extended to protect both data warehouse locations.

Same Security and Management Services

Shared security services are also critical for hybrid cloud data warehouses. I’ve already written two blog posts that provide details on security, governance, and privacy requirements for the modern data warehouse, one on database and one on cloud service security, so I will refer you to those for more details. But I would like to point out in this discussion that you will need integrated security services across your on-premises and public cloud environments to ensure a strong and consistent security posture for your hybrid data warehouse.

Finally, shared services for management tasks offer clear advantages in terms of cost, control, and simplicity:

  • You’ll need fewer staff members to develop, maintain, and monitor the components in your hybrid deployment.
  • You’ll improve control through consistent upgrades, patches, and backups.
  • You’ll simplify metering and licensing requirements across environments.

Actian Avalanche Hybrid Cloud Data Warehouse

It should come as no surprise that a hybrid cloud data warehouse that meets all these criteria does exist: the Actian Avalanche™ hybrid-cloud data warehouse, integration, and management platform is a true hybrid cloud data warehouse that can be deployed on-premises as well as in multiple public clouds, including AWS, Azure, and Google Cloud. You can read more about the Avalanche solution here.

 

This originally appeared on iTechnologySeries on October 12, 2021.

The post Hybrid Cloud Data Warehouses: Don’t Get Stuck on One Side or the Other appeared first on Actian.


Read More
Author:

Teresa Wingfield
A Beginner’s Guide to AI and Machine Learning in Web Scraping


With uses spanning personalized medicine to the creation of social media clickbait, the use of artificial intelligence (AI) and machine learning (ML) is expected to transform industries from health care to manufacturing. Web scraping is no exception – and while its use is definitely not the answer to every data collection challenge, simple applications of AI/ML can enhance the process and increase […]

The post A Beginner’s Guide to AI and Machine Learning in Web Scraping appeared first on DATAVERSITY.


Read More
Author:

Aleksandras Ĺ ulĹľenko

How Manufacturers Can Get Started Selling Directly To Consumers

Let’s face it: Manufacturers’ traditional sales models can hurt both the company and the customer. Because of the pandemic causing havoc on supply chains, selling direct has become a more popular option for many manufacturers. 

Looking forward to the 2020s, the old model of selling through a distribution/broker/retailer channel may still be alive, but many company leaders are finding that their customers prefer to buy directly from them. On top of that, restrictive middleman margins can increasingly put a chokehold on manufacturers’ profits.

When Covid-19 hit, people ran to the internet to shop for just about everything. I think manufacturing as an industry reached a tipping point between the pandemic, a global supply chain malfunction where production halted and even stopped for some companies, and some online retailers refusing to change policies to adapt to these situations.

These factors led many manufacturers to ramp up and increase their investments in a direct-to-consumer (DTC) strategy, where they have 100% control over pricing, inventory levels, and — increasingly — access to critical customer data. As the co-founder of a company that creates digital sales channels for manufacturers, I have some advice for those just getting started with selling directly to consumers.

read more

What Is SVM Classification Analysis and How Can It Benefit Business Analytics?


This article provides a brief explanation of the SVM classification method of analytics. What Is SVM Classification Analysis? SVM classification is based on the idea of finding a hyperplane that best divides a dataset into predefined classes. The goal is to choose a hyperplane with the greatest possible margin between the hyperplane and any point […]

The post What Is SVM Classification Analysis and How Can It Benefit Business Analytics? appeared first on DATAVERSITY.


Read More
Author:

Kartik Patel

Customer Data’s impact on Digital Transformation

When we hear the words “digital transformation”, the first thought that might come into your mind might be the shift from a traditional manual office environment into one that is leveraging all things digital.

That perspective is valid to some extent. Many companies and businesses shift from traditional physical business practices to more modern digital ones to improve not just operational efficiency but also to accentuate their digital presence.

They might do this either by creating a mobile app, revamping or launching their website to support e-commerce or even by ramping up their social media presence. All of these decisions and behaviours fall under the umbrella of digital transformation initiatives.

Continue reading at Pretectum.com

Email AI and Workflow Automation Saves Time and Money for Health Care


The current state of the labor market is imposing obstacles for employers across many industries to fill open positions – health care being chief of them. The global shortage of health care workers has forced practitioners into undertaking additional duties such as manual data entry alongside their general patient care duties. When practitioners shift their time […]

The post Email AI and Workflow Automation Saves Time and Money for Health Care appeared first on DATAVERSITY.


Read More
Author:

Hoala Greevy

Net Promoter 3.0

As a consumer, you’ve probably encountered this sort of question dozens of times—after an online purchase, at the end of a customer service interaction, or even after a hospital stay.

If you work at one of the thousands of companies that ask this question of their customers, you’re familiar with the Net Promoter System (NPS), which Reichheld invented and first wrote about in HBR almost 20 years ago. (See “The One Number You Need to Grow,” December 2003.)

NPS has spread rapidly around the world. It has become the predominant customer success framework—used today by two-thirds of the Fortune 1000. Why has it been embraced so enthusiastically?

Because it solves a vital challenge that our financial systems fail to address.

Financials can easily tell us when we have extracted $1 million from our customers’ wallets, but they can’t tell us when our work has improved customers’ lives.

Read more

The Future of Quantum Computing


Superposition, entanglement, and interference. Join us on this episode as we’re learning more about quantum computing and what the future holds for us. This episode’s guest is Sahar Ben Rached. She is a Research Master’s candidate in Nanotechnology at the Faculty of Sciences of Tunis. She is currently a Quantum Computing research intern at the IPE, […]

The post The Future of Quantum Computing appeared first on LightsOnData.


Read More
Author:

George Firican
Bloor Spotlight Highlights How Actian’s Ingres NeXt Strategy Avoids Common Modernization Pitfalls

Digital transformation requires use of the latest technologies. However, as you probably already know, modernizing a mission-critical database and the applications that interact with it can be risky and expensive, often turning into a long disruptive journey. But I have good news! According to a recent Bloor Spotlight report, Actian’s Ingres NeXt strategy for modernizing Ingres and OpenROAD applications either avoids or proactively addresses these potential pain points.

Bloor Senior Analyst Daniel Howard comments:

“Ingres NeXt is worth paying attention to because it acknowledges both the massive need and desire for digital transformation and modernization as well as the difficulties and shortcomings of conventional approaches to them, then takes steps to provide the former while mitigating the latter.”

Let’s look at the top four obstacles that stand in the way of modernization.

It’s Risky

Less than half of modernization projects are successful. Complex dependencies among databases, applications, operating systems, hardware, data sources, and other structures increase the likelihood  that something will go wrong. In addition, organizations are likely to make poor decisions at some point since there are few modernization best practices to guide the way.

It’s Expensive

Modernization typically requires Capital Expenditure (CapEx) justification. Although modernization can potentially save money and increase revenue in the long run, it can be difficult to prove that this will significantly outweigh the costs of maintaining your legacy systems over time. It can also be challenging to get a modernization initiative approved as part of an innovation budget. Innovation budgets are often quite small. According to Deloitte’s analysis, the average IT department invests more than half of its technology budget on maintaining business operations and only 19% on building innovative new capabilities.

It’s a Long Journey

Modernization can involve replacing thousands of hours’ worth of custom-developed business logic. Code may be stable, but it is perceived as brittle if it cannot be changed without great pain. Missing documentation, third-party applications, and libraries that are often no longer available can add time and complexity to a modernization project. Plus, many developers are simply unaware of conversion tools for updating “green screen” ABF applications and creating web and mobile versions.

It’s Disruptive

Mission-critical databases and applications require near 100% availability, so modernization requires careful planning and execution. Plus, technical staff and business users will need to be retrained and upskilled to make the most of new technologies.

How exactly does Ingres NeXt avoid or address these pain points?

Read the Bloor Spotlight on the Ingres NeXt Database and Application Modernization program for the answer to that question. The report discusses how automated migration utilities, asset reuse, and a high degree of flexibility and customization—among other things—result in a solution that can streamline your organization’s path to a modern data infrastructure.

The post Bloor Spotlight Highlights How Actian’s Ingres NeXt Strategy Avoids Common Modernization Pitfalls appeared first on Actian.


Read More
Author:

Teresa Wingfield
Bloor Spotlight Highlights How Actian’s Ingres NeXt Strategy Avoids Common Modernization Pitfalls

Digital transformation requires use of the latest technologies. However, as you probably already know, modernizing a mission-critical database and the applications that interact with it can be risky and expensive, often turning into a long disruptive journey. But I have good news! According to a recent Bloor Spotlight report, Actian’s Ingres NeXt strategy for modernizing Ingres and OpenROAD applications either avoids or proactively addresses these potential pain points.

Bloor Senior Analyst Daniel Howard comments:

“Ingres NeXt is worth paying attention to because it acknowledges both the massive need and desire for digital transformation and modernization as well as the difficulties and shortcomings of conventional approaches to them, then takes steps to provide the former while mitigating the latter.”

Let’s look at the top four obstacles that stand in the way of modernization.

It’s Risky

Less than half of modernization projects are successful. Complex dependencies among databases, applications, operating systems, hardware, data sources, and other structures increase the likelihood  that something will go wrong. In addition, organizations are likely to make poor decisions at some point since there are few modernization best practices to guide the way.

It’s Expensive

Modernization typically requires Capital Expenditure (CapEx) justification. Although modernization can potentially save money and increase revenue in the long run, it can be difficult to prove that this will significantly outweigh the costs of maintaining your legacy systems over time. It can also be challenging to get a modernization initiative approved as part of an innovation budget. Innovation budgets are often quite small. According to Deloitte’s analysis, the average IT department invests more than half of its technology budget on maintaining business operations and only 19% on building innovative new capabilities.

It’s a Long Journey

Modernization can involve replacing thousands of hours’ worth of custom-developed business logic. Code may be stable, but it is perceived as brittle if it cannot be changed without great pain. Missing documentation, third-party applications, and libraries that are often no longer available can add time and complexity to a modernization project. Plus, many developers are simply unaware of conversion tools for updating “green screen” ABF applications and creating web and mobile versions.

It’s Disruptive

Mission-critical databases and applications require near 100% availability, so modernization requires careful planning and execution. Plus, technical staff and business users will need to be retrained and upskilled to make the most of new technologies.

How exactly does Ingres NeXt avoid or address these pain points?

Read the Bloor Spotlight on the Ingres NeXt Database and Application Modernization program for the answer to that question. The report discusses how automated migration utilities, asset reuse, and a high degree of flexibility and customization—among other things—result in a solution that can streamline your organization’s path to a modern data infrastructure.

The post Bloor Spotlight Highlights How Actian’s Ingres NeXt Strategy Avoids Common Modernization Pitfalls appeared first on Actian.


Read More
Author:

Teresa Wingfield
The complete guide to data governance roles and responsibilities


If you’re at the beginning of your data governance journey, one of the very first steps you’ll need to take is to identify who in your organization will be part of the data governance team. Appointing the wrong people to key roles can cause the wheels to come off any well thought out initiative pretty quickly, […]

The post The complete guide to data governance roles and responsibilities appeared first on LightsOnData.


Read More
Author:

George Firican
Data Literacy: A New Requirement for Consumers and Businesses


Did you know that only one-third of us can confidently understand, analyze, and argue with data? That’s the essential question posited by the Data Literacy Project, an organization that wants to “ignite discussion and develop the tools we need to shape a successful, data-literate society.”  Achieving consumer Data Literacy at a mass scale is an ambitious […]

The post Data Literacy: A New Requirement for Consumers and Businesses appeared first on DATAVERSITY.


Read More
Author:

Scott Zoldi

Approaching Process Automation Using Virtualization Technology


Process control practitioners frequently use virtual machines (VMs) when deploying and managing process automation applications. However, developments in operational technologies have created new and better ways to streamline management on cloud data centers.  Orchestration and containerization – next-generation virtualization concepts – are transforming automation and freeing users from the frustrating and time-consuming process of wholesale digital […]

The post Approaching Process Automation Using Virtualization Technology appeared first on DATAVERSITY.


Read More
Author:

Nahla Davies

Scaling Airflow – Astronomer Vs Cloud Composer Vs Managed Workflows For Apache Airflow


Over the last 3 months, I have taken on two different migrations that involved taking companies from manually managing Airflow VMs to going over to using Cloud Composer and MWAA (Managed Workflows For Apache Airflow).   These are two great options when it comes to starting your first Airflow project. They help reduce a lot of issues…
Read more

The post Scaling Airflow – Astronomer Vs Cloud Composer Vs Managed Workflows For Apache Airflow appeared first on Seattle Data Guy.


Read More
Author: research@theseattledataguy.com

Why Adopting the Right Unstructured Data Management Strategy Is Critical


While many sectors have been under severe economic pressure in the past couple of years, the post-pandemic economy is offering significant opportunities for growth. However, many businesses must also currently contend with record domestic and global labor shortages, supply chain problems, and other challenging external constraints. As a result, enterprises everywhere are under intense pressure […]

The post Why Adopting the Right Unstructured Data Management Strategy Is Critical appeared first on DATAVERSITY.


Read More
Author:

Michael Jack

RSS
YouTube
LinkedIn
Share