Search for:
Zen Edge Database and Ado.net on Raspberry Pi
featured-image

Do you have a data-centric Windows application you want to run at the Edge? If so, this article demonstrates an easy and affordable way to accomplish this by using the Zen Enterprise Database through Ado.net on a Raspberry Pi.  Raspberry Pi features a 64-bit ARM processor, can accommodate several operating systems, and cost around $50 (USD).

These instructions use Windows 11 for ARM64 installed on a Raspberry Pi V4 with 8 GB RAM for this example. (You could consider using Windows 10 (or another ARM64-based board), but you would first need to ensure Microsoft supports your configuration.)

Here are the steps and results as follows.

  • Use the Microsoft installed Windows emulation with Windows 11. ARM64bit for Windows 11 installer
  • After the installer finishes, the Windows 11 directory structure should look like the figure below:

  • The installer creates Arm, x86, and x64bit directories for windows simulation.
  • Next, run a .Net Framework application using Zen ADO.NET provider on Windows 11 for ARM64 bit on Raspberry Pi.

Once the framework has been established, create an ADO.NET application using VS 2019 on a Windows platform where Zen v14 was installed and running.

To build the simple application, use a C# Windows form application, as seen in the following diagram.

Name and configure the project and point it to a location on the local drive (next diagram).

Create a form and add two command buttons and text boxes. Name it “Execute” and “Clear,” and add a DataGridView as follows.

Add Pervasive.Data.SqlClient.dll under project solution references by selecting the provider from C:Program Files (x86)ActianZenbinADONET4.4 folder. Add a “using” clause in the program code as

using Pervasive.Data.SqlClient;.

Add the following code under the “Execute” button.

Add the following code under the “Clear” button.

Then, add the connection information and SQL statement to the text boxes added in the previous steps as follows.

Zen Edge

Now the project is ready to compile, as seen below.

Use a “localhost” in the connection string to connect to the local system where the Zen engine is running. This example uses the Demodata database “class” table to select data.

Se “Execute” will then return the data in the Grid as follows.

Now the application is ready to be deployed on Raspberry Pi. To do so, all copy the “SelectData.Exe” from the C:testSelectDataSelectDatabinDebug folder and Zen ADO.NET provider “Pervasive.Data.SqlClient.dll “. Copy it to a folder on Windows 11 for ARM64bit on Raspberry Pi.

Next, register the ZEN ADO.NET provider to the GAC using Gacutil as follows.

Gacutil /f /I <dir>Pervasive.Data.SqlClient.dll

Zen Edge Database

Run the SelectData app and connect to a remote server where ZEN engine is running as a client-server application.

Change the server name or IP address in the connection string to your server where the Zen V14 or V15 engine is running.

Now the Windows application is running in the client-server using Zen Ado.net provider on a Raspberry Pi with Windows 11 for Arm64 bit installed.

And that’s it!  Following these instructions, you can build and deploy a data-centric Windows 11 application on a Raspberry Pi ARM64.  This or similar application can run on a client or server to upstream or downstream data clients such as sensors or other devices that generate or require data from an edge database.  Zen Enterprise uses standard SQL queries to create and manage data tables, and the same application and database will run on your Microsoft Windows-based (or Linux) laptops, desktops, or in the Cloud.  For a quick tutorial on the broad applicability of Zen, watch this video.

 

 

The post Zen Edge Database and Ado.net on Raspberry Pi appeared first on Actian.


Read More
Author: Johnson Varughese

Move Over, Backup: It’s Time to Talk Data Resiliency


Your data is under attack more than ever before. The onslaught of cyber attacks around the world is unrelenting, and organizations of all sizes have found themselves on the other end of the ransom demand. As hard as it is to imagine, organizations like the Cybersecurity and Infrastructure Security Agency are warning us that things could get […]

The post Move Over, Backup: It’s Time to Talk Data Resiliency appeared first on DATAVERSITY.


Read More
Author: W. Curtis Preston

The IRS Embraces Big Data to Fight Tax Fraud


After years of hiding in plain sight, cyber entrepreneurs and investors have finally caught the attention of the IRS. This is due mainly to the rise of non-fungible tokens (NFTs) and the crypto bubble. In addition to tech millionaires secreting their wealth in cryptocurrencies and digital banks, increased incidences of identity theft and refund fraud […]

The post The IRS Embraces Big Data to Fight Tax Fraud appeared first on DATAVERSITY.


Read More
Author: Bernard Brode

The sin of planned obsolescence


Progress is almost certainly agreed to be necessary but progress does not mean that what is old is no good or useless anymore unless that’s an aspect inherent to the design. We need to recognize the waste that progress can sometimes bring and try to minimize it at every turn.

I have worked with a number of product managers of digital and physical products over the years and one of the topics that doesn’t really come up is the question of obsolescence (planned or otherwise).

Planned obsolescence conceptually emerged almost a century ago with Bernard London’s pamphlet Ending the Depression Through Planned Obsolescence. His plan would have the government impose a legal obsolescence on personal-use items, to stimulate and perpetuate purchasing.

The concept was popularized in the 1950s by American industrial designer Clifford Brooks Stevens who used the term as the title of his talk at an advertising conference in Minneapolis in 1954.

By his definition, planned obsolescence was “Instilling in the buyer the desire to own something a little newer, a little better, a little sooner than is necessary.” Stevens saw this as a cornerstone to product evolution.

What is often recognized, is the importance of customer retention, optimality in the customer experience, product reliability, fitness for purpose and so on. How often does product durability get factored in? It seems the answer to that question depends heavily on the industry segment and the perceptions that exist around how long is a reasonable time before the style, performance and capabilities of the product wane.

There’s an industry segment known as FMCG, Fast Moving Consumer Goods, it’s a popular acronym and taught widely at business school but what does it really mean in our modern world? A Wikipedia definition will frame it as  “also known as consumer packaged goods (CPG)… products that are sold quickly and at a relatively low cost. Examples include non-durable household goods such as packaged foods, beverages, toiletries, candies, cosmetics, over-the-counter drugs, dry goods, and other consumables.” So, we get it, they’re stuff we consume regularly and don’t expect to have sitting on the shelf or in the cupboard for a long time.

There’s this other class of goods known as “consumer durables”, another business school term, the name is a clue to the expectation around these goods; it is something that should last. Again Wikipedia frames it nicely as “a category of consumer goods that do not wear out quickly, and therefore do not have to be purchased frequently. They are a part of core retail sales data and are known as “durable goods” because they tend to last for at least three years.” But there’s a problem here, who decided that three years was the minimum life span for these things?

It seem the question of product durability of products, particularly physical ones, has been a concern for a very long time, so much so, that various nations have invested in studies and programs to try to drive product durability standards and ensure that manufacturers product products that last.

In some studies it has been suggested that more than three quarters of consumers believes that products do not last as long now as in the past. The shape of consumer sentiment is clearly negative. It is also clear that in our world of finite resources and the growing waste problem that there are persuasive arguments that favour extended life expectancy out of products. This is driven not only by the direct savings resulting from less frequent replacement of products to experiencing greater reliability and fewer repair costs. Extending product life ultimately is of benefit to society as a whole. Save material, save energy and save the environment.

Manufacturers will cite that there are problems and limitations with this approach, for example, that increased life for certain products can be purchased only through significantly increased costs of production. Using more costly materials, metal instead of plastic for example. They would also say that more expensive more energy intensive manufacturing methods would be required. Again, the metals vs plastics argument. If longer lasting products are more expensive, then the direct savings to consumers may likely never be possible.

How we calculate the cost

In our minds, if we buy something today with a forecasted life of say 5 years then we work out that the cost is roughly a fifth of the total cost every year. The higher the price and the lower the durability, naturally, the higher the holding cost. This is effectively how we see the discount or lack of discount in our minds. Buying a high end mobile phone for $1000 today and expecting it to last for 5 years means $200 dollars a year. If you tell me that it will only last two years then the $500 a year might be a price point that discourages me.

Obsolescence is fundamental to the experience of modernity, not simply one dimension of an economic system. 

Cultures of Obsolescence: History, Materiality, and the Digital Age – B. Tischleder, S. Wasserman

The main argument against durability, as presented by manufacturers and product designers and managers is that if a products lasts longer then the turnover rate of that product will be slow and this will inhibit the ability to fund new product design and related innovations because the existing products will be what they are and capital will be tied up in the inventory and the old technology. This is also one of the catalysts behind just in time (JIT) manufacture and other innovations around stock minimization.

With the burgeoning focus on more efficient transportation like E/Vs and water and energy efficient appliances with elevated levels of safety you could say that we should not be trying to perpetuate existing products in their current form but should always be looking forward and improving, and besides, if you ‘re not continuously producing then there is retarded economic growth in industry and the potential for unemployment or at the very least under-employment.

So, we’ll take longer product life as long as we don’t have to suffer unacceptable side effects. Society will opt for more durability in products if the negative impact is reasonably benign.

Planning for durability

Product managers and designers should consider an approach to product durability that carries at least two important aspects

  • identify if a longer product life is a desirable goal
  • identify product life durability strategies that will have minimal adverse effects

From a design and manufacture standpoint this means focusing on just how much value is added through the raw materials to finished product production process, the value of the materials and energy involved in the production; the effects on the environment; and the actual potential impact of increased life.

Products that have high level of value-added either through labor, energy or materials would be the obvious ones that should have the greatest durability because of the value invested in the product. Extending product life complements
recycling as a conservation strategy.

The high tech industry has only recently started to point out the social and scarcity costs of some of the rare metals materials associated with production costs and while there is some acknowledgment of depleted energy resources used in production, these clearly don’t appear to be strong drivers to slow the release of increasingly faster, newer and feature-full new high tech products.

Additional burdens like toxic waste or aspects inherent to the product that need proper or specialized disposal might also be considered a part of this.

I guess the point of all this is to double-down on recognizing the overall sensitivity that we need to have, as consumers, as technologists and as technology advocates for the most appropriate orientation to product conception, design and production.

As consumers we want the latest, most powerful camera phone, for example. It won’t change the experiences we have already had or the moments in life already captured but it does have other costs. The existing camera phone we have, may have its limitations in contrast to the new ones, but are they really that materially significant? Perhaps not. More importantly, if I have a social conscience and I want to keep using my existing phone until it basically completely stops working, fails to hold a charge, gets lost or damaged irreparably, surely that is the point at which I should replace it?

I shouldn’t have to replace it because the piece of software or app that I use almost everyday, decides it is time to update itself and then declares that it will not work on the version of the operating system that my phone has – that, to my mind, is unacceptable and tantamount to planned obsolescence.

This is also myopic thinking on the part of the developers, the designers, the architects and the product managers responsible for those lines of digital products that they’ve decided should now run exclusively on new hardware. Sure, backwards compatibility is hard but failing to consider the broader implications, particularly with software and digital products, of the existing customers is bad product management. Even if you back your position up with empirical data, be sure that you’re looking at the whole ecosystem before you decide to brick the experience.


Read More
Author: Jewel Tan

Data Platforms Vs Data Warehouses Vs Data Virtualization And Federated SQL Engine – How To Store Your Data


Photo by Nana Smirnova on Unsplash There is a lot of discussion on how you should implement your data stacks today. Should you bundle or unbundle, should you use a traditional data warehouse, lakes, lake houses, or mesh? Or a hybrid of all of them? With all these different ways you can put together your…
Read more

The post Data Platforms Vs Data Warehouses Vs Data Virtualization And Federated SQL Engine – How To Store Your Data appeared first on Seattle Data Guy.


Read More
Author: research@theseattledataguy.com

Top 10 Essentials for Modern Data Integration


Data integration challenges are becoming more difficult as the volume of data available to large organizations continues to increase. Business leaders clearly understand that their data is of critical value but the volume, velocity, and variety of data available today is daunting. Faced with these challenges, companies are looking for solutions with a scalable, high-performing […]

The post Top 10 Essentials for Modern Data Integration appeared first on DATAVERSITY.


Read More
Author: Erez Alsheich

Improving Team Collaboration for a Stronger Workplace


by guest contributor: Lucy Reed, founder of GigMine

Is employee collaboration the secret sauce to business success? That does seem to be the case if you look at the statistics – Not only do collaborative companies have five times better performance rates, but they’re 30 percent more innovative and significantly less likely to have burnt-out employees. Collaboration between your employees and teams creates a healthier work environment, essentially, and boosts work productivity.

Eight ways to build collaborative teams offers some suggestions on how you – as a manager, business owner, or other leader – could increase employee collaboration in your company:   

Promote open communication and idea-sharing

Transparent communication and idea-sharing are two foundational pillars of collaboration. If you’d like your employees to work well together, they need to be able to communicate and share ideas without fear of reprisal. You can encourage open communication by establishing trust between your employees, managers, and other personnel. When talking to them, speak clearly and concisely, don’t assume employees will see things the same way you do, and hear them out. Open communication is a skill that may need building up. 

Use tech and tools that can aid collaboration

Face-to-face collaboration can be a challenge in an office space due to space and time constraints. That’s where technology comes in. With the right tools, your employees will be able to communicate with each other without leaving their desks. Some organizations create in-house collaborative software for this purpose. Others use existing commercial tools like Slack (communication), Todoist (to-do list), and Asana (task manager). PCMag offers a good list.

Encourage feedback

Have you heard of the “feedback culture”? It emphasizes everyone in the organization – employees, managers, and executives – offering feedback to each other periodically. This is different because traditionally the only real feedback exchanged is quarterly or annual performance reviews. A feedback culture is a departure from that formula. It’s about a faster, more direct exchange of information. It’s less hierarchical and reliant on formal reviews. It can be difficult to implement and require much adjustment but can transform your organization in a good way.

Encourage employees to get to know each other

How well do your employees know each other? It’s easier to collaborate with someone you know well, as opposed to someone you know on a professional level only. Often, especially in bigger “siloed” companies, most employees don’t know each other well enough to approach each other, let alone have a conversation. It makes collaboration between teams and departments impossible. As such, if you don’t already, encourage team bonding through team-building activities and games. Informal events can also bring everyone closer together. 

Share examples of the collaboration you’d like to see

It’s not enough to share your intention of creating a collaboration culture in your workplace – you should, ideally, back that up with real-world examples for your employees to learn from and emulate. Without examples, your employees may not understand what’s expected of them. Humanyze offers practical examples that are simple and easy to implement like document sharing, video conferencing, and peer training.

Be collaborative yourself

As a leader or mentor figure in your organization, your employees will look up to you for cues on how to behave and operate. By being collaborative yourself, you give them permission and encouragement to be collaborative. Make it a point to talk to your employees, share and receive ideas, give them feedback, work with them on tasks, and mentor them.

Conclusion

Forcing collaboration is not a good idea – strike a balance between allowing people to be individuals and also to work together. Collaboration won’t happen overnight – it takes time for people to get to know, be comfortable with, and work together with each other. Persist and you will succeed in creating a wonderful, productive culture that would take your company to new heights.


Read More
Author: Jewel Tan

5 Ways Data Leaders Can Improve Data Reliability


Businesses today collect and store an astonishing amount of data. According to estimates from IDC, 163 zettabytes of data will have been created worldwide by 2025. However, this data is not always useful to business leaders until it is organized to be of higher quality and reliability. Despite its importance to effective data analysis, most business leaders […]

The post 5 Ways Data Leaders Can Improve Data Reliability appeared first on DATAVERSITY.


Read More
Author: Loretta Jones

The Application of Synthetic Data Is Inevitable


Synthetic data is at an inflection point of utilization. The emerging technology is just beginning its adoption cycle and value to the enterprise, but change is on the horizon. According to my company’s recent survey, industry leaders believe that, on average, 59% of their industry will utilize synthetic data in five years, either independently or in combination with […]

The post The Application of Synthetic Data Is Inevitable appeared first on DATAVERSITY.


Read More
Author: Yashar Behzadi

What is a CMDM platform?

Answering the question of “what is a Customer MDM?”

Across the globe, in all industry segments, data drives business processes, and systems.

The overall organization, its employees, and its customers benefit when this data is shared and accessible across all business units. A unified single point of access for the same customer lists and data used to run the business. On the whole, business data users within the organization generally assume that the customer data that they have access to is consistent across the whole business until they identify anomalies.

The reality though, is that customer data evolves in a more organic and somewhat haphazard way than data management professionals would prefer. This is especially true in larger organizations. Mergers and business acquisitions, projects and initiatives, and other general business activities often result in multiple systems being created, that often perform a similar or exact same function but for a variety of reasons, these redundancies must coexist.

The result is that these conditions inevitably lead to inconsistencies in the overall data structures and the data values between the various systems. This variance leads to increased data management costs and organizational risks.

The general consensus is that both data management and organizational costs and risk can be reduced through the dual strategies of Master Data Management and Reference Data Management.

Master Data Management is about the management of data that relates to organizational entities. These organizational entities are objects like logical financial structures, assets, locations, products, customers, suppliers, and employees.

These same structures provide the necessary context for smoothing of business transactions and transactional and business activity analysis.

Within them, are entities, real-world persons, organizations, places, and things as virtual objects. These same entities are represented by entity instances. In digital forms, they are effectively digital entities but really they are data records. Master Data should represent the authoritative, most accurate data available about key business entities.

When managed well, Master Data entities are trustworthy and can be used by employees in partner engagement with confidence. Surrounding these entities, are business rules that dictate formats, allowable ranges, and characteristics that should be applied to appropriately frame the master data values held.

Common organizational master data may include data that relates to partners that are made up of private individuals, organizations, and their employees. That same data may describe their role, their relationships, and essential attributes that might be useful for engaging with them as an organization.

Typical people-based master data entities are objects like customer, citizen, patient, vendor, supplier, agent, business partner, competitor, employee, or student.

Seeking the truth

When multiple repositories of these entities exist, there are potentially different versions of ‘the truth’ and it becomes difficult to work out which one is correct and whether in fact, two or more entities are referring to the same thing.

In order to do so, one must have an understanding of the origins of the data. A defined System of Record (SoR) is often considered an authoritative system where that data is created/captured and maintained in a disciplined and controlled way.

The capture and maintenance are undertaken with defined rules and expected outcomes. Historically this would mean that the Point of Sale system is there to support selling activities, ERP to support make-to-sell or buy-to-sell, and CRM to support selling, service, and support of customers.

For any of these systems to be deemed trustworthy sources, they need to be generally recognized as holding “the best version of the truth” in relation to records they hold, based on automated and manual data curation. That trusted source is sometimes also referred to as a Single View. Within that system, the entities are often referred to as Golden Records.

Systems of Reference similarly, are authoritative systems where reliable data is held and maintained to support proper transaction processing and analysis. Those systems of reference may or may not originate their own data.

Historically, Master Data Management (MDM) applications, Data Sharing Hubs, and Data Warehouses have often served as systems of reference.

The challenge is that different systems have different purposes and often no single system describing the same entity, needs to be describing the exact same characteristics of that entity. The question then becomes, can any of these systems truly be “the single source of truth”?

Master data management efforts often pursue the consolidation of entities from the many sources that create and contain them and then formulate a composite record that may be incomplete and only a partially accurate representation of all the entities held. For different entity users that can mean that they have less faith in the “golden records” that the system presents. When this is the situation, the representation may switch from “Single Source” to “Trusted Source” suggesting that measures are in place to drive consistency, accuracy, and completeness in the entity records with minimal ambiguity and contentiousness.

Gartner defines Master Data Management as “a technology-enabled discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency, and accountability of the enterprise’s official shared Master Data assets.”

MDM is therefore a discipline, made up of people, processes, and technology. There is often no specific application solution despite the fact that vendors will often use the acronym to describe their products, systems, and platforms that manage master data but that does not mean that they are effectively managing the master data, simply that they have characteristics that, when used correctly. can assist in proper master data management.

As you can imagine then, when something is described as a Customer MDM, it is a practice that relates to the management of digital customer entities. That practice could be paper-based also but we’re assuming that at scale you’re more interested in digital record-keeping.

The CMDM systems then, are the people processes and technology that support the customer master data management practice. The CMDM platform concept is therefore a composite software application on-premise or in the cloud, that provides metadata and data that relates to the management of the customer entities.

CMDM Platforms and related technologies for Customer Master Data Management are offered by many of the leading global software brands as parts of multidomain MDM like SAP, Oracle, IBM, and Informatica but there are some specialist offerings that you might not have heard like Ataccama, Pretectum, Profisee, Reltio, Riversand, Semarchy and Stibo Systems

The original version of this article was posted as What is a CMDM platform?

Types of Customer Master Data Management


Generally speaking, Master Data Management is thought of as being leveraged in four different ways in an organization. These approaches are, for the most part, exactly the same for the management of customers, vendor, employee, product master or any other master data object in an organization.

Consider what you hope to achieve from implementing MDM. Pretectum observes that implementing customer MDM can be considered, on one hand, to help meet compliance and regulatory obligations. It can be considered as a mechanism that reduces risk and ensures business process optimization too. For some, it is simply recognized as an essential piece of the puzzle in having a better view and understanding of the customer. In reality, though, MDM and customer MDM, in particular, can serve to address all of these needs and potentially more.

Pretectum’s view is that one absolutely critical aspect of all of this is the establishment of an authoritative, consistent and correct understanding of who and what a given customer is. The benefits that flow from managed and controlled understanding under a CMDM could range from cost and operational savings in maintenance of the customer master to reduced loss and risk through collections and fraud avoidance when working with the different buyer and consumption audiences and prospects accompanied by increases in confidence and trust amongst your employees and associates.

Traditional and Legacy

Typically an organization starts out with a traditional approach to MDM. I need to sell something to customers, so I need to establish them so that I can provide them with a quotation, deliver to them or bill them. For that, I need to have enough information to complete the task at hand. This type of MDM setup is focused purely on support of the transaction and often starts in the record-keeping system that is used to perform the transactions.

That might be the Point of Sale (POS), the Customer Relationship Management (CRM), or Enterprise Resource Planning (ERP) system. In a medical setting, that might be the Patient Administration and Billing System (PAS) and in Banking it might be in the Retail, Credit, or Lending system. These types of systems typically have a way of defining all the attributes of the customer based on a definition in the system that prescribes what needs to be provided to perform the transaction.

Control and management of the data are typically done at the record level with varying degrees of governance and control. The challenge with this approach is that often the master record is limited in its usefulness to other systems and sometimes not even shared or made available to other systems. So, instead of being a customer master, it is simply a transactional master which may be limiting for some parts of the business since the master may not be elaborate enough for all the divisions’ needs. The control here is at the application level dictating all capability within that application framework. Redundant listings are therefore inevitable with the risk of wastage and misalignment between systems and divisions.

For most, this isn’t really considered as MDM at all though within a given System of Record (SoR) the customer entity, for example, may be considered the golden master if all other systems know about this system and what it contains. SAP, ERP, or CRM for example, might be considered your nominated customer master.

Coexistence

The coexistence master data management approach may see data from the transactional masters across several systems, being consolidated to a single customer view of the customer master. This is typically updated posthoc and periodically synchronized reflecting all the external keys. The content of the master from various systems may be inconsistent and more importantly, no single entry is necessarily considered to be more correct and appropriate than any other.

Conflicts between records can and does occur and because the authoring is undertaken in the different systems, the end result may only be useful for generalized reference and may not be considered as a true or absolute authority. Further, if harmonization of the redundancies between the entries is not remediated, there may well be duplicated and conflicting records.

The harmonized records then become known as the “golden nominal” of the customer and are held centrally but updated in principle, by regular updates from the surrounding systems of record. The problem remains, deciding which updates are the most correct ones. Often most suited to organizations where there is no real authority or ownership of the ‘whole’ description of the customer, mirroring the data is deemed the most important objective but often with shared responsibility by all, for what defines the golden nominal. This can be a risky approach if the control of the master is deemed essential. The Pretectum CMDM supports this approach as one way to maintain the customer “golden nominal” but you also need to keep in mind that the optimal configuration is with the Pretectum C-MDM being able to provide suggested updates back to the satellite systems.

Registry

A registry master data management approach is primarily employed as a central index linkage to the concept of master data, in that the master records are authored across the different systems, but with a linkage that is commonly referenced by the different systems as an external key. Just as for the transactional and coexistence approaches, the authoring remains in the satellite systems and at best, the registry serves as a skeleton that contains essential attributes of the master that are common across the systems.

Often the data is simply used for interrogation, lookup, or reference. Because you are working with a skeleton of definition, there is relatively loose control over the customer master because this point of reference only provides a sliver of definition to the customer master. External keys become useful if held.

The Skeleton Key becomes an authority of sorts, but only if it is referenced in the peripheral systems, as an external key for referencing and minimization of duplication. This approach is fast to implement and provides some degree of central and distributed master data governance if all participants agree the skeleton can operate as a central reference point or authority. The Pretectum CMDM can be used in this way, never touching your source systems but it may never fully operate as the latest and greatest version of your customer master.

Consolidated

Used primarily to support business intelligence (BI) or data warehousing initiatives. This is generally referred to as a downstream MDM style, in that MDM is applied downstream of the operating systems where master data is originally created.

Here the MDM Hub is the system of reference for all reporting, search and reference purposes and it likely stores all the master data needed with no need to connect to, or be connected with backend systems.

This consolidated customer master can operate in a complete vacuum because it is self-contained, simply taking feeds from the satellite systems – beneficiary applications beyond the CMDM enjoy the benefits of the collated records while sources maintain business as usual. The Pretectum CMDM can be used in this way.

Centralized Customer MDM supporting Transactional use

With the Pretectum CMDM, we believe that one of the most optimal ways to approach MDM is in a centrally governed way, adopting a centralized customer MDM approach. Here the Customer MDM Hub is both a system of reference and a system of entry – it can update and receive all necessary data updates from any and all backend systems – we do this by making use of APIs.

From within the centralized Pretectum CMDM, you have access to schema master data definitions, including data domains for reference and lookups, and then whatever subordinate datasets that make full or partial use of those definitions. You also have the ability to have derivatives of existing data and permission-based access. Forms are available for manual data curation and bulk replacement and appends are also possible. Most importantly, you can do everything through a rich browser-based cloud UI or you can use API-based methods. All customer master data creation and curation are managed from the Pretectum CMDM hub, and satellite systems outside no longer create or amend the master, instead, they subscribe to syndicated data from the CMDM for any new records or updates.

As suggested by many in the industry, the centralized MDM is the ideal implementation but also the most challenging to implement because it requires discipline and a change in the data ownership philosophy of a given organization. You might often see the implementation of a centralized Customer MDM as part of a digital transformation initiative.

Customer Data Collection & Syndication with the Pretectum C-MDM


Read More
Author: Uli Lokshin

Enterprise Architects Can Be Indispensable in the Boardroom


Enterprise architecture (EA) can be a confusing term. To supply a business executive with a working definition, you might list that EA comprises the business functions, capabilities, processes, roles, physical and organizational structure, data stores and flows, applications, platforms, hardware, and communication. You might throw in any number of impressive acronyms and abbreviations: TOGAF, EIM, […]

The post Enterprise Architects Can Be Indispensable in the Boardroom appeared first on DATAVERSITY.


Read More
Author: Wilko Visser

The Impact of Data Governance in Cybersecurity


Data is a critical asset that supports operations, drives decision-making, and gives businesses a competitive advantage in the marketplace. Unfortunately, today’s data can also be easily targeted by cybercriminals looking to access and tamper with sensitive information. This is why cybersecurity is increasingly becoming a top strategic priority for enterprises of all sizes. As concerns of cyberattacks […]

The post The Impact of Data Governance in Cybersecurity appeared first on DATAVERSITY.


Read More
Author: Y’vonne Sisco – Ormond

When Data and IT Infrastructure Are Overly Complex


As efforts to simplify IT infrastructure through automation are ongoing, the action to pinpoint where complexity resides is key. Data complexity not only complicates the user experience but also wreaks havoc in the backend for administrators and IT decision-makers.  As the world of data has grown exponentially and transcended the borders of enterprises, the management of data […]

The post When Data and IT Infrastructure Are Overly Complex appeared first on DATAVERSITY.


Read More
Author: Eric Herzog

Why Hiring Managers Should Hire Military Veterans for IT Positions


by guest contributor: Lucy Reed, founder of GigMine

The tech industry is experiencing a severe IT skills shortage, reports Information Week, with no end in sight. The lack of talent is concerning, to put it mildly – it’s a struggle to prop-up existing tech processes, let alone implement new ones. With circumstances being what they are, IT managers have to think outside the box, and often go above and beyond, to locate and pick up quality talent.

Here’s a solution you may not have thought of: taking on military veterans. Not only do veterans make for skilled IT workers, but they’re also easy to find and hire. Below, we cover everything important about recruiting military veterans for IT roles:        

Military vets are well-matched to IT 

Veterans are well-suited to IT roles. You can slot them into most processes with some training, from customer care to cybersecurity. To further sweeten the deal, the government sometimes offers tax breaks and even money for expenses related to the training and onboarding of veterans.

Here are some other reasons to hire veterans:

  • Transferable skills: Veterans have both soft and hard skills – such as communication, organization, and planning. These skills transfer over to IT, which revolves around problem-solving, well.
  • Trainability: Veterans have to be adaptable to thrive and survive. They aren’t afraid of new information and are more than capable of learning new tricks.   
  • Responsibility: Military service, with lives on the line, tends to beat out rashness and carelessness. You can expect veterans to be dependable workers.
  • Respect for authority: They are respectful of authority and trained to follow orders (unless it’s unconscionable, of course).
  • Leadership and teamwork: Veterans make for great team leaders, not to mention they understand the value of teamwork. They can become core members of your organization in no time.  
  • Loyalty: Provided you treat them well, there are no workers more loyal than veterans.
  • Work ethic and self-discipline: Last, but not least, veterans aren’t afraid of putting their nose to the grindstone. They have self-discipline in spades.

How to find and hire veterans

Finding veterans is easy enough thanks to the internet and the presence of several veteran-focused organizations and government initiatives. Here are some suggestions:

  • Advertise your open position: Advertising locally (papers, magazines, flyers, etc.) and online always works. It’s important to make your advertisements credible, accurate, and descriptive.  
  • Use the U.S. Department of Labor: The Veterans’ Employment and Training Service offers a collection of employer tools, resources, and programs.   
  • Marketplaces: There are some exclusive marketplaces (aka job boards) for ex-members of the military.
  • Check veteran organizations: A local veteran organization is likely to be able to put you in touch with a vet who needs a job.   

Supporting and retaining veteran workers

Veterans have a high turnover rate, says Korn Ferry – 43 percent of veterans leave their job within a year, and 80 percent before the end of their second. The primary reason for this trend is a lack of personal development and career growth. To prevent this from happening to your veteran workers, here’s what you can do:

  • Consider an onboarding program: Veteran-specific onboarding programs can ease them into their new position, providing much-needed structure, support, and guidance initially.
  • Consider their needs: Veterans require autonomy and growth – make sure their tasks provide this. Education programs, training, socialization, and progression opportunities can fulfill their expectations of personal development.
  • Lead (by example): Veterans will stick around and respect people who have qualities they admire – integrity, honesty, and dependability. Being a good leader and manager – someone who leads from the front – is a great way to win them over. 

Hiring a veteran can do wonders for your business. Not only can they reliably fill in technical roles, but they also benefit the company culture by adding integrity, discipline, and pride. As long as you keep their needs in mind, the ROI for hiring veterans will work in your favor.


Read More
Author: Jewel Tan

The 4 Levels of Analytics and Data Literacy


We are living in a data-driven world where data is produced everywhere. It is a currency that is not going away. However, Data is just data if it just sits there. Nothing happens unless action is taken, and that’s Analytics. In today’s episode, we speak to Jordan Morrow, the godfather of Data literacy, advisory board member […]

The post The 4 Levels of Analytics and Data Literacy appeared first on LightsOnData.


Read More
Author: George Firican

2022: Becoming the Year of Containers, Kubernetes, and SDP Security


A growing number of companies are continuing to launch significant digital transformation (DX) projects in 2022, and rightly so, since this can enhance their IT and business capabilities while resulting in cost savings. This has led to a large increase in the use of containers. The StackRox State of Container and Kubernetes Security Report—Winter 2020 flagged […]

The post 2022: Becoming the Year of Containers, Kubernetes, and SDP Security appeared first on DATAVERSITY.


Read More
Author: Don Boxley

Why you should consider Customer Master Data important


It has been suggested that companies that do not understand the importance of customer data management are less likely to survive in a post-pandemic modern economy.

Your customer data could be your business’s most valuable asset. But there are a few things to consider if your intentions are, to create a truly data-focused organization.

Pretectum feels that data is foundational for building business information.

We see it as essential for creating customer knowledge, and ultimately the driving force behind correct tactical and strategic decisions and business actions.

If your customer data is relevant, complete, accurate, meaningful, and actionable, it then becomes instrumental in driving the growth of the organization. If it has deficiencies it can prove to be a useless and even a harmful inhibitor to not just growth, but also business sustainment.

Customer data management initiatives should be put into play to systematically create and maintain customer master data and increase the organizational potential and these should focus on quality and control.

The most successful organizations manage their customer data cycle well by managing the way customer data is created, stored, maintained, used, and retired.

When customer data management is truly managed effectively, the customer data life cycle commences even before any customer data is acquired.

Data management is the function of planning, controlling, and delivering data effectively in any organization.

Data management includes practicing a discipline in the development, execution, and supervision of plans, programs, policies, and practices that protect, control, deliver and enhance the quality and value of data and information in the organization.

Why you should do it

Risk mitigation

Not only is there an expectation as a result of privacy and compliance, but there is also the question of business reputation. An organization that fails to maintain and assess its customer master data can be subject to non-compliance prosection or personal litigation

Security of customer master data is therefore very important and proper and appropriate data management helps in making sure that data is never accessed in an inappropriate or unauthorized way and is protected inside the organization.

Data security is an essential part of data management and Pretectum takes pride in the fact that it uses the latest methods and techniques to ensure that customer data remains secure and protected but is still able to be used by the business as required. 

Making use of a modern cloud platform like the Pretectum C-MDM protects not only the integrity of the data but also provides assurance to employees and companies that data loss, data breaches, and data thefts are less likely.

Effective data quality

There should be little doubt that a structured and planned approach to customer data collection, curation, and review, will lead to better data quality. Improvements to data management practice, however, do need to be considered as something progressive.

It is rarely possible to simply implement a software solution and expect there to be an immediate change in the status of your customer master data. There likely needs to be a rethink of all the participants in the data management process, an evaluation of roles and responsibilities, and the establishment of some data quality measures. This is typically covered by a digital transformation project but can also be driven by a data management organization that functions at a level of maturity despite perhaps making use of largely manual controls, processes, and methods.

After the implementation of a platform like the Pretectum C-MDM, your business will see improved data management which in turn will help in improving data quality and data access. For the business as a whole, this often translates into less friction in engaging with customers and improved decision-making.

Doing things right is doing things better…

Peter Drucker is quoted as having said “Efficiency is doing better what is already being done“. At Pretectum we feel that there must be a better way to manage your customer data and that, data managed properly, updated, enhanced, and made accessible, will enhance organizational efficiency. Conversely, inaccurate, mismanaged data will waste precious time and resources.

Eliminate errors, eliminate waste…

Only through effective customer data management will you minimize the occurrence of errors and reduce the damage that bad master data can cause. Transcription, poor integration, and legacy methods used for capturing customer data introduce a greater likelihood of customer master data errors. With centralized customer master data management underpinned by data validation and data quality, your business has the best chance at creating and retaining the most valuable business data asset – your customer master.

Contact Pretectum to learn more about how they can help.


Read More
Author: Jewel Tan

Empower Your Third Line of Defense for Effective Data Governance


Data Governance practitioners must incorporate all aspects that bind data to the organization. Internal audit, referred to as the third line of defense against risk, should actually be top of mind for implementing effective governance programs.  The “Three Lines of Defense” model is an industry-recognized approach to enterprise risk management. The ultimate goal is to […]

The post Empower Your Third Line of Defense for Effective Data Governance appeared first on DATAVERSITY.


Read More
Author: Steve Zagoudis

Evaluating Data Lakes vs. Data Warehouses


While data lakes and data warehouses are both important Data Management tools, they serve very different purposes. If you’re trying to determine whether you need a data lake, a data warehouse, or possibly even both, you’ll want to understand the functionality of each tool and their differences. This article will highlight the differences between each and how […]

The post Evaluating Data Lakes vs. Data Warehouses appeared first on DATAVERSITY.


Read More
Author: Heine Krog Iversen

3 Vital Concerns for Companies Running Hybrid-Cloud Environments


The benefits of the cloud – reduced capital expenditures, greater IT flexibility, business efficiency, competitive advantage – are compelling. So much so that, not so long ago, people were predicting organizations would move their entire computing infrastructure to the cloud, and nothing would be left on-premises. It, of course, never happened. Instead, organizations have embraced […]

The post 3 Vital Concerns for Companies Running Hybrid-Cloud Environments appeared first on DATAVERSITY.


Read More
Author: Ivan Pittaluga

3 Key Types Of Data Projects You Will Need To Take On As A Head Of Data Analytics


Data and analytics teams are often responsible for several key pillars in a company. This can pose a challenge when you’re the head of data and analytics and you need to pick your next data project. Where do you even start when you’re constantly bombarded with requests from your management team. In this article I…
Read more

The post 3 Key Types Of Data Projects You Will Need To Take On As A Head Of Data Analytics appeared first on Seattle Data Guy.


Read More
Author: research@theseattledataguy.com

What Data Practitioners Need to Know (and Do) About Common Language


“Unless and until all members of a team have a common understanding of the problem, attempts to solve the problem are just so much wasted energy.” –Gerald M. Weinberg [1] In March 2019, one of us (Thomas C. Redman) served as the judge in a mock trial of a data architect (played by Laura Sebastian Coleman) […]

The post What Data Practitioners Need to Know (and Do) About Common Language appeared first on DATAVERSITY.


Read More
Author: David C. Hay, Thomas C. Redman, C. Lwanga Yonke, and John A. Zachman

Small and Medium-Sized Businesses Need Self-Serve Advanced Analytics


If your small or medium-sized business (SMB) is looking for ways to improve forecasting, problem-solving, and market opportunities, it must embrace self-service advanced analytics that allow business users to leverage their role, their knowledge of their business function, and their collaborative initiatives to gather, analyze, and share information and improve business results.  According to Gartner, natural […]

The post Small and Medium-Sized Businesses Need Self-Serve Advanced Analytics appeared first on DATAVERSITY.


Read More
Author: Kartik Patel

How to Prepare Data for AI and ML


Regardless of how clever the machine or how brilliant the algorithm, the success of intelligence-based solutions is intrinsically tied to the quality of the data that goes in. That’s why, aside from its people, data is the most important thing an organization owns. Data must be the first stop on the journey to implementing artificial […]

The post How to Prepare Data for AI and ML appeared first on DATAVERSITY.


Read More
Author: Jamie Cairns, Fluent Commerce, and Carole Kingsbury, Ted Baker,