Search for:
Actian’s Innovation Earns Prestigious IT4IT Award

Innovation is essential for meeting organizations’ business, IT, and technical needs. It’s why Actian invests more than 20% of our revenue in research and development. In addition to the positive responses we hear from customers for helping them solve their toughest business challenges, we also receive accolades from industry peers.

For example, we recently earned the Award of Distinction in the category “IT4IT Standard / IT Management and Planning.” The honor was decided by the jury of The Open Group India Awards 2024, which recognized our efforts to effectively employ open standards and open source. The Jury Panel called our award a testament to our outstanding work and our clear path toward the effective use of open standards and open source.

At Actian, we use the IT4IT reference architecture to manage our business and the end-to-end lifecycles of all Actian products, such as the Actian Data Platform, Vector, and Zen.

This open standard is backed by around 900 members of the Open Group that include HCLTech and almost every other industry leader as well as government institutions.

Bringing Ongoing Value to Customers

To earn the award, we provided a detailed assessment that focused on the value streams we deliver and showcased how these streams bring new and ongoing benefits to customers. The assessment included these eight key aspects of our offerings:

  1. Modern product management practices. Our teams successfully use IT4IT, a scaled agile framework, DevOps, and site reliability engineering where appropriate for a modern, innovative approach to open standards and open source.
  2. Continuous improvement. We ensure strong management support for optimizing the lifecycles of our digital products and services with a focus on ongoing improvement and sustainable value.
  3. Mature product development. From gathering requirements to meet customers’ needs to releasing new products and updates, we optimize robust, value-centric processes to deliver modern, flexible, and easy-to-use products.
  4. Ongoing customer focus. The customer is at the heart of everything we do. We maintain a strong customer focus, ensuring our products meet their business needs to build confidence in the user and data management experience.
  5. An automation mindset. Operations are streamlined using automated order fulfillment to provide quick and easy delivery to the customer.
  6. Accurate billing. Established mechanisms for metering and billing customers provide a quick overview of the Actian Units used in the cloud while ensuring transparent and accurate pricing.
  7. Trusted reliability. We employ a proactive approach to system reliability using site reliability engineering.
  8. Tool rationalization initiative. With ongoing initiatives to optimize the software landscape in engineering and throughout our organization, we drive increased efficiency and reduce costs.

What Does the Product Journey Look Like?

Delivering industry-leading products requires detailed steps to ensure success. Our journey to product delivery is represented in detail here:

IT4IT product journey infographic

This is how the four aspects work together and are implemented:

  1. Strategy to Portfolio. In this planning process, Actian manages ISO 27001-compliant internal and external policies in Confluence. The strategic planning is handled by a dedicated team with regular reviews by the project management office and executive leadership team. This aligns the plans to our vision and governance through the executive team.

Based on these plans, the executive leadership team provides strategic funding and resource allocation for the development of projects. The development and governance of the architecture roadmap are managed by the architecture board.

  1. Requirement to Deploy. This building process entails sprint grooming to ensure a clear understanding of user stories and to facilitate the required collection and tracking of requirements, which then benefit future products and features.

At Actian, we use efficient, automated deployments with small batch continuous integration, robust testing, version control, and seamless integrations in our development processes. This is complemented by efficient testing, extensive automation, version-controlled test cases, realistic performance testing, and integrated shift-left practices in continuous integration and continuous development pipelines with defect management.

Of course, source code version control is used to ensure traceability through testing and comments, and to promote code reuse. The code changes are traceable for build package promotion, automated validation, and centralized repository.

  1. Request to Fulfill. In this process, during and after delivery, Actian provides a strong user engagement with self-service resources, efficient ordering and fulfillment, integrated support, effective ticket management, and collaborative issue resolution.

The external service offering is efficient, with strong contract management, knowledge sharing, and automated deployment plans along with Jira service desk and Salesforce integration. Customer instances are created via self-service with automated orchestration, deployment guidelines, Kubernetes provisioning, and continuous deployment. In addition, the billing system provides a robust usage and metering Actian Unit hour calculation system with RabbitMQ integration and usage history generation.

  1. Detect to Correct. In this final process that involves running the product, Actian provides collaborative SLA performance reviews in tiered service contracts (Gold, Silver, and Bronze), and Salesforce integration for SLA data. Knowledge is shared through a repository.

Actian offers a site reliability engineering framework with clear lifecycle stages, along with a rich knowledge base. A robust performance and availability monitoring system is also provided.

Identifying Opportunities for Improvements and Closing Gaps

As with any major assessment, there are ongoing opportunities for improvements and identifying gaps in services or capabilities. These are evaluated and addressed to further improve Actian products and offerings.

Opportunities for improvements to our Actian processes included 12 instances for integration. These integration opportunities can benefit the development and delivery of products through increased usage and the linked exchange of data between departments and functions.

Eighteen opportunities also exist for improvements for internal processes. These include providing a more consistent approach to standardization and best practices, which is expected to improve workflows during the development and deployment of products.

In addition to these, 14 opportunities for improvement were identified that can be addressed by improving internal tools. This includes introducing new tools as well as unifying and streamlining existing heterogeneous tools.

Curious how our products and services can help your business make confident, data-driven decisions? Let’s talk.

The post Actian’s Innovation Earns Prestigious IT4IT Award appeared first on Actian.


Read More
Author: Steffen Harre

It’s Not the Tools, It’s the Culture: How to Roll Out Data Democratization


You want to implement data democratization, so you deployed all the new tooling and data infrastructure. You have a data catalog to manage metadata and ensure data lineage and a data marketplace to enable data discovery and self-service analytics. You’ve invested in the latest technologies to enable full self-service operation.  The data management architecture you […]

The post It’s Not the Tools, It’s the Culture: How to Roll Out Data Democratization appeared first on DATAVERSITY.


Read More
Author: Marek Ovaceck

Understanding the Importance of Data Resilience 


In recent years, the frequency and sophistication of cyberattacks have surged, presenting a formidable challenge to organizations worldwide. The proliferation of interconnected devices, growing dependency on cloud services, and the shift to remote work have introduced new vulnerabilities, creating more opportunities for cybercriminals to exploit. My company’s 2024 Data Protection Trends report revealed that 75% of organizations experience […]

The post Understanding the Importance of Data Resilience  appeared first on DATAVERSITY.


Read More
Author: Dave Russell

How Data Is Used in Fraud Detection Techniques in Fintech Business


In the rapidly changing world of financial technology (fintech), fraud is a developing area seething with vigor. Digital banking and online financial services are booming every day, bringing with them new techniques for thieves to ply their trade – not ones that can be easily dismissed. Fintech firms must now relentlessly deploy data and artificial intelligence […]

The post How Data Is Used in Fraud Detection Techniques in Fintech Business appeared first on DATAVERSITY.


Read More
Author: Harsh Daiya

The Role of Data Security in Protecting Sensitive Information Across Verticals


Data is the fuel that keeps the engine of any organization running efficiently. Its growing importance is becoming a frequent topic of conversation in boardrooms and strategy meetings. Companies increasingly know the need to protect their sensitive information and continue investing heavily in cybersecurity measures. However, this approach has a critical oversight: The assumption that […]

The post The Role of Data Security in Protecting Sensitive Information Across Verticals appeared first on DATAVERSITY.


Read More
Author: Mario Vargas Valles

Data Governance Best Practices and Implementation Strategies

Data Governance Best Practices and Implementation Strategies

No matter what industry you work in, you know how important it is to collect data. Retail workers rely on customer data to inform their buying decisions, healthcare workers need comprehensive and accessible data on their patients to provide treatments, and financial professionals analyze large sets of market data to make predictions for their clients. But collecting data for your organization isn’t enough — it needs to be reliable, secure, accessible, and easy for the members of your company to use. That’s where data governance comes in.

Data governance is a term for an organization’s practices and processes that help it optimize its data usage. Why is data governance important? It includes plans to protect data systems against cybersecurity threats, streamline data storage solutions, set up data democratization rules, and implement products and data platforms that support greater data efficiency throughout an organization. The specific data governance policies professionals use greatly depend on their industry, the type of data they collect, how much data they use, and other factors. However, some data governance best practices can help professionals — whether they have data roles or not — create policies that optimize and simplify their data usage.

Data Governance vs. Data Compliance

Depending on your industry, you may hear the term data compliance commonly used alongside data governance. Data compliance refers to the policies and procedures that help you meet external legal requirements surrounding your data, and data governance has more to do with optimizing and securing your data for internal use. Data compliance doesn’t include industry standards or the requirements of partner companies, just what’s legally required. Data compliance laws may influence what data governance policies you implement, but you’ll mostly work with legal teams to ensure you meet these requirements.

For example, if your organization does business in countries that belong to the European Economic Area, you must abide by the General Data Protection Regulation. This law dictates how companies collect, process, and dispose of personal data. It has a huge impact on sharing data outside of your organization, data retention timelines, and data democratization and destruction policies.

Going Beyond the Data Governance Framework

A solid data governance program requires a well-structured data governance framework that addresses data quality, collection, management, privacy and security. Organizations manage these critical components by creating company-wide policies and forming departments of data professionals who work together to support the larger data governance framework. Some of the departments that contribute to overall data stewardship include:

  • Data
  • Analytics
  • Engineering
  • IT
  • Legal
  • Compliance
  • Executive Management

Data stewards consistently work with these departments to create and improve their policies and strategies. A governance program with high data trust never stays stagnant, so they learn about the ever-changing needs and habits of these teams to make sure data remains the fuel of a well-oiled business.

While there may be some policies that are tailored to specific departments that use data, effective data governance requires cooperation from every team in the company. If a sales team creates a lead database outside of data governance policies, which isn’t accessible to the rest of the company, that data isn’t being used effectively. If there’s a team storing metadata in unprotected spreadsheets instead of utilizing an already-established data catalog used by the rest of the organization, it weakens the governance framework.

Data Governance Best Practices

Once you assess the needs of the department stakeholders and create a framework based on them, it’s time to create your data governance program. Here are some widely-held best practices in data governance to help you begin a new program or refine one that’s fallen behind the times.

Establish Clear Roles

For any data compliance program to succeed, data stewards must make sure that the major stakeholders know their individual and collective responsibilities. This includes who’s ultimately in charge of the data, who’s responsible for maintaining data quality, who takes charge of the data management strategy, and who’s responsible for protecting it from cyber threats. This organizational chart can get a little complex at larger organizations, but ensuring there are no gaps in responsibility is one of the most critical best practices in data governance.

Develop and Enforce Data Quality Policies

Collecting as much data as possible and sorting it out after isn’t always a good strategy for data governance. Effectively utilizing data in your industry only works if the data is accurate, reliable, and relevant. If data isn’t collected often enough or doesn’t include information that your organization relies on, then it’s not meeting its true potential.

Establishing a standard for data quality begins with learning the needs of stakeholders across your organization; collecting data that no one needs is a waste of valuable resources. Then, you must create your data quality dimensions, or what defines the data you use as high-quality. The most common data quality dimensions are:

  • Relevance
  • Completeness
  • Accuracy
  • Validity
  • Consistency
  • Uniqueness
  • Timeliness

Ensure Data Compliance & Security

High-quality data is a valuable commodity, and there’s no end to the bad actors and black hats developing new ways to steal it. IT and cybersecurity professionals are invaluable and should impact many of the data security best practices in your data governance plan. For example, they can make the most informed decision about what access control model to use for your data systems, which will affect how permissions to data are given. If they feel that data masking is appropriate for your data systems, they can walk you through the benefits of blurring vs. tokenization.

Plan Data Audits & Policy Checks

As we mentioned, a quality data governance program is constantly evolving and adapting to meet the changing needs of an organization — even when that feedback isn’t given directly to you. Performing regular data audits can provide insights into how well your data governance program bolsters data trust, whether there are any gaps in your procedures, who isn’t getting with the program, and more. If you notice that your strategy isn’t meeting the needs of your data governance framework, don’t worry — data governance policies should be streamlined and updated every so often, and it just means you’ve identified solid ways to improve data trust.

Strategy for Implementing Data Governance

Once you’ve developed your framework, spoken to stakeholders to assess their needs, developed strategic policies and processes based on data governance best practices, and received approval from the higher-ups, it’s time to put your plan into action. Here’s a step-by-step guide to help you get your data governance program off the ground.

1. Document Your Policies and Processes

Before you can expect members of your organization to follow your plan, they need to be made aware. Creating detailed documents that define your plan makes it easier to notify coworkers of the upcoming changes to their regular routines and creates a source of truth that everyone can refer to. Having these responsibilities outlined in a document ensures there’s no confusion and can keep you from having to frequently re-explain the finer points of your plan to critical stakeholders.

2. Discuss Roles & Responsibilities

You’ve likely talked to key members of your data governance plan about their role and responsibilities to make sure they’re able to perform their duties. However, explaining these things in-depth ensures that there’s no confusion or gaps in the plan. Encourage these members to ask questions so that they fully understand what’s required of them. It’s possible that they’ve agreed to what you’ve asked without fully understanding the processes or considering how their data governance role would conflict with other responsibilities.

3. Set Up Your Data Governance Tools

Your bold new data governance plan may require new tools — or reconfiguring existing solutions — to succeed. Suppose the level of data analysis your organization requires can only be achieved with a NoSQL database, or your plan hinges on integrating multiple data sources. Once you’ve received buy-in from management, you’ll want to implement and configure these tools to your specific needs before allowing wider access to them.

Performing this step early can help ensure that these solutions are working the way you’ve intended and that your coworkers aren’t using tools that are only partially working. Using tools yourself also provides an opportunity to streamline and automate any processes that you weren’t very familiar with before.

4. Train Your Employees

Maintaining a data governance plan doesn’t just require buy-in from managers and executives — it takes effort from every member of the organization. Training employees about their role in the company’s data governance goes beyond how to use things like a new data archiving solution that you’ve implemented. Everyone needs to be aware of their role and how they fit into the larger framework of data governance to ensure that there are no gaps in your strategy.

5. Promote a Data-Driven Culture

Regularly reminding members of your organization of how crucial data is — as well as following the data governance plan — helps ensure that they don’t lapse in their responsibilities and your program runs smoothly. For example, it’s said that the biggest cybersecurity threat these days is a company’s least-informed employee. Sending company-wide updates each time a new threat or scam becomes known to the larger cybersecurity community helps keep data governance top-of-mind and ensures that the components of your plan function properly.

While data governance plans should be fairly rigid for other members of your organization, you should think of yours as fluid and flexible to meet changing needs. Company growth and evolving organizational needs are good things, and one can’t over appreciate the link between sustainable growth and data governance growing and adapting alongside it. You can use these best practices in data governance to adapt or create new plans that make your organization more efficient, productive, and secure, no matter what changes come its way.

The post Data Governance Best Practices and Implementation Strategies appeared first on Actian.


Read More
Author: Actian Corporation

Actian’s Benchmark Dominance: A Price-Performance Powerhouse

Actian Shines in TPC-H Benchmark, Outperforming Major Competitors

In August of this year, Actian conducted a TPC-H benchmark test utilizing the services of McKnight Consulting Group. While some companies perform and publish their own benchmarks, Actian prefers to utilize the services of a third party for true, reliable and unbiased testing. Based in Plano, Texas, the McKnight Consulting Group has helped over 100 companies with analytics, big data, master data management strategies and implementations, including benchmarking.

Actian conducted a similar TPC-H benchmark test last year, validating that it indeed was faster than some of its key competitors such as Google BigQuery and Snowflake, with a performance of 11 times and three times faster than each vendor, respectively. Since then, the Actian engineering team has continued to enhance the performance capabilities of the Actian Data Platform with the understanding that it needs to meet the requirements of its existing and prospective customer base.

This is especially important given the growth in business use cases and the sources of data used in day-to-day operations. Actian is always striving to keep ahead of the curve for its customers, and its ability to provide both rapid data processing capabilities and, in turn, unparalleled price-performance, have been key factors in its product roadmap.

In this recent TPC-H benchmark test, Actian decisively outperformed its competitors Snowflake, Databricks, and Google BigQuery.

Key Benchmark Findings

  • Raw Performance: Actian Data Platform’s execution speed was significantly faster than all three competitors tested in the benchmark. It achieved nearly eight times the performance of Databricks, over six times that of Snowflake, and an impressive 12 times the performance of BigQuery.
  • Concurrency: Even with five concurrent users, Actian Data Platform maintained its performance advantage, outperforming Databricks by three times, Snowflake by over seven times, and BigQuery by 9.6 times.
  • Price-Performance: Actian Data Platform’s combination of speed and affordability was unmatched. It offered a price-performance ratio that was over eight times better than both Snowflake and BigQuery.

This is a significant improvement over last year’s fantastic results and is a testament to Actian’s commitment to database performance and price performance. Actian, with over 50 years of experience in data and database models, continues to show its prowess in the market.

What Does This Mean for Actian’s Current and Future Customers?

For businesses seeking a high-performance, cost-effective data warehouse or analytics platform, the benchmark results are a compelling reason to consider the Actian Data Platform. Here’s why:

  • Faster Insights: Actian’s superior performance means that businesses can get answers to their most critical questions faster. Actian has always aimed to provide REAL real-time analytics, and these results prove that we can get customers there. This can lead to improved decision-making, increased operational efficiency, and better customer experiences.
  • Lower Costs: Actian Data Platform’s favorable price-performance ratio translates into significant cost savings for businesses. By choosing Actian, organizations can avoid the high and sometimes unpredictable costs associated with other data platforms while still achieving exceptional results. This leads to long-term total cost of ownership benefits that other vendors cannot provide.
  • Scalability: Actian Data Platform’s ability to handle concurrent users and large datasets demonstrates its scalability. This is essential for businesses that need to support growing data volumes and user demands – two business needs that every organization is facing today.

Price Performance is Top of Mind

Today, CFOs and technical users alike are trying to find ways to get the best price performance possible from their database management systems (DBMS). Not only are CFOs interested in up-front acquisition and implementation costs, but also all costs downstream that are associated with utilization and maintenance of whichever system they choose.

Technical users of DBMS offerings are also looking for alternative ways to utilize their systems to save costs. In the back alleys of the internet (places like Reddit and other forums) users of various DBMS platforms are talking with others about how to effectively “game” their DBMS platforms to get the best price performance possible, sometimes leading to the development of shadow database solutions to try to save costs.

With the latest TPC-H benchmark results showing that the Actian Data Platform performs over eight times better than both Snowflake and BigQuery, companies looking for outstanding price performance in their future and, indeed, current DBMS systems need to consider Actian.

Take the Next Step

Actian Data Platform’s dominance in the TPC-H benchmark is a clear indication of its exceptional capabilities. By delivering superior performance, affordability, and scalability, Actian offers a compelling solution for businesses seeking a powerful and cost-effective data platform. If organizations are looking to unlock the full potential of their data with confidence, Actian is worth a closer look.

To download the complete TPC-H report from McKnight, click here.

The post Actian’s Benchmark Dominance: A Price-Performance Powerhouse appeared first on Actian.


Read More
Author: Phil Ostroff

Ask a Data Ethicist: What Should Be the Limits of Biometric Data Collection in the Workplace?


There’s a movement underway to capture an increasing amount of data about employees – from facial recognition or fingerprint systems used for tracking time and attendance, to systems that monitor your every keystroke. All of these invasive data collection technologies raise this question: What should be the limits of biometric data collection in the workplace? […]

The post Ask a Data Ethicist: What Should Be the Limits of Biometric Data Collection in the Workplace? appeared first on DATAVERSITY.


Read More
Author: Katrina Ingram

Answering the Build vs. Buy Question for Generative AI


Building custom generative AI (GenAI) technology solutions is the best way to gain a competitive edge by leveraging GenAI tools and services tailored to your business. On the other hand, building GenAI models from scratch is a costly and complicated endeavor – which is why many businesses instead settle for a genAI strategy wherein they […]

The post Answering the Build vs. Buy Question for Generative AI appeared first on DATAVERSITY.


Read More
Author: Daniel Avancini

Generative AI Is Accelerating Data Pipeline Management


Data pipelines are like insurance. You only know they exist when something goes wrong. ETL processes are constantly toiling away behind the scenes, doing heavy lifting to connect the sources of data from the real world with the warehouses and lakes that make the data useful. Products like DBT and AirTran demonstrate the repeatability and […]

The post Generative AI Is Accelerating Data Pipeline Management appeared first on DATAVERSITY.


Read More
Author: Mike Finley

Data Backup Essentials: Zero Trust Strategies for System Administrators


Time is always of the essence in the case of system administration and IT operations teams. They address the countless issues coming in from other departments: “My network is acting funky, and I’m about to go on a meeting with an important client,” “I accidentally deleted every important email I have ever received from the […]

The post Data Backup Essentials: Zero Trust Strategies for System Administrators appeared first on DATAVERSITY.


Read More
Author: Anthony Cusimano

2024 trends in AI and Customer Experiences


As we reflect on twenty twenty-four’s customer experience trends, AI-driven personalization has been a key focus.

While promising, it’s faced challenges- Data quality issues persist, affecting AI effectiveness. Trust concerns linger among executives and consumers. Cost-to-serve ratios remain high.

Despite this, some success stories emerge- Amazon’s recommendation engine drives thirty-five percent of sales. Bank of America’s AI assistant handles two million daily interactions. However, AI adoption for fraud detection has grown only five percent since twenty nineteen.

Looking ahead, businesses should prioritize- Omnichannel self-service support Balancing AI with human connection Customer-centric strategies focused on real business impact Remember, a strong Customer Master Data Management solution is crucial for success in this evolving landscape.

At Pretectum, we’re here to help you navigate these trends and optimize your customer experience strategy.
Learn more at www.pretectum.com

Unlocking Potentials with Pretectum CMDM


Pretectum’s Customer Master Data Management (CMDM) platform offers several key capabilities for organizations looking to improve their customer data management and enhance customer experiences:
Centralized Data Repository
Pretectum CMDM serves as a centralized repository for customer data, consolidating information from various sources across the organization. This provides a single source of truth for customer-related data, enabling more consistent and accurate information across departments.
Regulatory Compliance Support
The platform incorporates features to help organizations meet data protection regulations, including:
Encryption and access controls
Audit trails
Consent management
Self-service data verification options for customers
These capabilities assist in safeguarding sensitive customer information and demonstrating compliance with privacy laws.
Data Governance
Pretectum CMDM facilitates structured data governance through:
Defined data quality standards
Data stewardship roles
Access control policies
Segregation of data ownership
This governance framework helps ensure customer data is handled responsibly and consistently.
Cross-Functional Collaboration
The platform serves as a shared resource for diverse teams to access and contribute to customer data, fostering collaboration between departments like legal, compliance, IT, and customer service.
Real-Time Monitoring
Pretectum CMDM enables real-time compliance monitoring through alerts and notifications about potential risks or compliance issues. This allows for swift corrective actions when needed.
Integration Capabilities
The platform can integrate with risk management systems to provide a more comprehensive view of customer-related risks.
Improved Customer Experience
By centralizing and improving the quality of customer data, Pretectum CMDM enables organizations to gain deeper customer insights and deliver more personalized experiences. This can lead to enhanced customer satisfaction, increased sales opportunities, and streamlined operations.
By leveraging these capabilities, organizations can work towards creating a more customer-centric approach to data management, ultimately driving business success through improved customer experiences and operational efficiencies.

Integrating Pretectum CMDM with existing CRM systems for a unified customer view


The ability to seamlessly integrate and unify customer data is paramount if an organization intends to make the best use of customer data. Companies are constantly striving to harness the power of advanced technologies to streamline their operations and gain a comprehensive understanding of their customer base.

One approach is the integration of Pretectum CMDM (Customer Master Data Management) with one’s existing CRM (Customer Relationship Management) systems, paving the way for a unified customer view that can revolutionize the way organizations interact with and understand their customers.

LEARN MORE BY VISITING https://www.pretectum.com/integrating-pretectum-cmdm-with-existing-crm-systems-for-a-unified-customer-view/

What is a Golden Nominal?


Pretectum CMDM offers solutions for managing duplicates through two primary approaches: survivorship and derivation. Survivorship involves selecting the most comprehensive record from duplicates, while derivation combines information from multiple sources to create a unique reference record.

Each method presents challenges, including those wrapped around data quality issues, integration complexity, and the need for ongoing governance and compliance.

Organizations must navigate these complexities to maintain accurate Golden Nominal records, ensuring they reflect the most relevant and current information.

Pretectum CMDM provides the tooling to streamline golden nominal creation and management processes and enhance data management capabilities overall.

https://www.pretectum.com/golden-nominal-creation/

Mind the Gap: Start Modernizing Analytics by Reorienting Your Enterprise Analytics Team


… and your data warehouse / data lake / data lakehouse. A few months ago, I talked about how nearly all of our analytics architectures are stuck in the 1990s. Maybe an executive at your company read that article, and now you have a mandate to “modernize analytics.” Let’s say that they even understand that just […]

The post Mind the Gap: Start Modernizing Analytics by Reorienting Your Enterprise Analytics Team appeared first on DATAVERSITY.


Read More
Author: Mark Cooper

The Data Governance Wake-Up Call From the OpenAI Breach


Shockwaves reverberated throughout the political and tech ecosystems this summer when OpenAI – the creator of ChatGPT – admitted it had been breached. The breach, which involved an outsider gaining access to internal messaging systems, left many worried that a national adversary could do the same and potentially weaponize generative AI technologies. National security aside, the […]

The post The Data Governance Wake-Up Call From the OpenAI Breach appeared first on DATAVERSITY.


Read More
Author: Jessica Smith

Data Literacy: The $100 Million Insurance Policy You’re Probably Ignoring
In boardrooms across the globe, executives are gleefully signing off on multi-million-dollar investments in data infrastructure. Big data! AI! Machine learning! But here’s the inconvenient truth they’re overlooking: Without a data-literate workforce, these shiny new toys are as useful as a Ferrari in a traffic jam.  The Elephant in the Data Center  Let’s cut to […]


Read More
Author: Christine Haskell

Soft Skills for Data Professionals and Practical Ways to Learn Them
I have been in data, in some form or another, for over 20 years and have come a long way in both my technical and soft skills during that time. There are plenty of roles for data professionals that do not require soft skills, and if that’s what you are into, more power to you. […]


Read More
Author: Rebecca Gazda

Data Is Risky Business: Scaling Data Governance to the National Stage
I recently started a doctorate. And because I obviously have too much free time after running my business, teaching, and writing columns for august publications like this, I’m looking at data governance. But not at the level of the organization. My doctoral research will be a deep dive into some oft-neglected human factors that need to […]


Read More
Author: Daragh O Brien

The Data-Centric Revolution: Dealing with Data Complexity
There are many perennial issues with data: data quality, data access, data provenance, and data meaning. I will contend in this article that the central issue around which these others revolve is data complexity. It’s the complexity of data that creates and perpetuates these other problems. As we’ll see, it is a tractable problem that […]


Read More
Author: Dave McComb

Building Sustainable Data Centers for the Digital Age
Data centers are increasingly necessary in today’s digitally dependent world. However, these specialized facilities are incredibly resource-intensive, requiring the professionals designing them to make numerous sustainable decisions. Which steps should they follow in such processes?  Identify Areas for Improvement  Many data center projects focus on changing existing structures to make them maximally sustainable. Similarly, some […]


Read More
Author: Ellie Gabel

Fundamentals of AI Automation
Experts predict the AI market will grow from $184 billion in 2024 to $826 billion by 2030. And considering the wide range of use cases for AI tools, that’s not much of a surprise. However, while solutions like ChatGPT continue growing in popularity among everyday users, the most significant potential of artificial intelligence lies in […]


Read More
Author: Sarah Kaminski

Real-time Analytics Vs Stream Processing – What Is The Difference?


One of the holy grails that many data teams seem to chase is real-time data analytics. After all, if you can have real-time analytics, you can make better decisions faster. However, there often is a conflation between real-time data analytics and stream processing.  These are two different concepts that are crucial to understanding how to…
Read more

The post Real-time Analytics Vs Stream Processing – What Is The Difference? appeared first on Seattle Data Guy.


Read More
Author: research@theseattledataguy.com