Search for:
What to Expect in AI Data Governance: 2025 Predictions


In 2025, preventing risks from both cyber criminals and AI use will be top mandates for most CIOs. Ransomware in particular continues to vex enterprises, and unstructured data is a vast, largely unprotected asset. AI solutions have moved from experimental to mainstream, with all the major tech companies and cloud providers making significant investments in [ā€¦]

The post What to Expect in AI Data Governance: 2025 Predictions appeared first on DATAVERSITY.


Read More
Author: Krishna Subramanian

User-Friendly External Smartblobs Using a Shadow Directory

I am very excited about the HCL InformixĀ® 15 external smartblob feature.

If you are not familiar with them, external smartblobs allow the user to store actual Binary Large Object (blob) and Character Large Object (clob) data external to the database. Metadata about that external storage is maintained by the database.

Notes: This article does NOT discuss details of the smartblobs feature itself, but rather proposes a solution to make the functionality more user-friendly. For details on feature behavior, setup, and new functions, see the documentation.

At the writing of this blog, v15.0 does not have the ifx_lo_path function defined, as required below.Ā  This has been reported to engineering.Ā  The workaround is to create it yourself with the following command:

create dba function ifx_lo_path(blob)
Ā Ā returns lvarchar
Ā Ā external name '(sq_lo_path)'
Ā Ā language C;

This article also does not discuss details of client programming required to INSERT blobs and clobs into the database.

The external smartblob feature was built for two main reasons:

1. Backup size

Storing blobs in the database itself can cause the database to become extremely large. As such, performing backups on the database takes an inordinate amount of time, and 0 level backups can be impossible. Offloading the actual blob contents to an external file system can lessen the HCL Informix backup burden by putting the blob data somewhere else. The database still governs the storage of, and access to, the blob, but the physical blob is housed elsewhere/externally.

2. Easy access to blobs

Users would like easy access to blob data, with familiar tools, without having to go through the database.Ā 

Using External Smartblobs in HCL Informix 15

HCL Informix 15 introduces external smartblobs. When you define an external smartblob space, you specify the external directory location (outside the database) where you would like the actual blob data to be stored. Then you assign blob column(s) to that external smartblob space when you CREATE TABLE. When a row is INSERTed, HCL Informix stores the blob data in the defined directory using an internal identifier for the filename.

Hereā€™s an example of a customer forms table: custforms (denormalized and hardcoded for simplicity). My external sbspace directory is /home/informix/blog/resources/esbsp_dir1.

CREATE TABLE custforms(formid SERIAL, company CHAR(20), year INT, lname CHAR(20), 
formname CHAR(50), form CLOB) PUT form IN (esbsp);

Here, I INSERT a 2023 TaxForm123 document from a Java program for a woman named Sanchez, who works for Actian:

try(PreparedStatement p = c.prepareStatement("INSERT INTO custforms 
(company, year, lname, formname, form) values(?,?,?,?,?)");

FileInputStream is = new FileInputStream("file.xml")) {
p.setString(1, "Actian");
p.setString(2, "2023");
p.setString(3, "Sanchez");
p.setString(4, "TaxForm123");
p.setBinaryStream(5, is);
p.executeUpdate();
}

After I INSERT this row, my external directory and file would look like this:

[informix@schma01-rhvm03 resources]$ pwd
/home/informix/blog/resources
[informix@schma01-rhvm03 resources]$ ls -l esbsp*
-rw-rw---- 1 informix informix 10240000 Oct 17 13:22 esbsp_chunk1

esbsp_dir1:
total 0
drwxrwx--- 2 informix informix 41 Oct 17 13:19 IFMXSB0
[informix@schma01-rhvm03 resources]$ ls esbsp_dir1/IFMXSB0
LO[2,2,1(0x102),1729188125]

Where LO[2,2,1(0x102),1729188125]is an actual file that contains the data that I could access directly. The problem is that if I want to directly access this file for Ms. Sanchez, I would first have to figure out that this file belongs to her and is the tax document I want. Itā€™s very cryptic!

A User-Friendly Smartblob Solution

When talking to Informix customers, they love the new external smartblobs feature but wish it could be a little more user-friendly.

As in the above example, instead of putting Sanchezā€™s 2023 TaxForm123 into a general directory called IFMXSB0 in a file called LO[2,2,1(0x102),1729188125, which together are meaningless to an end-user, wouldnā€™t it be nice if the file was located in an intuitive place like /home/forms/Actian/2024/TaxForm123/Sanchez.xml or something similarā€¦something meaningfulā€¦how YOU want it organized?

Having HCL Informix automatically do this is a little easier said than done, primarily because the database would not intuitively know how any one customer would want to organize their blobs. What exact directory substructure? From what column or columns do I form the file names? What order? All use cases would be different.

Leveraging a User-Friendly Shadow Directory

The following solution shows how you can create your own user-friendly logical locations for your external smartblobs by automatically maintaining a lightweight shadow directory structure to correspond to actual storage locations. The solution uses a very simple system of triggers and stored procedures to do this.

Note: Examples here are shown on Linux, but other UNIX flavors should work also.

How to Set Up in 4 Steps

For each smartblob column in question

STEP 1: Decide how you want to organize access to your files.

Decide what you want the base of your shadow directory to be and create it. In my case for this blog, it is: /home/informix/blog/resources/user-friendly. You could probably implement this solution without a set base directory (as seen in the examples), but that may not be a good idea because users would unknowingly start creating directories everywhere.

STEP 2: Create a create_link stored procedure and corresponding trigger for INSERTs.

This procedure makes sure that the desired data-driven subdirectory structure exists from the base (mkdir -p), then forms a user-friendly logical link to the Informix smartblob file.Ā Ā Ā  You must pass all the columns to this procedure from which you want to form the directory structure and filename from the trigger.

CREATE PROCEDURE

CREATE PROCEDURE create_link (p_formid INT, p_company CHAR(20), p_year INT,
p_lname CHAR(20), p_formname CHAR(50))
DEFINE v_oscommand CHAR(500);
DEFINE v_custlinkname CHAR(500);
DEFINE v_ifmxname CHAR(500);
DEFINE v_basedir CHAR(100);
-- set the base directory
LET v_basedir = '/home/informix/blog/resources/user-friendly';
-- make sure directory tree exists
LET v_oscommand = 'mkdir -p ' || TRIM(v_basedir) || '/' || TRIM(p_company) || '/' || 
TO_CHAR(p_year);
SYSTEM v_oscommand; 

-- form full link name 
LET v_custlinkname = TRIM(v_basedir) || '/' || TRIM(p_company) || '/' || TO_CHAR(p_year) 
|| '/' || TRIM(p_lname) || '.' || TRIM(p_formname) || '.' || TO_CHAR(p_formid);

-- get the actual location 
SELECT IFX_LO_PATH(form::LVARCHAR) INTO v_ifmxname FROM custforms WHERE formid = p_formid; 

-- create the os link 
LET v_oscommand = 'ln -s -f ' || '''' || TRIM(v_ifmxname) || '''' || ' ' || v_custlinkname; 
SYSTEM v_oscommand;

END PROCEDURE

CREATE TRIGGER

CREATE TRIGGER ins_tr INSERT ON custforms REFERENCING new AS post
FOR EACH ROW(EXECUTE PROCEDURE create_link (post.formid, post.company,
post.year, post.lname, post.formname));

STEP 3: Create a delete_link stored procedure and corresponding trigger for DELETEs.

This procedure will delete the shadow directory link if the row is deleted.

CREATE PROCEDURE

CREATE PROCEDURE delete_link (p_formid INT, p_company CHAR(20), p_year INT,
p_lname CHAR(20), p_formname CHAR(50))
DEFINE v_oscommand CHAR(500);
DEFINE v_custlinkname CHAR(500); 
DEFINE v_basedir CHAR(100);
-- set the base directory
LET v_basedir = '/home/informix/blog/resources/user-friendly';
-- form full link name
LET v_custlinkname = TRIM(v_basedir) || '/' ||
TRIM(p_company) || '/' || TO_CHAR(p_year) || '/' || TRIM(p_lname) || '.'
|| TRIM(p_formname) || '.' || TO_CHAR(p_formid);
-- remove the link
LET v_oscommand = 'rm -f -d ' || v_custlinkname;
SYSTEM v_oscommand;

END PROCEDURE

CREATE TRIGGER

CREATE TRIGGER del_tr DELETE ON custforms REFERENCING old AS pre FOR EACH ROW
(EXECUTE PROCEDURE delete_link (pre.formid, pre.company, pre.year, pre.lname, pre.formname));

STEP 4: Create a change_link stored procedure and corresponding trigger for UPDATEs, if desired.Ā Ā  In my example, Ms. Sanchez might get married to Mr. Simon and an UPDATE to her last name in the database occurs. I may then want to change all my user-friendly names from Sanchez to Simon.Ā  This procedure deletes the old link and creates a new one.

Notice the update trigger only must fire on the columns that form your directory structure and filenames.

CREATE PROCEDURE

CREATE PROCEDURE change_link (p_formid INT, p_pre_company CHAR(20), 
p_pre_year INT, p_pre_lname CHAR(20), p_pre_formname CHAR(50), p_post_company CHAR(20), 
p_post_year INT, p_post_lname CHAR(20), p_post_formname CHAR(50))

DEFINE v_oscommand CHAR(500);
DEFINE v_custlinkname CHAR(500);
DEFINE v_ifmxname CHAR(500);
DEFINE v_basedir CHAR(100);
-- set the base directory
LET v_basedir = '/home/informix/blog/resources/user-friendly';

-- get rid of old

-- form old full link name
LET v_custlinkname = TRIM(v_basedir) || '/' || TRIM(p_pre_company) || '/' || 
TO_CHAR(p_pre_year) || '/' || TRIM(p_pre_lname) || '.' || TRIM(p_pre_formname) || '.' 
|| TO_CHAR(p_formid) ;

-- remove the link and empty directories
LET v_oscommand = 'rm -f -d ' || v_custlinkname;
SYSTEM v_oscommand;

-- form the new
-- make sure directory tree exists
LET v_oscommand = 'mkdir -p ' || TRIM(v_basedir) || '/' || TRIM(p_post_company) || '/' || 
TO_CHAR(p_post_year);
SYSTEM v_oscommand;

-- form full link name
LET v_custlinkname = TRIM(v_basedir) || '/' || TRIM(p_post_company) || '/' || 
TO_CHAR(p_post_year) || '/' || TRIM(p_post_lname) || '.' || TRIM(p_post_formname) 
|| '.' || TO_CHAR(p_formid) ;

-- get the actual location
-- this is the same as before as id has not changed
SELECT IFX_LO_PATH(form::LVARCHAR) INTO v_ifmxname FROM custforms WHERE formid = p_formid;

-- create the os link
LET v_oscommand = 'ln -s -f ' || '''' || TRIM(v_ifmxname) || '''' || ' ' || v_custlinkname;
SYSTEM v_oscommand;

END PROCEDURE

CREATE TRIGGER

CREATE TRIGGER upd_tr UPDATE OF formid, company, year, lname, formname ON custforms
REFERENCING OLD AS pre NEW as post

FOR EACH ROW(EXECUTE PROCEDURE change_link (pre.formid, pre.company, pre.year, pre.lname, 
pre.formname, post.company, post.year, post.lname, post.formname));

Results Example

Back to our example.

With this infrastructure in place, now in addition to the Informix-named file being in place, I would have these user-friendly links on my file system that I can easily locate and identify.

INSERT

[informix@schma01-rhvm03 2023]$ pwd
/home/informix/blog/resources/user-friendly/Actian/2023
[informix@schma01-rhvm03 2023]
$ ls Sanchez.TaxForm123.2

If I do an ls -l, you will see that it is a link to the Informix blob file.

[informix@schma01-rhvm03 2023]$ ls -l
total 0
lrwxrwxrwx 1 informix informix 76 Oct 17 14:20 Sanchez.TaxForm123.2 -> 
/home/informix/blog/resources/esbsp_dir1/IFMXSB0/LO[2,2,1(0x102),1729188126]

UPDATE

If I then update her last name with UPDATE custforms SET lname = ā€˜Simonā€™ where formid=2,my file system now looks like this:

[informix@schma01-rhvm03 2023]$ ls -l
lrwxrwxrwx 1 informix informix 76 Oct 17 14:25 Simon.TaxForm123.2 -> 
/home/informix/blog/resources/esbsp_dir1/IFMXSB0/LO[2,2,1(0x102),1729188126]

DELETE

If I then go and DELETE this form with DELETE FROM custforms where formid=2, my directory structure looks like this:

[informix@schma01-rhvm03 2023]$ pwd
/home/informix/blog/resources/user-friendly/Actian/2023
[informix@schma01-rhvm03 2023]$ ls
[informix@schma01-rhvm03 2023]$

We Welcome Your Feedback

Please enjoy the new HCL Informix15 external smartblob feature.

I hope this idea can make external smartblobs easier for you to use. If you have any feedback on the idea, especially on enhancements or experience in production, please feel free to contact me at mary.schulte@hcl-software.com. I look forward to hearing from you!

Find out more about the launch of HCL Informix 15.

Notes

1. Shadow directory permissions. In creating this example, I did not explore directory and file permissions, but rather just used general permissions settings on my sandbox server. Likely, you will want to control permissions to avoid some of the anomalies I discuss below.

2. Manual blob file delete. With external smartblobs, if permissions are not controlled, it is possible that a user might somehow delete the physical smartblob file itself from its directory. HCL Informix, itself, cannot control this from happening. In the event it does happen, HCL Informix does NOT delete the corresponding row; the blob file will just be missing. There may be aspects to links that can automatically handle this, but I have not investigated them for this blog.

3. Link deletion in the shadow directory. If permissions are not controlled, it is possible that a user might delete a logical link formed by this infrastructure. This solution does not detect this. If this is an issue, I would suggest a periodic maintenance job that cross references the shadow directory links to blob files to detect missing links. For those blobs with missing links, write a database program to look up the rowā€™s location with the IFX_LO_PATH function, and reform the missing link.

4. Unique identifiers. I highly recommend using unique identifiers in this solution. In this simple example, I used formid. You donā€™t want to clutter things up, of course, but depending on how you structure your shadow directories and filenames, you may need to include more unique identifiers to avoid directory and link names duplication.

5. Empty directories. I did not investigate if there are options to rm in the delete stored procedure to clean up empty directories that might remain if a last item is deleted.

6. Production overhead. It is known that excessive triggers and stored procedures can add overhead to a production environment. For this blog, it is assumed that OLTP activity on blobs is not excessive, therefore production overhead should not be an issue. This being said, this solution has NOT been tested at scale.

7. NULL values. Make sure to consider the presence and impact of NULL values in columns used in this solution. For simplicity, I did not handle them here.

Informix is a trademark of IBM Corporation in at least one jurisdiction and is used under license.

Ā 

The post User-Friendly External Smartblobs Using a Shadow Directory appeared first on Actian.


Read More
Author: Mary Schulte

Transforming Marketing Data into Business Growth: Key Insights and Strategies


Marketing leaders and data professionals often grapple with a familiar challenge: how to transform marketing data into tangible business growth. During a recent episode of The Lights on Data Show, I had the privilege of speaking with Kasper Bossen-Rasmussen, founder and CEO of Accutics, about this very topic. Together, we explored key takeaways for addressing [ā€¦]

The post Transforming Marketing Data into Business Growth: Key Insights and Strategies appeared first on LightsOnData.


Read More
Author: George Firican

Why the Growing Adoption of IoT Demands Seamless Integration of IT and OT


Over the past year, cyberattacks on cyber-physical systems (CPS) haveĀ cost organizationsĀ around the world at least $500,000, highlighting the growing financial and operational risks of compromised security. As artificial intelligence (AI) continues to emerge as a key driver in nearly every sector, the need for trustworthy, secure data becomes even more crucial. To address these challenges, [ā€¦]

The post Why the Growing Adoption of IoT Demands Seamless Integration of IT and OT appeared first on DATAVERSITY.


Read More
Author: Julian Durand

From Input to Insight: How Quality Data Drives AI and Automation


More and more enterprises are looking to automation and AI to deliver new efficiencies and give their organizations an edge in the market. Data is the engine that powers both automation and AI. But data must be clean and user-friendly for these systems to work effectively and deliver on their promise.Ā  Lots of organizations are [ā€¦]

The post From Input to Insight: How Quality Data Drives AI and Automation appeared first on DATAVERSITY.


Read More
Author: Amol Dalvi

Data Monetization: The Holy Grail or the Road to Ruin?


Unlocking the value of data is a key focus for business leaders, especially the CIO. While in its simplest form, data can lead to better insights and decision-making, companies are pursuing an entirely different and more advanced agenda: the holy grail of data monetization. This concept involves aggregating a variety of both structured and unstructured [ā€¦]

The post Data Monetization: The Holy Grail or the Road to Ruin? appeared first on DATAVERSITY.


Read More
Author: Tony Klimas

Beyond Ownership: Scaling AI with Optimized First-Party Data


Brands, publishers, MarTech vendors, and beyond recently gathered in NYC for Advertising Week and swapped ideas on the future of marketing and advertising.Ā The overarching message from many brands was one weā€™ve heard before: First-party data is like gold, especially for personalization. But it takes more than ā€œowningā€ the data to make it valuable.Ā Scale and accuracy [ā€¦]

The post Beyond Ownership: Scaling AI with Optimized First-Party Data appeared first on DATAVERSITY.


Read More
Author: Tara DeZao

Ask a Data Ethicist: How Can You Learn More About Data and AI Ethics?


It was about this time last year that I pitched the team at DATAVERSITY the idea of this monthly column on data ethics. Thereā€™s certainly been no shortage of interesting questions to cover and Iā€™ve enjoyed writing about both the practical and more philosophical aspects of this topic. As we wrap up this year and [ā€¦]

The post Ask a Data Ethicist: How Can You Learn More About Data and AI Ethics? appeared first on DATAVERSITY.


Read More
Author: Katrina Ingram

Mind the Gap: Architecting Santaā€™s List ā€“ The Naughty-Nice Database


You never know whatā€™s going to happen when you click on a LinkedIn job posting button. Iā€™m always on the lookout for interesting and impactful projects, and one in particular caught my attention: ā€œFar North Enterprises, a global fabrication and distribution establishment, is looking to modernize a very old data environment.ā€ I clicked the button [ā€¦]

The post Mind the Gap: Architecting Santaā€™s List ā€“ The Naughty-Nice Database appeared first on DATAVERSITY.


Read More
Author: Mark Cooper

5 Data Management Tool and Technology Trends to Watch in 2025


The market surrounding data management tools and technologies is quite mature. After all, the typical business has been making extensive use of data to help streamline its operations and decision-making for years, and many companies have long had data management tools in place. But that doesnā€™t mean that little is happening in the world of [ā€¦]

The post 5 Data Management Tool and Technology Trends to Watch in 2025 appeared first on DATAVERSITY.


Read More
Author: Matheus Dellagnelo

How to Foster a Cross-Organizational Approach to Data Initiatives


In todayā€™s business landscape, data reigns supreme. It is the cornerstone of effective decision-making, fuels innovation, and drives organizational success. However, despite its immense potential, many organizations struggle to harness the full power of their data due to a fundamental disconnect between IT and business teams. This division not only impedes progress but also undermines [ā€¦]

The post How to Foster a Cross-Organizational Approach to Data Initiatives appeared first on DATAVERSITY.


Read More
Author: Abhas Ricky

Data Governance Defying Gravitas
ā€œDefying Gravity,ā€ the show-stopping anthem from the musical ā€œWicked,ā€ captures the essence of breaking free from conventions and soaring beyond expectations. Just as Elphaba, the protagonist witch from ā€œWicked,ā€ refuses to be bound by the weight of societal norms, Non-Invasive Data Governance (NIDG) offers organizations a way to defy the gravitas of traditional governance frameworks. [ā€¦]


Read More
Author: Robert S. Seiner

Through the Looking Glass: What Does Data Quality Mean for Unstructured Data?
I go to data conferences. Frequently. Almost always right here in NYC. We have lots of data conferences here. Over the years, Iā€™ve seen a trend ā€” more and more emphasis on AI.Ā Ā  Iā€™ve taken to asking a question at these conferences: What does data quality mean for unstructured data? This is my version of [ā€¦]


Read More
Author: Randall Gordon

Data Governance Best Practices: Lessons from Anthemā€™s Massive Data Breach
In the insurance industry, data governance best practices are not just buzzwords ā€” theyā€™re critical safeguards against potentially catastrophic breaches. The 2015 Anthem Blue Cross Blue Shield data breach serves as a stark reminder of why robust data governance is crucial.Ā  The Breach: A Wake-Up CallĀ  In January 2015, Anthem, one of the largest health [ā€¦]


Read More
Author: Christine Haskell

Data Insights Ensure Quality Data and Confident Decisions
Every business (large or small) creates and depends upon data. One hundred years ago, businesses looked to leaders and experts to strategize and to create operational goals. Decisions were based on opinion, guesswork, and a complicated mixture of notes and records reflecting historical results that may or may not be relevant to the future.Ā  Today, [ā€¦]


Read More
Author: Kartik Patel

Combining IoT and Blockchain Technology to Enhance Security
The Internet of Things (IoT) technology has taken the world by storm. From smart homes and wearables to connected cars and fitness trackers, IoT devices are becoming prevalent across various industries and aspects of daily life. There are approximately 15.14 billion connected IoT devices in 2023, and this number is expected to grow to around [ā€¦]


Read More
Author: Hazel Raoult

Securing Your Data With Actian Vector

The need for securing data from unauthorized access is not new. It has been required by laws for handling personally identiļ¬able information (PII) for quite a while. But the increasing use of data services in the cloud for all kinds of proprietary data that is not PII now makes data security an important part of most data strategies.

This is the start of a series of blog posts that take a detailed look at how data security can be ensured with Actian Vector. The first post explains the basic concept of encryption at rest and how Actian Vectorā€™s Database Encryption functionality implements it.

Understanding Encryption at Rest

Encryption at rest refers to encryption of data at rest, which means data that is persisted, usually on disk or in cloud storage. This encryption can be used in a database system that is mainly user data in tables and indexes, but also includes the metadata describing the organization of the user data. The main purpose of encryption at rest is to secure the persisted data from unauthorized direct access on disk or in cloud storage, that is without a connection to the database system.

The encryption can be transparent to the database applications. In this case, encryption and decryption is managed by the administrator, usually at the level of databases. The application then does not need to be aware of the encryption. It connects to the database to access and work with the data as if there is no encryption at all. In Actian Vector, this type of encryption at rest is called database encryption.

Encryption at the application level, on the other hand, requires the application to handle the encryption and decryption. Often this means that the user of the application has to provide an encryption key for both, the encryption (e.g. when data is inserted) and the decryption (e.g. when data is selected). While more complicated, it provides more control to the application and the user.

For example, encryption can be applied more ļ¬ne grained to speciļ¬c tables, columns in tables, or even individual record values in table columns. It may be possible to use individual encryption keys for diļ¬€erent data values. Thus, users can encrypt their private data with their own encryption key and be sure that without having this encryption key, no other user can see the data in clear text. In Actian Vector, encryption at the application level is referred to as function-based encryption.

Using Database Encryption in Actian Vector

In Actian Vector, the encryption that is transparent to the application works at the scope of a database and therefore is called database encryption. Whether a database is encrypted or not is determined with the creation of the database and cannot be changed later. When a database is created with database encryption, all the persisted data in tables and indexes, as well as the metadata for the database, is encrypted.

The encryption method is 256-bit AES, which requires a 32 byte symmetric encryption key. Symmetric means that the same key is used to encrypt and decrypt the data. This key is individually generated for each encrypted database and is called a database (encryption) key.

To have the database key available, it is stored in an internal system ļ¬le of the database server, where it is protected by a passphrase. This passphrase is provided by the user when creating the database. However, the database key is not used to directly encrypt the user data. Instead, it is used to encrypt, i.e. protect, yet another set of encryption keys that in turn are used to encrypt the user data in the tables and indexes. This set of encryption keys is called table (encryption) keys.

Once the database is created, the administrator can use the chosen passphrase to ā€œlockā€ the database. When the database is locked, the encrypted data cannot be accessed. Likewise, the administrator also uses the passphrase to ā€œunlockā€ a locked database and thus re-enable access to the encrypted data. When the database is unlocked, the administrator can change the passphrase. If desired, it is also possible to rotate the database key when changing the passphrase.

The rotation of the database key is optional, because it means that the whole container of the table keys needs to be decrypted with the old database key to then re-encrypt it with the new database key. Because this container of the table keys also contains other metadata, it can be quite large and thus the rotation of the database key can become a slow and computationally expensive operation. Database key rotation therefore is only recommended if there is a reasonable suspicion that the database key was compromised. Most of the time, changing only the passphrase should be suļ¬€icient. And it is done quickly.

With Actian Vector it is also possible to rotate the table encryption keys. This is done independently from changing the passphrase and the database key, and can be performed on a complete database as well as on individual tables. For each key that is rotated, the data must be decrypted with the old key and re-encrypted with the new key. In this case, we are dealing with the user data in tables and indexes. If this data is very large, the key rotation can be very costly and time consuming. This is especially true when rotating all table keys of a database.

A typical workļ¬‚ow of using database encryption in Actian Vector:

  • Create a database with encryption:
      1. createdb -encrypt <database_name>

This command prompts the user twice for the passphrase and then creates the database with encryption. The new database remains unlocked, i.e. it is readily accessible, until it is explicitly locked or until shutdown of the database system.

It is important that the creator of the database remembers the provided passphrase because it is needed to unlock the database and make it accessible, e.g. after a restart of the database system.

  • Lock the encrypted database:
      1. Connect to the unlocked database with the Terminal Monitor:
        sql <database_name>
      2. SQL to lock the database:
        DISABLE PASSPHRASE '<user supplied passphrase>'; g

The SQL statement locks the database. New connect attempts to the database are rejected with a corresponding error. Sessions that connected previously can still access the data until they disconnect.

To make the database lock also immediately eļ¬€ective for already connected sessions, additionally issue the following SQL statement:

      1. CALL X100(TERMINATE); g
  • Unlock the encrypted database:
      1. Connect to the locked database with the Terminal Monitor and option ā€œ-no_x100ā€:
        sql -no_x100 <database_name>
      2. SQL to unlock the database:
        ENABLE PASSPHRASE '<user supplied passphrase>'; g

The connection with the ā€œ-no_x100ā€ option connects without access to the warehouse data, but allows the administrative SQL statement to unlock the database.

  • Change the passphrase for the encrypted database:
      1. Connect to the unlocked database with the Terminal Monitor:
        sql <database_name>
      2. SQL to change the passphrase:
        ALTER PASSPHRASE '<old user supplied passphrase>' TO
        '<new passphrase>'; g

Again, it is important that the administrator remembers the new passphrase.

After changing the passphrase for an encrypted database, it is recommended to perform a new database backup (a.k.a. ā€œdatabase checkpointā€) to ensure continued full database recoverability.

  • When the database is no longer needed, destroy it:
      1. destroydb <database_name>

Note that the passphrase of the encrypted database is not needed to destroy it. The command can only be performed by users with the proper privileges, i.e. the database owner and administrators.

This first blog post in the database security series explained the concept of encryption at rest and how transparent encryption ā€” in Actian Vector called Database Encryption ā€” is used.

The next blog post in this series will take a look at function-based encryption in Actian Vector.

The post Securing Your Data With Actian Vector appeared first on Actian.


Read More
Author: Martin Fuerderer

Book of the Month: ā€œAI Governance Comprehensiveā€


Welcome to December 2024ā€™s ā€œBook of the Monthā€ column. This month, weā€™re featuring ā€œAI Governance Comprehensive: Tools, Vendors, Controls, and Regulationsā€ by Sunil Soares, available for free download on the YourDataConnect (YDC) website.Ā  This book offers readers a strong foundation in AI governance. While the emergence of generative AI (GenAI) has brought AI governance to [ā€¦]

The post Book of the Month: ā€œAI Governance Comprehensiveā€ appeared first on DATAVERSITY.


Read More
Author: Mark Horseman

Technical and Strategic Best Practices for Building RobustĀ Data Platforms


In the AI era, organizations are eager to harness innovation and create value through high-quality, relevant data. Gartner, however, projects thatĀ 80% of data governance initiatives will fail by 2027. This statistic underscores the urgent need for robustĀ data platformsĀ and governance frameworks. A successful data strategy outlines best practices and establishes a clear vision for data architecture, [ā€¦]

The post Technical and Strategic Best Practices for Building RobustĀ Data Platforms appeared first on DATAVERSITY.


Read More
Author: Alok Abhishek

Chatbot Quality Control: Why Data Hygiene Is a Necessity


The rush is on to deploy chatbots. Chatbots rely on data to power their outputs; however, companies that prioritize data quantity over quality risk creating systems that produce unreliable, inappropriate, and simply incorrect responses. Success in this field depends on rigorous data standards and ongoing quality control rather than simply accumulating more training data. When [ā€¦]

The post Chatbot Quality Control: Why Data Hygiene Is a Necessity appeared first on DATAVERSITY.


Read More
Author: Todd Fisher

Unleashing the Power of Data: A Guide for CIOs and CDOs
In todayā€™s data-driven landscape, organizations have a wealth of information at their fingertips ā€” but unlocking its full potential is a complex challenge. Many struggle to effectively leverage data for AI, analytics, and decision-making, often hindered by issues like accuracy, availability, and security. For CIOs and CDOs, prioritizing and addressing these obstacles is mandatory. From [ā€¦]


Read More
Author: Myles Suer

Data Speaks for Itself: Data Validation ā€“ Data Accuracy Imposter or Assistant?
In my last article, ā€œThe Shift from Syntactic to Semantic Data Curation and What It Means for Data Qualityā€ published in the August 2024 issue of this newsletter, I argued how the adoption of generative AI will change the focus and scope of data quality management (DQM). Because data quality is measured in the degree [ā€¦]


Read More
Author: Dr. John Talburt

The Art of Lean Governance: A Systems Thinking Approach to Data Governance
A systems thinking approach to process control and optimization demands continual data quality feedback loops. Moving the quality checks upstream to the source system provides the most extensive control coverage. Data quality approaches not utilizing these loops will fail to achieve the desired results, often worsening the problem.Ā  Data Governance is about gaining trust and [ā€¦]


Read More
Author: Steve Zagoudis

RSS
YouTube
LinkedIn
Share