Autonomous transactions in PostgreSQL

let's Start with the definition of what an Autonomous transaction in principle:
the Autonomous transactions allow you to create a new sub-transactions (subtransaction) that can save or discard the changes, regardless of the parent transaction. read More here.

They are implemented in the Oracle DBMS, and are very flexible and simple. The most popular (but not only) using Autonomous transactions is logging. Consider a simple example. Imagine the following scenario — your database is implemented long and complex stored procedure, such as monthly billing. Important events you'll want to see in the table logs. Of course, you can just write it without any tweaks. But there is one serious drawback — if unhandled error occurs the transaction is rolled back together with all the records in the table logs. Here come to the aid of Autonomous transactions. They run independently of the parent transaction, and you can see the record in the table logs before has finished bloggerwave procedure billing.

Let's move on to topic. Despite the pretentious title, PostgreSQL Autonomous transaction does not exist. But there is a way by which we can obtain a similar result.

the

dblink


First, we need a contrib module dblink. This module allows you to use its functions to refer to another PostgreSQL instance. A detailed description is beyond the scope of a topic. From myself I will say that the dblink is one of the few useful contrib modules, which I always set independently of the tasks of the database.

the

Example


Will show, the described method on the example from the beginning of a topic. Implement logging to "offline" transactions.

Create table logs:
CREATE TABLE "public"."logs" (
"log_id" BIGSERIAL
source TEXT NOT NULL
"level" TEXT NOT NULL
"message" TEXT NOT NULL
"time_added" TIMESTAMP WITHOUT TIME ZONE DEFAULT now() NOT NULL
CONSTRAINT "logs_pkey" PRIMARY KEY("log_id")
) WITH OIDS;


* This source code was highlighted with Source Code Highlighter.


Create a function for logging:
CREATE OR REPLACE FUNCTION "public"."log"(IN p_source text IN p_level text IN p_message text)
RETURNS void AS
$$
DECLARE
INSERT_SQL text := 'INSERT INTO public.logs (source, level, message) VALUES (''[SOURCE]'',''[LEVEL]'',''[MESSAGE]'')';
BEGIN

IF p_level != 'ERROR' AND p_level != 'WARNING'
AND p_level != 'INFO' AND p_level != 'DEBUG' THEN
RAISE EXCEPTION 'Log level should be one of the following types: ERROR, WARNING, INFO, DEBUG';
END IF;

INSERT_SQL := replace(INSERT_SQL '[SOURCE]' p_source);
INSERT_SQL := replace(INSERT_SQL '[LEVEL]' p_level);
INSERT_SQL := replace(INSERT_SQL '[MESSAGE]' p_message);

PERFORM dblink_exec('dbname=' || current_database(), INSERT_SQL);
END;
$$ LANGUAGE 'plpgsql';


* This source code was highlighted with Source Code Highlighter.

Any moment now we can call the function "log" and the entry will immediately appear in the log table regardless of the status of the transaction, during which we caused it.
PERFORM "public"."log"('monthly billing' 'INFO' 'Starting monthly billing');

* This source code was highlighted with Source Code Highlighter.


As you can see, the entire party is that we record in the table logs using the function "dblink_exec". This means that it creates a new connection, session and transaction context, which is recorded.

the

Comments


    unfortunately, this trick can not completely replace the offline Oracle transaction. The problem is the performance — I would not advise to use this technique left and right, think where it is really needed.

    this method is not only applicable to PostgreSQL. Known to me, each DBMS has a feature similar to "dblink_exec" the

  1. As correctly noted in the comments — use dblink_connect to create a persistent connection, and this connection further, significantly accelerate the work.
Article based on information from habrahabr.ru

Comments

Popular posts from this blog

Powershell and Cyrillic in the console (updated)

Active/Passive PostgreSQL Cluster, using Pacemaker, Corosync

Automatic deployment ElasticBeanstalk using Bitbucket Pipelines