"Postgres" is the service.
"psql" is the command-line client.
Double-quotes denote table names and column names.
Single-quotes denote string literals.
select * from schema."user" where "name" in ('Jane Smith', 'Susan Roth')
Label a block of code:
<<myLabel>>
BEGIN
//statements
END;
An exit command can be labeled to tell which block to exit. Exit command must be nested within that block.
<<myLabel>>
BEGIN
LOOP
IF x > 10 THEN
EXIT myLabel; //execution of loop ends here
END IF;
END LOOP;
END;
//execution picks up outside "myLabel" block
A continue command can be labeled to tell which loop to continue the next iteration of. Continue command must be nested within that loop.
<<labelA>>
LOOP
//execution picks up on next iteration of "labelA" loop
<<labelB>>
LOOP
IF x > 10 THEN
CONTINUE labelA; //execution of loop ends here
END IF;
END LOOP;
END LOOP;
Declaration:
x integer;
y varchar;
Assignment:
x := 0;
y := 'text';
CREATE USER <user_name> WITH
LOGIN
NOSUPERUSER
INHERIT
NOCREATEDB
NOCREATEROLE
NOREPLICATION;
CREATE DATABASE <database_name>
WITH
OWNER = <user_name>
ENCODING = 'UTF8'
LC_COLLATE = 'en_US.UTF-8'
LC_CTYPE = 'en_US.UTF-8'
TABLESPACE = pg_default
CONNECTION LIMIT = -1;
Insert entire result set:
INSERT INTO myTable (id)
SELECT customerId
FROM customerTable
WHERE Age > 65;
Postgres version of SQL's CROSS APPLY
SELECT *
FROM tableA
LEFT JOIN LATERAL (SELECT COUNT(*) Cnt FROM tableB WHERE tableB.column = tableA.column) tableB ON TRUE
Logical joins
AND
OR
Integer comparison
WHERE column = 1
Boolean comparison
WHERE column IS TRUE
IF x > 10 THEN
//statements
END IF;
IF x > 10 THEN
//statements
ELSIF
//statements
ELSE
//statements
END IF;
CASE
WHEN x == 0 THEN
//statements
WHEN x == 1, x == 2 THEN
//statements
ELSE
//statements
END CASE;
CASE x
WHEN 0 THEN
//statements
WHEN 1,2 THEN
//statements
ELSE
//statements
END CASE;
LOOP
//statements
END LOOP;
Exit loop on condition:
LOOP
EXIT WHEN x > 10;
END LOOP;
Continue next iteration of loop:
LOOP
CONTINUE;
END LOOP;
Continue next iteration of loop on condition:
LOOP
CONTINUE WHEN x > 10;
END LOOP;
WHILE x < 10 LOOP
//statements
END LOOP;
Iterate through list of integers. 1..10 provides 1,2,3,4,5,6,7,8,9,10.
FOR i IN 1..10 LOOP
//statements
END LOOP;
Specify a step other than the default of 1. This loops through 1,3,5,7,9.
FOR i IN 1..10 BY 2 LOOP
//statements
END LOOP;
Reverse the order. This loops through 10,9,8,7,6,5,4,3,2,1.
FOR i IN REVERSE 1..10 LOOP
//statements
END LOOP;
FOR record IN SELECT id, lastName FROM myTable LOOP
//record.id
//record.lastName
END LOOP;
CREATE TABLE myTable (id INT NOT NULL);
INSERT INTO myTable VALUES (1);
INSERT INTO myTable VALUES (2);
INSERT INTO myTable VALUES (3);
CREATE OR REPLACE FUNCTION myFunction() RETURNS SETOF myTable AS
$BODY$ //any label can be put inside the $$ characters
DECLARE
result myTable%rowtype;
BEGIN
FOR result IN SELECT * FROM myTable ORDER BY id ASC
LOOP
IF result.id > 2 THEN
EXIT;
END IF;
RETURN NEXT result;
END LOOP;
RETURN;
END
$BODY$
LANGUAGE 'plpgsql';
SELECT * FROM myFunction();
What non-standard collations are being used?
select table_schema,
table_name,
column_name,
collation_name
from information_schema.columns
where collation_name is not null
order by table_schema,
table_name,
ordinal_position;
PSQl supports the "utf8_general_ci" collation directly. This can store more special characters than the basic latin collations can.
(1)
Export full database schema from MSSQL with "Generate Scripts".
- "Types of data to export" = "Schema Only"
(2)
Convert schema to Postgres
- "sqlserver2pgsql" is one tool for this
git clone https://github.com/dalibo/sqlserver2pgsql.git
cd sqlserver2pgsql
perl sqlserver2pgsql.pl -f {mssql_file.sql} -b run_first.sql -a run_second.sql -u schema_unsure.sql
run_first.sql will contain table definitions.
run_second.sql will contain constraint definitions.
Review schema_unsure.sql for anything to handle manually.
(3)
Load postgres schemas into empty database.
psql -U {user_name} -p 5432 -h localhost -d {database_name} -f run_first.sql
psql -U {user_name} -p 5432 -h localhost -d {database_name} -f run_second.sql
Pentaho Data Integration (PDI) community edition is one tool for this. It will establish a connection to the source and destination databases and handle all data transfer and data type conversions.
(continue from article)