postgresql-9.4

Combine two JSON objects in PostgreSQL

只愿长相守 提交于 2019-12-03 08:17:02
I have two JSON rows in a PostgreSQL 9.4 table: the_column ---------------------- {"evens": [2, 4, 6]} {"odds": [1, 3, 5]} I want to combine all of the rows into one JSON object. (It should work for any number of rows.) Desired output: {"evens": [2, 4, 6], "odds": [1, 3, 5]} Use json_agg() to get an array: SELECT json_agg(source_column) AS the_column FROM tbl; Or json_each() in a LATERAL join and json_object_agg() to assemble elements: SELECT json_object_agg(key, value) AS the_column FROM tbl, json_each(data); Arthur Nascimento FYI, if someone's using jsonb in >= 9.5 and they only care about

How to delete replication slot in postgres 9.4

最后都变了- 提交于 2019-12-03 04:46:37
I have replication slot which I want to delete but when I do delete I got an error that I can't delete from view. Any ideas? postgres=# SELECT * FROM pg_replication_slots ; slot_name | plugin | slot_type | datoid | database | active | xmin | catalog_xmin | restart_lsn --------------+--------------+-----------+--------+----------+--------+------+--------------+------------- bottledwater | bottledwater | logical | 12141 | postgres | t | | 374036 | E/FE8D9010 (1 row) postgres=# delete from pg_replication_slots; ERROR: cannot delete from view "pg_replication_slots" DETAIL: Views that do not select

CREATE VIEW specifies more column names than columns

你说的曾经没有我的故事 提交于 2019-12-02 16:57:26
问题 If I run the following statements in PostgreSQL 9.4.8, I get this error message: CREATE VIEW specifies more column names than columns. But why? Doesn't f1 return a table with 5 columns and shouldn't v1 have 5 columns as well? Also, If I remove the casts from the first SELECT statement, I get this error message: Final statement returns unknown instead of character varying at column 1. But why? The correct type VARCHAR(20) is known from RETURNS , so why is there no implicit cast of strings such

PostgreSQL jsonb value in WHERE BETWEEN clause

空扰寡人 提交于 2019-12-02 09:42:46
问题 I've got jsonb field in my database table (a_table) with int value within, say: { "abc":{ "def":{ "ghk":500 } } } I'm about to create SELECT with filter by this field ("ghk") using WHERE clause: SELECT * FROM a_table WHERE ghk BETWEEN 0 AND 1000; How should i create such a query? Couldn't find good tutorial for jsonb usage so far. Thanks in advance! EDIT I found this solution: SELECT * FROM a_table WHERE a_field #> '{abc,def,ghk}' BETWEEN '0' AND '10000' ; Is it correct? 回答1: The #> returns a

CREATE VIEW specifies more column names than columns

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-02 08:55:08
If I run the following statements in PostgreSQL 9.4.8, I get this error message: CREATE VIEW specifies more column names than columns. But why? Doesn't f1 return a table with 5 columns and shouldn't v1 have 5 columns as well? Also, If I remove the casts from the first SELECT statement, I get this error message: Final statement returns unknown instead of character varying at column 1. But why? The correct type VARCHAR(20) is known from RETURNS , so why is there no implicit cast of strings such as 'a' ? CREATE OR REPLACE FUNCTION f1 (a1 INTEGER, a2 INTEGER) RETURNS TABLE (c1 VARCHAR(20), c2

PostgreSQL jsonb value in WHERE BETWEEN clause

痞子三分冷 提交于 2019-12-02 07:27:19
I've got jsonb field in my database table (a_table) with int value within, say: { "abc":{ "def":{ "ghk":500 } } } I'm about to create SELECT with filter by this field ("ghk") using WHERE clause: SELECT * FROM a_table WHERE ghk BETWEEN 0 AND 1000; How should i create such a query? Couldn't find good tutorial for jsonb usage so far. Thanks in advance! EDIT I found this solution: SELECT * FROM a_table WHERE a_field #> '{abc,def,ghk}' BETWEEN '0' AND '10000' ; Is it correct? The #> returns a JSONB document which you cannot cast to an int . You need the #>> operator which returns a scalar value

Postgresql crosstab query with multiple “row name” columns

天涯浪子 提交于 2019-12-02 06:48:53
问题 I have a table that is a "tall skinny" fact table: CREATE TABLE facts( eff_date timestamp NOT NULL, update_date timestamp NOT NULL, symbol_id int4 NOT NULL, data_type_id int4 NOT NULL, source_id char(3) NOT NULL, fact decimal /* Keys */ CONSTRAINT fact_pk PRIMARY KEY (source_id, symbol_id, data_type_id, eff_date), ) I'd like to "pivot" this for a report, so the header looks like this: eff_date, symbol_id, source_id, datatypeValue1, ... DatatypeValueN I.e., I'd like a row for each unique

Postgresql crosstab query with multiple “row name” columns

你。 提交于 2019-12-02 06:22:07
I have a table that is a "tall skinny" fact table: CREATE TABLE facts( eff_date timestamp NOT NULL, update_date timestamp NOT NULL, symbol_id int4 NOT NULL, data_type_id int4 NOT NULL, source_id char(3) NOT NULL, fact decimal /* Keys */ CONSTRAINT fact_pk PRIMARY KEY (source_id, symbol_id, data_type_id, eff_date), ) I'd like to "pivot" this for a report, so the header looks like this: eff_date, symbol_id, source_id, datatypeValue1, ... DatatypeValueN I.e., I'd like a row for each unique combination of eff_date, symbol_id, and source_id. However, the postgresql crosstab() function only allow on

Postgresql - Regex split csv line with potentials quotes

我与影子孤独终老i 提交于 2019-12-02 03:06:49
问题 I would like to split a column that represent a csv line in postgres. Fields in this text line are delimited by pipe, sometime they are enclosed by quote and sometime not. In addition we can have escaped chars. field1|"field2"|field3|"22 \" lcd \| screen " Is there a regex to split this column (i.e. using regexp_split_to_array(....)? ) 回答1: Not about regexp but it works create or replace function split_csv( line text, delim_char char(1) = ',', quote_char char(1) = '"') returns setof text[]

Go sql - prepared statement scope

允我心安 提交于 2019-12-02 02:47:13
I'm building an API using the Go (1.6.x) sql package along with with PostGres (9.4). Should my prepared statements have application or request scope? Having read the docs it would seem more efficient to scope them at the application level to reduce the number of preparation phases. However, perhaps there are other considerations and prepared statements are not designed to live that long? Prepared statements are so that you can execute repetitive SQL commands which may only differ in parameter values for example. They are not meant to live "long" as a prepared statement may (they do if called