postgresql-9.3

How to increment value in postgres update statement on JSON key

青春壹個敷衍的年華 提交于 2019-12-04 10:59:04
问题 When updating a relational table: CREATE TABLE foo ( id serial primary key, credit numeric); UPDATE foo SET bar = bar + $1 WHERE id = $2; However the equivalent in JSON doesn't work: CREATE TABLE foo ( id serial primary key, data json); UPDATE foo SET data->'bar' = data->'bar' + $1 WHERE id = $2; The error I get is error: syntax error at or near "->" - which is rather ambiguous. How do I do this? I am using postgres 9.3.4 In light of @GordonLinoff's comment below, I have created a feature

Is there a way to address all elements of JSON array when creating a constraint in PostgreSQL?

被刻印的时光 ゝ 提交于 2019-12-04 10:29:02
Does PostgreSQL provide any notation/method for putting a constraint on each element of a JSON array? An example: create table orders(data json); insert into orders values (' { "order_id": 45, "products": [ { "product_id": 1, "name": "Book" }, { "product_id": 2, "name": "Painting" } ] } '); I can easily add a constraint on the order_id field: alter table orders add check ((data->>'order_id')::integer >= 1); Now I need to do the same with product_id . I can put constraint on idividual array items: alter table orders add check ((data->'products'->0->>'product_id')::integer >= 1); alter table

How to dump an entire SQL Server 2014 database into a file, to be imported into a Postgres database?

假如想象 提交于 2019-12-04 05:14:56
问题 I have a SQL Server 2014 database from which I need to dump just the table data (no indexes, stored procedures, or anything else). This dump needs to be imported into a Postgres 9.3 database "as-is". What id the proper command line to create such a dump? 回答1: I must admit, this is more sort of a joke... You should rather follow the hint to use "Export" and write this to some kind of CSV. Just for fun: EDIT: create a column list to avoid binary columns... columns, which are not directly

Heroku + Apartment PG::Error: ERROR: function pg_stat_statements_reset() does not exist

余生颓废 提交于 2019-12-04 03:55:21
I use Apartment gem in Rails 4 to support multi-tenancy in Postgres 9.3.3 on Heroku. An error is occurred when Apartment gem creates a new tenant. Deep investigation showed that a schema was created, but no tables inside. Heroku logs showed an error: PG::Error: ERROR: function pg_stat_statements_reset() does not exist When a new schema is created Postgres is trying to reset stats by executing the function pg_stat_statements_reset() By default, this function can only be executed by superusers (from original doc) . Heroku doesn't give you superuser privileges. So you need to disable extension pg

Postgres: Best way to move data from public schema of one DB to new schema of another DB

徘徊边缘 提交于 2019-12-04 03:05:16
I am new to Postgres and just discovered that I cannot access data of different databases in one SQL query. And also learned the concept of schema in Postgres. Now, I have two databases db1 and db2 Both have tables with same name in their public schema. Now, I want to create a new schema in db1 with name : new_schema And move data from db2.public to db1.new_schema What is the easiest way to do this ? The simplest way to do that is to rename schemas. However you must be sure you are a sole user of db1 database. First, hide your schema public in db1: alter schema public rename to original_public

how to parse json using json_populate_recordset in postgres

99封情书 提交于 2019-12-04 02:54:00
问题 I have a json stored as text in one of my database row. the json data is as following [{"id":67272,"name":"EE_Quick_Changes_J_UTP.xlsx"},{"id":67273,"name":"16167.txt"},{"id":67274,"name":"EE_12_09_2013_Bcum_Searchall.png"}] to parse this i want to use postgresql method json_populate_recordset() when I post a command like select json_populate_recordset(null::json,'[{"id":67272,"name":"EE_Quick_Changes_J_UTP.xlsx"},{"id":67273,"name":"16167.txt"},{"id":67274,"name":"EE_12_09_2013_Bcum

Writing to JSON column of Postgres database using Spring / JPA

こ雲淡風輕ζ 提交于 2019-12-03 11:36:12
问题 I have a table called "test" containing a column "sample_column" of type json in Postgres 9.3. I'm trying to write the following contents into the column using Spring / JPA: {"name":"Updated name"} I read on other posts that I need to add a custom converter to map the string to json type. This is the code I have now: TestDAO.java: @Entity @Table(name="test") public class TestDAO implements Serializable { private static final long serialVersionUID = 1L; @Id @GeneratedValue(strategy

indexing and query high dimensional data in postgreSQL

故事扮演 提交于 2019-12-03 11:28:40
I want to index data in height dimensions (128 dimensional vectors of integers in range of [0,254] are possible): | id | vector | | 1 | { 1, 0, ..., 254} | | 2 | { 2, 128, ...,1} | | . | { 1, 0, ..., 252} | | n | { 1, 2, ..., 251} | I saw that PostGIS implemented R-Trees. So can I use these trees in PostGIS to index and query multidimensional vectors in Postgres? I also saw that there is a index implementation for int arrays . Now I have questions about how to perform a query. Can I perform a knn-search and a radius search on an integer array? Maybe I also must define my own distance function.

Cast syntax to convert a sum to float

痴心易碎 提交于 2019-12-03 09:21:31
Using PostgreSQL 9.3, I want to convert the calculated values to data type float . My first attempt: SELECT float(SUM(Seconds))/-1323 AS Averag; Gives me this error: syntax error at or near "SUM" My second attempt: SELECT to_float(SUM(Seconds))/-1323 AS Averag; Gives me this error: function to_float(bigint) does not exist You need to use the cast syntax: SELECT CAST (SUM(Seconds) AS FLOAT)/-1323 AS Averag; Erwin Brandstetter There is also the shorthand cast syntax: SELECT sum(seconds) ::float / -1323 AS averag; Postgres data type cast It is not exact casting but a trick to do the job :) and

Querying inside Postgres JSON arrays

被刻印的时光 ゝ 提交于 2019-12-03 06:57:28
问题 How would you go about searching for an element inside an array stored in a json column? (Update: Also see the 9.4 updated answer for jsonb columns.) If I have a JSON document like this, stored in a json column named blob : {"name": "Wolf", "ids": [185603363281305602,185603363289694211]} what I'd like to be able to do is something like: SELECT * from "mytable" WHERE 185603363289694211 = ANY("blob"->'ids'); and get all matching rows out. But this doesn't work because "blob"->'ids' returns JSON