aggregate-functions

Rolling Daily Distinct Counts

天涯浪子 提交于 2019-12-06 01:34:09
We have a table with the following columns: SESSION_ID USER_ID CONNECT_TS -------------- --------------- --------------- 1 99 2013-01-01 2:23:33 2 101 2013-01-01 2:23:55 3 104 2013-01-01 2:24:41 4 101 2013-01-01 2:24:43 5 233 2013-01-01 2:25:01 We need to get a distinct count of users for each day and a count of "active users" which are defined as users that have used the application in the last 45 days. Here is what we have come up with, but I feel like there has to be a better way: select trunc(a.connect_ts) , count(distinct a.user_id) daily_users , count(distinct b.user_id) active_users

Create array in SELECT

好久不见. 提交于 2019-12-06 01:31:14
问题 I'm using PostgreSQL 9.1 and I have this data structure: A B ------- 1 a 1 a 1 b 1 c 1 c 1 c 1 d 2 e 2 e I need a query that produces this result: 1 4 {{c,3},{a,2},{b,1},{d,1}} 2 1 {{e,2}} A=1, 4 rows total with A=1, the partial counts (3 rows with c value, 2 rows with a value, .....) The distinct values of column "A" The count of all rows related to the "A" value An array contains all the elements related to the "A" value and the relative count of itself The sort needed for the array is

How to count 2 different data in one query

耗尽温柔 提交于 2019-12-06 01:11:37
问题 I need to calculate sum of occurences of some data in two columns in one query. DB is in SQL Server 2005. For example I have this table: Person: Id, Name, Age And I need to get in one query those results: 1. Count of Persons that have name 'John' 2. Count of 'John' with age more than 30 y. I can do that with subqueries in this way (it is only example): SELECT (SELECT COUNT(Id) FROM Persons WHERE Name = 'John'), (SELECT COUNT (Id) FROM Persons WHERE Name = 'John' AND age > 30) FROM Persons But

multiply(num) aggregate function in postgresql

末鹿安然 提交于 2019-12-05 22:13:14
This could be incredibly simple by the documentation is quite on it. Is there a way to aggregate columns via multiplication operator in postgresql. I know i can do count(column) or sum(column), but is there a multiply(column) or product(column) function that i can use. If not, any ideas how to achieve it. I'm using postgres 9.1 regards, Hassan Sure, just define an aggregate over the base multiplication function. E.g. for bigint: CREATE AGGREGATE mul(bigint) ( SFUNC = int8mul, STYPE=bigint ); Example: regress=> SELECT mul(x) FROM generate_series(1,5) x; mul ----- 120 (1 row) See CREATE

PostgreSQL equivalent for SQL Server GROUP BY WITH ROLLUP

风格不统一 提交于 2019-12-05 19:24:28
问题 I have an Sql Server Query that is using the ROLLUP clause while grouping. I want an equivalent query in Postgres. Query in SQl Server is: SELECT (CASE WHEN acnt_dba_name Is Null THEN 'Total' ELSE acnt_dba_name END) as account, (CASE WHEN evt_name Is Null THEN '' ELSE evt_name END) as event, COUNT(CASE reg_is_complete WHEN true THEN 1 ELSE Null END) as regsComplete, COUNT(CASE WHEN reg_frn_pro_id > 0 AND reg_is_complete = false THEN 1 ELSE Null END) as regsInComplete, COUNT(CASE WHEN reg_frn

Missing 'Median' Aggregate Function in Django?

守給你的承諾、 提交于 2019-12-05 17:30:07
问题 The Development version of Django has aggregate functions like Avg, Count, Max, Min, StdDev, Sum, and Variance (link text). Is there a reason Median is missing from the list? Implementing one seems like it would be easy. Am I missing something? How much are the aggregate functions doing behind the scenes? 回答1: Because median isn't a SQL aggregate. See, for example, the list of PostgreSQL aggregate functions and the list of MySQL aggregate functions. 回答2: Here's your missing function. Pass it

What is causing a scope parameter error in my SSRS chart?

邮差的信 提交于 2019-12-05 17:14:36
Why am I getting this error in my chart? ( Chart Image ) I am using this expression in the chart: Series: =Sum(Fields!Mins_Att.Value)/Sum(Fields!Mins_Poss.Value) Series 1: =Sum(Fields!Mins_Att.Value, "Chart2_CategoryGroup2")/Sum(Fields!Mins_Poss.Value, "Chart2_CategoryGroup2") and I am getting this error: The Y expression for the Chart has a scope parameter that is not valid for an aggregate function. The scope parameter must be set to a string constant that is equal the name of group, data region or name of dataset. The scope of "Chart2_CategoryGroup2" doesn't exist in the report. 来源: https:/

Cumulative sum over days

别说谁变了你拦得住时间么 提交于 2019-12-05 16:58:16
I have a MySQL table like the following: date count 2010-01-01 5 2010-01-02 6 2010-01-03 7 How can I accumulate the sum of each day to the next one? So the result is like: date acum per day 2010-01-01 5 2010-01-02 11 2010-01-03 18 I think i need some kind of for(each date)... but no clue. Just the final query i used following answer from Eric. (thanks). SELECT t1.dia, sum(t2.operacions), sum(t2.amount) FROM (SELECT count(*) operations, sum(amount), date(b.timestamp) dia FROM transactions b group by date(b.timestamp)) t1 INNER JOIN (SELECT count(*) operations, sum(amount), date(b.timestamp) dia

MySql Query: include days that have COUNT(id) == 0 but only in the last 30 days

荒凉一梦 提交于 2019-12-05 15:59:11
I am doing a query to get the number of builds per day from our database for the last 30 days. But it has become needed to marked days where there were no builds also. In my WHERE clause I use submittime to determine whether there were builds, how could I modify this to include days that have COUNT(id) == 0 but only in the last 30 days. Original Query: SELECT COUNT(id) AS 'Past-Month-Builds', CONCAT(MONTH(submittime), '-', DAY(submittime)) as 'Month-Day' FROM builds WHERE DATE(submittime) >= DATE_SUB(CURDATE(), INTERVAL 30 day) GROUP BY MONTH(submittime), DAY(submittime); What I've Tried:

Can not ORDER BY an AVG value with certain GROUP BY criteria in MySQL

浪子不回头ぞ 提交于 2019-12-05 13:18:49
I have a table data_summaries . It has such columns as item_id INT(11) , user_grouping TEXT and value DECIMAL(10,2) . If I try to make a query that groups the results by user_grouping and orders them by the AVG of value , that fails: SELECT user_grouping, AVG(value) AS avg_value, SUM(value) AS sum_value FROM data_summaries GROUP BY user_grouping ORDER BY avg_value +---------------+-----------+-----------+ | user_grouping | avg_value | sum_value | +---------------+-----------+-----------+ | London | 50.609733 | 18978.65 | | Paris | 50.791733 | 19046.90 | | New York | 51.500400 | 2575.02 | |