DAU and MAU (daily active users and monthly active users) are an established way of measuring user engagement.
How can I get these numbers using SQL and Google BigQ
In order to analyze trends while not waiting to have "full month", there is a need to look at each day with its predecessor 30 days... I am afraid that the suggested solution (by Felipe Hoffa), changes the question, not just the data retrieval query.
You can find bellow my take of the issue. I am not sure what it does under the hood in terms of performance, and it is not very fast (much slower than Felipe's...), but it covers the business need as I understand it. Still, if you could offer a solution that optimize this approach, that would be great.
Please note: no use of any joins and sub aggregates, just splits, group by, and date manipulations.
SELECT
*,
DAU/WAU AS DAW_WAU,
DAU/MAU AS DAW_MAU,
FROM (
SELECT
COALESCE(DAUDate,WAUDate,MAUDate) AS ReportDate,
subreddit,
EXACT_COUNT_DISTINCT(IF(DAUDate IS NOT NULL,author,NULL)) AS DAU,
EXACT_COUNT_DISTINCT(IF(WAUDate IS NOT NULL,author,NULL)) AS WAU,
EXACT_COUNT_DISTINCT(IF(MAUDate IS NOT NULL,author,NULL)) AS MAU,
FROM (
SELECT
DDate,
subreddit,
author,
Ind,
DATE(IF(Ind=0,DDate,NULL)) AS DAUDate,
DATE(IF(Ind<7,DATE_ADD(DDate,Ind,"Day"),NULL)) AS WAUDate,
DATE(IF(Ind<30,DATE_ADD(DDate,Ind,"Day"),NULL)) AS MAUDate
FROM (
SELECT
DATE(SEC_TO_TIMESTAMP(created_utc)) AS DDate,
subreddit,
author,
INTEGER(SPLIT("0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30",",")) AS Ind
FROM
[fh-bigquery:reddit_comments.2015_09],
[fh-bigquery:reddit_comments.2015_08] ))
WHERE
COALESCE(DAUDate,WAUDate,MAUDate)50000
ORDER BY
2,
1 DESC