How to calculate DAU/MAU with BigQuery (engagement)

前端 未结 2 1139
后悔当初
后悔当初 2021-01-16 06:15

DAU and MAU (daily active users and monthly active users) are an established way of measuring user engagement.

How can I get these numbers using SQL and Google BigQ

2条回答
  •  不要未来只要你来
    2021-01-16 06:44

    2019 standard SQL update:

    • https://stackoverflow.com/a/49866033/132438

    (to understand the utility of DAU/MAU see articles like http://blog.compariscope.wefi.com/mobile-app-usage-dau-mau)

    Let's play with the reddit comments data stored in BigQuery. We want to find out the dau/mau ratio for the 'AskReddit' subreddit during September on a daily rolling basis:

    SELECT day, dau, mau, INTEGER(100*dau/mau) daumau
    FROM (
      SELECT day, EXACT_COUNT_DISTINCT(author) dau, FIRST(mau) mau
      FROM (
        SELECT DATE(SEC_TO_TIMESTAMP(created_utc)) day, author
        FROM [fh-bigquery:reddit_comments.2015_09]
        WHERE subreddit='AskReddit') a
      JOIN (
        SELECT stopday, EXACT_COUNT_DISTINCT(author) mau
        FROM (SELECT created_utc, subreddit, author FROM [fh-bigquery:reddit_comments.2015_09], [fh-bigquery:reddit_comments.2015_08]) a
        CROSS JOIN (
          SELECT DATE(SEC_TO_TIMESTAMP(created_utc)) stopday
          FROM [fh-bigquery:reddit_comments.2015_09]
          GROUP BY 1
        ) b
        WHERE subreddit='AskReddit'
        AND SEC_TO_TIMESTAMP(created_utc) BETWEEN DATE_ADD(stopday, -30, 'day') AND TIMESTAMP(stopday)
        GROUP BY 1
      ) b
      ON a.day=b.stopday
      GROUP BY 1
    )
    ORDER BY 1
    

    This query gets DAU for each day in September, and looks also into August data to get the MAU for each 30 day period ending in each DAU day. That takes a lot of processing (30x), and we can get almost equivalent results if we only calculate one MAU for September, and proceed to use that value as the denominator:

    SELECT day, dau, mau, INTEGER(100*dau/mau) daumau
    FROM (
      SELECT DATE(SEC_TO_TIMESTAMP(created_utc)) day, EXACT_COUNT_DISTINCT(author) dau, FIRST(mau) mau
      FROM [fh-bigquery:reddit_comments.2015_09] a
      CROSS JOIN (
        SELECT EXACT_COUNT_DISTINCT(author) mau
        FROM [fh-bigquery:reddit_comments.2015_09]
        WHERE subreddit='AskReddit'
      ) b
      WHERE subreddit='AskReddit'
      GROUP BY 1
    )
    ORDER BY 1
    

    That's a much simpler query that brings us almost equivalent results much faster.

    Now to get an average value for this subreddit for the month:

    SELECT ROUND(100*AVG(dau/mau), 2) daumau
    FROM (
      SELECT DATE(SEC_TO_TIMESTAMP(created_utc)) day, EXACT_COUNT_DISTINCT(author) dau, FIRST(mau) mau
      FROM [fh-bigquery:reddit_comments.2015_09] a
      CROSS JOIN (
        SELECT EXACT_COUNT_DISTINCT(author) mau
        FROM [fh-bigquery:reddit_comments.2015_09]
        WHERE subreddit='AskReddit'
      ) b
      WHERE subreddit='AskReddit'
      GROUP BY 1
    )
    

    This tells us that 'AskReddit' had an engagement of 8.95% during September.

    Last stop, how to compare engagement within various subreddits:

    SELECT ROUND(100*AVG(dau)/MAX(mau), 2) avg_daumau, MAX(mau) mau, subreddit
    FROM (
      SELECT a.subreddit, DATE(SEC_TO_TIMESTAMP(created_utc)) day,
             EXACT_COUNT_DISTINCT(author) dau, FIRST(mau) mau
      FROM [fh-bigquery:reddit_comments.2015_09] a
      JOIN (
        SELECT EXACT_COUNT_DISTINCT(author) mau, subreddit
        FROM [fh-bigquery:reddit_comments.2015_09]
        GROUP BY 2
      ) b
      ON a.subreddit=b.subreddit
      WHERE mau>50000
      GROUP BY 1, 2
    )
    
    GROUP BY subreddit
    ORDER BY 1
    

提交回复
热议问题