gaps-and-islands

Finding Gaps in Timestamps for Multiple Users in PostgreSQL

好久不见. 提交于 2021-01-28 11:48:59
问题 I am working with a dataset containing Check-In and Check-Out times for multiple office rooms over the last 5 years. One of the projects I was asked to work on was calculating the amount of time each room is busy and vacant over various time ranges (daily, weekly, monthly, etc.) assuming normal operational hours (8am to 5pm). A sample of the dataset for two days looks like this: room_id start_dt end_dt Room: Room 3 2019-05-04 09:00:00 2019-05-04 11:30:00 Room: Room 3 2019-05-04 11:30:00 2019

Finding Gaps in Timestamps for Multiple Users in PostgreSQL

帅比萌擦擦* 提交于 2021-01-28 11:39:27
问题 I am working with a dataset containing Check-In and Check-Out times for multiple office rooms over the last 5 years. One of the projects I was asked to work on was calculating the amount of time each room is busy and vacant over various time ranges (daily, weekly, monthly, etc.) assuming normal operational hours (8am to 5pm). A sample of the dataset for two days looks like this: room_id start_dt end_dt Room: Room 3 2019-05-04 09:00:00 2019-05-04 11:30:00 Room: Room 3 2019-05-04 11:30:00 2019

Partial sum between different records using SQL 2008

早过忘川 提交于 2021-01-28 11:15:36
问题 I'm trying to solve this issue in SQL 2008. I've a table like this: DECLARE @table TABLE ( TimeStamp DATETIME, val INT, typerow VARCHAR(3) ); INSERT INTO @table(TimeStamp, val, typerow) VALUES ('2018-06-03 13:30:00.000', 6, 'out'), ('2018-06-03 14:10:00.000', 8, 'out'), ('2018-06-03 14:30:00.000', 3, 'in'), ('2018-06-03 15:00:00.000', 9, 'out'), ('2018-06-03 15:30:00.000', 4, 'out'), ('2018-06-03 16:00:00.000', 2, 'out'), ('2018-06-03 17:05:00.000', 8, 'in'), ('2018-06-03 17:30:00.000', 0,

MySQL Group from quarters to periods

半腔热情 提交于 2021-01-28 06:02:22
问题 I have a table like this: Person smallint(5) act_time datetime 1 2020-05-29 07:00:00 1 2020-05-29 07:15:00 1 2020-05-29 07:30:00 2 2020-05-29 07:15:00 2 2020-05-29 07:30:00 1 2020-05-29 10:30:00 1 2020-05-29 10:45:00 The table above is an example with 2 different persons and there is a row for each quarter they are at work... What is the best way in MySQL to "convert" this table to another table where there is a column for "person", a column for "start" and one for "stop". So the result is

Tracking a continuous instance of absence SQL

懵懂的女人 提交于 2021-01-07 02:41:15
问题 I have a table called sickness which is a record of when an employee is off work sick. It looks like this: Date_Sick Employee_Number ---------- ---------------- 2020-06-08 001 2020-06-10 001 2020-06-11 001 2020-06-12 001 2020-06-08 002 2020-06-09 002 What I'm trying to do is add a new column with a unique ID to identify a unique instance of absence. A unique instance of absence is one that runs in consecutive weekdays with no breaks. Hence my output table should look like this: Date_Sick

How to make LAG() ignore NULLS in SQL Server?

醉酒当歌 提交于 2021-01-04 02:54:46
问题 Does anyone know how to replace nulls in a column with a string until it hits a new string then that string replaces all null values below it? I have a column that looks like this Original Column: PAST_DUE_COL 91 or more days pastdue Null Null 61-90 days past due Null Null 31-60 days past due Null 0-30 days past due Null Null Null Expected Result Column: PAST_DUE_COL 91 or more days past due 91 or more days past due 91 or more days past due 61-90 days past due 61-90 days past due 61-90 days

SQL Server : find recent consecutive records that are greater than 5

核能气质少年 提交于 2020-12-13 04:52:27
问题 I need to write a query that shows the result broken down by FormID that have a value greater than 5 based on the most recent LogDate . Based on the most recent LogDate , if there was a value that was less than 5, it should display values from that point that are greater than 5 as the values under 5 is a 'reset' if you will. I am essentially looking at recent consecutive LogDate records that are greater than 5. Say we have the following record set: FormID Value LogDate -----------------------