time-series

R growth rate calculation week over week on daily timeseries data

两盒软妹~` 提交于 2020-04-30 10:56:38
问题 I'm trying to calculate w/w growth rates entirely in R. I could use excel, or preprocess with ruby, but that's not the point. data.frame example date gpv type 1 2013-04-01 12900 back office 2 2013-04-02 16232 back office 3 2013-04-03 10035 back office I want to do this factored by 'type' and I need to wrap up the Date type column into weeks. And then calculate the week over week growth. I think I need to do ddply to group by week - with a custom function that determines if a date is in a

R growth rate calculation week over week on daily timeseries data

余生颓废 提交于 2020-04-30 10:56:18
问题 I'm trying to calculate w/w growth rates entirely in R. I could use excel, or preprocess with ruby, but that's not the point. data.frame example date gpv type 1 2013-04-01 12900 back office 2 2013-04-02 16232 back office 3 2013-04-03 10035 back office I want to do this factored by 'type' and I need to wrap up the Date type column into weeks. And then calculate the week over week growth. I think I need to do ddply to group by week - with a custom function that determines if a date is in a

Creating a TimeseriesGenerator with multiple inputs

落爺英雄遲暮 提交于 2020-04-18 05:48:05
问题 I'm trying to train an LSTM model on daily fundamental and price data from ~4000 stocks, due to memory limits I cannot hold everything in memory after converting to sequences for the model. This leads me to using a generator instead like the TimeseriesGenerator from Keras / Tensorflow. Problem is that if I try using the generator on all of my data stacked it would create sequences of mixed stocks, see the example below with a sequence of 5, here Sequence 3 would include the last 4

Time series prediction using 1D Conv

非 Y 不嫁゛ 提交于 2020-04-18 05:42:36
问题 I am trying to implement a time series for video prediction using 1D Convolution in Kera. I have extracted feature vectors from the Pre-trained CNN model. Each Feature vector is of size 2048. Thus if a video contains 100 frames then the input size will be (100, 2048). But the problem is each video has different no of frames and I am unable to cater to the variable size of input data in Keras. My code looks like this: #video 1 has 100 frames (100,2048) #video 2 has 52 frames (52,2048) #video 3

creating list of ts objects based on list of dataframes in R

穿精又带淫゛_ 提交于 2020-04-17 22:49:35
问题 So I had previously posted a similar question but this is kind of an extension to it. The complicacy of my data set has increased and now my data set is a list of data frames where each data frame has a KEY , CAL , OAS . KEY is the unique value which changes from one data frame to another. CAL is the timescale I have to utilize to make a ts object expressed in week.year. and OAS is the number of units that are recorded for every week. The original data set has some 250 data frames, for

creating list of ts objects based on list of dataframes in R

眉间皱痕 提交于 2020-04-17 22:41:11
问题 So I had previously posted a similar question but this is kind of an extension to it. The complicacy of my data set has increased and now my data set is a list of data frames where each data frame has a KEY , CAL , OAS . KEY is the unique value which changes from one data frame to another. CAL is the timescale I have to utilize to make a ts object expressed in week.year. and OAS is the number of units that are recorded for every week. The original data set has some 250 data frames, for

Fuzzy Join Using Time and Geo-coordinates in R

依然范特西╮ 提交于 2020-04-16 03:22:51
问题 There two data frames with disparate information. The only columns they have in common are datetime and lat/long fields. Can one create a third data frame using R or an R package (or possibly Python/Pandas) that takes a subset of rows from both data frames by similar date and lat/long fields? The joins should be fuzzy, not exact, plus/minus an hr and tenth a degree. Input Example: df_1 Datetime Latitude Longitude 2018-10-01 08:27:10 34.8014080 103.8499800 2018-09-30 04:55:51 43.3367432 44

How to smooth timeseries with yearly data with lowess in python

不羁岁月 提交于 2020-04-12 07:08:15
问题 I have some data that were recoreded yearly as follows. mydata = [0.6619346141815186, 0.7170140147209167, 0.692265510559082, 0.6394098401069641, 0.6030995845794678, 0.6500746607780457, 0.6013327240943909, 0.6273292303085327, 0.5865356922149658, 0.6477396488189697, 0.5827181339263916, 0.6496025323867798, 0.6589270234107971, 0.5498126149177551, 0.48638370633125305, 0.5367399454116821, 0.517595648765564, 0.5171639919281006, 0.47503289580345154, 0.6081966757774353, 0.5808742046356201, 0

Why do I get an error message pointing to Inf values when trying to plot counts over time in R?

孤人 提交于 2020-04-07 06:50:08
问题 I am using the code given in this answer to generate this plot library(rvest) cachedir <- "cache" if (!dir.exists(cachedir)) dir.create(cachedir) URL <- "https://github.com/CSSEGISandData/COVID-19/tree/master/csse_covid_19_data/csse_covid_19_daily_reports" html <- read_html(URL) csvlinks <- html_nodes(html, "td span") %>% html_nodes("a") %>% html_attr("href") %>% grep("csv$", ., value = TRUE) %>% paste0("https://raw.githubusercontent.com", .) %>% gsub("/blob", "", .) csvfiles <- file.path

R: ts object shows weekly seasonality, but not xts (with same data and frequency parameter)

。_饼干妹妹 提交于 2020-03-25 13:47:09
问题 I have a dataframe which captures daily data: $dt: Date, format: "2019-01-01" "2019-01-02" "2019-01-03" "2019-01-04" $new_user_growth: num NA -0.0254 -0.0469 -0.1257 0.3125 I converted the dataframe above to ts by: ts_h7_2019 <- ts(data=df$new_user_growth, frequency = 7) I set frequency to 7 because I want to focus on weekly seasonality. When I decompose the data using mstl (automatic stl algorithm), it shows Seasonal7 trend. So far so good. But then, I found working with xts is easier, so I