xts

Creating regular 15-minute time-series from irregular time-series

夙愿已清 提交于 2019-11-27 03:36:34
I have an irregular time-series (with DateTime and RainfallValue) in a csv file C:\SampleData.csv : DateTime,RainInches 1/6/2000 11:59,0 1/6/2000 23:59,0.01 1/7/2000 11:59,0 1/13/2000 23:59,0 1/14/2000 0:00,0 1/14/2000 23:59,0 4/14/2000 3:07,0.01 4/14/2000 3:12,0.03 4/14/2000 3:19,0.01 12/31/2001 22:44,0 12/31/2001 22:59,0.07 12/31/2001 23:14,0 12/31/2001 23:29,0 12/31/2001 23:44,0.01 12/31/2001 23:59,0.01 Note: The irregular time-steps could be 1 min, 15 min, 1 hour, etc. Also, there could be multiple observations in a desired 15-min interval. I am trying to create a regular 15-minute time

Convert data frame with date column to timeseries

非 Y 不嫁゛ 提交于 2019-11-27 03:10:56
I've got a data frame with the following data: >PRICE DATE CLOSE 1 20070103 54.700 2 20070104 54.770 3 20070105 55.120 4 20070108 54.870 5 20070109 54.860 6 20070110 54.270 7 20070111 54.770 8 20070112 55.360 9 20070115 55.760 ... As you can see my DATE column represents a date (yyyyMMdd) and my CLOSE column represents prices. I now have to calculate CalmarRatio, from the PerformanceAnalytics package. I'm new to R, so i can't understand everything, but from what i have googled to the moment i see that the R parameter to that function needs to be a time-series-like object. Is there any way i

R xts: generating 1 minute time series from second events

故事扮演 提交于 2019-11-27 03:06:31
问题 I have an xts sequence of stock trade events that I want to process to generate 1 minute OHLC time series. For instance this set of trades: Timestamp Price Size 9:30:00.123 12.32 200 9:30.00.532 12.21 100 9:30.32.352 12.22 500 9:30.45.342 12.35 200 Should result in the 9:30:00 record: Timestamp Open High Low Close 9:30:00 12.32 12.35 12.21 12.35 The way I approached this is to split the original trade series by the minute: myminseries = do.call(rbind, lapply(split(mytrades, "minutes"),

Return data subset time frames within another timeframes?

我是研究僧i 提交于 2019-11-27 02:38:19
There are very nifty ways of subsetting xts objects. For example, one can get all the data for all years, months, days but being strictly between 9:30 AM and 4 PM by doing: my_xts["T09:30/T16:00"] Or you can get all the observations between two dates by doing: my_xts["2012-01-01/2012-03-31"] Or all the dates before/after a certain date by doing: my_xts["/2011"] # from start of data until end of 2011 my_xts["2011/"] # from 2011 until the end of the data How can I get all the data for only certain months for all years or only certain days for all months and years? Do any other subsetting tricks

Access zoo or xts index

不羁岁月 提交于 2019-11-27 02:34:15
问题 I am using zoo objects, buy my question also applies to xts objects. It looks to me like it is a one column vector with an index. In my case the index is the vector of dates and the one column vector my data. All is good except that I would like to access the dates (from the index). For example I have the following result: ObjZoo <- structure(c(10, 20), .Dim = c(2L, 1L), index = c(14788, 14789), class = "zoo", .Dimnames = list(NULL, "Data")) unclass(ObjZoo) # Data # [1,] 10 # [2,] 20 # attr(,

Rolling Sum by Another Variable in R

南楼画角 提交于 2019-11-27 02:20:15
问题 I want to get the rolling 7-day sum by ID. Suppose my data looks like this: data<-as.data.frame(matrix(NA,42,3)) data$V1<-seq(as.Date("2014-05-01"),as.Date("2014-09-01"),by=3) data$V2<-rep(1:6,7) data$V3<-rep(c(1,2),21) colnames(data)<-c("Date","USD","ID") Date USD ID 1 2014-05-01 1 1 2 2014-05-04 2 2 3 2014-05-07 3 1 4 2014-05-10 4 2 5 2014-05-13 5 1 6 2014-05-16 6 2 7 2014-05-19 1 1 8 2014-05-22 2 2 9 2014-05-25 3 1 10 2014-05-28 4 2 How can I add a new column that will contain the rolling

Which is the best method to apply a script repetitively to n .csv files in R?

梦想与她 提交于 2019-11-27 01:52:59
My situation : I have a number of csv files all with the same suffix pre .csv, but the first two characters of the file name are different (ie AA01.csv, AB01.csv, AC01.csv etc) I have an R script which I would like to run on each file. This file essentially extracts the data from the .csv and assigns them to vectors / converts them into timeseries objects. (For example, AA01 xts timeseries object, AB01 xts object) What I would like to achieve : Embed the script within a larger loop (or as appropriate) to sequentially run over each file and apply the script Remove the intermediate objects

Rolling window over irregular time series

不羁的心 提交于 2019-11-27 00:58:07
问题 I have an irregular time series of events (posts) using xts , and I want to calculate the number of events that occur over a rolling weekly window (or biweekly, or 3 day, etc). The data looks like this: postid 2010-08-04 22:28:07 867 2010-08-04 23:31:12 891 2010-08-04 23:58:05 901 2010-08-05 08:35:50 991 2010-08-05 13:28:02 1085 2010-08-05 14:14:47 1114 2010-08-05 14:21:46 1117 2010-08-05 15:46:24 1151 2010-08-05 16:25:29 1174 2010-08-05 23:19:29 1268 2010-08-06 12:15:42 1384 2010-08-06 15:22

Regular analysis over irregular time series

纵然是瞬间 提交于 2019-11-26 23:03:59
问题 I have an irregular time series ( xts in R ) that I want to apply some time-windowing to. For example, given a time series like the following, I want to compute things like how many observations there are in each discrete 3-hour window, starting from 2009-09-22 00:00:00 : library(lubridate) s <- xts(c("OK", "Fail", "Service", "OK", "Service", "OK"), ymd_hms(c("2009-09-22 07:43:30", "2009-10-01 03:50:30", "2009-10-01 08:45:00", "2009-10-01 09:48:15", "2009-11-11 10:30:30", "2009-11-11 11:12:45

Forecasting time series data

拥有回忆 提交于 2019-11-26 21:48:26
I've done some research and I am stuck in finding the solution. I have a time series data, very basic data frame, let's call it x : Date Used 11/1/2011 587 11/2/2011 578 11/3/2011 600 11/4/2011 599 11/5/2011 678 11/6/2011 555 11/7/2011 650 11/8/2011 700 11/9/2011 600 11/10/2011 550 11/11/2011 600 11/12/2011 610 11/13/2011 590 11/14/2011 595 11/15/2011 601 11/16/2011 700 11/17/2011 650 11/18/2011 620 11/19/2011 645 11/20/2011 650 11/21/2011 639 11/22/2011 620 11/23/2011 600 11/24/2011 550 11/25/2011 600 11/26/2011 610 11/27/2011 590 11/28/2011 595 11/29/2011 601 11/30/2011 700 12/1/2011 650 12