Converting a list of tokens to n-grams
I have a list of documents that have already been tokenized: dat <- list(c("texaco", "canada", "lowered", "contract", "price", "pay", "crude", "oil", "canadian", "cts", "barrel", "effective", "decrease", "brings", "companys", "posted", "price", "benchmark", "grade", "edmonton", "swann", "hills", "light", "sweet", "canadian", "dlrs", "bbl", "texaco", "canada", "changed", "crude", "oil", "postings", "feb", "reuter"), c("argentine", "crude", "oil", "production", "pct", "january", "mln", "barrels", "mln", "barrels", "january", "yacimientos", "petroliferos", "fiscales", "january", "natural", "gas",