问题
I have run this program below with a symbol_list of a few hundreds symbol and at some moment it says not enough memory whereas I do reuse the same variable why ?
base-url: http://www.google.com/finance/historical
download-directory: "askpoweruser/stock-download/google/files/"
column-header: "Time;Open;High;Low;Close;Volume"
#debug: true
symbol_list: parse/all {GOOG AAPL MSFT INDEXDJX:.DJI} " "
ans: ask {symbols by default "GOOG AAPL MSFT INDEXDJX:.DJI": }
if (ans <> "") [symbol_list: parse/all ans " "]
;do code-block/2
foreach symbol symbol_list [
url0: rejoin [base-url "?q=" symbol]
dir: make-dir/deep to-rebol-file download-directory
either none? filename: find symbol ":" [
filename: symbol
url: rejoin [url0 "&output=csv"]
either not error? try [content: read url][
out-string: copy rejoin [column-header newline]
quotes: parse/all content ",^/"
reversed-quotes: reverse quotes
foreach [v c l h o d] reversed-quotes [
either not (error? try [d: to-date d]) [
d: rejoin [d/year "-" d/month "-" d/day]
append out-string rejoin [d ";" o ";" h ";" l ";" c ";" v newline]
][
;print [d "is not a date"]
;input
]
]
filename: rejoin [filename "_" "1440"]
write to-rebol-file rejoin [dir filename ".csv"] out-string
print filename
][
print ["Error for symbol" symbol]
]
][
filename: replace/all replace/all filename ":" "" "." ""
out: copy []
for i 0 1 1 [
p: i
url: rejoin [url0 "&start=" (p * 200) "&num=" ((p + 1) * 200)]
content: read url
rule: [to "<table" thru "<table" to ">" thru ">"
to "<table" thru "<table" to ">" thru ">"
to "<table" thru "<table" to ">" thru ">"
copy quotes to </table> to end
]
parse content rule
parse quotes [
some [to "<td" thru "<td" to ">" thru ">" [copy x to "<" | copy x to end] (append out replace/all x "^/" "")]
to end
]
if #debug [
write/lines to-rebol-file rejoin [dir filename "_" p ".html"] quotes
]
]
if #debug [
write to-rebol-file rejoin [dir filename "_temp" ".txt"] mold out
]
out-string: copy rejoin [column-header newline]
out: reverse out
foreach [v c l h o d] out [
d: parse/all d " ,"
d: to-date rejoin [d/4 "-" d/1 "-" d/2]
d: rejoin [d/year "-" d/month "-" d/day]
append out-string replace/all rejoin [d ";" o ";" h ";" l ";" c ";" v newline] "," ""
]
filename: rejoin [filename "_" "1440"]
write to-rebol-file rejoin [dir filename ".csv"] out-string
print filename
]
]
To get the list of symbols you can use this (rebol crashed above before letter H):
alphabet: [A B C D E F G H I J K L M N O P Q R S T U V W X Y Z]
symbol-list: copy []
rule: [
to <table class="quotes">
some [ to {<A href="/stockquote} to ">" thru ">" copy symbol to "<" (append symbol-list symbol)]
to </table>
]
foreach letter alphabet [
content: read to-url rejoin ["http://www.eoddata.com/stocklist/NYSE/" letter ".htm"]
parse content rule
probe symbol-list
write/append %askpoweruser/stock-download/symbol-list-nyse.txt mold symbol-list
]
回答1:
you can put a 'STAT function within one of your loops to try and figure out if and where a memory leak is occuring.
out of memory errors usually occur in one of these situations or something similar:
appending to a series which is not cleared or copied at the start of a function with a loop
a complete tree of data is not reset to none (at each leaf and branch) in a situation where some (a single?) sub elements are referenced outside the tree and the whole data block ends up being caught in the ram unable to free itself
when printing a very large string or nested tree of large objects (for example a VID face contains a reference to the complete styleshet, so printing the window of a big app usually fails.).
some stack overflows (endless recursions or loops) are sometimes incorrectly reported as memory errors.
allocation of a single item grows exponentially... like image! allocation when multiplying each pass by 10 on both axes... effectively increasing two orders of magnitude which usually fails at numbers in the n*10k range.
the largest item in the GC sometimes never deallocates as per the suboptimal R2 GC (large images may have this symptom).
recursive parse rules are creating data and a single rule is infinite (it happens very rapidly on rules like [ rule | none ] . none effectively equivalent to forever in this case.
回答2:
It works without problems. but some notes:
instead:
read to-url rejoin ["http://www.eoddata.com/stocklist/NYSE/" letter]
you can use just:
read join http://www.eoddata.com/stocklist/NYSE/ letter
instead:
either not cond [1][2]
you should use:
either cond [2][1]
instead:
download-directory: "askpoweruser/stock-download/google/files/"
dir: make-dir/deep to-rebol-file download-directory
write to-rebol-file rejoin [dir filename "_temp" ".txt"] mold out
use:
download-directory: %askpoweruser/stock-download/google/files/
dir: make-dir/deep download-directory
write rejoin [dir filename %_temp.txt] mold out
what the hell means?!:
for i 0 1 1 [
p: i
url: rejoin [url0 "&start=" (p * 200) "&num=" ((p + 1) * 200)]
..
来源:https://stackoverflow.com/questions/4297653/rebol-problem-with-not-enough-memory