How to run Ta-Lib on multiple columns of a Pandas dataframe?

寵の児 提交于 2019-12-07 16:23:28

问题


I have a data frame with the price of several securities as columns and I can't find a solution to run TA-Lib in one shot because it needs numpy.ndarray.

How can I run TA-Lib over multiple securities and get a data frame in return?

import talib as ta
d = {'security1': [1,2,8,9,8,5], 'security2': [3,8,5,4,3,5]}
df = pd.DataFrame(data=d)
df
Out[518]: 
   security1  security2
0          1          3
1          2          8
2          8          5
3          9          4
4          8          3
5          5          5

ta.EMA(df, 2)
TypeError: Argument 'real' has incorrect type (expected numpy.ndarray, got DataFrame)

ta.EMA(df['security1'], 2)
Out[520]: 
0         NaN
1    1.500000
2    5.833333
3    7.944444
4    7.981481
5    5.993827
dtype: float64

type(df['security1'])
Out[524]: pandas.core.series.Series

When I convert the data frame to a numpy.ndarray it still throws an exception:

ta.EMA(df.values, 2)
Out[528]: Exception: input array type is not double

Thank you.


回答1:


TA-Lib is expecting floating point data, whereas yours is integral.

As such, when constructing your dataframe you need to coerce the input data by specifying dtype=numpy.float64:

import pandas
import numpy
import talib

d = {'security1': [1,2,8,9,8,5], 'security2': [3,8,5,4,3,5]}
df = pandas.DataFrame(data=d, dtype=numpy.float64)         # note numpy.float64 here

TA-Lib expects 1D arrays, which means it can operate on pandas.Series but not pandas.DataFrame.

You can, however, use pandas.DataFrame.apply to apply a function on each column of your dataframe

df.apply(lambda c: talib.EMA(c, 2))

    security1   security2
0         NaN         NaN
1    1.500000    5.500000
2    5.833333    5.166667
3    7.944444    4.388889
4    7.981481    3.462963
5    5.993827    4.487654


来源:https://stackoverflow.com/questions/51712269/how-to-run-ta-lib-on-multiple-columns-of-a-pandas-dataframe

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!