How to L2 normalize an array with Swift

妖精的绣舞 提交于 2019-12-11 15:37:40

问题


I am trying normalize the input of my CoreML model like below, it kind of does something to the array but its quite different then what SKLearn does(I give same input and watch output in these environments). So appereantly I do something wrong.

My Model is trained with Keras and SKlearn and it must do the same normalization as I did using SKLearn Normalizer, which is the default L2 normalizer. What I am doing below apperantly is not equalivant of sklearn, any ideas?

    vDSP_normalizeD(vec, 1, &normalizedVec, 1, &mean, &std, vDSP_Length(count))

    let (normalizedXVec, _, _) = normalize(vec: doubleArray)

Then here I convert normalizedXVec to MLMultiArray and use as input to my predictor

Note: I also tried to convert the normalizer from sklearn using coreml tools but I got errors as seen here:


回答1:


vDSP_normalizeD uses the mean and standard deviation. That is not the same as L2.

The L2 normalization first computes the L2-norm of the vector, which is the same as sqrt(v[0]*v[0] + v[1]*v[1] + ... + v[n]*v[n]) and then it divides each element of the vector by that number.



来源:https://stackoverflow.com/questions/54608649/how-to-l2-normalize-an-array-with-swift

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!