Parameter Tuning for Perceptron Learning Algorithm

后端 未结 3 1844
醉话见心
醉话见心 2021-02-01 09:35

I\'m having sort of an issue trying to figure out how to tune the parameters for my perceptron algorithm so that it performs relatively well on unseen data.

I\'ve imple

3条回答
  •  青春惊慌失措
    2021-02-01 10:35

    The learning rate depends on the typical values of data. There is no rule of thumb in general. Feature scaling is a method used to standardize the range of independent variables or features of data. In data processing, it is also known as data normalization and is generally performed during the data preprocessing step.

    Normalizing the data to a zero-mean, unit variance or between 0-1 or any other standard form can help in selecting a value of learning rate. As doug mentioned, learning rate between 0.05 and 0.2 generally works well.

    Also this will help in making the algorithm converge faster.

    Source: Juszczak, P.; D. M. J. Tax, and R. P. W. Dui (2002). "Feature scaling in support vector data descriptions". Proc. 8th Annu. Conf. Adv. School Comput. Imaging: 95–10.

提交回复
热议问题