At the moment, the most common implementation for an one artificial neuron learning used by support vector machine — SVM by Vladimir Vapnik. SVM's algorithm find a minimum of structural risk. In order to obtain the nonlinearity of input vectors in SVM used kernel tricks, but very carefully. Incorrectly chosen kernel tricks can cause overfitting.
But for SVM already have a worthy competitor. This is vector machine by Reshetov – VMR. VMR well as SVM can teach one artificial neuron and also uses kernel tricks to obtain non-linearity of input vectors.
But the difference VMR that the algorithm searches for the minimax structural risk. Another characteristic difference - a higher generalization ability compared to SVM. Therefore, VMR can use various kernel tricks without risk of overfitting. during training of neuron. Redundant kernel tricks in VMR can automatically redused.
None of binary classification algorithm in Weka was unable to give the best results on the generalization ability when compared to the VMR. Testing was conducted on samples taken from the repository http://archive.ics.uci.edu/ml/datasets.html
It is possible that VMR finally replaced SVM?
If you are not using WMR, then download HERE the library libvmr, to get unlimited opportunities for machine learning