Home > other >  KNN - study notes
KNN - study notes

Time:11-17

K approaching algorithm
Algorithm principle: higher similarity data samples, after the mapping to the n dimensional space, lower than the similarity of samples close
The category of the classification problem: will the recent prediction of the largest number of as
Regression problems: the nearest point of the mean as the predicted results
Super parameters: before the training model, artificial specified parameters, and have great influence on the algorithm, different from the internal parameters of the algorithm, the algorithm of internal parameters is determined by the model training, available grid cross validation method to find the optimal superparametric
K value: K value size will directly affect the model result, the K value is too small, the model sensitivity is stronger, but easy to cause the model fitting, K value is too large model sensitivity is weak, easy to cause the model to owe fitting,
A fitting: model is too complex, focused training accidental error and sample fitting in, and also learned a lot of useless features, special effect of training set, validation set effect is very poor,
Owe fitting: model is too simple, the characteristics of the training samples are extracted to too little, leading to poor training effect,
  • Related