CodePudding user response:
I don't do the image, but from the perspective of ordinary programmers for1. The first binarization
2. Angle code, binary point have some distribution trend of unification is adjusted for "light on the turbidity" (statistic + affine transformation)
3. The size code
4. Feature similarity decision (Angle size specifications, then only need to determine characteristics, general algorithm for the cosine similarity)
Ps: the thing you'd better find a professional image to get, we can do only if the application nor the somebody else to do
CodePudding user response:
https://blog.csdn.net/jiangxinyu/article/details/7968139CodePudding user response:
Point. The point of statute, binarization with some distribution trend of unification is adjusted for "light on the turbidity" (statistic + affine transformation)Wanted to think, it can use a linear classifier, to get a linear regression line directly, then according to the linear regression line, what do you do after the affine transformation in the feature vector, or directly in the linear regression line vector feature
CodePudding user response:
Throw yolo3CodePudding user response: