Happy New Year. Thank you for your continued support this year. I tried out SVM (Support Vector Machine), so I’ll write down the method. Like Naive Bayes, it can classify text. Here, I used svm_light.
cd /usr/local/src
mkdir svm_light
cd svm_light
wget http://download.joachims.org/svm_light/current/svm_light_linux.tar.gz
gunzip -c svm_light.tar.gz | tar xvf -
make all
After installation, add /usr/local/src/svm_light to your PATH.
1 i am happy
1 happy happy new year
-1 bad year
-1 i am bad
Next, create a correspondence table between each word and a number.
1:i
2:am
3:happy
4:new
5:year
6:bad
Finally, create training data based on this correspondence table (call this file “example_file”).
1 1:1 2:1 3:1
1 3:2 4:1 5:1
-1 5:1 6:1
-1 1:1 2:1 6:1
svm_learn example_file model_file
Create a model file (“model_file”) with the svm_learn command.