주요 개념 및 관련 문서
섀넌 엔트로피, 크로스 엔트로피, KL Divergence
- https://ratsgo.github.io/statistics/2017/09/22/information/
- http://norman3.github.io/prml/docs/chapter01/6.html
- http://dongwonshin.net/kullback-leibler-divergence-%EC%84%A4%EB%AA%85/
- http://solarisailab.com/archives/2237
- https://ratsgo.github.io/deep%20learning/2017/09/24/loss/
- https://taeoh-kim.github.io/blog/cross-entropy%EC%9D%98-%EC%A0%95%ED%99%95%ED%95%9C-%ED%99%95%EB%A5%A0%EC%A0%81-%EC%9D%98%EB%AF%B8/
- http://www.aistudy.com/control/information_theory.htm
Logit function
Decision Tree + ID3알고리즘
- http://jihoonlee.tistory.com/16
- http://seamless.tistory.com/20
- https://ratsgo.github.io/machine%20learning/2017/03/26/tree/
- 랜덤포레스트, 로테이션포레스트: https://ratsgo.github.io/machine%20learning/2017/03/17/treeensemble/
베르누이 확률 분포
최대가능도 추정법 (Maximum Likelihood Estimator, MLE)
SGD
Boosting 기법의 기해
Mean Squared Error, Bias, and Variance
- https://en.wikipedia.org/wiki/Mean_squared_error
- https://ko.wikipedia.org/wiki/편향-분산 트레이드오프
- http://scott.fortmann-roe.com/docs/BiasVariance.html
- boostring기법이해.pdf 의 자료 중 bias, variance 에 대한 설명은 이 Essay 자료가 원 출처인 것으로 보인다.
- Prediction vs Estimation
Posterior Probability
- https://en.wikipedia.org/wiki/Posterior_probability
- 고등학교 조건부 확률 공식 모음 - https://www.mathfactory.net/10328
SVM(Support Vector Machine)
- https://ratsgo.github.io/machine%20learning/2017/05/23/SVM/
- https://wikidocs.net/5719
- http://yamalab.tistory.com/40
FFM 선행 논문
- Rendle, S. (2010). Factorization machines. : https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf
- Factorization Machines with libFM: https://www.csie.ntu.edu.tw/~b97053/paper/Factorization%20Machines%20with%20libFM.pdf
- FM을 어떻게 구현하는지 좀 더 설명되어있음.
- Jahrer, M., Tscher, A., Lee, J.-Y., Deng, J., Zhang, H., and Spoelstra, J. (2012). Ensemble of collaborative filtering and feature engineered models for click through rate prediction. : https://pdfs.semanticscholar.org/eeb9/34178ea9320c77852eb89633e14277da41d8.pdf
- Training and Testing Low-degree Polynomial Data Mappings via Linear SVM : http://www.jmlr.org/papers/volume11/chang10a/chang10a.pdf
- logistic regression의 loss function 설명. https://stats.stackexchange.com/questions/250937/which-loss-function-is-correct-for-logistic-regression#answer-279698
- y가 {0, 1} 일때와 y가 {-1, 1} 일때의 공식 차이 설명.
- https://www.analyticsvidhya.com/blog/2018/01/factorization-machines/
- http://ailab.criteo.com/ctr-prediction-linear-model-field-aware-factorization-machines/
Word2Vec
- https://ratsgo.github.io/natural%20language%20processing/2017/03/08/word2vec/
* one hot encoding 을 TF-IDF 로 대체해 응용 가능
Doc2Vec
- https://yujuwon.tistory.com/entry/Doc2Vec
- http://www.engear.net/wp/tag/doc2vec/
머신러닝 전반적인 내용 (part1~8)
회귀분석 강의 노트 (한남대학교 통계학과 권세혁 교수)
- http://wolfpack.hnu.ac.kr/lecture/Regression/
ALS(MF) 알고리즘
- https://www.slideshare.net/madvirus/als-ws?from_action=save
* als-141117230305-conversion-gate01.pdf
기계학습관련 동강
- https://seslab.kaist.ac.kr/xe2/page_GBex27
- https://www.analyticsvidhya.com/blog/2018/01/factorization-machines/
- matrix factorization
- factorization machines over polynomial and linear models
- http://research.criteo.com/ctr-prediction-linear-model-field-aware-factorization-machines/
- 3 Idiots 멤버인 Yu-Chin Juan 의 아티클
- Poly2, FM, FFM 의 차이에 대해 친절하게 잘 설명하고 있다.
- http://tech.adroll.com/blog/data-science/2015/08/25/factorization-machines.html
- AdRoll 이라는 광고회사의 연구소에서 작성한 아티클
- Factorization Machine 에 대해 친절하게 설명하고 있다. FM 결과가 convex 가 아니어서 최적화 모델 적용하는데 제약이 있다는 언급이 있다. SGD 는 작동하지만, convex 특성에 기반한 최적화 모델은 적용하기 어려운 듯 하다.
- https://www.csie.ntu.edu.tw/~r01922136/kaggle-2014-criteo.pdf
- 3 Idiots' Approach for Display Advertising Challenge
- http://techblog.youdao.com/wp-content/uploads/2015/03/Avazu-CTR-Prediction.pdf
- Avazu Click-Through Rate Prediction
- by Xiaocong Zhou and Peng Yan, March 9, 2015