一级黄色片免费播放|中国黄色视频播放片|日本三级a|可以直接考播黄片影视免费一级毛片

高級搜索

留言板

尊敬的讀者、作者、審稿人, 關(guān)于本刊的投稿、審稿、編輯和出版的任何問題, 您可以本頁添加留言。我們將盡快給您答復(fù)。謝謝您的支持!

姓名
郵箱
手機號碼
標(biāo)題
留言內(nèi)容
驗證碼

v-軟間隔羅杰斯特回歸分類機

黃成泉 王士同 蔣亦樟 董愛美

黃成泉, 王士同, 蔣亦樟, 董愛美. v-軟間隔羅杰斯特回歸分類機[J]. 電子與信息學(xué)報, 2016, 38(4): 985-992. doi: 10.11999/JEIT150769
引用本文: 黃成泉, 王士同, 蔣亦樟, 董愛美. v-軟間隔羅杰斯特回歸分類機[J]. 電子與信息學(xué)報, 2016, 38(4): 985-992. doi: 10.11999/JEIT150769
HUANG Chengquan, WANG Shitong, JIANG Yizhang, DONG Aimei. v-Soft Margin Logistic Regression Classifier[J]. Journal of Electronics & Information Technology, 2016, 38(4): 985-992. doi: 10.11999/JEIT150769
Citation: HUANG Chengquan, WANG Shitong, JIANG Yizhang, DONG Aimei. v-Soft Margin Logistic Regression Classifier[J]. Journal of Electronics & Information Technology, 2016, 38(4): 985-992. doi: 10.11999/JEIT150769

v-軟間隔羅杰斯特回歸分類機

doi: 10.11999/JEIT150769 cstr: 32379.14.JEIT150769
基金項目: 

國家自然科學(xué)基金(61272210, 61202311),江蘇省自然科學(xué)基金(BK2012552),貴州省科學(xué)技術(shù)基金(黔科合J字[2013]2136號)

v-Soft Margin Logistic Regression Classifier

Funds: 

The National Natural Science Foundation of China (61272210, 61202311), The Natural Science Foundation of Jiangsu Province (BK2012552), The Science and Technology Foundation of Guizhou Province ([2013]2136)

  • 摘要: 坐標(biāo)下降(Coordinate Descent, CD)方法是求解大規(guī)模數(shù)據(jù)分類問題的有效方法,具有簡單操作流程和快速收斂速率。為了提高羅杰斯特回歸分類器(Logistic Regression Classifier, LRC)的泛化性能,受v-軟間隔支持向量機的啟發(fā),該文提出一種v-軟間隔羅杰斯特回歸分類機(v-Soft Margin Logistic Regression Classifier, v-SMLRC),證明了v-SMLRC對偶為一等式約束對偶坐標(biāo)下降CDdual并由此提出了適合于大規(guī)模數(shù)據(jù)的v-SMLRC-CDdual。所提出的v-SMLRC-CDdual既能最大化類間間隔,又能有效提高LRC的泛化性能。大規(guī)模文本數(shù)據(jù)集實驗表明,v-SMLRC-CDdual分類性能優(yōu)于或等同于相關(guān)方法。
  • BOTTOU L and BOUSQUET O. The tradeoffs of large scale learning[C]. Proceedings of Advances in Neural Information Processing Systems, Cambridge, 2008: 151-154.
    LIN C Y, TSAI C H, LEE C P, et al. Large-scale logistic regression and linear support vector machines using Spark[C]. Proceedings of 2014 IEEE International Conference on Big Data, Washington DC, 2014: 519-528. doi: 10.1109/BigData. 2014.7004269.
    AGERRI R, ARTOLA X, BELOKI Z, et al. Big data for natural language processing: A streaming approach[J]. Knowledge-Based Systems, 2015, 79: 36-42. doi: 10.1016/ j.knosys.2014.11.007.
    DARROCH J N and RATCLIFF D. Generalized iterative scaling for log-linear models[J]. The Annals of Mathematical Statistics, 1972, 43(5): 1470-1480. doi: 10.1214/aoms/ 1177692379.
    DELLA P S, DELLA P V, and LAFFERTY J. Inducing features of random fields[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997, 19(4): 380-393. doi: 10.1109/34.588021.
    GOODMAN J. Sequential conditional generalized iterative scaling[C]. Proceedings of the 40th annual meeting of the association of computational linguistics, Philadelphia, 2002: 9-16. doi: 10.3115/1073083.1073086.
    JIN R, YAN R, ZHANG J, et al. A faster iterative scaling algorithm for conditional exponential model[C]. Proceedings of the 20th International Conference on Machine Learning, New York, 2003: 282-289.
    HUANG F L, HSIEN C J, CHANG K W, et al. Iterative scaling and coordinate descent methods for maximum entropy[J]. Journal of Machine Learning Research, 2010, 11(2): 815-848.
    KOMAREK P and MOORE A W. Making logistic regression a core data mining tool: a practical investigation of accuracy, speed, and simplicity[R]. Technical report TR-05-27, Robotics Institute of Carnegie Mellon University, Pittsburgh, 2005.
    LIN C J, WENG R C, and KEERTHI S S. Trust region Newton method for large-scale logistic regression[J]. Journal of Machine Learning Research, 2008, 9(4): 627-650.
    KEERTHI S S, DUAN K B, SHEVADE S K, et al. A fast dual algorithm for kernel logistic regression[J]. Machine Learning, 2005, 61(1-3): 151-165. doi: 10.1007/s10994- 005-0768-5.
    PLATT J C. Fast training of support vector machines using sequential minimal optimization[C]. Proceedings of Advances in Kernel Methods: Support Vector Learning, Cambridge, 1999: 185-208.
    YU H F, HUANG F L, and LIN C J. Dual coordinate descent methods for logistic regression and maximum entropy models[J]. Machine Learning, 2011, 85(1/2): 41-75. doi: 10.1007/s10994-010-5221-8.
    顧鑫, 王士同, 許敏. 基于多源的跨領(lǐng)域數(shù)據(jù)分類快速新算法[J]. 自動化學(xué)報, 2014, 40(3): 531-547. doi: 10.3724/SP.J. 1004.2014.00531.
    GU X, WANG S T, and XU M. A new cross-multidomain classification algorithm and its fast version for large datasets[J]. Acta Automatica Sinica, 2014, 40(3): 531-547. doi: 10.3724/SP.J.1004.2014.00531.
    顧鑫, 王士同. 大樣本多源域與小目標(biāo)域的跨領(lǐng)域快速分類學(xué)習(xí)[J]. 計算機研究與發(fā)展, 2014, 51(3): 519-535. doi: 10.7544/issn1000-1239.2014.20120652.
    GU X and WANG S T. Fast cross-domain classification method for large multisources/small target domains[J]. Journal of Computer Research and Development, 2014, 51(3): 519-535. doi: 10.7544/issn1000-1239.2014.20120652.
    張學(xué)峰, 陳渤, 王鵬輝, 等. 一種基于Dirichelt過程隱變量支撐向量機模型的目標(biāo)識別方法[J]. 電子與信息學(xué)報, 2015, 37(1): 29-36. doi: 10.11999/JEIT140129.
    ZHANG X F, CHEN B, WANG P H, et al. A target recognition method based on dirichlet process latent variable support vector machine model[J]. Journal of Electronics Information Technology, 2015, 37(1): 29-36. doi: 10.11999/ JEIT140129.
    及歆榮, 侯翠琴, 侯義斌. 無線傳感器網(wǎng)絡(luò)下線性支持向量機分布式協(xié)同訓(xùn)練方法研究[J]. 電子與信息學(xué)報, 2015, 37(3): 708-714. doi: 10.11999/JEIT140408.
    JI X R, HOU C Q, and HOU Y B. Research on the distributed training method for linear SVM in WSN[J]. Journal of Electronics Information Technology, 2015, 37(3): 708-714. doi: 10.11999/JEIT140408.
    高發(fā)榮, 王佳佳, 席旭剛, 等. 基于粒子群優(yōu)化-支持向量機方法的下肢肌電信號步態(tài)識別[J]. 電子與信息學(xué)報, 2015, 37(5): 1154-1159. doi: 10.11999/JEIT141083.
    GAO F R, WANG J J, XI X G, et al. Gait recognition for lower extremity electromyographic signals based on PSO- SVM method[J]. Journal of Electronics Information Technology, 2015, 37(5): 1154-1159. doi: 10.11999/ JEIT141083.
    HSIEH C J, CHANG K W, LIN C J, et al. A dual coordinate descent method for large-scale linear SVM[C]. Proceedings of the 25th International Conference on Machine Learning, New York, 2008: 408-415. doi: 10.1145/1390156.1390208.
    CHEN P H, LIN C J, and SCHLKOPF B. A tutorial on v-support vector machines[J]. Applied Stochastic Models in Business and Industry, 2005, 21(2): 111-136. doi: 10.1002/ asmb.537.
    PENG X J, CHEN D J, and KONG L Y. A clipping dual coordinate descent algorithm for solving support vector machines[J]. Knowledge-Based Systems, 2014, 71: 266-278. doi: 10.1016/j.knosys.2014.08.005.
    TSAI C H, LIN C Y, and LIN C J. Incremental and decremental training for linear classification[C]. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, 2014: 343-352. doi: 10.1145/2623330.2623661.
  • 加載中
計量
  • 文章訪問數(shù):  1848
  • HTML全文瀏覽量:  120
  • PDF下載量:  442
  • 被引次數(shù): 0
出版歷程
  • 收稿日期:  2015-06-29
  • 修回日期:  2015-12-08
  • 刊出日期:  2016-04-19

目錄

    /

    返回文章
    返回