Solving large-scale support vector ordinal regression with asynchronous parallel coordinate descent algorithms

作者:

Highlights:

• We highlight a special SVOR formulation whose thresholds are described implicitly, so that the dual formulation is concise to apply the state-of-the-art asynchronous parallel coordinate descent algorithm, such as AsyGCD.

• We propose two novel asynchronous parallel coordinate descent algorithms, called AsyACGD and AsyORGCD respectively. AsyACGD is an accelerated extension of AsyGCD using active set strategy. AsyORGCD is specifically designed for SVOR that it can keep the ordered thresholds when it is training so that it can obtain good performance with fewer time.

摘要

•We highlight a special SVOR formulation whose thresholds are described implicitly, so that the dual formulation is concise to apply the state-of-the-art asynchronous parallel coordinate descent algorithm, such as AsyGCD.•We propose two novel asynchronous parallel coordinate descent algorithms, called AsyACGD and AsyORGCD respectively. AsyACGD is an accelerated extension of AsyGCD using active set strategy. AsyORGCD is specifically designed for SVOR that it can keep the ordered thresholds when it is training so that it can obtain good performance with fewer time.

论文关键词:Asynchronous parallel,Coordinate descent,Support vector,Ordinal regression

论文评审过程:Received 8 December 2019, Revised 31 July 2020, Accepted 11 August 2020, Available online 14 August 2020, Version of Record 16 August 2020.

论文官网地址:https://doi.org/10.1016/j.patcog.2020.107592