Abstract
One central problem in science and engineering is predicting unseen outcome via relevant knowledge gained from data, where accuracy of generalization is the key. In the context of classification, we argue that higher generalization accuracy is achievable via ψ-learning, when a certain class of non-convex rather than convex cost functions are employed. To deliver attainable higher generalization accuracy, we propose two computational strategies via a global optimization technique-difference convex programming, which relies on a decomposition of the cost function into a difference of two convex functions. The first strategy solves sequential quadratic programs. The second strategy, combining this with the method of Branch-and-Bound, is more computationally intensive but is capable of producing global optima. Numerical experiments suggest that the algorithms realize the desired generalization ability of ψ-learning.
Original language | English (US) |
---|---|
Pages | 1-11 |
Number of pages | 11 |
DOIs | |
State | Published - 2005 |
Event | 5th SIAM International Conference on Data Mining, SDM 2005 - Newport Beach, CA, United States Duration: Apr 21 2005 → Apr 23 2005 |
Other
Other | 5th SIAM International Conference on Data Mining, SDM 2005 |
---|---|
Country/Territory | United States |
City | Newport Beach, CA |
Period | 4/21/05 → 4/23/05 |
Keywords
- DC programming
- Global optimization
- Sequential quadratic programming
- Support vectors