This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Next revision Both sides next revision | ||
assignments:assignment4 [2016/09/30 11:26] asa [Submission] |
assignments:assignment4 [2016/10/03 10:01] asa [Part 3: Soft-margin SVM for separable data] |
||
---|---|---|---|
Line 17: | Line 17: | ||
* Consider the following statement: The set of all key support vectors is unique. Prove this, or show a counter-example. | * Consider the following statement: The set of all key support vectors is unique. Prove this, or show a counter-example. | ||
- | * Using the definition of key support vectors prove a tighter bound on the leave-one-out cross validation error: | + | * In class we argued that the fraction of examples that are support vectors provide a bound on the leave-one-out error. Using the definition of key support vectors prove a tighter bound on the leave-one-out cross validation error can be obtained: |
$$ | $$ | ||
E_{cv} \leq \frac{\textrm{number of key support vectors}}{N}, | E_{cv} \leq \frac{\textrm{number of key support vectors}}{N}, | ||
Line 27: | Line 27: | ||
Suppose you are given a linearly separable dataset, and you are training the soft-margin SVM, which uses slack variables with the soft-margin constant $C$ set | Suppose you are given a linearly separable dataset, and you are training the soft-margin SVM, which uses slack variables with the soft-margin constant $C$ set | ||
- | with the soft margin constant $C$ set | ||
to some positive value. | to some positive value. | ||
Consider the following statement: | Consider the following statement: |