This shows you the differences between two versions of the page.
Both sides previous revision Previous revision | |||
assignments:assignment4 [2016/10/06 15:09] asa [Part 4: Using SVMs] |
assignments:assignment4 [2016/10/11 18:16] (current) asa [Part 2: leave-one-out error for linearly separable data] |
||
---|---|---|---|
Line 14: | Line 14: | ||
In this question we will explore the leave-one-out error for a hard-margin SVM for a linearly separable dataset. | In this question we will explore the leave-one-out error for a hard-margin SVM for a linearly separable dataset. | ||
- | First, we define a //key support vector// as a support vector whose removal from the dataset changes the maximum margin hyperplane. | + | First, we define a set of //key support vectors// as a subset of the support vectors such that removal of any one vector from the set changes the maximum margin hyperplane. |
* Consider the following statement: The set of all key support vectors is unique. Prove this, or show a counter-example. | * Consider the following statement: The set of all key support vectors is unique. Prove this, or show a counter-example. |