Warning: Declaration of action_plugin_tablewidth::register(&$controller) should be compatible with DokuWiki_Action_Plugin::register(Doku_Event_Handler $controller) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/tablewidth/action.php on line 93
assignments:assignment4 [CS545 fall 2016]

User Tools

Site Tools


assignments:assignment4

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
assignments:assignment4 [2016/09/30 11:26]
asa [Submission]
assignments:assignment4 [2016/10/05 11:41]
asa [Part 3: Soft-margin SVM for separable data]
Line 17: Line 17:
  
   * Consider the following statement: ​ The set of all key support vectors is unique. ​ Prove this, or show a counter-example.   * Consider the following statement: ​ The set of all key support vectors is unique. ​ Prove this, or show a counter-example.
-  * Using the definition of key support vectors prove a tighter bound on the leave-one-out cross validation error: ​+  * In class we argued that the fraction of examples that are support vectors provide a bound on the leave-one-out error.  ​Using the definition of key support vectors prove a tighter bound on the leave-one-out cross validation error can be obtained
 $$ $$
 E_{cv} \leq \frac{\textrm{number of key support vectors}}{N},​ E_{cv} \leq \frac{\textrm{number of key support vectors}}{N},​
Line 27: Line 27:
  
 Suppose you are given a linearly separable dataset, and you are training the soft-margin SVM, which uses slack variables with the soft-margin constant $C$ set  Suppose you are given a linearly separable dataset, and you are training the soft-margin SVM, which uses slack variables with the soft-margin constant $C$ set 
-with the soft margin constant $C$ set  
 to some positive value. to some positive value.
 Consider the following statement: Consider the following statement:
  
 Since increasing the $\xi_i$ can only increase the objective of the primal problem (which Since increasing the $\xi_i$ can only increase the objective of the primal problem (which
-we are trying to minimize), at the optimal ​solution to the primal problem, all the+we are trying to minimize), at the solution to the primal problem, all the
 training examples will have $\xi_i$ equal training examples will have $\xi_i$ equal
 to zero.  to zero. 
assignments/assignment4.txt ยท Last modified: 2016/10/11 18:16 by asa