Warning: Declaration of action_plugin_tablewidth::register(&$controller) should be compatible with DokuWiki_Action_Plugin::register(Doku_Event_Handler $controller) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/tablewidth/action.php on line 93
code:ridge_regression [CS545 fall 2016]

User Tools

Site Tools


code:ridge_regression

This is an old revision of the document!



Warning: Declaration of syntax_plugin_comment::handle($match, $state, $pos, &$handler) should be compatible with DokuWiki_Syntax_Plugin::handle($match, $state, $pos, Doku_Handler $handler) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/comment/syntax.php on line 19

Warning: Declaration of syntax_plugin_comment::render($mode, &$renderer, $data) should be compatible with DokuWiki_Syntax_Plugin::render($format, Doku_Renderer $renderer, $data) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/comment/syntax.php on line 19

Warning: Declaration of syntax_plugin_fontsize2::handle($match, $state, $pos, &$handler) should be compatible with DokuWiki_Syntax_Plugin::handle($match, $state, $pos, Doku_Handler $handler) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/fontsize2/syntax.php on line 116

Warning: Declaration of syntax_plugin_fontsize2::render($mode, &$renderer, $data) should be compatible with DokuWiki_Syntax_Plugin::render($format, Doku_Renderer $renderer, $data) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/fontsize2/syntax.php on line 116

Warning: Declaration of syntax_plugin_tablewidth::handle($match, $state, $pos, &$handler) should be compatible with DokuWiki_Syntax_Plugin::handle($match, $state, $pos, Doku_Handler $handler) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/tablewidth/syntax.php on line 57

Warning: Declaration of syntax_plugin_tablewidth::render($mode, &$renderer, $data) should be compatible with DokuWiki_Syntax_Plugin::render($format, Doku_Renderer $renderer, $data) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/tablewidth/syntax.php on line 57

Warning: Declaration of syntax_plugin_mathjax_protecttex::render($mode, &$renderer, $data) should be compatible with DokuWiki_Syntax_Plugin::render($format, Doku_Renderer $renderer, $data) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/mathjax/syntax/protecttex.php on line 157

Ridge Regression

ridge_regression.py
import numpy as np
from PyML.classifiers.baseClassifiers import Classifier
from PyML.evaluators import resultsObjects
 
"""
An implementation of ridge regression.
This is a simpler version than the one in PyML (see classifiers/ridgeRegression).
It works with the PyVectorDataSet container
"""
 
class RidgeRegression (Classifier) :
 
    """
    An implementation of ridge regression
 
    :Keywords:
      - `ridge` -- the ridge parameter [default: 10.0]
      - `kernel` -- a kernel object [default: Linear]
      - `regression` -- whether to use the object for regression [default: False]
        in its default (False), it is used as a classifier
      - `fit_bias` -- whether to incorporate a bias term [default: True]
 
    """
 
    attributes = {'ridge': 10,
                  'regression' : False,
                  'fit_bias' : True}
 
    def __init__(self, arg=None, **args) :
 
        Classifier.__init__(self, arg, **args)
        if self.regression :
            self.resultsObject = resultsObjects.RegressionResults
            self.classify = self.decisionFunc
 
 
    def train(self, data, **args) :
 
        Classifier.train(self, data, **args)
 
        if not self.regression and data.labels.numClasses != 2 :
            raise ValueError, "not a binary classification problem"
 
        if self.fit_bias :
            data.addFeature('bias', [1.0 for i in range(len(data))])
 
        self.w = np.zeros(data.numFeatures)
        self.bias = 0.0
 
        Y = np.array(data.labels.Y)
        if not (self.regression) :
            Y = Y * 2 - 1
        self.w = np.linalg.solve(data.X.T.dot(data.X) + self.ridge * np.eye(data.numFeatures), data.X.T.dot(Y))
        # there are alternative ways of computing the weight vector which are not
        # as computationally efficient:
        #self.w = np.dot(np.linalg.inv(data.X.T.dot(data.X)), X.T.dot(Y))
        #self.w = np.dot(np.linalg.pinv(data.X), Y)
        if self.fit_bias :
            data.eliminateFeatures([data.numFeatures -1])
            self.bias = self.w[-1]
            self.w = self.w[:-1]
 
        # this should be the last command in the train function
        self.log.trainingTime = self.getTrainingTime()
 
 
    def decisionFunc(self, data, i) :
 
        return np.dot(self.w, data.X[i]) + self.bias
 
    def classify(self, data, i) :
 
        score = self.decisionFunc(data, i)
        classification = 1 if score > 0 else 0
        return (classification, score)
code/ridge_regression.1379536698.txt.gz ยท Last modified: 2016/08/09 10:25 (external edit)