Main.Projects History
Hide minor edits - Show changes to output
Changed line 29 from:
Here are the links to the datasets you will use in your experiments:
to:
Here are the links to the datasets you will use in your feature selection experiments:
Added lines 28-33:
Here are the links to the datasets you will use in your experiments:
* [[http://archive.ics.uci.edu/ml/datasets/Gisette | Gisette]]
* [[http://archive.ics.uci.edu/ml/datasets/Dexter | Dexter]]
* [[http://archive.ics.uci.edu/ml/datasets/Arcene | Arcene]]
Changed lines 17-18 from:
* Hui Zou and Trevor Hastie. [[http://onlinelibrary.wiley.com/doi/10.1111/j.1467-9868.2005.00503.x/full | Regularization and Variable Selection via the Elastic Net]]. JRSSB (2005) 67(2) 301-320. An R package elasticnet is available from CRAN.
to:
* Hui Zou and Trevor Hastie. [[http://onlinelibrary.wiley.com/doi/10.1111/j.1467-9868.2005.00503.x/full | Regularization and Variable Selection via the Elastic Net]]. JRSSB (2005) 67(2) 301-320. An R package elasticnet is available from CRAN. (Majdi)
Changed line 23 from:
* T. Joachims. [[http://www.cs.cornell.edu/People/tj/publications/joachims_05a.pdf | A Support Vector Method for Multivariate Performance Measures]]. Proceedings of the International Conference on Machine Learning (ICML), 2005. [[http://svmlight.joachims.org/svm_perf.html | code]]. Can be used for feature selection like SVMs are used in the [[http://www.springerlink.com/content/w68424066825vr3l/|RFE]] method. (Majdi)
to:
* T. Joachims. [[http://www.cs.cornell.edu/People/tj/publications/joachims_05a.pdf | A Support Vector Method for Multivariate Performance Measures]]. Proceedings of the International Conference on Machine Learning (ICML), 2005. [[http://svmlight.joachims.org/svm_perf.html | code]]. Can be used for feature selection like SVMs are used in the [[http://www.springerlink.com/content/w68424066825vr3l/|RFE]] method.
Changed line 27 from:
* Christoph H. Lampert. [[http://books.nips.cc/papers/files/nips24/NIPS2011_0207.pdf | Maximum Margin Multi-Label Structured Prediction]]. Neural Information Processing Systems (NIPS), 2011.
to:
* Wei Bi and James Kwok. [[http://www.icml-2011.org/papers/10_icmlpaper.pdf | Multi-Label Classification on Tree- and DAG-Structured Hierarchies]]. International Conference on Machine Learning (ICML-11), 2011. (Indika)
Changed line 23 from:
* T. Joachims. [[http://www.cs.cornell.edu/People/tj/publications/joachims_05a.pdf | A Support Vector Method for Multivariate Performance Measures]]. Proceedings of the International Conference on Machine Learning (ICML), 2005. [[http://svmlight.joachims.org/svm_perf.html | code]]. Can be used for feature selection like SVMs are used in the [[http://www.springerlink.com/content/w68424066825vr3l/|RFE]] method.
to:
* T. Joachims. [[http://www.cs.cornell.edu/People/tj/publications/joachims_05a.pdf | A Support Vector Method for Multivariate Performance Measures]]. Proceedings of the International Conference on Machine Learning (ICML), 2005. [[http://svmlight.joachims.org/svm_perf.html | code]]. Can be used for feature selection like SVMs are used in the [[http://www.springerlink.com/content/w68424066825vr3l/|RFE]] method. (Majdi)
Changed line 11 from:
* Yeung K, Bumgarner R. [[http://genomebiology.com/2003/4/12/r83 | Multiclass classification of microarray data with repeated measurements: application to cancer]]. Genome Biology. 4:R83, 2003.
to:
* Yeung K, Bumgarner R. [[http://genomebiology.com/2003/4/12/r83 | Multiclass classification of microarray data with repeated measurements: application to cancer]]. Genome Biology. 4:R83, 2003. (Simon)
Changed line 13 from:
* Y. Sun, S. Todorovic, and S. Goodison. [[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.163.2272|Local Learning Based Feature Selection for High Dimensional Data Analysis]]. IEEE Trans. on Pattern Analysis and Machine Intelligence (TPAMI), vol. 32, no. 9, pp. 1610-1626, 2010.
to:
* Y. Sun, S. Todorovic, and S. Goodison. [[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.163.2272|Local Learning Based Feature Selection for High Dimensional Data Analysis]]. IEEE Trans. on Pattern Analysis and Machine Intelligence (TPAMI), vol. 32, no. 9, pp. 1610-1626, 2010. (Nand)
Changed line 19 from:
* Ji Zhu, Saharon Rosset, Trevor Hastie and Rob Tibshirani. [[http://www.stat.lsa.umich.edu/~jizhu/pubs/Zhu-NIPS04.pdf | 1-norm support vector machines]]. In: Neural Information Processing Systems (NIPS) 16, 2004. 1-norm classifiers create very sparse representations, so are useful for feature selection.
to:
* Ji Zhu, Saharon Rosset, Trevor Hastie and Rob Tibshirani. [[http://www.stat.lsa.umich.edu/~jizhu/pubs/Zhu-NIPS04.pdf | 1-norm support vector machines]]. In: Neural Information Processing Systems (NIPS) 16, 2004. 1-norm classifiers create very sparse representations, so are useful for feature selection. (Prathamesh)
Changed line 21 from:
* Feiping Nie, Heng Huang, Cai Xiao, Chris Ding. [[http://books.nips.cc/papers/files/nips23/NIPS2010_1149.pdf | Efficient and Robust Feature Selection via Joint L2,1-Norms Minimization]]. NIPS 2010.
to:
* Feiping Nie, Heng Huang, Cai Xiao, Chris Ding. [[http://books.nips.cc/papers/files/nips23/NIPS2010_1149.pdf | Efficient and Robust Feature Selection via Joint L2,1-Norms Minimization]]. NIPS 2010. (Rehab)
Changed line 9 from:
* Chris Ding and Hanchuan Peng. [[ http://ranger.uta.edu/~chqding/papers/mRMR_JBCB.pdf | Minimum redundancy feature selection from microarray gene expression data]]. Journal of Bioinformatics and Computational Biology, Vol. 3, No. 2, pp.185-205, 2005.
to:
* Chris Ding and Hanchuan Peng. [[ http://ranger.uta.edu/~chqding/papers/mRMR_JBCB.pdf | Minimum redundancy feature selection from microarray gene expression data]]. Journal of Bioinformatics and Computational Biology, Vol. 3, No. 2, pp.185-205, 2005. (Joel)
Changed line 23 from:
* T. Joachims. [[http://www.cs.cornell.edu/People/tj/publications/joachims_05a.pdf | A Support Vector Method for Multivariate Performance Measures]]. Proceedings of the International Conference on Machine Learning (ICML), 2005. [[http://svmlight.joachims.org/svm_perf.html | code]]. Can be used for feature selection like the [[http://www.springerlink.com/content/w68424066825vr3l/|RFE]] method.
to:
* T. Joachims. [[http://www.cs.cornell.edu/People/tj/publications/joachims_05a.pdf | A Support Vector Method for Multivariate Performance Measures]]. Proceedings of the International Conference on Machine Learning (ICML), 2005. [[http://svmlight.joachims.org/svm_perf.html | code]]. Can be used for feature selection like SVMs are used in the [[http://www.springerlink.com/content/w68424066825vr3l/|RFE]] method.
Changed line 23 from:
* T. Joachims. [[http://www.cs.cornell.edu/People/tj/publications/joachims_05a.pdf | A Support Vector Method for Multivariate Performance Measures]]. Proceedings of the International Conference on Machine Learning (ICML), 2005. [[http://svmlight.joachims.org/svm_perf.html | code]]. Can be used for feature selection like the RFE method.
to:
* T. Joachims. [[http://www.cs.cornell.edu/People/tj/publications/joachims_05a.pdf | A Support Vector Method for Multivariate Performance Measures]]. Proceedings of the International Conference on Machine Learning (ICML), 2005. [[http://svmlight.joachims.org/svm_perf.html | code]]. Can be used for feature selection like the [[http://www.springerlink.com/content/w68424066825vr3l/|RFE]] method.
Changed line 19 from:
* Ji Zhu, Saharon Rosset, Trevor Hastie and Rob Tibshirani. [[http://www.stat.lsa.umich.edu/~jizhu/pubs/Zhu-NIPS04.pdf | 1-norm support vector machines]]. In: Neural Information Processing Systems (NIPS) 16, 2004.
to:
* Ji Zhu, Saharon Rosset, Trevor Hastie and Rob Tibshirani. [[http://www.stat.lsa.umich.edu/~jizhu/pubs/Zhu-NIPS04.pdf | 1-norm support vector machines]]. In: Neural Information Processing Systems (NIPS) 16, 2004. 1-norm classifiers create very sparse representations, so are useful for feature selection.
Changed line 27 from:
* Christoph H. Lampert. [[http://books.nips.cc/papers/files/nips24/NIPS2011_0207.pdf | Maximum Margin Multi-Label Structured Prediction. Neural Information Processing Systems(NIPS), 2011.
to:
* Christoph H. Lampert. [[http://books.nips.cc/papers/files/nips24/NIPS2011_0207.pdf | Maximum Margin Multi-Label Structured Prediction]]. Neural Information Processing Systems (NIPS), 2011.
Added lines 25-27:
!! Prediction of protein function
* Christoph H. Lampert. [[http://books.nips.cc/papers/files/nips24/NIPS2011_0207.pdf | Maximum Margin Multi-Label Structured Prediction. Neural Information Processing Systems(NIPS), 2011.
* Christoph H. Lampert. [[http://books.nips.cc/papers/files/nips24/NIPS2011_0207.pdf | Maximum Margin Multi-Label Structured Prediction. Neural Information Processing Systems(NIPS), 2011.
Changed lines 23-24 from:
* A. Rakotomamonjy. [[http://asi.insa-rouen.fr/enseignants/~arakotom/publi/rakotomamonjyROCAI2004.pdf | Optimizing AUC with SVMs]]. Proceedings of European Conference on Artificial Intelligence Workshop on ROC Curve and AI, Valencia, 2004. [[ http://asi.insa-rouen.fr/enseignants/~arakotom/toolbox/index.html | code]]. Can be used for feature selection like the RFE method.
to:
* T. Joachims. [[http://www.cs.cornell.edu/People/tj/publications/joachims_05a.pdf | A Support Vector Method for Multivariate Performance Measures]]. Proceedings of the International Conference on Machine Learning (ICML), 2005. [[http://svmlight.joachims.org/svm_perf.html | code]]. Can be used for feature selection like the RFE method.
Changed line 23 from:
* A. Rakotomamonjy. [[http://asi.insa-rouen.fr/enseignants/~arakotom/publi/rakotomamonjyROCAI2004.pdf | Optimizing AUC with SVMs]]. Proceedings of European Conference on Artificial Intelligence Workshop on ROC Curve and AI, Valencia, 2004.
to:
* A. Rakotomamonjy. [[http://asi.insa-rouen.fr/enseignants/~arakotom/publi/rakotomamonjyROCAI2004.pdf | Optimizing AUC with SVMs]]. Proceedings of European Conference on Artificial Intelligence Workshop on ROC Curve and AI, Valencia, 2004. [[ http://asi.insa-rouen.fr/enseignants/~arakotom/toolbox/index.html | code]]. Can be used for feature selection like the RFE method.
Added line 23:
* A. Rakotomamonjy. [[http://asi.insa-rouen.fr/enseignants/~arakotom/publi/rakotomamonjyROCAI2004.pdf | Optimizing AUC with SVMs]]. Proceedings of European Conference on Artificial Intelligence Workshop on ROC Curve and AI, Valencia, 2004.
Changed lines 13-14 from:
* Y. Sun, S. Todorovic, and S. Goodison. [[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.163.2272
| Local Learning Based Feature Selection for High Dimensional Data Analysis]]. IEEE Trans. on Pattern Analysis and Machine Intelligence (TPAMI), vol. 32, no. 9, pp. 1610-1626, 2010.
to:
* Y. Sun, S. Todorovic, and S. Goodison. [[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.163.2272|Local Learning Based Feature Selection for High Dimensional Data Analysis]]. IEEE Trans. on Pattern Analysis and Machine Intelligence (TPAMI), vol. 32, no. 9, pp. 1610-1626, 2010.
Added lines 12-14:
* Y. Sun, S. Todorovic, and S. Goodison. [[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.163.2272
| Local Learning Based Feature Selection for High Dimensional Data Analysis]]. IEEE Trans. on Pattern Analysis and Machine Intelligence (TPAMI), vol. 32, no. 9, pp. 1610-1626, 2010.
Changed lines 11-14 from:
* Yeung K, Bumgarner R. [[http://genomebiology.com/2003/4/12/r83
| Multiclass classification of microarray data
with repeated measurements: application to cancer]]. Genome
Biol. 4:R83, 2003.
to:
* Yeung K, Bumgarner R. [[http://genomebiology.com/2003/4/12/r83 | Multiclass classification of microarray data with repeated measurements: application to cancer]]. Genome Biology. 4:R83, 2003.
Added lines 11-14:
* Yeung K, Bumgarner R. [[http://genomebiology.com/2003/4/12/r83
| Multiclass classification of microarray data
with repeated measurements: application to cancer]]. Genome
Biol. 4:R83, 2003.
| Multiclass classification of microarray data
with repeated measurements: application to cancer]]. Genome
Biol. 4:R83, 2003.
Changed lines 16-17 from:
* Ji Zhu, Saharon Rosset, Trevor Hastie and Rob Tibshirani. [[http://www.stat.lsa.umich.edu/~jizhu/pubs/Zhu-NIPS04.pdf | 1-norm support vector machines]]. In: Neural Information
Processing Systems (NIPS) 16, 2004.
Processing Systems (NIPS) 16, 2004.
to:
* Ji Zhu, Saharon Rosset, Trevor Hastie and Rob Tibshirani. [[http://www.stat.lsa.umich.edu/~jizhu/pubs/Zhu-NIPS04.pdf | 1-norm support vector machines]]. In: Neural Information Processing Systems (NIPS) 16, 2004.
Changed lines 16-17 from:
* Ji Zhu, Saharon Rosset, Trevor Hastie and Rob Tibshirani. [[http://www.stat.lsa.umich.edu/~jizhu/pubs/Zhu-NIPS04.pdf | 1-norm
support vector machines]]. In: Neural Information
support vector machines]]. In: Neural Information
to:
* Ji Zhu, Saharon Rosset, Trevor Hastie and Rob Tibshirani. [[http://www.stat.lsa.umich.edu/~jizhu/pubs/Zhu-NIPS04.pdf | 1-norm support vector machines]]. In: Neural Information
Changed lines 14-15 from:
* Hui Zou and Trevor Hastie. [[http://onlinelibrary.wiley.com/doi/10.1111/j.1467-9868.2005.00503.x/full | Regularization and Variable Selection via
the Elastic Net]]. JRSSB (2005) 67(2) 301-320. An R package elasticnet is available from CRAN.
the Elastic Net]]. JRSSB (2005) 67(2) 301-320. An R package elasticnet is available from CRAN.
to:
* Hui Zou and Trevor Hastie. [[http://onlinelibrary.wiley.com/doi/10.1111/j.1467-9868.2005.00503.x/full | Regularization and Variable Selection via the Elastic Net]]. JRSSB (2005) 67(2) 301-320. An R package elasticnet is available from CRAN.
Changed line 14 from:
* Hui Zou and Trevor Hastie. [[http://www.stanford.edu/~hastie/Papers/B67.2%20(2005)%20301-320%20Zou%20&%20Hastie.pdf | Regularization and Variable Selection via
to:
* Hui Zou and Trevor Hastie. [[http://onlinelibrary.wiley.com/doi/10.1111/j.1467-9868.2005.00503.x/full | Regularization and Variable Selection via
Changed line 21 from:
* Feiping Nie, Heng Huang, Cai Xiao, Chris Ding. [[http://books.nips.cc/papers/files/nips23/NIPS2010_1149.pdf | Efficient and Robust Feature Selection via Joint L2,1-Norms Minimization]]. NIPS 2010.
to:
* Feiping Nie, Heng Huang, Cai Xiao, Chris Ding. [[http://books.nips.cc/papers/files/nips23/NIPS2010_1149.pdf | Efficient and Robust Feature Selection via Joint L2,1-Norms Minimization]]. NIPS 2010.
Changed lines 9-10 from:
* Chris Ding and Hanchuan Peng. [[ http://ranger.uta.edu/~chqding/papers/mRMR_JBCB.pdf | Minimum redundancy feature selection from microarray gene expression data]].
Journal of Bioinformatics and Computational Biology, Vol. 3, No. 2, pp.185-205, 2005.
Journal of Bioinformatics and Computational Biology, Vol. 3, No. 2, pp.185-205, 2005.
to:
* Chris Ding and Hanchuan Peng. [[ http://ranger.uta.edu/~chqding/papers/mRMR_JBCB.pdf | Minimum redundancy feature selection from microarray gene expression data]]. Journal of Bioinformatics and Computational Biology, Vol. 3, No. 2, pp.185-205, 2005.
Changed lines 3-23 from:
to:
There are two topics available for projects: feature selection and prediction of protein function. Your first step is to choose a paper whose method you will use/implement.
!!Feature selection
!!!Filter methods:
* Chris Ding and Hanchuan Peng. [[ http://ranger.uta.edu/~chqding/papers/mRMR_JBCB.pdf | Minimum redundancy feature selection from microarray gene expression data]].
Journal of Bioinformatics and Computational Biology, Vol. 3, No. 2, pp.185-205, 2005.
!!!Embedded feature selection methods:
* Hui Zou and Trevor Hastie. [[http://www.stanford.edu/~hastie/Papers/B67.2%20(2005)%20301-320%20Zou%20&%20Hastie.pdf | Regularization and Variable Selection via
the Elastic Net]]. JRSSB (2005) 67(2) 301-320. An R package elasticnet is available from CRAN.
* Ji Zhu, Saharon Rosset, Trevor Hastie and Rob Tibshirani. [[http://www.stat.lsa.umich.edu/~jizhu/pubs/Zhu-NIPS04.pdf | 1-norm
support vector machines]]. In: Neural Information
Processing Systems (NIPS) 16, 2004.
* Feiping Nie, Heng Huang, Cai Xiao, Chris Ding. [[http://books.nips.cc/papers/files/nips23/NIPS2010_1149.pdf | Efficient and Robust Feature Selection via Joint L2,1-Norms Minimization]]. NIPS 2010.
!!Feature selection
!!!Filter methods:
* Chris Ding and Hanchuan Peng. [[ http://ranger.uta.edu/~chqding/papers/mRMR_JBCB.pdf | Minimum redundancy feature selection from microarray gene expression data]].
Journal of Bioinformatics and Computational Biology, Vol. 3, No. 2, pp.185-205, 2005.
!!!Embedded feature selection methods:
* Hui Zou and Trevor Hastie. [[http://www.stanford.edu/~hastie/Papers/B67.2%20(2005)%20301-320%20Zou%20&%20Hastie.pdf | Regularization and Variable Selection via
the Elastic Net]]. JRSSB (2005) 67(2) 301-320. An R package elasticnet is available from CRAN.
* Ji Zhu, Saharon Rosset, Trevor Hastie and Rob Tibshirani. [[http://www.stat.lsa.umich.edu/~jizhu/pubs/Zhu-NIPS04.pdf | 1-norm
support vector machines]]. In: Neural Information
Processing Systems (NIPS) 16, 2004.
* Feiping Nie, Heng Huang, Cai Xiao, Chris Ding. [[http://books.nips.cc/papers/files/nips23/NIPS2010_1149.pdf | Efficient and Robust Feature Selection via Joint L2,1-Norms Minimization]]. NIPS 2010.
