Shark machine learning library
About Shark
News!
Contribute
Credits and copyright
Downloads
Getting Started
Installation
Using the docs
Documentation
Tutorials
Quick references
Class list
Global functions
FAQ
Showroom
include
shark
Algorithms
Trainers
FisherLDA.h
Go to the documentation of this file.
1
//===========================================================================
2
/*!
3
*
4
*
5
* \brief FisherLDA
6
*
7
*
8
*
9
* \author O. Krause
10
* \date 2010
11
*
12
*
13
* \par Copyright 1995-2017 Shark Development Team
14
*
15
* <BR><HR>
16
* This file is part of Shark.
17
* <http://shark-ml.org/>
18
*
19
* Shark is free software: you can redistribute it and/or modify
20
* it under the terms of the GNU Lesser General Public License as published
21
* by the Free Software Foundation, either version 3 of the License, or
22
* (at your option) any later version.
23
*
24
* Shark is distributed in the hope that it will be useful,
25
* but WITHOUT ANY WARRANTY; without even the implied warranty of
26
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
27
* GNU Lesser General Public License for more details.
28
*
29
* You should have received a copy of the GNU Lesser General Public License
30
* along with Shark. If not, see <http://www.gnu.org/licenses/>.
31
*
32
*/
33
//===========================================================================
34
#ifndef SHARK_ALGORITHMS_TRAINERS_FISHERLDA_H
35
#define SHARK_ALGORITHMS_TRAINERS_FISHERLDA_H
36
37
#include <
shark/Core/DLLSupport.h
>
38
#include <
shark/Models/LinearModel.h
>
39
#include <
shark/Algorithms/Trainers/AbstractTrainer.h
>
40
41
namespace
shark
{
42
43
44
/*!
45
* \brief Fisher's Linear Discriminant Analysis for data compression
46
*
47
* Similar to PCA, \em Fisher's \em Linear \em Discriminant \em Analysis is a
48
* method for reducing the datas dimensionality. In contrast to PCA it also uses
49
* class information.
50
*
51
* Consider the data's covariance matrix \f$ S \f$ and a unit vector \f$ u \f$
52
* which defines a one-dimensional subspace of the data. Then, PCA would
53
* maximmize the objective \f$ J(u) = u^T S u \f$, namely the datas variance in
54
* the subspace. Fisher-LDA, however, maximizes
55
* \f[
56
* J(u) = ( u^T S_W u )^{-1} ( u^T S_B u ),
57
* \f]
58
* where \f$ S_B \f$ is the covariance matrix of the class-means and \f$ S_W \f$
59
* is the average covariance matrix of all classes (in both cases, each class'
60
* influence is weighted by it's size). As a result, Fisher-LDA finds a subspace
61
* in which the class means are wide-spread while (in average) the variance of
62
* each class becomes small. This leads to good lower-dimensional
63
* representations of the data in cases where the classes are linearly
64
* separable.
65
*
66
* If a subspace with more than one dimension is requested, the above step is
67
* executed consecutively to find the next optimal subspace-dimension
68
* orthogonally to the others.
69
*
70
*
71
* \b Note: the max. dimensionality for the subspace is \#NumOfClasses-1.
72
*
73
* It is possible to choose how many dimnsions are used by setting the appropriate value
74
* by calling setSubspaceDImension or in the constructor.
75
* Also optionally whitening can be applied.
76
* For more detailed information about Fisher-LDA, see \e Bishop, \e Pattern
77
* \e Recognition \e and \e Machine \e Learning.
78
*/
79
class
FisherLDA
:
public
AbstractTrainer
<LinearModel<>, unsigned int>
80
{
81
public
:
82
/// Constructor
83
SHARK_EXPORT_SYMBOL
FisherLDA
(
bool
whitening
=
false
, std::size_t subspaceDimension = 0);
84
85
/// \brief From INameable: return the class name.
86
std::string
name
()
const
87
{
return
"Fisher-LDA"
; }
88
89
void
setSubspaceDimensions
(std::size_t dimensions){
90
m_subspaceDimensions
= dimensions;
91
}
92
93
std::size_t
subspaceDimensions
()
const
{
94
return
m_subspaceDimensions
;
95
}
96
97
/// check whether whitening mode is on
98
bool
whitening
()
const
{
99
return
m_whitening
;
100
}
101
102
/// if active, the model whitenes the inputs
103
void
setWhitening
(
bool
newWhitening){
104
m_whitening
= newWhitening;
105
}
106
107
/// Compute the FisherLDA solution for a multi-class problem.
108
SHARK_EXPORT_SYMBOL
void
train
(
LinearModel<>
& model,
LabeledData<RealVector, unsigned int>
const
& dataset);
109
110
protected
:
111
SHARK_EXPORT_SYMBOL
void
meanAndScatter
(
LabeledData<RealVector, unsigned int>
const
& dataset, RealVector&
mean
, RealMatrix& scatter);
112
bool
m_whitening
;
113
std::size_t
m_subspaceDimensions
;
114
};
115
116
117
}
118
#endif