shark::KernelTargetAlignment< InputType, LabelType > Class Template Reference

Kernel Target Alignment - a measure of alignment of a kernel Gram matrix with labels. More...

#include <shark/ObjectiveFunctions/KernelTargetAlignment.h>

+ Inheritance diagram for shark::KernelTargetAlignment< InputType, LabelType >:

Public Member Functions

 KernelTargetAlignment (LabeledData< InputType, LabelType > const &dataset, AbstractKernelFunction< InputType > *kernel, bool centering=true)
 Construction of the Kernel Target Alignment (KTA) from a kernel object. More...
 
std::string name () const
 From INameable: return the class name. More...
 
SearchPointType proposeStartingPoint () const
 Return the current kernel parameters as a starting point for an optimization run. More...
 
std::size_t numberOfVariables () const
 Accesses the number of variables. More...
 
double eval (RealVector const &input) const
 Evaluate the (centered, negative) Kernel Target Alignment (KTA). More...
 
ResultType evalDerivative (const SearchPointType &input, FirstOrderDerivative &derivative) const
 Compute the derivative of the KTA as a function of the kernel parameters. More...
 
- Public Member Functions inherited from shark::AbstractObjectiveFunction< RealVector, double >
const Featuresfeatures () const
 
virtual void updateFeatures ()
 
bool hasValue () const
 returns whether this function can calculate it's function value More...
 
bool hasFirstDerivative () const
 returns whether this function can calculate the first derivative More...
 
bool hasSecondDerivative () const
 returns whether this function can calculate the second derivative More...
 
bool canProposeStartingPoint () const
 returns whether this function can propose a starting point. More...
 
bool isConstrained () const
 returns whether this function can return More...
 
bool hasConstraintHandler () const
 returns whether this function can return More...
 
bool canProvideClosestFeasible () const
 Returns whether this function can calculate thee closest feasible to an infeasible point. More...
 
bool isThreadSafe () const
 Returns true, when the function can be usd in parallel threads. More...
 
bool isNoisy () const
 Returns true, when the function can be usd in parallel threads. More...
 
 AbstractObjectiveFunction ()
 Default ctor. More...
 
virtual ~AbstractObjectiveFunction ()
 Virtual destructor. More...
 
virtual void init ()
 
void setRng (random::rng_type *rng)
 Sets the Rng used by the objective function. More...
 
virtual bool hasScalableDimensionality () const
 
virtual void setNumberOfVariables (std::size_t numberOfVariables)
 Adjusts the number of variables if the function is scalable. More...
 
virtual std::size_t numberOfObjectives () const
 
virtual bool hasScalableObjectives () const
 
virtual void setNumberOfObjectives (std::size_t numberOfObjectives)
 Adjusts the number of objectives if the function is scalable. More...
 
std::size_t evaluationCounter () const
 Accesses the evaluation counter of the function. More...
 
AbstractConstraintHandler< SearchPointType > const & getConstraintHandler () const
 Returns the constraint handler of the function if it has one. More...
 
virtual bool isFeasible (const SearchPointType &input) const
 Tests whether a point in SearchSpace is feasible, e.g., whether the constraints are fulfilled. More...
 
virtual void closestFeasible (SearchPointType &input) const
 If supported, the supplied point is repaired such that it satisfies all of the function's constraints. More...
 
ResultType operator() (SearchPointType const &input) const
 Evaluates the function. Useful together with STL-Algorithms like std::transform. More...
 
virtual ResultType evalDerivative (SearchPointType const &input, SecondOrderDerivative &derivative) const
 Evaluates the objective function and calculates its gradient. More...
 
- Public Member Functions inherited from shark::INameable
virtual ~INameable ()
 

Additional Inherited Members

- Public Types inherited from shark::AbstractObjectiveFunction< RealVector, double >
enum  Feature
 List of features that are supported by an implementation. More...
 
typedef RealVector SearchPointType
 
typedef double ResultType
 
typedef boost::mpl::if_< std::is_arithmetic< double >, SearchPointType, RealMatrix >::type FirstOrderDerivative
 
typedef TypedFlags< FeatureFeatures
 This statement declares the member m_features. See Core/Flags.h for details. More...
 
typedef TypedFeatureNotAvailableException< FeatureFeatureNotAvailableException
 
- Protected Member Functions inherited from shark::AbstractObjectiveFunction< RealVector, double >
void announceConstraintHandler (AbstractConstraintHandler< SearchPointType > const *handler)
 helper function which is called to announce the presence of an constraint handler. More...
 
- Protected Attributes inherited from shark::AbstractObjectiveFunction< RealVector, double >
Features m_features
 
std::size_t m_evaluationCounter
 Evaluation counter, default value: 0. More...
 
AbstractConstraintHandler< SearchPointType > const * m_constraintHandler
 
random::rng_type * mep_rng
 

Detailed Description

template<class InputType = RealVector, class LabelType = unsigned int>
class shark::KernelTargetAlignment< InputType, LabelType >

Kernel Target Alignment - a measure of alignment of a kernel Gram matrix with labels.

The Kernel Target Alignment (KTA) was originally proposed in the paper:
On Kernel-Target Alignment. N. Cristianini, J. Shawe-Taylor, A. Elisseeff, J. Kandola. Innovations in Machine Learning, 2006.
Here we provide a version with centering of the features as proposed in the paper:
Two-Stage Learning Kernel Algorithms. C. Cortes, M. Mohri, A. Rostamizadeh. ICML 2010.
The kernel target alignment is defined as

\[ \hat A = \frac{\langle K, y y^T \rangle}{\sqrt{\langle K, K \rangle \cdot \langle y y^T, y y^T \rangle}} \]

where K is the kernel Gram matrix of the data and y is the vector of +1/-1 valued labels. The outer product \( y y^T \) corresponds to an "ideal" Gram matrix corresponding to a kernel that maps the two classes each to a single point, thus minimizing within-class distance for fixed inter-class distance. The inner products denote the Frobenius product of matrices: http://en.wikipedia.org/wiki/Matrix_multiplication#Frobenius_product
In kernel-based learning, the kernel Gram matrix K is of the form

\[ K_{i,j} = k(x_i, x_j) = \langle \phi(x_i), \phi(x_j) \rangle \]

for a Mercer kernel function k and inputs \( x_i, x_j \). In this version of the KTA we use centered feature vectors. Let

\[ \psi(x_i) = \phi(x_i) - \frac{1}{\ell} \sum_{j=1}^{\ell} \phi(x_j) \]

denote the centered feature vectors, then the centered Gram matrix \( K^c \) is given by

\[ K^c_{i,j} = \langle \psi(x_i), \psi(x_j) \rangle = K_{i,j} - \frac{1}{\ell} \sum_{n=1}^\ell K_{i,n} + K_{j,n} + \frac{1}{\ell^2} \sum_{m,n=1}^\ell K_{n,m} \]

The alignment measure computed by this class is the exact same formula for \( \hat A \), but with \( K^c \) plugged in in place of $ \( K \).
KTA measures the Frobenius inner product between a kernel Gram matrix and this ideal matrix. The interpretation is that KTA measures how well a given kernel fits a classification problem. The actual measure is invariant under kernel rescaling. In Shark, objective functions are minimized by convention. Therefore the negative alignment \( - \hat A \) is implemented. The measure is extended for multi-class problems by using prototype vectors instead of scalar labels.
The following properties of KTA are important from a model selection point of view: it is relatively fast and easy to compute, it is differentiable w.r.t. the kernel function, and it is independent of the actual classifier.
The following notation is used in several of the methods of the class. \( K^c \) denotes the centered Gram matrix, y is the vector of labels, Y is the outer product of this vector with itself, k is the row (or column) wise average of the uncentered Gram matrix K, my is the label average, and u is the vector of all ones, and \( \ell \) is the number of data points, and thus the size of the Gram matrix.

Definition at line 103 of file KernelTargetAlignment.h.

Constructor & Destructor Documentation

◆ KernelTargetAlignment()

Member Function Documentation

◆ eval()

template<class InputType = RealVector, class LabelType = unsigned int>
double shark::KernelTargetAlignment< InputType, LabelType >::eval ( RealVector const &  input) const
inlinevirtual

Evaluate the (centered, negative) Kernel Target Alignment (KTA).

See the class description for more details on this computation.

Reimplemented from shark::AbstractObjectiveFunction< RealVector, double >.

Definition at line 148 of file KernelTargetAlignment.h.

References shark::IParameterizable< VectorType >::setParameterVector().

◆ evalDerivative()

template<class InputType = RealVector, class LabelType = unsigned int>
ResultType shark::KernelTargetAlignment< InputType, LabelType >::evalDerivative ( const SearchPointType input,
FirstOrderDerivative derivative 
) const
inlinevirtual

Compute the derivative of the KTA as a function of the kernel parameters.

It holds:

\[ \langle K^c, K^c \rangle = \langle K,K \rangle -2 \ell \langle k,k \rangle + mk^2 \ell^2 \\ (\langle K^c, K^c \rangle )' = 2 \langle K,K' \rangle -4\ell \langle k, \frac{1}{\ell} \sum_j K'_ij \rangle +2 \ell^2 mk \sum_ij 1/(\ell^2) K'_ij \\ = 2 \langle K,K' \rangle -4 \langle k, \sum_j K'_ij \rangle + 2 mk \sum_ij K_ij \\ = 2 \langle K,K' \rangle -4 \langle k u^T, K' \rangle + 2 mk \langle u u^T, K' \rangle \\ = 2\langle K -2 k u^T + mk u u^T, K' \rangle ) \\ \langle Y, K^c \rangle = \langle Y, K \rangle - 2 n \langle y, k \rangle + n^2 my mk \\ (\langle Y , K^c \rangle )' = \langle Y -2 y u^T + my u u^T, K' \rangle \]

now the derivative is computed from this values in a second sweep over the data: we get:

\[ \hat A' = 1/\langle K^c,K^c \rangle ^{3/2} (\langle K^c,K^c \rangle (\langle Y,K^c \rangle )' - 0.5*\langle Y, K^c \rangle (\langle K^c , K^c \rangle )') \\ = 1/\langle K^c,K^c \rangle ^{3/2} \langle \langle K^c,K^c \rangle (Y -2 y u^T + my u u^T)- \langle Y, K^c \rangle (K -2 k u^T+ mk u u^T),K' \rangle \\ = 1/\langle K^c,K^c \rangle ^{3/2} \langle W,K' \rangle \]

reordering rsults in

\[ W= \langle K^c,K^c \rangle Y - \langle Y, K^c \rangle K \\ - 2 (\langle K^c,K^c \rangle y - \langle Y, K^c \rangle k) u^T \\ + (\langle K^c,K^c \rangle my - \langle Y, K^c \rangle mk) u u^T \]

where \( K' \) is the derivative of K with respct of the kernel parameters.

Reimplemented from shark::AbstractObjectiveFunction< RealVector, double >.

Definition at line 174 of file KernelTargetAlignment.h.

References shark::LabeledData< InputT, LabelT >::batch(), shark::batchSize(), shark::AbstractKernelFunction< InputTypeT >::createState(), shark::AbstractKernelFunction< InputTypeT >::eval(), shark::LabeledData< InputT, LabelT >::numberOfBatches(), shark::IParameterizable< VectorType >::numberOfParameters(), shark::IParameterizable< VectorType >::setParameterVector(), SHARK_PARALLEL_FOR, and shark::AbstractKernelFunction< InputTypeT >::weightedParameterDerivative().

◆ name()

template<class InputType = RealVector, class LabelType = unsigned int>
std::string shark::KernelTargetAlignment< InputType, LabelType >::name ( ) const
inlinevirtual

From INameable: return the class name.

Reimplemented from shark::INameable.

Definition at line 132 of file KernelTargetAlignment.h.

◆ numberOfVariables()

template<class InputType = RealVector, class LabelType = unsigned int>
std::size_t shark::KernelTargetAlignment< InputType, LabelType >::numberOfVariables ( ) const
inlinevirtual

◆ proposeStartingPoint()

template<class InputType = RealVector, class LabelType = unsigned int>
SearchPointType shark::KernelTargetAlignment< InputType, LabelType >::proposeStartingPoint ( ) const
inlinevirtual

Return the current kernel parameters as a starting point for an optimization run.

Reimplemented from shark::AbstractObjectiveFunction< RealVector, double >.

Definition at line 136 of file KernelTargetAlignment.h.


The documentation for this class was generated from the following file: