shark Namespace Reference

AbstractMultiObjectiveOptimizer. More...

Namespaces

 blas
 
 random
 
 statistics
 
 tags
 Tags are empty types which can be used as a function argument.
 

Classes

class  AbsoluteLoss
 absolute loss More...
 
class  AbstractBudgetMaintenanceStrategy
 This is the abstract interface for any budget maintenance strategy. More...
 
class  AbstractClustering
 Base class for clustering. More...
 
class  AbstractConstraintHandler
 Implements the base class for constraint handling. More...
 
class  AbstractCost
 Cost function interface. More...
 
class  AbstractKernelFunction
 Base class of all Kernel functions. More...
 
class  AbstractLinearSvmTrainer
 Super class of all linear SVM trainers. More...
 
class  AbstractLineSearchOptimizer
 Basis class for line search methods. More...
 
class  AbstractLoss
 Loss function interface. More...
 
class  AbstractMetric
 
class  AbstractModel
 Base class for all Models. More...
 
class  AbstractMultiObjectiveOptimizer
 base class for abstract multi-objective optimizers for arbitrary search spaces. More...
 
class  AbstractNearestNeighbors
 Interface for Nearest Neighbor queries. More...
 
class  AbstractObjectiveFunction
 Super class of all objective functions for optimization and learning. More...
 
class  AbstractOptimizer
 An optimizer that optimizes general objective functions. More...
 
class  AbstractSingleObjectiveOptimizer
 Base class for all single objective optimizer. More...
 
class  AbstractStoppingCriterion
 Base class for stopping criteria of optimization algorithms. More...
 
class  AbstractSvmTrainer
 Super class of all kernelized (non-linear) SVM trainers. More...
 
class  AbstractTrainer
 Superclass of supervised learning algorithms. More...
 
class  AbstractUnsupervisedTrainer
 Superclass of unsupervised learning algorithms. More...
 
class  AbstractWeightedTrainer
 Superclass of weighted supervised learning algorithms. More...
 
class  AbstractWeightedUnsupervisedTrainer
 Superclass of weighted unsupervised learning algorithms. More...
 
struct  Ackley
 Convex quadratic benchmark function with single dominant axis. More...
 
class  Adam
 Adaptive Moment Estimation Algorithm (ADAM) More...
 
struct  AdditiveEpsilonIndicator
 Implements the Additive approximation properties of sets. More...
 
class  ARDKernelUnconstrained
 Automatic relevance detection kernel for unconstrained parameter optimization. More...
 
class  BarsAndStripes
 Generates the Bars-And-Stripes problem. In this problem, a 4x4 image has either rows or columns of the same value. More...
 
class  BaseDCNonDominatedSort
 Divide-and-conquer algorithm for non-dominated sorting. More...
 
struct  Batch
 class which helps using different batch types More...
 
struct  Batch< blas::vector< T > >
 specialization for vectors which should be matrices in batch mode! More...
 
struct  Batch< detail::MatrixRowReference< M > >
 
struct  Batch< shark::blas::compressed_vector< T > >
 specialization for ublas compressed vectors which are compressed matrices in batch mode! More...
 
struct  Batch< WeightedDataPair< DataType, WeightType > >
 
struct  BatchTraits
 
struct  BatchTraits< blas::compressed_matrix< T > >
 
struct  BatchTraits< blas::dense_matrix_adaptor< T, blas::row_major > >
 
struct  BatchTraits< blas::matrix< T > >
 
struct  BatchTraits< WeightedDataBatch< DataType, WeightType > >
 
class  BFGS
 Broyden, Fletcher, Goldfarb, Shannon algorithm for unconstraint optimization. More...
 
class  BiasSolver
 
class  BiasSolverSimplex
 
class  BinaryLayer
 Layer of binary units taking values in {0,1}. More...
 
class  BinaryTree
 Super class of binary space-partitioning trees. More...
 
class  BipolarLayer
 Layer of bipolar units taking values in {-1,1}. More...
 
struct  BitflipMutator
 Bitflip mutation operator. More...
 
class  BlockMatrix2x2
 SVM regression matrix. More...
 
struct  BoxBasedShrinkingStrategy
 Takes q boxx constrained QP-type problem and implements shrinking on it. More...
 
class  BoxConstrainedProblem
 Quadratic program with box constraints. More...
 
class  BoxConstrainedShrinkingProblem
 
class  BoxConstraintHandler
 
class  BoxedSVMProblem
 Boxed problem for alpha in [lower,upper]^n and equality constraints. More...
 
class  CachedMatrix
 Efficient quadratic matrix cache. More...
 
struct  CanBeCalled
 detects whether Functor(Argument) can be called. More...
 
struct  CanBeCalled< R(*)(T), Argument >
 
struct  CanBeCalled< R(T), Argument >
 
class  CARTree
 Classification and Regression Tree. More...
 
class  Centroids
 Clusters defined by centroids. More...
 
class  CG
 Conjugate-gradient method for unconstrained optimization. More...
 
class  Chessboard
 "chess board" problem for binary classification More...
 
struct  Cigar
 Convex quadratic benchmark function with single dominant axis. More...
 
class  CigarDiscus
 Convex quadratic benchmark function. More...
 
struct  CIGTAB1
 Multi-objective optimization benchmark function CIGTAB 1. More...
 
struct  CIGTAB2
 Multi-objective optimization benchmark function CIGTAB 2. More...
 
class  CircleInSquare
 
class  Classifier
 Conversion of real-valued or vector valued outputs to class labels. More...
 
class  ClusteringModel
 Abstract model with associated clustering object. More...
 
class  CMA
 Implements the CMA-ES. More...
 
struct  CMAChromosome
 Models a CMAChromosomeof the elitist (MO-)CMA-ES that encodes strategy parameters. More...
 
class  CMACMap
 The CMACMap class represents a linear combination of piecewise constant functions. More...
 
class  CMAIndividual
 
class  CMSA
 Implements the CMSA. More...
 
class  CombinedObjectiveFunction
 Linear combination of objective functions. More...
 
class  ConcatenatedModel
 ConcatenatedModel concatenates two models such that the output of the first model is input to the second. More...
 
struct  ConstProxyReference
 sets the type of ProxxyReference More...
 
struct  ConstrainedSphere
 Constrained Sphere function. More...
 
class  ContrastiveDivergence
 Implements k-step Contrastive Divergence described by Hinton et al. (2006). More...
 
class  Conv2DModel
 Convolutional Model for 2D image data. More...
 
class  CrossEntropy
 Error measure for classification tasks that can be used as the objective function for training. More...
 
class  CrossEntropyMethod
 Implements the Cross Entropy Method. More...
 
class  CrossValidationError
 Cross-validation error for selection of hyper-parameters. More...
 
struct  CrowdingDistance
 Implements the Crowding Distance of a pareto front. More...
 
class  CSvmDerivative
 This class provides two main member functions for computing the derivative of a C-SVM hypothesis w.r.t. its hyperparameters. The constructor takes a pointer to a KernelClassifier and an SvmTrainer, in the assumption that the former was trained by the latter. It heavily accesses their members to calculate the derivative of the alpha and offset values w.r.t. the SVM hyperparameters, that is, the regularization parameter C and the kernel parameters. This is done in the member function prepareCSvmParameterDerivative called by the constructor. After this initial, heavier computation step, modelCSvmParameterDerivative can be called on an input sample to the SVM model, and the method will yield the derivative of the hypothesis w.r.t. the SVM hyperparameters. More...
 
class  CSVMProblem
 Problem formulation for binary C-SVM problems. More...
 
class  CSvmTrainer
 Training of C-SVMs for binary classification. More...
 
class  CVFolds
 
class  Data
 Data container. More...
 
class  DataDistribution
 A DataDistribution defines an unsupervised learning problem. More...
 
class  DataView
 Constant time Element-Lookup for Datasets. More...
 
class  DiagonalWithCircle
 
class  DifferenceKernelMatrix
 SVM ranking matrix. More...
 
struct  DiffPowers
 
class  DiscreteKernel
 Kernel on a finite, discrete space. More...
 
class  DiscreteLoss
 flexible loss for classification More...
 
struct  Discus
 Convex quadratic benchmark function. More...
 
class  DistantModes
 Creates a set of pattern (each later representing a mode) which than are randomly perturbed to create the data set. The dataset was introduced in Desjardins et al. (2010) (Parallel Tempering for training restricted Boltzmann machines, AISTATS 2010) More...
 
class  DoublePole
 
class  DropoutLayer
 
struct  DTLZ1
 Implements the benchmark function DTLZ1. More...
 
struct  DTLZ2
 Implements the benchmark function DTLZ2. More...
 
struct  DTLZ3
 Implements the benchmark function DTLZ3. More...
 
struct  DTLZ4
 Implements the benchmark function DTLZ4. More...
 
struct  DTLZ5
 Implements the benchmark function DTLZ5. More...
 
struct  DTLZ6
 Implements the benchmark function DTLZ6. More...
 
struct  DTLZ7
 Implements the benchmark function DTLZ7. More...
 
class  ElitistCMA
 Implements the elitist CMA-ES. More...
 
struct  ElitistSelection
 Survival selection to find the next parent set. More...
 
struct  ELLI1
 Multi-objective optimization benchmark function ELLI1. More...
 
struct  ELLI2
 Multi-objective optimization benchmark function ELLI2. More...
 
struct  Ellipsoid
 Convex quadratic benchmark function. More...
 
struct  EmptyState
 Default State of an Object which does not need a State. More...
 
struct  Energy
 The Energy function determining the Gibbs distribution of an RBM. More...
 
class  EnergyStoringTemperedMarkovChain
 Implements parallel tempering but also stores additional statistics on the energy differences. More...
 
class  EpsilonHingeLoss
 Hinge-loss for large margin regression. More...
 
class  EpsilonSvmTrainer
 Training of Epsilon-SVMs for regression. More...
 
struct  EPTournamentSelection
 Survival and mating selection to find the next parent set. More...
 
class  ErrorFunction
 Objective function for supervised learning. More...
 
class  EvaluationArchive
 Objective function wrapper storing all function evaluations. More...
 
class  ExactGradient
 
class  ExampleModifiedKernelMatrix
 
class  Exception
 Top-level exception class of the shark library. More...
 
struct  FastSigmoidNeuron
 Fast sigmoidal function, which does not need to compute an exponential function. More...
 
class  FisherLDA
 Fisher's Linear Discriminant Analysis for data compression. More...
 
struct  Fonseca
 Bi-objective real-valued benchmark function proposed by Fonseca and Flemming. More...
 
class  GaussianKernelMatrix
 Efficient special case if the kernel is Gaussian and the inputs are sparse vectors. More...
 
class  GaussianLayer
 A layer of Gaussian neurons. More...
 
class  GaussianRbfKernel
 Gaussian radial basis function kernel. More...
 
class  GaussianTaskKernel
 Special "Gaussian-like" kernel function on tasks. More...
 
class  GeneralizationLoss
 The generalization loss calculates the relative increase of the validation error compared to the minimum training error. More...
 
class  GeneralizationQuotient
 SStopping criterion monitoring the quotient of generalization loss and training progress. More...
 
class  GeneralQuadraticProblem
 Quadratic Problem with only Box-Constraints Let K the kernel matrix, than the problem has the form. More...
 
class  GibbsOperator
 Implements Block Gibbs Sampling related transition operators for various temperatures. More...
 
class  GridSearch
 Optimize by trying out a grid of configurations. More...
 
struct  GSP
 Real-valued benchmark function with two objectives. More...
 
class  HardClusteringModel
 Model for "hard" clustering. More...
 
class  HierarchicalClustering
 Clusters defined by a binary space partitioning tree. More...
 
struct  Himmelblau
 Multi-modal two-dimensional continuous Himmelblau benchmark function. More...
 
class  HingeLoss
 Hinge-loss for large margin classification. More...
 
class  HMGSelectionCriterion
 
class  HuberLoss
 Huber-loss for for robust regression. More...
 
struct  HypervolumeApproximator
 Implements an FPRAS for approximating the volume of a set of high-dimensional objects. The algorithm is described in. More...
 
struct  HypervolumeCalculator
 Frontend for hypervolume calculation algorithms in m dimensions. More...
 
struct  HypervolumeCalculator2D
 Implementation of the exact hypervolume calculation in 2 dimensions. More...
 
struct  HypervolumeCalculator3D
 Implementation of the exact hypervolume calculation in 3 dimensions. More...
 
struct  HypervolumeCalculatorMDHOY
 Implementation of the exact hypervolume calculation in m dimensions. More...
 
struct  HypervolumeCalculatorMDWFG
 Implementation of the exact hypervolume calculation in m dimensions. More...
 
struct  HypervolumeContribution
 Frontend for hypervolume contribution algorithms in m dimensions. More...
 
struct  HypervolumeContribution2D
 Finds the smallest/largest Contributors given 2D points. More...
 
struct  HypervolumeContribution3D
 Finds the hypervolume contribution for points in 3DD. More...
 
struct  HypervolumeContributionApproximator
 Approximately determines the point of a set contributing the least hypervolume. More...
 
struct  HypervolumeContributionMD
 Finds the hypervolume contribution for points in MD. More...
 
struct  HypervolumeIndicator
 Calculates the hypervolume covered by a front of non-dominated points. More...
 
struct  HypervolumeSubsetSelection2D
 Implementation of the exact hypervolume subset selection algorithm in 2 dimensions. More...
 
struct  IHR1
 Multi-objective optimization benchmark function IHR1. More...
 
struct  IHR2
 Multi-objective optimization benchmark function IHR 2. More...
 
struct  IHR3
 Multi-objective optimization benchmark function IHR3. More...
 
struct  IHR4
 Multi-objective optimization benchmark function IHR 4. More...
 
struct  IHR6
 Multi-objective optimization benchmark function IHR 6. More...
 
class  ImagePatches
 Given a set of images, draws a set of image patches of a given size. More...
 
class  INameable
 This class is an interface for all objects which can have a name. More...
 
class  IndexedIterator
 creates an Indexed Iterator, an Iterator which also carries index information using index() More...
 
class  IndexingIterator
 
class  IndicatorBasedMOCMA
 Implements the generational MO-CMA-ES. More...
 
class  IndicatorBasedRealCodedNSGAII
 Implements the NSGA-II. More...
 
struct  IndicatorBasedSelection
 Implements the well-known indicator-based selection strategy. More...
 
class  IndicatorBasedSteadyStateMOCMA
 Implements the \((\mu+1)\)-MO-CMA-ES. More...
 
class  Individual
 Individual is a simple templated class modelling an individual that acts as a candidate solution in an evolutionary algorithm. More...
 
class  IParameterizable
 Top level interface for everything that holds parameters. More...
 
class  IRpropMinus
 This class offers methods for the usage of the improved Resilient-Backpropagation-algorithm without weight-backtracking. More...
 
class  IRpropPlus
 This class offers methods for the usage of the improved Resilient-Backpropagation-algorithm with weight-backtracking. More...
 
class  IRpropPlusFull
 
class  ISerializable
 Abstracts serializing functionality. More...
 
class  IterativeNNQuery
 Iterative nearest neighbors query. More...
 
class  JaakkolaHeuristic
 Jaakkola's heuristic and related quantities for Gaussian kernel selection. More...
 
class  KDTree
 KD-tree, a binary space-partitioning tree. More...
 
class  KernelBudgetedSGDTrainer
 Budgeted stochastic gradient descent training for kernel-based models. More...
 
struct  KernelClassifier
 Linear classifier in a kernel feature space. More...
 
class  KernelExpansion
 Linear model in a kernel feature space. More...
 
class  KernelMatrix
 Kernel Gram matrix. More...
 
class  KernelMeanClassifier
 Kernelized mean-classifier. More...
 
class  KernelSGDTrainer
 Generic stochastic gradient descent training for kernel-based models. More...
 
class  KernelTargetAlignment
 Kernel Target Alignment - a measure of alignment of a kernel Gram matrix with labels. More...
 
struct  KeyValuePair
 Represents a Key-Value-Pair similar std::pair which is strictly ordered by it's key. More...
 
class  KHCTree
 KHC-tree, a binary space-partitioning tree. More...
 
class  LabeledData
 Data set for supervised learning. More...
 
class  LabeledDataDistribution
 A LabeledDataDistribution defines a supervised learning problem. More...
 
class  LabelOrder
 This will normalize the labels of a given dataset to 0..N-1. More...
 
class  LassoRegression
 LASSO Regression. More...
 
class  LBFGS
 Limited-Memory Broyden, Fletcher, Goldfarb, Shannon algorithm. More...
 
class  LCTree
 LC-tree, a binary space-partitioning tree. More...
 
class  LDA
 Linear Discriminant Analysis (LDA) More...
 
struct  LibSVMSelectionCriterion
 Computes the maximum gian solution. More...
 
class  LinearClassifier
 Basic linear classifier. More...
 
class  LinearCSvmTrainer
 
class  LinearKernel
 Linear Kernel, parameter free. More...
 
class  LinearModel
 Linear Prediction with optional activation function. More...
 
struct  LinearNeuron
 Linear activation Neuron. More...
 
struct  LinearRankingSelection
 Implements a fitness-proportional selection scheme for mating selection that scales the fitness values linearly before carrying out the actual selection. More...
 
class  LinearRegression
 Linear Regression. More...
 
class  LinearSAGTrainer
 Stochastic Average Gradient Method for training of linear models,. More...
 
class  LineSearch
 Wrapper for the linesearch class of functions in the linear algebra library. More...
 
class  LMCMA
 Implements a Limited-Memory-CMA. More...
 
struct  LogisticNeuron
 Neuron which computes the Logistic (logistic) function with range [0,1]. More...
 
class  LogisticRegression
 Trainer for Logistic regression. More...
 
class  LooError
 Leave-one-out error objective function. More...
 
class  LooErrorCSvm
 Leave-one-out error, specifically optimized for C-SVMs. More...
 
class  LRUCache
 Implements an LRU-Caching Strategy for arbitrary Cache-Lines. More...
 
struct  LZ1
 Multi-objective optimization benchmark function LZ1. More...
 
struct  LZ2
 Multi-objective optimization benchmark function LZ2. More...
 
struct  LZ3
 Multi-objective optimization benchmark function LZ3. More...
 
struct  LZ4
 Multi-objective optimization benchmark function LZ4. More...
 
struct  LZ5
 Multi-objective optimization benchmark function LZ5. More...
 
struct  LZ6
 Multi-objective optimization benchmark function LZ6. More...
 
struct  LZ7
 Multi-objective optimization benchmark function LZ7. More...
 
struct  LZ8
 Multi-objective optimization benchmark function LZ8. More...
 
struct  LZ9
 
class  MarkovChain
 A single Markov chain. More...
 
class  MarkovPole
 
struct  MaximumGainCriterion
 Working set selection by maximization of the dual objective gain. More...
 
struct  MaximumGradientCriterion
 Working set selection by maximization of the projected gradient. More...
 
class  MaxIterations
 This stopping criterion stops after a fixed number of iterations. More...
 
class  McPegasos
 Pegasos solver for linear multi-class support vector machines. More...
 
class  MeanModel
 Calculates the weighted mean of a set of models. More...
 
class  MergeBudgetMaintenanceStrategy
 Budget maintenance strategy that merges two vectors. More...
 
class  MergeBudgetMaintenanceStrategy< RealVector >
 Budget maintenance strategy merging vectors. More...
 
class  MissingFeaturesKernelExpansion
 Kernel expansion with missing features support. More...
 
class  MissingFeatureSvmTrainer
 Trainer for binary SVMs natively supporting missing features. More...
 
class  MklKernel
 Weighted sum of kernel functions. More...
 
class  MNIST
 Reads in the famous MNIST data in possibly binarized form. The MNIST database itself is not included in Shark, this class just helps loading it. More...
 
class  ModelKernel
 Kernel function that uses a Model as transformation function for another kernel. More...
 
class  ModifiedKernelMatrix
 Modified Kernel Gram matrix. More...
 
class  MOEAD
 Implements the MOEA/D algorithm. More...
 
class  MonomialKernel
 Monomial kernel. Calculates \( \left\langle x_1, x_2 \right\rangle^m_exponent \). More...
 
class  MultiChainApproximator
 Approximates the gradient by taking samples from an ensemble of Markov chains running in parallel. More...
 
class  MultiNomialDistribution
 Implements a multinomial distribution. More...
 
class  MultiObjectiveBenchmark
 Creates a multi-objective Benchmark from a set of given single objective functions. More...
 
class  MultiTaskKernel
 Special kernel function for multi-task and transfer learning. More...
 
struct  MultiTaskSample
 Aggregation of input data and task index. More...
 
class  MultiVariateNormalDistribution
 Implements a multi-variate normal distribution with zero mean. More...
 
class  MultiVariateNormalDistributionCholesky
 Multivariate normal distribution with zero mean using a cholesky decomposition. More...
 
struct  MVPSelectionCriterion
 Computes the most violating pair of the problem. More...
 
class  NearestNeighborModel
 NearestNeighbor model for classification and regression. More...
 
class  NearestNeighborModel< InputType, unsigned int >
 
class  NegativeAUC
 Negative area under the curve. More...
 
class  NegativeGaussianProcessEvidence
 Evidence for model selection of a regularization network/Gaussian process. More...
 
class  NegativeLogLikelihood
 Computes the negative log likelihood of a dataset under a model. More...
 
class  NegativeWilcoxonMannWhitneyStatistic
 Negative Wilcoxon-Mann-Whitney statistic. More...
 
class  NestedGridSearch
 Nested grid search. More...
 
class  NeuronLayer
 
class  NonMarkovPole
 Objective function for single and double non-Markov poles. More...
 
class  NormalDistributedPoints
 Generates a set of normally distributed points. More...
 
class  NormalizeComponentsUnitInterval
 Train a model to normalize the components of a dataset to fit into the unit inverval. More...
 
class  NormalizeComponentsUnitVariance
 Train a linear model to normalize the components of a dataset to unit variance, and optionally to zero mean. More...
 
class  NormalizeComponentsWhitening
 Train a linear model to whiten the data. More...
 
class  NormalizeComponentsZCA
 Train a linear model to whiten the data. More...
 
class  NormalizedKernel
 Normalized version of a kernel function. More...
 
class  NormalizeKernelUnitVariance
 Determine the scaling factor of a ScaledKernel so that it has unit variance in feature space one on a given dataset. More...
 
class  Normalizer
 "Diagonal" linear model for data normalization. More...
 
struct  NormalizerNeuron
 
struct  NSGA3Indicator
 
class  OneClassSvmTrainer
 Training of one-class SVMs. More...
 
class  OneNormRegularizer
 One-norm of the input as an objective function. More...
 
struct  OnePointCrossover
 Implements one-point crossover. More...
 
class  OneVersusOneClassifier
 One-versus-one Classifier. More...
 
class  OptimizationTrainer
 Wrapper for training schemes based on (iterative) optimization. More...
 
class  PamiToy
 
struct  PartiallyMappedCrossover
 Implements partially mapped crossover. More...
 
class  PartlyPrecomputedMatrix
 Partly Precomputed version of a matrix for quadratic programming. More...
 
class  PCA
 Principal Component Analysis. More...
 
class  Pegasos
 Pegasos solver for linear (binary) support vector machines. More...
 
struct  PenalizingEvaluator
 Penalizing evaluator for scalar objective functions. More...
 
class  Perceptron
 Perceptron online learning algorithm. More...
 
class  PointSearch
 Optimize by trying out predefined configurations. More...
 
class  PointSetKernel
 Normalized version of a kernel function. More...
 
class  PolynomialKernel
 Polynomial kernel. More...
 
struct  PolynomialMutator
 Polynomial mutation operator. More...
 
class  PopulationBasedStepSizeAdaptation
 Step size adaptation based on the success of the new population compared to the old. More...
 
class  PrecomputedMatrix
 Precomputed version of a matrix for quadratic programming. More...
 
class  ProductKernel
 Product of kernel functions. More...
 
class  ProjectBudgetMaintenanceStrategy
 Budget maintenance strategy that projects a vector. More...
 
class  ProjectBudgetMaintenanceStrategy< RealVector >
 Budget maintenance strategy that projects a vector. More...
 
class  ProxyIterator
 Creates an iterator which reinterpretes an object as a range. More...
 
class  QpBoxLinear
 Quadratic program solver for box-constrained problems with linear kernel. More...
 
class  QpConfig
 Super class of all support vector machine trainers. More...
 
class  QpMcBoxDecomp
 
class  QpMcLinear
 Generic solver skeleton for linear multi-class SVM problems. More...
 
class  QpMcLinearADM
 Solver for the multi-class SVM with absolute margin and discriminative maximum loss. More...
 
class  QpMcLinearATM
 Solver for the multi-class SVM with absolute margin and total maximum loss. More...
 
class  QpMcLinearATS
 Solver for the multi-class SVM with absolute margin and total sum loss. More...
 
class  QpMcLinearCS
 Solver for the multi-class SVM by Crammer & Singer. More...
 
class  QpMcLinearLLW
 Solver for the multi-class SVM by Lee, Lin & Wahba. More...
 
class  QpMcLinearMMR
 Solver for the multi-class maximum margin regression SVM. More...
 
class  QpMcLinearReinforced
 Solver for the "reinforced" multi-class SVM. More...
 
class  QpMcLinearWW
 Solver for the multi-class SVM by Weston & Watkins. More...
 
class  QpMcSimplexDecomp
 
struct  QpSolutionProperties
 properties of the solution of a quadratic program More...
 
class  QpSolver
 Quadratic program solver. More...
 
class  QpSparseArray
 specialized container class for multi-class SVM problems More...
 
struct  QpStoppingCondition
 stopping conditions for quadratic programming More...
 
class  RadiusMarginQuotient
 radius margin quotions for binary SVMs More...
 
class  RankingSvmTrainer
 Training of an SVM for ranking. More...
 
class  RBFLayer
 Implements a layer of radial basis functions in a neural network. More...
 
class  RBM
 stub for the RBM class. at the moment it is just a holder of the parameter set and the Energy. More...
 
class  RealCodedNSGAIII
 Implements the NSGA-III. More...
 
struct  RealSpace
 The RealSpace can't be enumerated. Infinite values are just too much. More...
 
struct  RectifierNeuron
 Rectifier Neuron f(x) = max(0,x) More...
 
struct  ReferenceVectorAdaptation
 Reference vector adaptation for the RVEA algorithm. More...
 
struct  ReferenceVectorGuidedSelection
 Implements the reference vector selection for the RVEA algorithm. More...
 
class  RegularizationNetworkTrainer
 Training of a regularization network. More...
 
class  RegularizedKernelMatrix
 Kernel Gram matrix with modified diagonal. More...
 
class  RemoveBudgetMaintenanceStrategy
 Budget maintenance strategy that removes a vector. More...
 
struct  ResultSet
 
class  RFClassifier
 Random Forest Classifier. More...
 
class  RFTrainer
 Random Forest. More...
 
class  RFTrainer< RealVector >
 
class  RFTrainer< unsigned int >
 
class  ROC
 ROC-Curve - false negatives over false positives. More...
 
struct  Rosenbrock
 Generalized Rosenbrock benchmark function. More...
 
struct  RotatedObjectiveFunction
 Rotates an objective function using a randomly initialized rotation. More...
 
struct  RouletteWheelSelection
 Fitness-proportional selection operator. More...
 
class  RpropMinus
 This class offers methods for the usage of the Resilient-Backpropagation-algorithm without weight-backtracking. More...
 
class  RpropPlus
 This class offers methods for the usage of the Resilient-Backpropagation-algorithm with weight-backtracking. More...
 
class  RVEA
 Implements the RVEA algorithm. More...
 
class  ScaledKernel
 Scaled version of a kernel function. More...
 
struct  Schwefel
 Convex benchmark function. More...
 
class  ScopedHandle
 
class  Shape
 Represents the Shape of an input or output. More...
 
class  Shark
 Allows for querying compile settings at runtime. Provides the current command line arguments to the rest of the library. More...
 
class  Shifter
 Shifter problem. More...
 
class  SimpleNearestNeighbors
 Brute force optimized nearest neighbor implementation. More...
 
class  SimplexDownhill
 Simplex Downhill Method. More...
 
struct  SimulatedBinaryCrossover
 Simulated binary crossover operator. More...
 
class  SingleChainApproximator
 Approximates the gradient by taking samples from a single Markov chain. More...
 
struct  SingleObjectiveResultSet
 Result set for single objective algorithm. More...
 
class  SinglePole
 
class  SMSEMOA
 Implements the SMS-EMOA. More...
 
class  SoftClusteringModel
 Model for "soft" clustering. More...
 
struct  SoftmaxNeuron
 
struct  Sphere
 Convex quadratic benchmark function. More...
 
class  SquaredEpsilonHingeLoss
 Hinge-loss for large margin regression using th squared two-norm. More...
 
class  SquaredHingeCSvmTrainer
 
class  SquaredHingeLinearCSvmTrainer
 
class  SquaredHingeLoss
 Squared Hinge-loss for large margin classification. More...
 
class  SquaredLoss
 squared loss for regression and classification More...
 
class  SquaredLoss< OutputType, unsigned int >
 
class  SquaredLoss< Sequence, Sequence >
 
struct  State
 Represents the State of an Object. More...
 
class  SteepestDescent
 Standard steepest descent. More...
 
class  SubrangeKernel
 Weighted sum of kernel functions. More...
 
class  SvmLogisticInterpretation
 Maximum-likelihood model selection score for binary support vector machines. More...
 
class  SvmProblem
 
class  SvmShrinkingProblem
 
struct  TanhNeuron
 Neuron which computes the hyperbolic tangenst with range [-1,1]. More...
 
class  TemperedMarkovChain
 
class  Timer
 Timer abstraction with microsecond resolution. More...
 
struct  TournamentSelection
 Tournament selection operator. More...
 
class  TrainingError
 This stopping criterion tracks the improvement of the error function of the training error over an interval of iterations. More...
 
class  TrainingProgress
 This stopping criterion tracks the improvement of the training error over an interval of iterations. More...
 
struct  TransformedData
 
class  TreeConstruction
 Stopping criteria for tree construction. More...
 
class  TreeNearestNeighbors
 Nearest Neighbors implementation using binary trees. More...
 
class  TruncatedExponentialLayer
 A layer of truncated exponential neurons. More...
 
class  TrustRegionNewton
 Simple Trust-Region method based on the full Hessian matrix. More...
 
class  TukeyBiweightLoss
 Tukey's Biweight-loss for robust regression. More...
 
class  TwoNormRegularizer
 Two-norm of the input as an objective function. More...
 
class  TwoPointStepSizeAdaptation
 Step size adaptation based on the success of the new population compared to the old. More...
 
struct  TwoStateSpace
 The TwoStateSpace is a discrete Space with only two values, for example {0,1} or {-1,1}. More...
 
class  TypedFeatureNotAvailableException
 Exception indicating the attempt to use a feature which is not supported. More...
 
class  TypedFlags
 Flexible and extensible mechanisms for holding flags. More...
 
class  UniformCrossover
 Uniform crossover of arbitrary individuals. More...
 
struct  UniformRankingSelection
 Selects individuals from the range of individual and offspring individuals. More...
 
class  UnlabeledData
 Data set for unsupervised learning. More...
 
struct  ValidatedSingleObjectiveResultSet
 Result set for validated points. More...
 
class  ValidatedStoppingCriterion
 Given the current Result set of the optimizer, calculates the validation error using a validation function and hands the results over to the underlying stopping criterion. More...
 
class  VariationalAutoencoderError
 Computes the variational autoencoder error function. More...
 
class  VDCMA
 
class  Wave
 Noisy sinc function: y = sin(x) / x + noise. More...
 
struct  WeightedDataBatch
 
struct  WeightedDataPair
 Input-Label pair of data. More...
 
class  WeightedLabeledData
 Weighted data set for supervised learning. More...
 
class  WeightedSumKernel
 Weighted sum of kernel functions. More...
 
class  WeightedUnlabeledData
 Weighted data set for unsupervised learning. More...
 
struct  WS2MaximumGradientCriterion
 Working set selection by maximization of the projected gradient. More...
 
struct  ZDT1
 Multi-objective optimization benchmark function ZDT1. More...
 
struct  ZDT2
 Multi-objective optimization benchmark function ZDT2. More...
 
struct  ZDT3
 Multi-objective optimization benchmark function ZDT3. More...
 
struct  ZDT4
 Multi-objective optimization benchmark function ZDT4. More...
 
struct  ZDT6
 Multi-objective optimization benchmark function ZDT6. More...
 
class  ZeroOneLoss
 0-1-loss for classification. More...
 
class  ZeroOneLoss< unsigned int, RealVector >
 0-1-loss for classification. More...
 

Typedefs

typedef IndicatorBasedMOCMA< HypervolumeIndicatorMOCMA
 
typedef IndicatorBasedMOCMA< AdditiveEpsilonIndicatorEpsilonMOCMA
 
typedef std::pair< double, RealVector > Preference
 A preferred region in a lattice-sampled unit sphere. More...
 
typedef IndicatorBasedRealCodedNSGAII< HypervolumeIndicatorRealCodedNSGAII
 
typedef IndicatorBasedRealCodedNSGAII< AdditiveEpsilonIndicatorEpsRealCodedNSGAII
 
typedef IndicatorBasedRealCodedNSGAII< CrowdingDistanceCrowdingRealCodedNSGAII
 
typedef IndicatorBasedSteadyStateMOCMA< HypervolumeIndicatorSteadyStateMOCMA
 
typedef IndicatorBasedSteadyStateMOCMA< AdditiveEpsilonIndicatorEpsilonSteadyStateMOCMA
 
typedef IRpropPlus Rprop99
 
typedef IRpropMinus Rprop99d
 
typedef RpropPlus Rprop93
 
typedef RpropMinus Rprop94
 
typedef boost::archive::polymorphic_iarchive InArchive
 Type of an archive to read from. More...
 
typedef boost::archive::polymorphic_text_iarchive TextInArchive
 
typedef boost::archive::polymorphic_oarchive OutArchive
 Type of an archive to write to. More...
 
typedef boost::archive::polymorphic_text_oarchive TextOutArchive
 
typedef std::pair< std::vector< std::size_t >, std::vector< std::size_t > > RecreationIndices
 auxiliary typedef for createCVSameSizeBalanced and createCVFullyIndexed, stores location index in the first and partition index in the second More...
 
typedef LabeledData< RealVector, unsigned int > ClassificationDataset
 specialized template for classification with unsigned int labels More...
 
typedef LabeledData< RealVector, RealVector > RegressionDataset
 specialized template for regression with RealVector labels More...
 
typedef LabeledData< CompressedRealVector, unsigned int > CompressedClassificationDataset
 specialized template for classification with unsigned int labels and sparse data More...
 
typedef blas::permutation_matrix PermutationMatrix
 
typedef std::deque< RealVector > Sequence
 Type of Data sequences. More...
 
typedef ARDKernelUnconstrained DenseARDKernel
 
typedef ARDKernelUnconstrained< CompressedRealVector > CompressedARDKernel
 
typedef GaussianRbfKernel DenseRbfKernel
 
typedef GaussianRbfKernel< CompressedRealVector > CompressedRbfKernel
 
typedef LinearKernel DenseLinearKernel
 
typedef LinearKernel< CompressedRealVector > CompressedLinearKernel
 
typedef ModelKernel< RealVector > DenseModelKernel
 
typedef ModelKernel< CompressedRealVector > CompressedModelKernel
 
typedef MonomialKernel DenseMonomialKernel
 
typedef MonomialKernel< CompressedRealVector > CompressedMonomialKernel
 
typedef NormalizedKernel DenseNormalizedKernel
 
typedef NormalizedKernel< CompressedRealVector > CompressedNormalizedKernel
 
typedef PolynomialKernel DensePolynomialKernel
 
typedef PolynomialKernel< CompressedRealVector > CompressedPolynomialKernel
 
typedef ScaledKernel DenseScaledKernel
 
typedef ScaledKernel< CompressedRealVector > CompressedScaledKernel
 
typedef SubrangeKernel< RealVector > DenseSubrangeKernel
 
typedef SubrangeKernel< CompressedRealVector > CompressesSubrangeKernel
 
typedef WeightedSumKernel DenseWeightedSumKernel
 
typedef WeightedSumKernel< CompressedRealVector > CompressedWeightedSumKernel
 
typedef AbstractObjectiveFunction< RealVector, double > SingleObjectiveFunction
 
typedef AbstractObjectiveFunction< RealVector, RealVector > MultiObjectiveFunction
 
typedef RBM< BinaryLayer, BinaryLayer, random::rng_type > BinaryRBM
 
typedef GibbsOperator< BinaryRBMBinaryGibbsOperator
 
typedef MarkovChain< BinaryGibbsOperatorBinaryGibbsChain
 
typedef TemperedMarkovChain< BinaryGibbsOperatorBinaryPTChain
 
typedef MultiChainApproximator< BinaryGibbsChainBinaryPCD
 
typedef ContrastiveDivergence< BinaryGibbsOperatorBinaryCD
 
typedef SingleChainApproximator< BinaryPTChainBinaryParallelTempering
 
typedef RBM< BipolarLayer, BipolarLayer, random::rng_type > BipolarRBM
 
typedef GibbsOperator< BipolarRBMBipolarGibbsOperator
 
typedef MarkovChain< BipolarGibbsOperatorBipolarGibbsChain
 
typedef TemperedMarkovChain< BipolarGibbsOperatorBipolarPTChain
 
typedef MultiChainApproximator< BipolarGibbsChainBipolarPCD
 
typedef ContrastiveDivergence< BipolarGibbsOperatorBipolarCD
 
typedef SingleChainApproximator< BipolarPTChainBipolarParallelTempering
 
typedef RBM< GaussianLayer, BinaryLayer, random::rng_type > GaussianBinaryRBM
 
typedef GibbsOperator< GaussianBinaryRBMGaussianBinaryGibbsOperator
 
typedef MarkovChain< GaussianBinaryGibbsOperatorGaussianBinaryGibbsChain
 
typedef TemperedMarkovChain< GaussianBinaryGibbsOperatorGaussianBinaryPTChain
 
typedef MultiChainApproximator< GaussianBinaryGibbsChainGaussianBinaryPCD
 
typedef ContrastiveDivergence< GaussianBinaryGibbsOperatorGaussianBinaryCD
 
typedef SingleChainApproximator< GaussianBinaryPTChainGaussianBinaryParallelTempering
 
typedef TwoStateSpace< 0, 1 > BinarySpace
 
typedef TwoStateSpace<-1, 1 > SymmetricBinarySpace
 
typedef RBM< TruncExpBinaryEnergy, random::rng_type > TruncExpBinaryRBM
 
typedef GibbsOperator< TruncExpBinaryRBMTruncExpBinaryGibbsOperator
 
typedef MarkovChain< TruncExpBinaryGibbsOperatorTruncExpBinaryGibbsChain
 
typedef TemperedMarkovChain< TruncExpBinaryGibbsOperatorTruncExpBinaryPTChain
 
typedef MultiChainApproximator< TruncExpBinaryGibbsChainTruncExpBinaryPCD
 
typedef ContrastiveDivergence< TruncExpBinaryGibbsOperatorTruncExpBinaryCD
 
typedef SingleChainApproximator< TruncExpBinaryPTChainTruncExpBinaryParallelTempering
 
typedef std::vector< shark::RealVector > FrontType
 

Enumerations

enum  DominanceRelation { INCOMPARABLE = 0, LHS_DOMINATES_RHS = 1, RHS_DOMINATES_LHS = 2, EQUIVALENT = 3 }
 Result of comparing two objective vectors w.r.t. Pareto dominance. More...
 
enum  AlphaStatus { AlphaFree = 0, AlphaLowerBound = 1, AlphaUpperBound = 2, AlphaDeactivated = 3 }
 
enum  QpStopType { QpNone = 0, QpAccuracyReached = 1, QpMaxIterationsReached = 4, QpTimeout = 8 }
 
enum  McSvm {
  McSvm::WW, McSvm::CS, McSvm::LLW, McSvm::ATM,
  McSvm::ATS, McSvm::ADM, McSvm::OVA, McSvm::MMR,
  McSvm::ReinforcedSvm
}
 
enum  BuildType { RELEASE_BUILD_TYPE, DEBUG_BUILD_TYPE }
 Models the build type. More...
 
enum  Convolution { Convolution::Valid, Convolution::ZeroPad }
 
enum  PartitionEstimationAlgorithm {
  AIS, AISMean, TwoSidedAISMean, AcceptanceRatio,
  AcceptanceRatioMean
}
 

Functions

SHARK_EXPORT_SYMBOL KernelExpansion< RealVector > approximateKernelExpansion (random::rng_type &rng, KernelExpansion< RealVector > const &model, std::size_t k, double precision=1.e-8)
 Approximates a kernel expansion by a smaller one using an optimized basis. More...
 
template<class IndividualRange >
auto penalizedFitness (IndividualRange &range) -> decltype(boost::adaptors::transform(range, detail::IndividualPenalizedFitnessFunctor()))
 
template<class IndividualRange >
auto unpenalizedFitness (IndividualRange &range) -> decltype(boost::adaptors::transform(range, detail::IndividualUnpenalizedFitnessFunctor()))
 
template<class IndividualRange >
auto ranks (IndividualRange &range) -> decltype(boost::adaptors::transform(range, detail::IndividualRankFunctor()))
 
template<class IndividualRange >
auto searchPoint (IndividualRange &range) -> decltype(boost::adaptors::transform(range, detail::IndividualSearchPointFunctor()))
 
template<class PointRange , class RankRange >
void dcNonDominatedSort (PointRange const &points, RankRange &ranks)
 
template<class PointRange , class RankRange >
void fastNonDominatedSort (PointRange const &points, RankRange &ranks)
 Implements the well-known non-dominated sorting algorithm. More...
 
template<class PointRange , class RankRange >
void nonDominatedSort (PointRange const &points, RankRange &ranks)
 Frontend for non-dominated sorting algorithms. More...
 
template<class PointRange , class RankRange >
void nonDominatedSort (PointRange const &points, RankRange const &ranks)
 
template<class VectorTypeA , class VectorTypeB >
DominanceRelation dominance (VectorTypeA const &lhs, VectorTypeB const &rhs)
 Pareto dominance relation for two objective vectors. More...
 
template<typename Matrix , typename randomType = shark::random::rng_type>
Matrix sampleLatticeUniformly (randomType &rng, Matrix const &matrix, std::size_t const n, bool const keep_corners=true)
 
RealMatrix preferenceAdjustedUnitVectors (std::size_t const n, std::size_t const sum, std::vector< Preference > const &preferences)
 Return a set of evenly spaced n-dimensional points on the unit sphere clustered around the specified preference points. More...
 
RealMatrix preferenceAdjustedWeightVectors (std::size_t const n, std::size_t const sum, std::vector< Preference > const &preferences)
 Return a set of of evenly spaced n-dimensional points on the "unit simplex" clustered around the specified preference points. More...
 
std::size_t computeOptimalLatticeTicks (std::size_t const n, std::size_t const target_count)
 Computes the number of Ticks for a grid of a certain size. More...
 
RealMatrix weightLattice (std::size_t const n, std::size_t const sum)
 Returns a set of evenly spaced n-dimensional points on the "unit simplex". More...
 
RealMatrix unitVectorsOnLattice (std::size_t const n, std::size_t const sum)
 Return a set of evenly spaced n-dimensional points on the unit sphere. More...
 
template<typename Matrix >
UIntMatrix computeClosestNeighbourIndicesOnLattice (Matrix const &m, std::size_t const n)
 
double tchebycheffScalarizer (RealVector const &fitness, RealVector const &weights, RealVector const &optimalPointFitness)
 
SHARK_EXPORT_SYMBOL std::size_t kMeans (Data< RealVector > const &data, std::size_t k, Centroids &centroids, std::size_t maxIterations=0)
 The k-means clustering algorithm. More...
 
SHARK_EXPORT_SYMBOL std::size_t kMeans (Data< RealVector > const &data, RBFLayer &model, std::size_t maxIterations=0)
 The k-means clustering algorithm for initializing an RBF Layer. More...
 
template<class InputType >
KernelExpansion< InputTypekMeans (Data< InputType > const &dataset, std::size_t k, AbstractKernelFunction< InputType > &kernel, std::size_t maxIterations=0)
 The kernel k-means clustering algorithm. More...
 
template<class T >
maxExpInput ()
 Maximum allowed input value for exp. More...
 
template<class T >
minExpInput ()
 Minimum value for exp(x) allowed so that it is not 0. More...
 
template<class T >
boost::enable_if< std::is_arithmetic< T >, T >::type sqr (const T &x)
 Calculates x^2. More...
 
template<class T >
cube (const T &x)
 Calculates x^3. More...
 
template<class T >
boost::enable_if< std::is_arithmetic< T >, T >::type sigmoid (T x)
 Logistic function/logistic function. More...
 
template<class T >
safeExp (T x)
 Thresholded exp function, over- and underflow safe. More...
 
template<class T >
safeLog (T x)
 Thresholded log function, over- and underflow safe. More...
 
template<class T >
boost::enable_if< std::is_arithmetic< T >, T >::type softPlus (T x)
 Numerically stable version of the function log(1+exp(x)). More...
 
double softPlus (double x)
 Numerically stable version of the function log(1+exp(x)). calculated with float precision to save some time. More...
 
template<class T >
copySign (T x, T y)
 
template<typename T , typename U >
ResultSet< T, U > makeResultSet (T const &t, U const &u)
 Generates a typed solution given the search point and the corresponding objective function value. More...
 
template<class SearchPoint , class Result >
std::ostream & operator<< (std::ostream &out, ResultSet< SearchPoint, Result > const &solution)
 
template<class SearchPoint >
std::ostream & operator<< (std::ostream &out, ValidatedSingleObjectiveResultSet< SearchPoint > const &solution)
 
bool operator== (Shape const &shape1, Shape const &shape2)
 
bool operator!= (Shape const &shape1, Shape const &shape2)
 
template<class E , class T >
std::basic_ostream< E, T > & operator<< (std::basic_ostream< E, T > &os, Shape const &shape)
 
template<class Iterator , class Rng >
void shuffle (Iterator begin, Iterator end, Rng &rng)
 random_shuffle algorithm which stops after acquiring the random subsequence for [begin,middle) More...
 
template<class RandomAccessIterator , class Rng >
void partial_shuffle (RandomAccessIterator begin, RandomAccessIterator middle, RandomAccessIterator end, Rng &rng)
 random_shuffle algorithm which stops after acquiring the random subsequence for [begin,middle) More...
 
template<class RandomAccessIterator >
void partial_shuffle (RandomAccessIterator begin, RandomAccessIterator middle, RandomAccessIterator end)
 random_shuffle algorithm which stops after acquiring the random subsequence for [begin,middle) More...
 
template<class Range >
boost::range_iterator< Range >::type median_element (Range &range)
 Returns the iterator to the median element. after this call, the range is partially ordered. More...
 
template<class Range >
boost::range_iterator< Range >::type median_element (Range const &rangeAdaptor)
 
template<class Range >
boost::range_iterator< Range >::type partitionEqually (Range &range)
 Partitions a range in two parts as equal in size as possible. More...
 
template<class Range >
boost::range_iterator< Range >::type partitionEqually (Range const &rangeAdaptor)
 Partitions a range in two parts as equal in size as possible and returns it's result. More...
 
template<class K , class V >
void swap (KeyValuePair< K, V > &pair1, KeyValuePair< K, V > &pair2)
 Swaps the contents of two instances of KeyValuePair. More...
 
template<class Key , class Value >
KeyValuePair< Key, Value > makeKeyValuePair (Key const &key, Value const &value)
 Creates a KeyValuePair. More...
 
template<class T , class Range >
Batch< T >::type createBatch (Range const &range)
 creates a batch from a range of inputs More...
 
template<class Range >
Batch< typename Range::value_type >::type createBatch (Range const &range)
 creates a batch from a range of inputs More...
 
template<class T , class Iterator >
Batch< T >::type createBatch (Iterator const &begin, Iterator const &end)
 
template<class BatchT >
auto getBatchElement (BatchT &batch, std::size_t i) -> decltype(BatchTraits< BatchT >::type::get(std::declval< BatchT &>(), i))
 
template<class BatchT >
auto getBatchElement (BatchT const &batch, std::size_t i) -> decltype(BatchTraits< BatchT >::type::get(std::declval< BatchT const &>(), i))
 
template<class BatchT >
std::size_t batchSize (BatchT const &batch)
 
template<class BatchT >
auto batchBegin (BatchT &batch) -> decltype(BatchTraits< BatchT >::type::begin(batch))
 
template<class BatchT >
auto batchBegin (BatchT const &batch) -> decltype(BatchTraits< BatchT >::type::begin(batch))
 
template<class BatchT >
auto batchEnd (BatchT &batch) -> decltype(BatchTraits< BatchT >::type::end(batch))
 
template<class BatchT >
auto batchEnd (BatchT const &batch) -> decltype(BatchTraits< BatchT >::type::end(batch))
 
template<class S >
Sfusionize (detail::FusionFacade< S > &facade)
 
template<class S >
S const & fusionize (detail::FusionFacade< S > const &facade)
 
template<class S >
boost::disable_if< detail::isFusionFacade< S >, S & >::type fusionize (S &facade)
 
template<class S >
boost::disable_if< detail::isFusionFacade< S >, S const &>::type fusionize (S const &facade)
 
template<class Range >
Data< typename Range::value_type > createDataFromRange (Range const &inputs, std::size_t maximumBatchSize=0)
 creates a data object from a range of elements More...
 
template<class Range >
UnlabeledData< typename boost::range_value< Range >::type > createUnlabeledDataFromRange (Range const &inputs, std::size_t maximumBatchSize=0)
 creates a data object from a range of elements More...
 
template<class Range1 , class Range2 >
LabeledData< typename boost::range_value< Range1 >::type, typename boost::range_value< Range2 >::type > createLabeledDataFromRange (Range1 const &inputs, Range2 const &labels, std::size_t maximumBatchSize=0)
 creates a labeled data object from two ranges, representing inputs and labels More...
 
template<class T , class U >
std::ostream & operator<< (std::ostream &stream, const LabeledData< T, U > &d)
 brief Outstream of elements for labeled data. More...
 
unsigned int numberOfClasses (Data< unsigned int > const &labels)
 Return the number of classes of a set of class labels with unsigned int label encoding. More...
 
std::vector< std::size_t > classSizes (Data< unsigned int > const &labels)
 Returns the number of members of each class in the dataset. More...
 
template<class InputType >
std::size_t dataDimension (Data< InputType > const &dataset)
 Return the dimensionality of a dataset. More...
 
template<class InputType , class LabelType >
std::size_t inputDimension (LabeledData< InputType, LabelType > const &dataset)
 Return the input dimensionality of a labeled dataset. More...
 
template<class InputType , class LabelType >
std::size_t labelDimension (LabeledData< InputType, LabelType > const &dataset)
 Return the label/output dimensionality of a labeled dataset. More...
 
template<class InputType >
std::size_t numberOfClasses (LabeledData< InputType, unsigned int > const &dataset)
 Return the number of classes (highest label value +1) of a classification dataset with unsigned int label encoding. More...
 
template<class InputType , class LabelType >
std::vector< std::size_t > classSizes (LabeledData< InputType, LabelType > const &dataset)
 Returns the number of members of each class in the dataset. More...
 
template<class T , class Functor >
boost::lazy_disable_if< CanBeCalled< Functor, typename Data< T >::batch_type >, TransformedData< Functor, T >>::type transform (Data< T > const &data, Functor f)
 Transforms a dataset using a Functor f and returns the transformed result. More...
 
template<class T , class Functor >
boost::lazy_enable_if< CanBeCalled< Functor, typename Data< T >::batch_type >, TransformedData< Functor, T >>::type transform (Data< T > const &data, Functor const &f)
 Transforms a dataset using a Functor f and returns the transformed result. More...
 
template<class I , class L , class Functor >
LabeledData< typename detail::TransformedDataElement< Functor, I >::type, L > transformInputs (LabeledData< I, L > const &data, Functor const &f)
 Transforms the inputs of a dataset and return the transformed result. More...
 
template<class I , class L , class Functor >
LabeledData< I, typename detail::TransformedDataElement< Functor, L >::type > transformLabels (LabeledData< I, L > const &data, Functor const &f)
 Transforms the labels of a dataset and returns the transformed result. More...
 
template<class T , class FeatureSet >
Data< blas::vector< T > > selectFeatures (Data< blas::vector< T > > const &data, FeatureSet const &features)
 Creates a copy of a dataset selecting only a certain set of features. More...
 
template<class T , class FeatureSet >
LabeledData< RealVector, T > selectInputFeatures (LabeledData< RealVector, T > const &data, FeatureSet const &features)
 
template<class DatasetT >
DatasetT splitAtElement (DatasetT &data, std::size_t elementIndex)
 Removes the last part of a given dataset and returns a new split containing the removed elements. More...
 
template<class I >
void repartitionByClass (LabeledData< I, unsigned int > &data, std::size_t batchSize=LabeledData< I, unsigned int >::DefaultBatchSize)
 reorders the dataset such, that points are grouped by labels More...
 
template<class I >
LabeledData< I, unsigned int > binarySubProblem (LabeledData< I, unsigned int >const &data, unsigned int zeroClass, unsigned int oneClass)
 
template<class I >
LabeledData< I, unsigned int > oneVersusRestProblem (LabeledData< I, unsigned int >const &data, unsigned int oneClass)
 Construct a binary (two-class) one-versus-rest problem from a multi-class problem. More...
 
template<typename RowType >
RowType getColumn (Data< RowType > const &data, std::size_t columnID)
 
template<typename RowType >
void setColumn (Data< RowType > &data, std::size_t columnID, RowType newColumn)
 
template<class DatasetType , class IndexRange >
DataView< DatasetType > subset (DataView< DatasetType > const &view, IndexRange const &indizes)
 creates a subset of a DataView with elements indexed by indices More...
 
template<class DatasetType >
DataView< DatasetType > randomSubset (DataView< DatasetType > const &view, std::size_t size)
 creates a random subset of a DataView with given size More...
 
template<class DatasetType , class IndexRange >
DataView< DatasetType >::batch_type subBatch (DataView< DatasetType > const &view, IndexRange const &indizes)
 Creates a batch given a set of indices. More...
 
template<class DatasetType >
DataView< DatasetType >::batch_type randomSubBatch (DataView< DatasetType > const &view, std::size_t size)
 Creates a random batch of a given size. More...
 
template<class DatasetType >
DataView< DatasetType > toView (DatasetType &set)
 Creates a View from a dataset. More...
 
template<class T >
DataView< T >::dataset_type toDataset (DataView< T > const &view, std::size_t batchSize=DataView< T >::dataset_type::DefaultBatchSize)
 Creates a new dataset from a View. More...
 
template<class DatasetType >
std::size_t numberOfClasses (DataView< DatasetType > const &view)
 
template<class DatasetType >
std::size_t inputDimension (DataView< DatasetType > const &view)
 Return the input dimensionality of the labeled dataset represented by the view. More...
 
template<class DatasetType >
std::size_t labelDimension (DataView< DatasetType > const &view)
 Return the label dimensionality of the labeled dataset represented by the view. More...
 
template<class DatasetType >
std::size_t dataDimension (DataView< DatasetType > const &view)
 Return the dimensionality of the dataset represented by the view. More...
 
template<typename VectorType >
void importHDF5 (Data< VectorType > &data, const std::string &fileName, const std::string &datasetName)
 Import data from a HDF5 file. More...
 
template<typename VectorType , typename LabelType >
void importHDF5 (LabeledData< VectorType, LabelType > &labeledData, const std::string &fileName, const std::string &data, const std::string &label)
 Import data to a LabeledData object from a HDF5 file. More...
 
template<typename VectorType >
void importHDF5 (Data< VectorType > &data, const std::string &fileName, const std::vector< std::string > &cscDatasetName)
 Import data from HDF5 dataset of compressed sparse column format. More...
 
template<typename VectorType , typename LabelType >
void importHDF5 (LabeledData< VectorType, LabelType > &labeledData, const std::string &fileName, const std::vector< std::string > &cscDatasetName, const std::string &label)
 Import data from HDF5 dataset of compressed sparse column format. More...
 
template<class T >
void importPGM (std::string const &fileName, T &data, std::size_t &sx, std::size_t &sy)
 Import a PGM image from file. More...
 
template<class T >
void exportPGM (std::string const &fileName, T const &data, std::size_t sx, std::size_t sy, bool normalize=false)
 Export a PGM image to file. More...
 
void exportFiltersToPGMGrid (std::string const &basename, RealMatrix const &filters, std::size_t width, std::size_t height)
 Exports a set of filters as a grid image. More...
 
void exportFiltersToPGMGrid (std::string const &basename, Data< RealVector > const &filters, std::size_t width, std::size_t height)
 Exports a set of filters as a grid image. More...
 
template<class T >
void importPGMSet (std::string const &p, Data< T > &set)
 Import PGM images scanning a directory recursively. More...
 
template<class Vec1T , class Vec2T , class Vec3T , class Device >
void meanvar (Data< Vec1T > const &data, blas::vector_container< Vec2T, Device > &mean, blas::vector_container< Vec3T, Device > &variance)
 Calculates the mean and variance values of the input data. More...
 
template<class Vec1T , class Vec2T , class MatT , class Device >
void meanvar (Data< Vec1T > const &data, blas::vector_container< Vec2T, Device > &mean, blas::matrix_container< MatT, Device > &variance)
 Calculates the mean, variance and covariance values of the input data. More...
 
template<class VectorType >
VectorType mean (Data< VectorType > const &data)
 Calculates the mean vector of the input vectors. More...
 
template<class VectorType >
VectorType mean (UnlabeledData< VectorType > const &data)
 
template<class VectorType >
VectorType variance (Data< VectorType > const &data)
 Calculates the variance vector of the input vectors. More...
 
template<class VectorType >
blas::matrix< typename VectorType::value_type > covariance (Data< VectorType > const &data)
 Calculates the covariance matrix of the data vectors. More...
 
template<class D1 , class W1 , class D2 , class W2 >
void swap (WeightedDataPair< D1, W1 > &&p1, WeightedDataPair< D2, W2 > &&p2)
 
template<class D1 , class W1 , class D2 , class W2 >
void swap (WeightedDataBatch< D1, W1 > &p1, WeightedDataBatch< D2, W2 > &p2)
 
template<class T >
std::ostream & operator<< (std::ostream &stream, const WeightedUnlabeledData< T > &d)
 brief Outstream of elements for weighted data. More...
 
template<class DataRange , class WeightRange >
boost::disable_if< boost::is_arithmetic< WeightRange >, WeightedUnlabeledData< typename boost::range_value< DataRange >::type > >::type createUnlabeledDataFromRange (DataRange const &data, WeightRange const &weights, std::size_t batchSize=0)
 creates a weighted unweighted data object from two ranges, representing data and weights More...
 
template<class T , class U >
std::ostream & operator<< (std::ostream &stream, const WeightedLabeledData< T, U > &d)
 brief Outstream of elements for weighted labeled data. More...
 
std::size_t numberOfClasses (WeightedUnlabeledData< unsigned int > const &labels)
 
std::vector< std::size_t > classSizes (WeightedUnlabeledData< unsigned int > const &labels)
 Returns the number of members of each class in the dataset. More...
 
template<class InputType >
std::size_t dataDimension (WeightedUnlabeledData< InputType > const &dataset)
 Return the dimnsionality of points of a weighted dataset. More...
 
template<class InputType , class LabelType >
std::size_t inputDimension (WeightedLabeledData< InputType, LabelType > const &dataset)
 Return the input dimensionality of a weighted labeled dataset. More...
 
template<class InputType , class LabelType >
std::size_t labelDimension (WeightedLabeledData< InputType, LabelType > const &dataset)
 Return the label/output dimensionality of a labeled dataset. More...
 
template<class InputType >
std::size_t numberOfClasses (WeightedLabeledData< InputType, unsigned int > const &dataset)
 Return the number of classes (highest label value +1) of a classification dataset with unsigned int label encoding. More...
 
template<class InputType , class LabelType >
std::vector< std::size_t > classSizes (WeightedLabeledData< InputType, LabelType > const &dataset)
 Returns the number of members of each class in the dataset. More...
 
template<class InputType >
double sumOfWeights (WeightedUnlabeledData< InputType > const &dataset)
 Returns the total sum of weights. More...
 
template<class InputType , class LabelType >
double sumOfWeights (WeightedLabeledData< InputType, LabelType > const &dataset)
 Returns the total sum of weights. More...
 
template<class InputType >
RealVector classWeight (WeightedLabeledData< InputType, unsigned int > const &dataset)
 Computes the cumulative weight of every class. More...
 
template<class InputRange , class LabelRange , class WeightRange >
boost::disable_if< boost::is_arithmetic< WeightRange >, WeightedLabeledData< typename boost::range_value< InputRange >::type, typename boost::range_value< LabelRange >::type >>::type createLabeledDataFromRange (InputRange const &inputs, LabelRange const &labels, WeightRange const &weights, std::size_t batchSize=0)
 creates a weighted unweighted data object from two ranges, representing data and weights More...
 
template<class InputType , class LabelType >
WeightedLabeledData< InputType, LabelType > bootstrap (LabeledData< InputType, LabelType > const &dataset, std::size_t bootStrapSize=0)
 Creates a bootstrap partition of a labeled dataset and returns it using weighting. More...
 
template<class InputType >
WeightedUnlabeledData< InputTypebootstrap (UnlabeledData< InputType > const &dataset, std::size_t bootStrapSize=0)
 Creates a bootstrap partition of an unlabeled dataset and returns it using weighting. More...
 
 SHARK_VECTOR_MATRIX_TYPEDEFS (long double, BigReal)
 
 SHARK_VECTOR_MATRIX_TYPEDEFS (bool, Bool)
 
template<class VectorType >
ConcatenatedModel< VectorTypeoperator>> (AbstractModel< VectorType, VectorType, VectorType > &firstModel, AbstractModel< VectorType, VectorType, VectorType > &secondModel)
 Connects two AbstractModels so that the output of the first model is the input of the second. More...
 
template<class VectorType >
ConcatenatedModel< VectorTypeoperator>> (ConcatenatedModel< VectorType > const &firstModel, AbstractModel< VectorType, VectorType, VectorType > &secondModel)
 
template<typename InputType , typename InputTypeT1 , typename InputTypeT2 >
double evalSkipMissingFeatures (const AbstractKernelFunction< InputType > &kernelFunction, const InputTypeT1 &inputA, const InputTypeT2 &inputB)
 
template<typename InputType , typename InputTypeT1 , typename InputTypeT2 , typename InputTypeT3 >
double evalSkipMissingFeatures (const AbstractKernelFunction< InputType > &kernelFunction, const InputTypeT1 &inputA, const InputTypeT2 &inputB, InputTypeT3 const &missingness)
 
template<class InputType , class M , class Device >
void calculateRegularizedKernelMatrix (AbstractKernelFunction< InputType >const &kernel, Data< InputType > const &dataset, blas::matrix_expression< M, Device > &matrix, double regularizer=0)
 Calculates the regularized kernel gram matrix of the points stored inside a dataset. More...
 
template<class InputType , class M , class Device >
void calculateMixedKernelMatrix (AbstractKernelFunction< InputType >const &kernel, Data< InputType > const &dataset1, Data< InputType > const &dataset2, blas::matrix_expression< M, Device > &matrix)
 Calculates the kernel gram matrix between two data sets. More...
 
template<class InputType >
RealMatrix calculateRegularizedKernelMatrix (AbstractKernelFunction< InputType >const &kernel, Data< InputType > const &dataset, double regularizer=0)
 Calculates the regularized kernel gram matrix of the points stored inside a dataset. More...
 
template<class InputType >
RealMatrix calculateMixedKernelMatrix (AbstractKernelFunction< InputType >const &kernel, Data< InputType > const &dataset1, Data< InputType > const &dataset2)
 Calculates the kernel gram matrix between two data sets. More...
 
template<class InputType , class WeightMatrix >
RealVector calculateKernelMatrixParameterDerivative (AbstractKernelFunction< InputType > const &kernel, Data< InputType > const &dataset, WeightMatrix const &weights)
 Efficiently calculates the weighted derivative of a Kernel Gram Matrix w.r.t the Kernel Parameters. More...
 
template<class RBMType >
double logPartitionFunction (RBMType const &rbm, double beta=1.0)
 Calculates the value of the partition function $Z$. More...
 
template<class RBMType >
double negativeLogLikelihoodFromLogPartition (RBMType const &rbm, UnlabeledData< RealVector > const &inputs, double logPartition, double beta=1.0)
 Estimates the negative log-likelihood of a set of input vectors under the models distribution using the partition function. More...
 
template<class RBMType >
double negativeLogLikelihood (RBMType const &rbm, UnlabeledData< RealVector > const &inputs, double beta=1.0)
 Estimates the negative log-likelihood of a set of input vectors under the models distribution. More...
 
double estimateLogFreeEnergyFromEnergySamples (RealMatrix const &energyDiffUp, RealMatrix const &energyDiffDown, PartitionEstimationAlgorithm algorithm=AIS)
 
template<class RBMType >
double estimateLogFreeEnergy (RBMType &rbm, Data< RealVector > const &initDataset, RealVector const &beta, std::size_t samples, PartitionEstimationAlgorithm algorithm=AcceptanceRatioMean, float burnInPercentage=0.1)
 
template<class RBMType >
double annealedImportanceSampling (RBMType &rbm, RealVector const &beta, std::size_t samples)
 
template<class RBMType >
double estimateLogFreeEnergy (RBMType &rbm, Data< RealVector > const &initDataset, std::size_t chains, std::size_t samples, PartitionEstimationAlgorithm algorithm=AIS, float burnInPercentage=0.1)
 
template<class RBMType >
double annealedImportanceSampling (RBMType &rbm, std::size_t chains, std::size_t samples)
 
template<typename Stream >
FrontType read_front (Stream &in, std::size_t noObjectives, const std::string &separator=" ", std::size_t headerLines=0)
 
template<class I , class L >
CVFolds< LabeledData< I, L > > createCVIID (LabeledData< I, L > &set, std::size_t numberOfPartitions, std::size_t batchSize=Data< I >::DefaultBatchSize)
 Create a partition for cross validation. More...
 
template<class I , class L >
CVFolds< LabeledData< I, L > > createCVSameSize (LabeledData< I, L > &set, std::size_t numberOfPartitions, std::size_t batchSize=LabeledData< I, L >::DefaultBatchSize)
 Create a partition for cross validation. More...
 
template<class I >
CVFolds< LabeledData< I, unsigned int > > createCVSameSizeBalanced (LabeledData< I, unsigned int > &set, std::size_t numberOfPartitions, std::size_t batchSize=Data< I >::DefaultBatchSize, RecreationIndices *cv_indices=NULL)
 Create a partition for cross validation. More...
 
template<class I , class L >
CVFolds< LabeledData< I, L > > createCVBatch (LabeledData< I, L > const &set, std::size_t numberOfPartitions)
 Create a partition for cross validation without changing the dataset. More...
 
template<class I , class L >
CVFolds< LabeledData< I, L > > createCVIndexed (LabeledData< I, L > &set, std::size_t numberOfPartitions, std::vector< std::size_t > indices, std::size_t batchSize=Data< I >::DefaultBatchSize)
 Create a partition for cross validation from indices. More...
 
template<class I , class L >
CVFolds< LabeledData< I, L > > createCVFullyIndexed (LabeledData< I, L > &set, std::size_t numberOfPartitions, RecreationIndices indices, std::size_t batchSize=Data< I >::DefaultBatchSize)
 Create a partition for cross validation from indices for both ordering and partitioning. More...
 
template<class T >
std::ostream & operator<< (std::ostream &stream, const Data< T > &d)
 Outstream of elements. More...
 
SHARK_EXPORT_SYMBOL std::pair< std::string, std::string > splitUrl (std::string const &url)
 Split a URL into its domain and resource parts. More...
 
SHARK_EXPORT_SYMBOL std::string download (std::string const &url, unsigned short port=80)
 Download a document with the HTTP protocol. More...
 
template<class InputType , class LabelType >
void downloadSparseData (LabeledData< InputType, LabelType > &dataset, std::string const &url, unsigned short port=80, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Download and import a sparse data (libSVM) file. More...
 
template<class InputType , class LabelType >
void downloadFromMLData (LabeledData< InputType, LabelType > &dataset, std::string const &name, std::size_t batchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Download a data set from mldata.org. More...
 
template<class InputType >
void downloadCsvData (LabeledData< InputType, unsigned int > &dataset, std::string const &url, LabelPosition lp, char separator=',', char comment='#', unsigned short port=80, std::size_t maximumBatchSize=LabeledData< RealVector, RealVector >::DefaultBatchSize)
 Download and import a dense data (CSV) file for classification. More...
 
template<class InputType >
void downloadCsvData (LabeledData< InputType, RealVector > &dataset, std::string const &url, LabelPosition lp, std::size_t numberOfOutputs=1, char separator=',', char comment='#', unsigned short port=80, std::size_t maximumBatchSize=LabeledData< RealVector, RealVector >::DefaultBatchSize)
 Download and import a dense data (CSV) file for regression. More...
 
void import_libsvm (LabeledData< RealVector, unsigned int > &dataset, std::istream &stream, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import data from a LIBSVM file. More...
 
void import_libsvm (LabeledData< CompressedRealVector, unsigned int > &dataset, std::istream &stream, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import data from a LIBSVM file. More...
 
void import_libsvm (LabeledData< RealVector, unsigned int > &dataset, std::string fn, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import data from a LIBSVM file. More...
 
void import_libsvm (LabeledData< CompressedRealVector, unsigned int > &dataset, std::string fn, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import data from a LIBSVM file. More...
 
template<typename InputType >
void export_libsvm (LabeledData< InputType, unsigned int > &dataset, const std::string &fn, bool dense=false, bool oneMinusOne=true, bool sortLabels=false, bool append=false)
 Export data to LIBSVM format. More...
 
SHARK_EXPORT_SYMBOL void importSparseData (LabeledData< RealVector, unsigned int > &dataset, std::istream &stream, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import classification data from a sparse data (libSVM) file. More...
 
SHARK_EXPORT_SYMBOL void importSparseData (LabeledData< RealVector, RealVector > &dataset, std::istream &stream, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, RealVector >::DefaultBatchSize)
 Import regression data from a sparse data (libSVM) file. More...
 
SHARK_EXPORT_SYMBOL void importSparseData (LabeledData< CompressedRealVector, unsigned int > &dataset, std::istream &stream, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import classification data from a sparse data (libSVM) file. More...
 
SHARK_EXPORT_SYMBOL void importSparseData (LabeledData< CompressedRealVector, RealVector > &dataset, std::istream &stream, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, RealVector >::DefaultBatchSize)
 Import regression data from a sparse data (libSVM) file. More...
 
SHARK_EXPORT_SYMBOL void importSparseData (LabeledData< RealVector, unsigned int > &dataset, std::string fn, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import classification data from a sparse data (libSVM) file. More...
 
SHARK_EXPORT_SYMBOL void importSparseData (LabeledData< RealVector, RealVector > &dataset, std::string fn, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, RealVector >::DefaultBatchSize)
 Import regression data from a sparse data (libSVM) file. More...
 
SHARK_EXPORT_SYMBOL void importSparseData (LabeledData< CompressedRealVector, unsigned int > &dataset, std::string fn, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import classification data from a sparse data (libSVM) file. More...
 
SHARK_EXPORT_SYMBOL void importSparseData (LabeledData< CompressedRealVector, RealVector > &dataset, std::string fn, unsigned int highestIndex=0, std::size_t batchSize=LabeledData< RealVector, RealVector >::DefaultBatchSize)
 Import regression data from a sparse data (libSVM) file. More...
 
template<typename InputType >
void exportSparseData (LabeledData< InputType, unsigned int > const &dataset, std::ostream &stream, bool oneMinusOne=true, bool sortLabels=false)
 Export classification data to sparse data (libSVM) format. More...
 
template<typename InputType >
void exportSparseData (LabeledData< InputType, unsigned int > const &dataset, const std::string &fn, bool oneMinusOne=true, bool sortLabels=false, bool append=false)
 Export classification data to sparse data (libSVM) format. More...
 
template<typename InputType >
void exportSparseData (LabeledData< InputType, RealVector > const &dataset, std::ostream &stream)
 Export regression data to sparse data (libSVM) format. More...
 
template<typename InputType >
void exportSparseData (LabeledData< InputType, RealVector > const &dataset, const std::string &fn, bool append=false)
 Export regression data to sparse data (libSVM) format. More...
 
template<class InputType , class OutputType >
void initRandomNormal (AbstractModel< InputType, OutputType > &model, double s)
 Initialize model parameters normally distributed. More...
 
template<class InputType , class OutputType >
void initRandomUniform (AbstractModel< InputType, OutputType > &model, double lower, double upper)
 Initialize model parameters uniformly at random. More...
 

Variables

static const double SQRT_2_PI = boost::math::constants::root_two_pi<double>()
 Constant for sqrt( 2 * pi ). More...
 
enum  LabelPosition { FIRST_COLUMN, LAST_COLUMN }
 Position of the label in a CSV file. More...
 
SHARK_EXPORT_SYMBOL void csvStringToData (Data< FloatVector > &data, std::string const &contents, char separator=',', char comment='#', std::size_t maximumBatchSize=Data< RealVector >::DefaultBatchSize)
 Import unlabeled vectors from a read-in character-separated value file. More...
 
SHARK_EXPORT_SYMBOL void csvStringToData (Data< RealVector > &data, std::string const &contents, char separator=',', char comment='#', std::size_t maximumBatchSize=Data< RealVector >::DefaultBatchSize)
 Import unlabeled vectors from a read-in character-separated value file. More...
 
SHARK_EXPORT_SYMBOL void csvStringToData (Data< unsigned int > &data, std::string const &contents, char separator=',', char comment='#', std::size_t maximumBatchSize=Data< unsigned int >::DefaultBatchSize)
 Import "csv" from string consisting only of a single unsigned int per row. More...
 
SHARK_EXPORT_SYMBOL void csvStringToData (Data< int > &data, std::string const &contents, char separator=',', char comment='#', std::size_t maximumBatchSize=Data< int >::DefaultBatchSize)
 Import "csv" from string consisting only of a single int per row. More...
 
SHARK_EXPORT_SYMBOL void csvStringToData (Data< float > &data, std::string const &contents, char separator=',', char comment='#', std::size_t maximumBatchSize=Data< double >::DefaultBatchSize)
 Import "csv" from string consisting only of a single double per row. More...
 
SHARK_EXPORT_SYMBOL void csvStringToData (Data< double > &data, std::string const &contents, char separator=',', char comment='#', std::size_t maximumBatchSize=Data< double >::DefaultBatchSize)
 Import "csv" from string consisting only of a single double per row. More...
 
SHARK_EXPORT_SYMBOL void csvStringToData (LabeledData< RealVector, unsigned int > &dataset, std::string const &contents, LabelPosition lp, char separator=',', char comment='#', std::size_t maximumBatchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import labeled data from a character-separated value file. More...
 
SHARK_EXPORT_SYMBOL void csvStringToData (LabeledData< FloatVector, unsigned int > &dataset, std::string const &contents, LabelPosition lp, char separator=',', char comment='#', std::size_t maximumBatchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import labeled data from a character-separated value file. More...
 
SHARK_EXPORT_SYMBOL void csvStringToData (LabeledData< RealVector, RealVector > &dataset, std::string const &contents, LabelPosition lp, std::size_t numberOfOutputs=1, char separator=',', char comment='#', std::size_t maximumBatchSize=LabeledData< RealVector, RealVector >::DefaultBatchSize)
 Import regression data from a read-in character-separated value file. More...
 
SHARK_EXPORT_SYMBOL void csvStringToData (LabeledData< FloatVector, FloatVector > &dataset, std::string const &contents, LabelPosition lp, std::size_t numberOfOutputs=1, char separator=',', char comment='#', std::size_t maximumBatchSize=LabeledData< RealVector, RealVector >::DefaultBatchSize)
 Import regression data from a read-in character-separated value file. More...
 
template<class T >
void importCSV (Data< T > &data, std::string fn, char separator=',', char comment='#', std::size_t maximumBatchSize=Data< T >::DefaultBatchSize, std::size_t titleLines=0)
 Import a Dataset from a csv file. More...
 
template<class T >
void importCSV (LabeledData< blas::vector< T >, unsigned int > &data, std::string fn, LabelPosition lp, char separator=',', char comment='#', std::size_t maximumBatchSize=LabeledData< RealVector, unsigned int >::DefaultBatchSize)
 Import a labeled Dataset from a csv file. More...
 
template<class T >
void importCSV (LabeledData< blas::vector< T >, blas::vector< T > > &data, std::string fn, LabelPosition lp, std::size_t numberOfOutputs=1, char separator=',', char comment='#', std::size_t maximumBatchSize=LabeledData< RealVector, RealVector >::DefaultBatchSize)
 Import a labeled Dataset from a csv file. More...
 
template<typename Type >
void exportCSV (Data< Type > const &set, std::string fn, char separator=',', bool sci=true, unsigned int width=0)
 Format unlabeled data into a character-separated value file. More...
 
template<typename InputType , typename LabelType >
void exportCSV (LabeledData< InputType, LabelType > const &dataset, std::string fn, LabelPosition lp, char separator=',', bool sci=true, unsigned int width=0)
 Format labeled data into a character-separated value file. More...
 
enum  KernelMatrixNormalizationType {
  NONE, MULTIPLICATIVE_TRACE_ONE, MULTIPLICATIVE_TRACE_N, MULTIPLICATIVE_VARIANCE_ONE,
  CENTER_ONLY, CENTER_AND_MULTIPLICATIVE_TRACE_ONE
}
 
template<typename InputType , typename LabelType >
void exportKernelMatrix (LabeledData< InputType, LabelType > const &dataset, AbstractKernelFunction< InputType > &kernel, std::ostream &out, KernelMatrixNormalizationType normalizer=NONE, bool scientific=false, unsigned int fieldwidth=0)
 Write a kernel Gram matrix to stream. More...
 
template<typename InputType , typename LabelType >
void exportKernelMatrix (LabeledData< InputType, LabelType > const &dataset, AbstractKernelFunction< InputType > &kernel, std::string fn, KernelMatrixNormalizationType normalizer=NONE, bool sci=false, unsigned int width=0)
 Write a kernel Gram matrix to file. More...
 
template<typename InputType , typename LabelType >
void export_kernel_matrix (LabeledData< InputType, LabelType > const &dataset, AbstractKernelFunction< InputType > &kernel, std::ostream &out, KernelMatrixNormalizationType normalizer=NONE, bool scientific=false, unsigned int fieldwidth=0)
 
template<typename InputType , typename LabelType >
void export_kernel_matrix (LabeledData< InputType, LabelType > const &dataset, AbstractKernelFunction< InputType > &kernel, std::string fn, KernelMatrixNormalizationType normalizer=NONE, bool sci=false, unsigned int width=0)
 

Detailed Description

AbstractMultiObjectiveOptimizer.

Implements Block Gibbs Sampling.

Implements the Shifter benchmark problem.

Loads the MNIST benchmark problem.

Implements the DistantModes/ArtificialModes benchmark problem.

Implements the Bars & Stripes benchmark problem.

Implements the bipolar (-1,1) state neuron layer.

Typedefs for the Bipolar RBM.

Typedefs for the Binary-Binary RBM.

Calculate statistics given a range of values.

ROC.

Implements a multi-variate normal distribution with zero mean.

Implements a multinomial distribution.

Variational-autoencoder error function.

Maximum-likelihood model selection for binary support vector machines.

Regularizer.

Radius Margin Quotient for SVM model selection.

Negative Log Likelihood error function.

Evidence for model selection of a regularization network/Gaussian process.

Functions for measuring the area under the (ROC) curve.

Error measure for classication tasks, typically used for evaluation of results.

Implements Tukey's Biweight-loss function for robust regression.

Implements the Squared Error Loss function for regression.

Implements the Squared Hinge Loss function for maximum margin classification.

Implements the squard Hinge Loss function for maximum margin regression.

Implements the Huber loss function for robust regression.

Implements the Hinge Loss function for maximum margin classification.

Implements the Hinge Loss function for maximum margin regression.

Flexible error measure for classication tasks.

Error measure for classification tasks that can be used as the objective function for training.

super class of all loss functions

implements the absolute loss, which is the distance between labels and predictions

Leave-one-out error for C-SVMs.

Leave-one-out error.

Kernel Target Alignment - a measure of alignment of a kernel Gram matrix with labels.

Archive of evaluated points as an objective function wrapper.

error function for supervised learning

cross-validation error for selection of hyper-parameters

CombinedObjectiveFunction.

Multi-objective optimization benchmark function ZDT6.

Multi-objective optimization benchmark function ZDT4.

Multi-objective optimization benchmark function ZDT3.

Multi-objective optimization benchmark function ZDT2.

Multi-objective optimization benchmark function ZDT1.

Convex benchmark function.

Implements a wrapper over an m_objective function which just rotates its inputs.

Generalized Rosenbrock benchmark function.

Pole balancing simulation for double pole.

Pole balancing simulation for double poles.

Objective function for single and double poles with partial state information (non-Markovian task)

Objective function for single and double poles with full state information (Markovian task)

Multi-objective optimization benchmark function LZ9.

Multi-objective optimization benchmark function LZ8.

Multi-objective optimization benchmark function LZ7.

Multi-objective optimization benchmark function LZ6.

Multi-objective optimization benchmark function LZ5.

Multi-objective optimization benchmark function LZ4.

Multi-objective optimization benchmark function LZ3.

Multi-objective optimization benchmark function LZ2.

Multi-objective optimization benchmark function LZ1.

Multi-objective optimization benchmark function IHR 6.

Multi-objective optimization benchmark function IHR 4.

Multi-objective optimization benchmark function IHR 3.

Multi-objective optimization benchmark function IHR 2.

Multi-objective optimization benchmark function IHR 1.

Two-dimensional, real-valued Himmelblau function.

GSP benchmark function for multiobjective optimization.

Bi-objective real-valued benchmark function proposed by Fonseca and Flemming.

Multi-objective optimization benchmark function ELLI 2.

Multi-objective optimization benchmark function ELLI 1.

Objective function DTLZ7.

Objective function DTLZ6.

Objective function DTLZ5.

Objective function DTLZ4.

Objective function DTLZ3.

Objective function DTLZ2.

Objective function DTLZ1.

Multi-objective optimization benchmark function CIGTAB 2.

Multi-objective optimization benchmark function CIGTAB 1.

Convex quadratic benchmark function.

Convex quadratic benchmark function with single dominant axis.

AbstractObjectiveFunction.

cost function for quantitative judgement of deviations of predictions from target values

Base class for constraints.

Random Forest Classifier.

Tree for nearest neighbor search in data with low embedding dimension.

Tree for nearest neighbor search in kernel-induced feature spaces.

Tree for nearest neighbor search in low dimensions.

Cart Classifier.

Binary space-partitioning tree of data points.

Implements a radial basis function layer.

One-versus-one Classifier.

Model for scaling and translation of data vectors.

NEarest neighbor model for classification and regression.

Implements the Mean Model that can be used for ensemble classifiers.

Implements a Model using a linear function.

Weighted sum of m_base kernels.

Variant of WeightedSumKernel which works on subranges of Vector inputs.

A kernel function that wraps a member kernel and multiplies it by a scalar.

Product of kernel functions.

Polynomial kernel.

Applies a kernel to two pointsets and comptues the average response.

Normalization of a kernel function.

Special kernel classes for multi-task and transfer learning.

monomial (polynomial) kernel

Weighted sum of base kernels, each acting on a subset of features only.

A kernel expansion with support of missing features.

linear kernel (standard inner product)

Collection of functions dealing with typical tasks of kernels.

Affine linear kernel function expansion.

Radial Gaussian kernel.

Do special kernel evaluation by skipping missing features.

Kernel on a finite, discrete space.

Derivative of a C-SVM hypothesis w.r.t. its hyperparameters.

Gaussian automatic relevance detection (ARD) kernel.

abstract super class of all metrics

abstract super class of all kernel functions

Implements a model applying a convolution to an image.

concatenation of two models, with type erasure

Model for "soft" clustering.

Hierarchical Clustering.

Model for "hard" clustering.

Super class for clustering models.

Clusters defined by centroids.

Super class for clustering definitions.

Model for conversion of real valued output to class labels.

base class for all models, as well as a specialized differentiable model

Some operations for creating rotation matrices.

Kernel Gram matrix with modified diagonal.

Precomputed version of a matrix for quadratic programming.

Partly Precomputed version of a matrix for quadratic programming.

Modified Kernel Gram matrix.

Cache implementing an Least-Recently-Used Strategy.

Kernel Gram matrix.

Efficient special case if the kernel is gaussian and the inputs are sparse vectors.

Kernel matrix which supports kernel evaluations on data with missing features.

Kernel matrix for SVM ranking.

Efficient quadratic matrix cache.

Kernel matrix for SVM regression.

Entry Point for all Basic Linear Algebra(BLAS) in shark.

Weighted data sets for (un-)supervised learning.

some functions for vector valued statistics like mean, variance and covariance

Support for importing and exporting data from and to sparse data (libSVM) formatted data files.

Importing and exporting PGM images.

Deprecated import_libsvm and export_libsvm functions.

This will relabel a given dataset to have labels 0..N-1 (and vice versa)

Support for importing data from HDF5 file.

export precomputed kernel matrices (using libsvm format)

Support for downloading data sets from online sources.

Fast lookup for elements in constant datasets.

Data for (un-)supervised learning.

Learning problems given by analytic distributions.

Tools for cross-validation.

Support for importing and exporting data from and to character separated value (CSV) files.

Defines an batch adptor for structures.

Defines the Batch Interface for a type, e.g., for every type a container with optimal structure.

A scoped_ptr like container for C type handles.

Provides a pair of Key and Value, as well as functions working with them.

Small Iterator collection.

Small General algorithm collection.

Template class checking whether for a functor F and Argument U, F(U) can be called.

Traits which allow to define ProxyReferences for types.

Timer abstraction with microsecond resolution.

Class which externalizes the state of an Object.

Class Describing the Shape of an Input.

Result sets for algorithms.

Shark Random number generation.

Very basic math abstraction layer.

ISerializable interface.

IParameterizable interface.

INameable interface.

Flexible and extensible mechanisms for holding flags.

Exception.

Random Forest Trainer.

Trainer for a Regularization Network or a Gaussian Process.

Support Vector Machine Trainer for the ranking-SVM.

Perceptron.

Principal Component Analysis.

Model training by means of a general purpose optimization procedure.

Trainer for One-Class Support Vector Machines.

Determine the scaling factor of a ScaledKernel so that it has unit variance in feature space one on a given dataset.

Data normalization to zero mean, unit variance and zero covariance while keping the original coordinate system.

Data normalization to zero mean, unit variance and zero covariance.

Data normalization to zero mean and unit variance.

Data normalization to the unit interval.

Trainer for binary SVMs natively supporting missing features.

Logistic Regression.

Generic Stochastic Average Gradient Descent training for linear models.

Linear Regression.

LDA.

LASSO Regression.

Generic stochastic gradient descent training for kernel-based models.

KernelMeanClassifier.

FisherLDA.

Trainer for the Epsilon-Support Vector Machine for Regression.

Remove budget maintenance strategy.

Project budget maintenance strategy.

Merge budget maintenance strategy.

Budgeted stochastic gradient descent training for kernel-based models.

Abstract Budget maintenance strategy.

Abstract Trainer Interface for trainers that support weighting.

Abstract Trainer Interface.

Abstract Support Vector Machine Trainer, general and linear case.

Stopping Criterion which evaluates the validation error and hands the result over to another stopping criterion.

Stopping Criterion which stops, when the training error seems to converge.

Stopping Criterion which stops, when the trainign error seems to converge.

Stopping Criterion which stops after a fixed number of iterations.

Stopping criterion monitoring the quotient of generalization loss and training progress.

Stopping Criterion which stops, when the generalization of the solution gets worse.

Defines a base class for stopping criteria of optimization algorithms.

Quadratic programming for Support Vector Machines.

Special container for certain coefficients describing multi-class SVMs.

General and specialized quadratic program classes and a generic solver.

Quadratic programming problem for multi-class SVMs.

Quadratic programming solvers for linear multi-class SVM training without bias.

Quadratic programming m_problem for multi-class SVMs.

Quadratic programming solver linear SVM training without bias.

Quadratic program definitions.

Shrinking strategy based on box constraints.

Pegasos solvers for linear SVMs.

Efficient Nearest neighbor queries.

Efficient brute force implementation of nearest neighbors.

Interface for nearest Neighbor queries.

Jaakkola's heuristic and related quantities for Gaussian kernel selection.

Trust-Region Newton-Step Method.

SteepestDescent.

implements different versions of Resilient Backpropagation of error.

LineSearch.

LBFGS.

CG.

BFGS.

Adam.

Base class for Line Search Optimizer.

Implements the VD-CMA-ES Algorithm.

SteadyStateMOCMA.h.

Implements the SMS-EMOA.

Nelder-Mead Simplex Downhill Method.

Implements the RVEA algorithm.

RealCodedNSGAIIII.h.

IndicatorBasedRealCodedNSGAII.h.

Implements a two point step size adaptation rule based on a line-search type of approach.

Roulette-Wheel-Selection using uniform selection probability assignment.

Implements tournament selection.

Implements fitness proportional selection.

Implements the reference vector selection for RVEA.

Roulette-Wheel-Selection based on fitness-rank-based selection probability assignment.

Indicator-based selection strategy for multi-objective selection.

EP-Tournament selection operator.

Elitist Selection Operator suitable for (mu,lambda) and (mu+lambda) selection.

Implements the Tchebycheff scalarizer.

Reference vector adaptation for the RVEA algorithm.

Uniform crossover of arbitrary individuals.

Simulated binary crossover operator.

Implements one-point crossover operator.

Implements the tep size adaptation based on the success of the new population compared to the old.

Polynomial mutation operator.

Bit flip mutation operator.

Various functions for generating n-dimensional grids (simplex-lattices).

Algorithm selecting front based on their crowding distance.

Calculates the hypervolume covered by a front of non-dominated points.

Algorithm selecting points based on their crowding distance.

Calculates the additive approximation quality of a Pareto-front approximation.

Approximately determines the individual contributing the least hypervolume.

Implements the frontend for the HypervolumeContribution algorithms, including the approximations.

Implementation of the exact hypervolume calculation in m dimensions.

Implementation of the exact hypervolume calculation in 3 dimensions.

Implementation of the exact hypervolume calculation in 2 dimensions.

Implements the frontend for the HypervolumeCalculator algorithms, including the approximations.

Determine the volume of the union of objects by an FPRAS.

PenalizingEvaluator.

Implementation of the Pareto-Dominance relation.

Swapper method for non-dominated sorting.

Implements the fast non-dominated sort algorithm.

Implements a divide-and-conquer non-dominated sorting algorithm.

Implements the MOEA/D algorithm.

Implements the generational Multi-objective Covariance Matrix Adapation ES.

  • GridSearch.h.

Implements the most recent version of the elitist CMA-ES.

Implements the Cross Entropy Algorithm.

Implements the CMSA.

Implements the most recent version of the non-elitist CMA-ES.

TypedIndividual.

CMAChromosomeof the CMA-ES.

The k-means clustering algorithm.

AbstractSingleObjectiveOptimizer.

AbstractOptimizer.

Author
T.Voss, T. Glasmachers, O.Krause
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Hansen, N. The CMA Evolution Startegy: A Tutorial, June 28, 2011 and the eqation numbers refer to this publication (retrieved April 2014).

Author
Thomas Voss and Christian Igel
Date
April 2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The algorithm is described in

H. G. Beyer, B. Sendhoff (2008). Covariance Matrix Adaptation Revisited: The CMSA Evolution Strategy In Proceedings of the Tenth International Conference on Parallel Problem Solving from Nature (PPSN X), pp. 123-132, LNCS, Springer-Verlag

Copyright (c) 1998-2008:
Institut für Neuroinformatik
Author
-
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Christophe Thiery, Bruno Scherrer. Improvements on Learning Tetris with Cross Entropy. International Computer Games Association Journal, ICGA, 2009, 32. <inria-00418930>

Author
Jens Holm, Mathias Petræus and Mark Wulff
Date
January 2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The algorithm is based on

C. Igel, T. Suttorp, and N. Hansen. A Computational Efficient Covariance Matrix Update and a (1+1)-CMA for Evolution Strategies. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2006), pp. 453-460, ACM Press, 2006

D. V. Arnold and N. Hansen: Active covariance matrix adaptation for the (1+1)-CMA-ES. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2010): pp 385-392, ACM Press 2010

Author
O. Krause T.Voss
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Voss, T. Glasmachers, O.Krause
Date
2010-2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Thomas Voss and Christian Igel
Date
April 2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Voss
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Bjoern Bugge Grathwohl
Date
February 2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Voss, O. Krause
Date
2015
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers (based on old version by T. Voß)
Date
2011-2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Voss, O.Krause
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Voss
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2014-2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause, T. Glasmachers
Date
2014-2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2016-2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Voss, O.Krause, T. Glasmachers
Date
2014-2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Voss, O.Krause
Date
2010-2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Voss, O.Krause
Date
2010-2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Bjørn Bugge Grathwohl
Date
February 2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Voss O.Krause
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Voss
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Bjoern Bugge Grathwohl
Date
March 2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

K. Miettinen, "Nonlinear Multiobjective Optimization", International Series in Operations Research and Management Science (12) DOI: 10.1007/978-1-4615-5563-6

Author
Bjørn Bugge Grathwohl
Date
February 2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The algorithm is described in: James E. Baker. Adaptive Selection Methods for Genetic Algorithms. In John J. Grefenstette (ed.): Proceedings of the 1st International Conference on Genetic Algorithms (ICGA), pp. 101-111, Lawrence Erlbaum Associates, 1985

Author
T.Voss
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

See http://en.wikipedia.org/wiki/Fitness_proportionate_selection

Author
T. Voss
Copyright (c) 1998-2008:
Institut für Neuroinformatik
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

See http://en.wikipedia.org/wiki/Tournament_selection

Author
-
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2015
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

See Nicola Beume, Boris Naujoks, and Michael Emmerich. SMS-EMOA: Multiobjective selection based on dominated hypervolume. European Journal of Operational Research, 181(3):1653-1669, 2007.

Author
T.Voss
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Oswin Krause
Date
April 2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.Implements the VD-CMA-ES Algorithm

The VD-CMA-ES implements a restricted form of the CMA-ES where the covariance matrix is restriced to be (D+vv^T) where D is a diagonal matrix and v a single vector. Therefore this variant is capable of large-scale optimisation

For more reference, see the paper Akimoto, Y., A. Auger, and N. Hansen (2014). Comparison-Based Natural Gradient Optimization in High Dimension. To appear in Genetic and Evolutionary Computation Conference (GECCO 2014), Proceedings, ACM

The implementation differs from the paper to be closer to the reference implementation and to have better numerical accuracy.

Author
O. Krause
Date
2013
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause
Date
2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The Broyden, Fletcher, Goldfarb, Shannon (BFGS) algorithm is a quasi-Newton method for unconstrained real-valued optimization.

Author
O. Krause
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Conjugate-gradient method for unconstraint optimization.

Author
O. Krause
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The Limited-Memory Broyden, Fletcher, Goldfarb, Shannon (BFGS) algorithm is a quasi-Newton method for unconstrained real-valued optimization. See: http://en.wikipedia.org/wiki/LBFGS for details.

Author
S. Dahlgaard, O.Krause
Date
2013
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause, S. Dahlgaard
Date
2010-2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Oswin Krause
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause
Date
2015
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, O. Krause, C. Igel
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2012-2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, O.Krause
Date
2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, O.Krause
Date
2013
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2007-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, O.Krause
Date
2007-2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This file provides a number of classes representing hugh dense matrices all related to kernel Gram matices of possibly large datasets. These classes share a common interface for (a) providing a matrix entry, (b) swapping two variable indices, and (c) returning the matrix size.
This interface is required by the template class CachedMatrix, which provides a cache mechanism for restricted matrix rows, as it is used by various quadratic program solvers within the library. The PrecomputedMatrix provides a sometimes faster but more memory intensive alternative to CachedMatrix.
Author
T. Glasmachers
Date
2007-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This file provides: 1) the QpConfig class, which can configure and provide information about an SVM training procedure; 2) a super-class for general SVM trainers, namely the AbstractSvmTrainer; and 3) a streamlined variant thereof for purely linear SVMs, namely the AbstractLinearSvmTrainer. In general, the SvmTrainers hold as parameters all hyperparameters of the underlying SVM, which includes the kernel parameters for non-linear SVMs.
Author
T. Glasmachers
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause, T.Glasmachers
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This holds the interface for any budget maintenance strategy.
Author
Aydin Demircioglu
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This is an implementation of the BSGD algorithm, developed by
Wang, Crammer and Vucetic: Breaking the curse of kernelization: Budgeted stochastic gradient descent for large-scale SVM training, JMLR 2012. Basically this is pegasos, so something similar to a perceptron. The main difference is that we do restrict the sparsity of the weight vector to a (currently predefined) value. Therefore, whenever this sparsity is reached, we have to decide how to add a new vector to the model, without destroying this sparsity. Several methods have been proposed for this, Wang et al. main insight is that merging two budget vectors (i.e. two vectors in the model). If the first one is searched by norm of its alpha coefficient, the second one can be found by some optimization problem, yielding a roughly optimal pair. This pair can be merged and by doing so the budget has now space for a new vector. Such strategies are called budget maintenance strategies.
This implementation owes much to the 'reference' implementation
in the BudgetedSVM software.
Author
T. Glasmachers, Aydin Demircioglu
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This is an budget strategy that adds a new vector by merging a pair of budget vectors. The pair to merge is found by first searching for the budget vector with the smallest alpha-coefficients (measured in 2-norm), and then finding the second one by computing a certain degradation measure. This is therefore linear in the size of the budget.
The method is an implementation of the merge strategy given in wang, crammer, vucetic: "Breaking the Curse of Kernelization: Budgeted Stochastic Gradient Descent for Large-Scale SVM Training" and owes very much to the implementation in BudgetedSVM.
Author
Aydin Demircioglu
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This is an budget strategy that simply project one of the budget vectors onto the others. To save time, the smallest vector (measured in 2-norm of the alpha-coefficients) will be selected for projection.
Author
Aydin Demircioglu
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This is an budget strategy that simply removes one of the budget vectors. Depending on the flavor, this can be e.g. a random one, the smallest one (w.r.t. to 2-norm of the alphas)
Author
Aydin Demircioglu
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, C. Igel
Date
2010, 2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2013
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This implementation is based on a class removed from the LinAlg package, written by M. Kreutz in 1998.
Author
T. Glasmachers
Date
2007-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause
Date
2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
B. Li
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2010, 2013
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers,O.Krause
Date
2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
M. Tuma
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2011-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
K. N. Hansen, J. Kremer
Date
2011-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This file serves as a minimal abstraction layer. Inclusion of this file makes some frequently used functions, constants, and header file inclusions OS-, compiler-, and version-independent.
Author
-
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Voss, M. Tuma
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

A ProxyReference can be used in the context of abstract functions to bind several related types of arguments to a single proxy type. Main use are ublas expression templates so that vectors, matrix rows and subvectors can be treated as one argument type

Author
O.Krause
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Implementation is based on
http://www.boost.org/doc/libs/1_52_0/doc/html/proto/appendices.html
Author
O. Krause
Date
2013
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Oswin Krause
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This class provides RAII handle management to Shark
Author
B. Li
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The most important application of the methods provided in this file is the import of data from CSV files into Shark data containers.
Author
T. Voss, M. Tuma
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2010-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2006-2013
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This file provides containers for data used by the models, loss functions, and learning algorithms (trainers). The reason for dedicated containers of this type is that data often need to be split into subsets, such as training and test data, or folds in cross-validation. The containers in this file provide memory efficient mechanisms for managing and providing such subsets.
Author
O. Krause, T. Glasmachers
Date
2010-2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The methods in this file allow to download data sets from the mldata.org repository and other sources.
Author
T. Glasmachers
Date
2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The most important application of the methods provided in this file is the import of data from HDF5 files into Shark data containers.
Author
B. Li
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Aydin Demircioglu
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Deprecated:
This file is provided for backwards compatibility. Its is deprecated, use SparseData.h for new projects.
Author
T. Glasmachers
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
C. Igel
Date
2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The most important application of the methods provided in this file is the import of data from LIBSVM files to Shark Data containers.
Author
M. Tuma, T. Glasmachers, C. Igel
Date
2010-2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause, C. Igel
Date
2010-2013
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This file provides containers for data used by the models, loss functions, and learning algorithms (trainers). The reason for dedicated containers of this type is that data often need to be split into subsets, such as training and test data, or folds in cross-validation. The containers in this file provide memory efficient mechanisms for managing and providing such subsets. The speciality of these containers are that they are weighted.
Author
O. Krause
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause, T.Glasmachers, T. Voss
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/. Shark linear algebra definitions

This file provides all basic definitions for linear algebra. If defines objects and views for vectors and matrices over several base types as well as a lot of usefull functions.
Author
T. Glasmachers
Date
2007-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers
Date
2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, A. Demircioglu
Date
2007-2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause
Date
2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Glasmachers, O. Krause
Date
2010
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause
Date
2010-01-01
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Date
2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Glasmachers, O. Krause, M. Tuma
Date
2010-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This class provides two main member functions for computing the derivative of a C-SVM hypothesis w.r.t. its hyperparameters. First, the derivative is prepared in general. Then, the derivative can be computed comparatively cheaply for any input sample. Needs to be supplied with pointers to a KernelExpansion and CSvmTrainer.
Author
M. Tuma, T. Glasmachers
Date
2007-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Affine linear kernel expansions resulting from Support vector machine (SVM) training and other kernel methods.
Author
T. Glasmachers
Date
2007-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause
Date
2007-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Glasmachers, O. Krause, M. Tuma
Date
2010, 2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
M. Tuma, O.Krause
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Glasmachers, O. Krause, M. Tuma
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, O.Krause
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Glasmachers
Date
2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
M. Tuma, T. Glasmachers, O. Krause
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
S., O.Krause
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, O. Krause
Date
2010-2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Kang Li, O. Krause
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, C. Igel, O.Krause
Date
2012-2017
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
K. N. Hansen, J. Kremer
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
K. N. Hansen, O.Krause, J. Kremer
Date
2011-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2013
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
-
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The function is described in

Christian Igel, Nikolaus Hansen, and Stefan Roth. Covariance Matrix Adaptation for Multi-objective Optimization. Evolutionary Computation 15(1), pp. 1-28, 2007

Author
-
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
-
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
-
Date
2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Multi-modal benchmark function.

Author
-
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The function is described in

H. Li and Q. Zhang. Multiobjective Optimization Problems with Complicated Pareto Sets, MOEA/D and NSGA-II, IEEE Trans on Evolutionary Computation, 2(12):284-302, April 2009.

Author
-
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Class for balancing one or two poles on a cart using a fitness function that decreases the longer the pole(s) balance(s). Based on code written by Verena Heidrich-Meisner for the paper

V. Heidrich-Meisner and C. Igel. Neuroevolution strategies for episodic reinforcement learning. Journal of Algorithms, 64(4):152–168, 2009.

Author
Johan Valentin Damgaard
Date
-
Copyright 1995-2017 Shark Development Team

This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Oswin Krause
Date
2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Class for simulating two poles balancing on a cart. Based on code written by Verena Heidrich-Meisner for the paper

V. Heidrich-Meisner and C. Igel. Neuroevolution strategies for episodic reinforcement learning. Journal of Algorithms, 64(4):152–168, 2009.

which was in turn based on code available at http://webdocs.cs.ualberta.ca/~sutton/book/code/pole.c as of 2015/4/19, written by Rich Sutton and Chuck Anderson and later modified. Faustino Gomez wrote the physics code using the differential equations from Alexis Weiland's paper and added the Runge-Kutta solver.

Author
Johan Valentin Damgaard
Date
-
Copyright 1995-2017 Shark Development Team

This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Class for simulating a single pole balancing on a cart. Based on code written by Verena Heidrich-Meisner for the paper

V. Heidrich-Meisner and C. Igel. Neuroevolution strategies for episodic reinforcement learning. Journal of Algorithms, 64(4):152–168, 2009.

which was in turn based on code available at http://webdocs.cs.ualberta.ca/~sutton/book/code/pole.c as of 2015/4/19, written by Rich Sutton and Chuck Anderson and later modified. Faustino Gomez wrote the physics code using the differential equations from Alexis Weiland's paper and added the Runge-Kutta solver.

Author
Johan Valentin Damgaard
Date
-
Copyright 1995-2017 Shark Development Team

This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

This non-convex benchmark function for real-valued optimization is a generalization from two to multiple dimensions of a classic function first proposed in:

H. H. Rosenbrock. An automatic method for finding the greatest or least value of a function. The Computer Journal 3: 175-184, 1960

Author
-
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Voss
Date
2010-2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

The function is described in

Eckart Zitzler, Kalyanmoy Deb, and Lothar Thiele. Comparison of Multiobjective Evolutionary Algorithms: Empirical Results. Evolutionary Computation 8(2):173-195, 2000

Author
-
Date
-
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, O. Krause
Date
2007-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T. Glasmachers, O.Krause
Date
2010-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Tobias Glasmachers
Date
2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Oswin Krause
Date
2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Oswin Krause, Christian Igel
Date
2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Christian Igel
Date
2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
C. Igel, T. Glasmachers, O. Krause
Date
2007-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Glasmachers, O.Krause
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
M.Tuma, T.Glasmachers
Date
2009-2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
T.Voss, O.Krause
Date
2016
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2010-2011
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O.Krause
Date
2015
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause, A.Fischer
Date
2012-2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Oswin Krause Asja Fischer
Date
1.2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Asja Fischer
Date
1. 2014
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
Asja Fischer
Date
2013
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Author
O. Krause, A.Fischer, K.Bruegge
Date
2012
Copyright 1995-2017 Shark Development Team



This file is part of Shark. http://shark-ml.org/

Shark is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

Shark is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

You should have received a copy of the GNU Lesser General Public License along with Shark. If not, see http://www.gnu.org/licenses/.

Typedef Documentation

◆ BinaryCD

◆ BinaryGibbsChain

◆ BinaryGibbsOperator

Definition at line 48 of file BinaryRBM.h.

◆ BinaryParallelTempering

◆ BinaryPCD

◆ BinaryPTChain

◆ BinaryRBM

typedef RBM<BinaryLayer,BinaryLayer, random::rng_type> shark::BinaryRBM

Definition at line 47 of file BinaryRBM.h.

◆ BinarySpace

Definition at line 82 of file TwoStateSpace.h.

◆ BipolarCD

◆ BipolarGibbsChain

◆ BipolarGibbsOperator

Definition at line 48 of file BipolarRBM.h.

◆ BipolarParallelTempering

◆ BipolarPCD

◆ BipolarPTChain

◆ BipolarRBM

typedef RBM<BipolarLayer,BipolarLayer, random::rng_type> shark::BipolarRBM

Definition at line 47 of file BipolarRBM.h.

◆ ClassificationDataset

typedef LabeledData<RealVector, unsigned int> shark::ClassificationDataset

specialized template for classification with unsigned int labels

Definition at line 745 of file Dataset.h.

◆ CompressedARDKernel

typedef ARDKernelUnconstrained<CompressedRealVector> shark::CompressedARDKernel

Definition at line 263 of file ArdKernel.h.

◆ CompressedClassificationDataset

typedef LabeledData<CompressedRealVector, unsigned int> shark::CompressedClassificationDataset

specialized template for classification with unsigned int labels and sparse data

Definition at line 751 of file Dataset.h.

◆ CompressedLinearKernel

typedef LinearKernel<CompressedRealVector> shark::CompressedLinearKernel

Definition at line 133 of file LinearKernel.h.

◆ CompressedModelKernel

typedef ModelKernel<CompressedRealVector> shark::CompressedModelKernel

Definition at line 275 of file ModelKernel.h.

◆ CompressedMonomialKernel

typedef MonomialKernel<CompressedRealVector> shark::CompressedMonomialKernel

Definition at line 193 of file MonomialKernel.h.

◆ CompressedNormalizedKernel

typedef NormalizedKernel<CompressedRealVector> shark::CompressedNormalizedKernel

Definition at line 277 of file NormalizedKernel.h.

◆ CompressedPolynomialKernel

typedef PolynomialKernel<CompressedRealVector> shark::CompressedPolynomialKernel

Definition at line 303 of file PolynomialKernel.h.

◆ CompressedRbfKernel

typedef GaussianRbfKernel<CompressedRealVector> shark::CompressedRbfKernel

Definition at line 279 of file GaussianRbfKernel.h.

◆ CompressedScaledKernel

typedef ScaledKernel<CompressedRealVector> shark::CompressedScaledKernel

Definition at line 160 of file ScaledKernel.h.

◆ CompressedWeightedSumKernel

Definition at line 399 of file WeightedSumKernel.h.

◆ CompressesSubrangeKernel

typedef SubrangeKernel<CompressedRealVector> shark::CompressesSubrangeKernel

Definition at line 208 of file SubrangeKernel.h.

◆ CrowdingRealCodedNSGAII

◆ DenseARDKernel

Definition at line 262 of file ArdKernel.h.

◆ DenseLinearKernel

Definition at line 132 of file LinearKernel.h.

◆ DenseModelKernel

typedef ModelKernel<RealVector> shark::DenseModelKernel

Definition at line 274 of file ModelKernel.h.

◆ DenseMonomialKernel

Definition at line 192 of file MonomialKernel.h.

◆ DenseNormalizedKernel

◆ DensePolynomialKernel

◆ DenseRbfKernel

Definition at line 278 of file GaussianRbfKernel.h.

◆ DenseScaledKernel

Definition at line 159 of file ScaledKernel.h.

◆ DenseSubrangeKernel

Definition at line 207 of file SubrangeKernel.h.

◆ DenseWeightedSumKernel

◆ EpsilonMOCMA

◆ EpsilonSteadyStateMOCMA

◆ EpsRealCodedNSGAII

◆ FrontType

typedef std::vector< shark::RealVector > shark::FrontType

Definition at line 14 of file AdditiveEpsilonIndicatorMain.cpp.

◆ GaussianBinaryCD

◆ GaussianBinaryGibbsChain

◆ GaussianBinaryGibbsOperator

◆ GaussianBinaryParallelTempering

◆ GaussianBinaryPCD

◆ GaussianBinaryPTChain

◆ GaussianBinaryRBM

Definition at line 48 of file GaussianBinaryRBM.h.

◆ InArchive

typedef boost::archive::polymorphic_iarchive shark::InArchive

Type of an archive to read from.

Definition at line 74 of file ISerializable.h.

◆ MOCMA

Definition at line 264 of file MOCMA.h.

◆ MultiObjectiveFunction

typedef AbstractObjectiveFunction< RealVector, RealVector > shark::MultiObjectiveFunction

Definition at line 318 of file AbstractObjectiveFunction.h.

◆ OutArchive

typedef boost::archive::polymorphic_oarchive shark::OutArchive

Type of an archive to write to.

Definition at line 80 of file ISerializable.h.

◆ PermutationMatrix

typedef blas::permutation_matrix shark::PermutationMatrix

Definition at line 74 of file Base.h.

◆ Preference

typedef std::pair<double, RealVector> shark::Preference

A preferred region in a lattice-sampled unit sphere.

A preferred region is a pair with a radius and a vector. The intersection of the vector and the unit sphere denotes the center of the preferred region, and the radius denotes the radius of a sphere constructed such that the center point (intersection of vector and unit sphere) is on its periphery. Points are then sampled on this smaller sphere and projected up on the unit sphere, thus restricting the covered segment/area/volume/hypervolume of the unit sphere. See Figure 1 in "Evolutionary Many-objective Optimization of Hybrid Electric Vehicle Control: From General Optimization to Preference Articulation"

Definition at line 136 of file Lattice.h.

◆ RealCodedNSGAII

◆ RecreationIndices

typedef std::pair< std::vector<std::size_t> , std::vector<std::size_t> > shark::RecreationIndices

auxiliary typedef for createCVSameSizeBalanced and createCVFullyIndexed, stores location index in the first and partition index in the second

Definition at line 127 of file CVDatasetTools.h.

◆ RegressionDataset

typedef LabeledData<RealVector, RealVector> shark::RegressionDataset

specialized template for regression with RealVector labels

Definition at line 748 of file Dataset.h.

◆ Rprop93

Used to connect the class names with the year of publication of the paper in which the algorithm was introduced.

Definition at line 539 of file Rprop.h.

◆ Rprop94

Used to connect the class names with the year of publication of the paper in which the algorithm was introduced.

Definition at line 543 of file Rprop.h.

◆ Rprop99

Used to connect the class names with the year of publication of the paper in which the algorithm was introduced.

Definition at line 531 of file Rprop.h.

◆ Rprop99d

Used to connect the class names with the year of publication of the paper in which the algorithm was introduced.

Definition at line 535 of file Rprop.h.

◆ Sequence

typedef std::deque<RealVector> shark::Sequence

Type of Data sequences.

Definition at line 77 of file Base.h.

◆ SingleObjectiveFunction

Definition at line 317 of file AbstractObjectiveFunction.h.

◆ SteadyStateMOCMA

◆ SymmetricBinarySpace

Definition at line 83 of file TwoStateSpace.h.

◆ TextInArchive

typedef boost::archive::polymorphic_text_iarchive shark::TextInArchive

Definition at line 75 of file ISerializable.h.

◆ TextOutArchive

typedef boost::archive::polymorphic_text_oarchive shark::TextOutArchive

Definition at line 81 of file ISerializable.h.

◆ TruncExpBinaryCD

◆ TruncExpBinaryGibbsChain

◆ TruncExpBinaryGibbsOperator

◆ TruncExpBinaryParallelTempering

◆ TruncExpBinaryPCD

◆ TruncExpBinaryPTChain

◆ TruncExpBinaryRBM

typedef RBM<TruncExpBinaryEnergy, random::rng_type> shark::TruncExpBinaryRBM

Definition at line 47 of file TruncExpBinaryRBM.h.

Enumeration Type Documentation

◆ AlphaStatus

Enumerator
AlphaFree 
AlphaLowerBound 
AlphaUpperBound 
AlphaDeactivated 

Definition at line 391 of file QpSolver.h.

◆ BuildType

Models the build type.

Enumerator
RELEASE_BUILD_TYPE 

A release build.

DEBUG_BUILD_TYPE 

A debug build.

Definition at line 58 of file Shark.h.

◆ Convolution

enum shark::Convolution
strong
Enumerator
Valid 
ZeroPad 

Definition at line 42 of file ConvolutionalModel.h.

◆ DominanceRelation

Result of comparing two objective vectors w.r.t. Pareto dominance.

Enumerator
INCOMPARABLE 
LHS_DOMINATES_RHS 
RHS_DOMINATES_LHS 
EQUIVALENT 

Definition at line 42 of file ParetoDominance.h.

◆ McSvm

enum shark::McSvm
strong
Enumerator
WW 
CS 
LLW 
ATM 
ATS 
ADM 
OVA 
MMR 
ReinforcedSvm 

Definition at line 27 of file CSvmTrainer.h.

◆ PartitionEstimationAlgorithm

Enumerator
AIS 
AISMean 
TwoSidedAISMean 
AcceptanceRatio 
AcceptanceRatioMean 

Definition at line 111 of file analytics.h.

◆ QpStopType

reason for the quadratic programming solver to stop the iterative optimization process

Enumerator
QpNone 
QpAccuracyReached 
QpMaxIterationsReached 
QpTimeout 

Definition at line 63 of file QuadraticProgram.h.

Function Documentation

◆ annealedImportanceSampling() [1/2]

template<class RBMType >
double shark::annealedImportanceSampling ( RBMType &  rbm,
RealVector const &  beta,
std::size_t  samples 
)

Definition at line 171 of file analytics.h.

Referenced by annealedImportanceSampling().

◆ annealedImportanceSampling() [2/2]

template<class RBMType >
double shark::annealedImportanceSampling ( RBMType &  rbm,
std::size_t  chains,
std::size_t  samples 
)

Definition at line 196 of file analytics.h.

References annealedImportanceSampling(), and beta.

◆ approximateKernelExpansion()

SHARK_EXPORT_SYMBOL KernelExpansion<RealVector> shark::approximateKernelExpansion ( random::rng_type &  rng,
KernelExpansion< RealVector > const &  model,
std::size_t  k,
double  precision = 1.e-8 
)

Approximates a kernel expansion by a smaller one using an optimized basis.

often, a kernel expansion can be represented much more compactly when the points defining the basis of the kernel expansion are not fixed. The resulting kernel expansion can be evaluated much quicker than the original when k is small compared to the number of the nonzero elements in the weight vector of the supplied kernel expansion.

Given a kernel expansion with weight matrix alpha and Basis B of size m, finds a new weight matrix beta and Basis Z with k vectors so that the difference of the resulting decision vectors is small in the RKHS defined by the supplied kernel.

The algorithm proceeds by first performing a kMeans clustering as a good initialization. This initial guess is then optimized by finding the closest weight vector to the original vector representable by the basis. Using this estimate, the basis can then be optimized.

The supplied kernel must be dereferentiable wrt its input parameters which excludes all kernels not defined on RealVector

The algorithms is O(k^3 + k m) in each iteration.

Parameters
rngthe Rng used for the kMeans clustering
modelthe kernel expansion to approximate
kthe number of basis vectors to be used by the approximation
precisiontarget precision of the gradient to be reached during optimization

◆ batchBegin() [1/2]

template<class BatchT >
auto shark::batchBegin ( BatchT &  batch) -> decltype(BatchTraits<BatchT>::type::begin(batch))

Definition at line 420 of file BatchInterface.h.

Referenced by transform().

◆ batchBegin() [2/2]

template<class BatchT >
auto shark::batchBegin ( BatchT const &  batch) -> decltype(BatchTraits<BatchT>::type::begin(batch))

Definition at line 425 of file BatchInterface.h.

◆ batchEnd() [1/2]

template<class BatchT >
auto shark::batchEnd ( BatchT &  batch) -> decltype(BatchTraits<BatchT>::type::end(batch))

◆ batchEnd() [2/2]

template<class BatchT >
auto shark::batchEnd ( BatchT const &  batch) -> decltype(BatchTraits<BatchT>::type::end(batch))

Definition at line 435 of file BatchInterface.h.

◆ batchSize()

template<class BatchT >
std::size_t shark::batchSize ( BatchT const &  batch)

Definition at line 415 of file BatchInterface.h.

Referenced by calculateKernelMatrixParameterDerivative(), calculateMixedKernelMatrix(), shark::KHCTree< Container, CuttingAccuracy >::calculateNormal(), calculateRegularizedKernelMatrix(), createCVFullyIndexed(), createCVIID(), createCVIndexed(), createCVSameSize(), createCVSameSizeBalanced(), createLabeledDataFromRange(), shark::GibbsOperator< RBMType >::createSample(), createUnlabeledDataFromRange(), shark::DataView< shark::Data< LabelType > const >::DataView(), downloadFromMLData(), downloadSparseData(), shark::Energy< RBM >::energy(), shark::Energy< RBM >::energyFromHiddenInput(), shark::Energy< RBM >::energyFromVisibleInput(), shark::GaussianLayer::energyTerm(), shark::MissingFeaturesKernelExpansion< InputType >::eval(), shark::PointSetKernel< InputType >::eval(), shark::Classifier< KernelExpansion< InputType > >::eval(), shark::DiscreteKernel::eval(), shark::NormalizedKernel< InputType >::eval(), shark::ProductKernel< MultiTaskSample< InputTypeT > >::eval(), shark::OneVersusOneClassifier< InputType, VectorType >::eval(), shark::WeightedSumKernel< InputType >::eval(), shark::KernelExpansion< InputType >::eval(), shark::KernelTargetAlignment< InputType, LabelType >::evalDerivative(), shark::TruncatedExponentialLayer::expectedPhiValue(), exportSparseData(), shark::AbstractKernelFunction< MultiTaskSample< InputTypeT > >::featureDistanceSqr(), shark::ExactGradient< RBMType >::getLogPartition(), shark::SimpleNearestNeighbors< InputType, LabelType >::getNeighbors(), shark::TreeNearestNeighbors< InputType, LabelType >::getNeighbors(), shark::HierarchicalClustering< InputT >::hardMembership(), shark::AbstractClustering< RealVector >::hardMembership(), import_libsvm(), shark::Energy< RBM >::logUnnormalizedProbabilityHidden(), shark::Energy< RBM >::logUnnormalizedProbabilityVisible(), main(), repartitionByClass(), shark::MultiChainApproximator< MarkovChainType >::setBatchSize(), splitAtElement(), toDataset(), shark::ScaledKernel< InputType >::weightedInputDerivative(), shark::NormalizedKernel< InputType >::weightedInputDerivative(), shark::WeightedSumKernel< InputType >::weightedInputDerivative(), shark::PointSetKernel< InputType >::weightedParameterDerivative(), and shark::NormalizedKernel< InputType >::weightedParameterDerivative().

◆ bootstrap() [1/2]

template<class InputType , class LabelType >
WeightedLabeledData< InputType, LabelType> shark::bootstrap ( LabeledData< InputType, LabelType > const &  dataset,
std::size_t  bootStrapSize = 0 
)

Creates a bootstrap partition of a labeled dataset and returns it using weighting.

Bootstrapping resamples the dataset by drawing a set of points with replacement. Thus the sampled set will contain some points multiple times and some points not at all. Bootstrapping is usefull to obtain unbiased measurements of the mean and variance of an estimator.

Optionally the size of the bootstrap (that is, the number of sampled points) can be set. By default it is 0, which indicates that it is the same size as the original dataset.

Definition at line 801 of file WeightedDataset.h.

References shark::random::discrete(), shark::random::globalRng, shark::LabeledData< InputT, LabelT >::inputShape(), shark::WeightedLabeledData< InputT, LabelT >::inputShape(), shark::LabeledData< InputT, LabelT >::labelShape(), shark::WeightedLabeledData< InputT, LabelT >::labelShape(), and shark::LabeledData< InputT, LabelT >::numberOfElements().

◆ bootstrap() [2/2]

template<class InputType >
WeightedUnlabeledData<InputType> shark::bootstrap ( UnlabeledData< InputType > const &  dataset,
std::size_t  bootStrapSize = 0 
)

Creates a bootstrap partition of an unlabeled dataset and returns it using weighting.

Bootstrapping resamples the dataset by drawing a set of points with replacement. Thus the sampled set will contain some points multiple times and some points not at all. Bootstrapping is usefull to obtain unbiased measurements of the mean and variance of an estimator.

Optionally the size of the bootstrap (that is, the number of sampled points) can be set. By default it is 0, which indicates that it is the same size as the original dataset.

Definition at line 829 of file WeightedDataset.h.

References shark::random::discrete(), shark::random::globalRng, shark::Data< Type >::numberOfElements(), shark::Data< Type >::shape(), and shark::WeightedUnlabeledData< DataT >::shape().

◆ calculateKernelMatrixParameterDerivative()

template<class InputType , class WeightMatrix >
RealVector shark::calculateKernelMatrixParameterDerivative ( AbstractKernelFunction< InputType > const &  kernel,
Data< InputType > const &  dataset,
WeightMatrix const &  weights 
)

Efficiently calculates the weighted derivative of a Kernel Gram Matrix w.r.t the Kernel Parameters.

The formula is \( \sum_i \sum_j w_{ij} k(x_i,x_j)\) where w_ij are the weights of the gradient and x_i x_j are the datapoints defining the gram matrix and k is the kernel. For efficiency it is assumd that w_ij = w_ji. This method is only useful when the whole Kernel Gram Matrix neds to be computed to get the weights w_ij and only computing smaller blocks is not sufficient.

Parameters
kernelthe kernel for which to calculate the kernel gram matrix
datasetthe set of points used in the gram matrix
weightsthe weights of the derivative, they must be symmetric!
Returns
the weighted derivative w.r.t the parameters.

Definition at line 181 of file KernelHelpers.h.

References shark::Data< Type >::batch(), batchSize(), shark::AbstractKernelFunction< InputTypeT >::createState(), shark::AbstractKernelFunction< InputTypeT >::eval(), shark::Data< Type >::numberOfBatches(), shark::IParameterizable< VectorType >::numberOfParameters(), and shark::AbstractKernelFunction< InputTypeT >::weightedParameterDerivative().

Referenced by shark::RadiusMarginQuotient< InputType, CacheType >::evalDerivative().

◆ calculateMixedKernelMatrix() [1/2]

template<class InputType , class M , class Device >
void shark::calculateMixedKernelMatrix ( AbstractKernelFunction< InputType >const &  kernel,
Data< InputType > const &  dataset1,
Data< InputType > const &  dataset2,
blas::matrix_expression< M, Device > &  matrix 
)

Calculates the kernel gram matrix between two data sets.

Parameters
kernelthe kernel for which to calculate the kernel gram matrix
dataset1the set of points corresponding to rows of the Gram matrix
dataset2the set of points corresponding to columns of the Gram matrix
matrixthe target kernel matrix

Definition at line 95 of file KernelHelpers.h.

References shark::Data< Type >::batch(), batchSize(), shark::Data< Type >::numberOfBatches(), shark::Data< Type >::numberOfElements(), SHARK_PARALLEL_FOR, and SIZE_CHECK.

Referenced by calculateMixedKernelMatrix().

◆ calculateMixedKernelMatrix() [2/2]

template<class InputType >
RealMatrix shark::calculateMixedKernelMatrix ( AbstractKernelFunction< InputType >const &  kernel,
Data< InputType > const &  dataset1,
Data< InputType > const &  dataset2 
)

Calculates the kernel gram matrix between two data sets.

Parameters
kernelthe kernel for which to calculate the kernel gram matrix
dataset1the set of points corresponding to rows of the Gram matrix
dataset2the set of points corresponding to columns of the Gram matrix
Returns
matrix the target kernel matrix

Definition at line 159 of file KernelHelpers.h.

References calculateMixedKernelMatrix().

◆ calculateRegularizedKernelMatrix() [1/2]

template<class InputType , class M , class Device >
void shark::calculateRegularizedKernelMatrix ( AbstractKernelFunction< InputType >const &  kernel,
Data< InputType > const &  dataset,
blas::matrix_expression< M, Device > &  matrix,
double  regularizer = 0 
)

Calculates the regularized kernel gram matrix of the points stored inside a dataset.

Regularization is applied by adding the regularizer on the diagonal

Parameters
kernelthe kernel for which to calculate the kernel gram matrix
datasetthe set of points used in the gram matrix
matrixthe target kernel matrix
regularizerthe regularizer of the matrix which is always >= 0. default is 0.

Definition at line 52 of file KernelHelpers.h.

References shark::Data< Type >::batch(), batchSize(), shark::Data< Type >::numberOfBatches(), shark::Data< Type >::numberOfElements(), SHARK_PARALLEL_FOR, SHARK_RUNTIME_CHECK, and SIZE_CHECK.

Referenced by calculateRegularizedKernelMatrix(), shark::KernelMatrix< InputType, CacheType >::matrix(), and shark::RegularizationNetworkTrainer< InputType >::train().

◆ calculateRegularizedKernelMatrix() [2/2]

template<class InputType >
RealMatrix shark::calculateRegularizedKernelMatrix ( AbstractKernelFunction< InputType >const &  kernel,
Data< InputType > const &  dataset,
double  regularizer = 0 
)

Calculates the regularized kernel gram matrix of the points stored inside a dataset.

Regularization is applied by adding the regularizer on the diagonal

Parameters
kernelthe kernel for which to calculate the kernel gram matrix
datasetthe set of points used in the gram matrix
regularizerthe regularizer of the matrix which is always >= 0. default is 0.
Returns
the kernel gram matrix

Definition at line 141 of file KernelHelpers.h.

References calculateRegularizedKernelMatrix(), and SHARK_RUNTIME_CHECK.

◆ classSizes() [1/2]

std::vector<std::size_t> shark::classSizes ( WeightedUnlabeledData< unsigned int > const &  labels)
inline

Returns the number of members of each class in the dataset.

Definition at line 701 of file WeightedDataset.h.

References classSizes().

◆ classSizes() [2/2]

template<class InputType , class LabelType >
std::vector<std::size_t> shark::classSizes ( WeightedLabeledData< InputType, LabelType > const &  dataset)
inline

Returns the number of members of each class in the dataset.

Definition at line 730 of file WeightedDataset.h.

References classSizes(), and shark::WeightedLabeledData< InputT, LabelT >::labels().

◆ classWeight()

template<class InputType >
RealVector shark::classWeight ( WeightedLabeledData< InputType, unsigned int > const &  dataset)

Computes the cumulative weight of every class.

Definition at line 755 of file WeightedDataset.h.

References numberOfClasses().

Referenced by shark::KernelMeanClassifier< InputType >::train().

◆ computeClosestNeighbourIndicesOnLattice()

template<typename Matrix >
UIntMatrix shark::computeClosestNeighbourIndicesOnLattice ( Matrix const &  m,
std::size_t const  n 
)

Definition at line 175 of file Lattice.h.

◆ computeOptimalLatticeTicks()

std::size_t shark::computeOptimalLatticeTicks ( std::size_t const  n,
std::size_t const  target_count 
)

Computes the number of Ticks for a grid of a certain size.

Computes the least number of ticks in each dimension required to make an n-dimensional simplex grid at least as many points as a target number points. For example, the points in a two-dimensional grid – a line – with size n are the points (0,n-1), (1,n-2), ... (n-1,0).

Referenced by shark::NSGA3Indicator::init().

◆ covariance()

template<class VectorType >
blas::matrix<typename VectorType::value_type> shark::covariance ( Data< VectorType > const &  data)

Calculates the covariance matrix of the data vectors.

Referenced by createData(), mean(), shark::NormalDistributedPoints::NormalDistributedPoints(), and shark::NormalizeComponentsZCA::train().

◆ createBatch() [1/3]

◆ createBatch() [2/3]

template<class Range >
Batch<typename Range::value_type>::type shark::createBatch ( Range const &  range)

creates a batch from a range of inputs

Definition at line 395 of file BatchInterface.h.

◆ createBatch() [3/3]

template<class T , class Iterator >
Batch<T>::type shark::createBatch ( Iterator const &  begin,
Iterator const &  end 
)

Definition at line 400 of file BatchInterface.h.

◆ createLabeledDataFromRange()

template<class InputRange , class LabelRange , class WeightRange >
boost::disable_if< boost::is_arithmetic<WeightRange>, WeightedLabeledData< typename boost::range_value<InputRange>::type, typename boost::range_value<LabelRange>::type >>::type shark::createLabeledDataFromRange ( InputRange const &  inputs,
LabelRange const &  labels,
WeightRange const &  weights,
std::size_t  batchSize = 0 
)

creates a weighted unweighted data object from two ranges, representing data and weights

Definition at line 773 of file WeightedDataset.h.

References batchSize(), createDataFromRange(), createLabeledDataFromRange(), and SHARK_RUNTIME_CHECK.

◆ createUnlabeledDataFromRange()

template<class DataRange , class WeightRange >
boost::disable_if< boost::is_arithmetic<WeightRange>, WeightedUnlabeledData< typename boost::range_value<DataRange>::type > >::type shark::createUnlabeledDataFromRange ( DataRange const &  data,
WeightRange const &  weights,
std::size_t  batchSize = 0 
)

creates a weighted unweighted data object from two ranges, representing data and weights

Definition at line 544 of file WeightedDataset.h.

References batchSize(), createDataFromRange(), createUnlabeledDataFromRange(), and SHARK_RUNTIME_CHECK.

◆ dataDimension() [1/2]

template<class DatasetType >
std::size_t shark::dataDimension ( DataView< DatasetType > const &  view)

Return the dimensionality of the dataset represented by the view.

Definition at line 344 of file DataView.h.

References dataDimension(), and shark::DataView< DatasetType >::dataset().

◆ dataDimension() [2/2]

template<class InputType >
std::size_t shark::dataDimension ( WeightedUnlabeledData< InputType > const &  dataset)

Return the dimnsionality of points of a weighted dataset.

Definition at line 707 of file WeightedDataset.h.

References dataDimension().

◆ dcNonDominatedSort()

template<class PointRange , class RankRange >
void shark::dcNonDominatedSort ( PointRange const &  points,
RankRange &  ranks 
)

Definition at line 466 of file DCNonDominatedSort.h.

Referenced by nonDominatedSort().

◆ dominance()

template<class VectorTypeA , class VectorTypeB >
DominanceRelation shark::dominance ( VectorTypeA const &  lhs,
VectorTypeB const &  rhs 
)
inline

Pareto dominance relation for two objective vectors.

Definition at line 52 of file ParetoDominance.h.

References EQUIVALENT, INCOMPARABLE, LHS_DOMINATES_RHS, RHS_DOMINATES_LHS, and SHARK_ASSERT.

Referenced by fastNonDominatedSort(), and shark::BaseDCNonDominatedSort::operator()().

◆ estimateLogFreeEnergy() [1/2]

template<class RBMType >
double shark::estimateLogFreeEnergy ( RBMType &  rbm,
Data< RealVector > const &  initDataset,
RealVector const &  beta,
std::size_t  samples,
PartitionEstimationAlgorithm  algorithm = AcceptanceRatioMean,
float  burnInPercentage = 0.1 
)

Definition at line 154 of file analytics.h.

References estimateLogFreeEnergyFromEnergySamples().

Referenced by estimateLogFreeEnergy().

◆ estimateLogFreeEnergy() [2/2]

template<class RBMType >
double shark::estimateLogFreeEnergy ( RBMType &  rbm,
Data< RealVector > const &  initDataset,
std::size_t  chains,
std::size_t  samples,
PartitionEstimationAlgorithm  algorithm = AIS,
float  burnInPercentage = 0.1 
)

Definition at line 182 of file analytics.h.

References beta, and estimateLogFreeEnergy().

◆ estimateLogFreeEnergyFromEnergySamples()

double shark::estimateLogFreeEnergyFromEnergySamples ( RealMatrix const &  energyDiffUp,
RealMatrix const &  energyDiffDown,
PartitionEstimationAlgorithm  algorithm = AIS 
)
inline

Definition at line 119 of file analytics.h.

References AIS.

Referenced by estimateLogFreeEnergy().

◆ evalSkipMissingFeatures() [1/2]

template<typename InputType , typename InputTypeT1 , typename InputTypeT2 >
double shark::evalSkipMissingFeatures ( const AbstractKernelFunction< InputType > &  kernelFunction,
const InputTypeT1 &  inputA,
const InputTypeT2 &  inputB 
)

Does a kernel function evaluation with Missing features in the inputs

Parameters
kernelFunctionThe kernel function used to do evaluation
inputAa input
inputBanother input

The kernel k(x,y) is evaluated taking missing features into account. For this it is checked whether a feature of x or y is nan and in this case the corresponding features in inputA and inputB won't be considered.

Definition at line 58 of file EvalSkipMissingFeatures.h.

References shark::AbstractKernelFunction< InputTypeT >::eval(), SHARK_RUNTIME_CHECK, SIZE_CHECK, and shark::AbstractKernelFunction< InputTypeT >::supportsVariableInputSize().

Referenced by shark::ExampleModifiedKernelMatrix< InputType, CacheType >::entry().

◆ evalSkipMissingFeatures() [2/2]

template<typename InputType , typename InputTypeT1 , typename InputTypeT2 , typename InputTypeT3 >
double shark::evalSkipMissingFeatures ( const AbstractKernelFunction< InputType > &  kernelFunction,
const InputTypeT1 &  inputA,
const InputTypeT2 &  inputB,
InputTypeT3 const &  missingness 
)

Do kernel function evaluation while Missing features in the inputs

Parameters
kernelFunctionThe kernel function used to do evaluation
inputAa input
inputBanother input
missingnessused to decide which features in the inputs to take into consideration for the purpose of evaluation. If a feature is NaN, then the corresponding features in inputA and inputB won't be considered.

Definition at line 104 of file EvalSkipMissingFeatures.h.

References shark::AbstractKernelFunction< InputTypeT >::eval(), SHARK_RUNTIME_CHECK, SIZE_CHECK, and shark::AbstractKernelFunction< InputTypeT >::supportsVariableInputSize().

◆ exportFiltersToPGMGrid() [1/2]

void shark::exportFiltersToPGMGrid ( std::string const &  basename,
RealMatrix const &  filters,
std::size_t  width,
std::size_t  height 
)
inline

Exports a set of filters as a grid image.

It is assumed that the filters each form a row in the filter-matrix. Moreover, the sizes of the filter images has to be given and it must gold width*height=W.size2(). The filters a re printed on a single image as a grid. The grid will be close to square. And the image are separated by a black 1 pixel wide line. The output will be normalized so that all images are on the same scale.

Parameters
basenameFile to write to. ".pgm" is appended to the filename
filtersMatrix storing the filters row by row
widthWidth of the filter image
heightHeight of th filter image

Definition at line 146 of file Pgm.h.

References SIZE_CHECK.

Referenced by main().

◆ exportFiltersToPGMGrid() [2/2]

void shark::exportFiltersToPGMGrid ( std::string const &  basename,
Data< RealVector > const &  filters,
std::size_t  width,
std::size_t  height 
)
inline

Exports a set of filters as a grid image.

It is assumed that the filters each form a row in the filter-matrix. Moreover, the sizes of the filter images has to be given and it must gold width*height=W.size2(). The filters a re printed on a single image as a grid. The grid will be close to square. And the image are separated by a black 1 pixel wide line. The output will be normalized so that all images are on the same scale.

Parameters
basenameFile to write to. ".pgm" is appended to the filename
filtersMatrix storing the filters row by row
widthWidth of the filter image
heightHeight of th filter image

Definition at line 183 of file Pgm.h.

References dataDimension(), shark::Data< Type >::numberOfElements(), and SIZE_CHECK.

◆ exportPGM()

template<class T >
void shark::exportPGM ( std::string const &  fileName,
T const &  data,
std::size_t  sx,
std::size_t  sy,
bool  normalize = false 
)

Export a PGM image to file.

Parameters
fileNameFile to write to
dataLinear object storing image
sxWidth of image
syHeight of image
normalizeAdjust values to [0,255], default false

Definition at line 118 of file Pgm.h.

References SIZE_CHECK.

Referenced by main().

◆ fastNonDominatedSort()

template<class PointRange , class RankRange >
void shark::fastNonDominatedSort ( PointRange const &  points,
RankRange &  ranks 
)

Implements the well-known non-dominated sorting algorithm.

Assembles subsets/fronts of mututally non-dominating individuals. Afterwards every individual is assigned a rank by points[i].rank() = frontNumber. The front of dominating points has the value 1.

The algorithm is dscribed in Deb et al, A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II IEEE Transactions on Evolutionary Computation, 2002

Parameters
points[in] pointsulation to subdivide into fronts of non-dominated individuals.
ranks[out] Set of integers storing the rank of the i-th point. must have the same size

Definition at line 49 of file FastNonDominatedSort.h.

References dominance(), LHS_DOMINATES_RHS, RHS_DOMINATES_LHS, and SIZE_CHECK.

Referenced by nonDominatedSort().

◆ fusionize() [1/4]

template<class S >
S& shark::fusionize ( detail::FusionFacade< S > &  facade)

Definition at line 147 of file BatchInterfaceAdaptStruct.h.

References S.

◆ fusionize() [2/4]

template<class S >
S const& shark::fusionize ( detail::FusionFacade< S > const &  facade)

Definition at line 151 of file BatchInterfaceAdaptStruct.h.

References S.

◆ fusionize() [3/4]

template<class S >
boost::disable_if<detail::isFusionFacade<S>,S&>::type shark::fusionize ( S facade)

Definition at line 157 of file BatchInterfaceAdaptStruct.h.

References S.

◆ fusionize() [4/4]

template<class S >
boost::disable_if<detail::isFusionFacade<S>,S const& >::type shark::fusionize ( S const &  facade)

Definition at line 162 of file BatchInterfaceAdaptStruct.h.

◆ getBatchElement() [1/2]

◆ getBatchElement() [2/2]

template<class BatchT >
auto shark::getBatchElement ( BatchT const &  batch,
std::size_t  i 
) -> decltype(BatchTraits<BatchT>::type::get(std::declval<BatchT const&>(),i))

Definition at line 410 of file BatchInterface.h.

◆ importHDF5() [1/4]

template<typename VectorType >
void shark::importHDF5 ( Data< VectorType > &  data,
const std::string &  fileName,
const std::string &  datasetName 
)

Import data from a HDF5 file.

Parameters
dataContainer storing the loaded data
fileNameThe name of HDF5 file to be read from
datasetNamethe HDF5 dataset name to access in the HDF5 file
Template Parameters
VectorTypeType of object stored in Shark data container

Definition at line 268 of file HDF5.h.

◆ importHDF5() [2/4]

template<typename VectorType , typename LabelType >
void shark::importHDF5 ( LabeledData< VectorType, LabelType > &  labeledData,
const std::string &  fileName,
const std::string &  data,
const std::string &  label 
)

Import data to a LabeledData object from a HDF5 file.

Parameters
labeledDataContainer storing the loaded data
fileNameThe name of HDF5 file to be read from
datathe HDF5 dataset name for data
labelthe HDF5 dataset name for label
Template Parameters
VectorTypeType of object stored in Shark data container
LableTypeType of label

Definition at line 294 of file HDF5.h.

◆ importHDF5() [3/4]

template<typename VectorType >
void shark::importHDF5 ( Data< VectorType > &  data,
const std::string &  fileName,
const std::vector< std::string > &  cscDatasetName 
)

Import data from HDF5 dataset of compressed sparse column format.

Parameters
dataContainer storing the loaded data
fileNameThe name of HDF5 file to be read from
cscDatasetNamethe CSC dataset names used to construct a matrix
Template Parameters
VectorTypeType of object stored in Shark data container

Definition at line 317 of file HDF5.h.

◆ importHDF5() [4/4]

template<typename VectorType , typename LabelType >
void shark::importHDF5 ( LabeledData< VectorType, LabelType > &  labeledData,
const std::string &  fileName,
const std::vector< std::string > &  cscDatasetName,
const std::string &  label 
)

Import data from HDF5 dataset of compressed sparse column format.

Parameters
labeledDataContainer storing the loaded data
fileNameThe name of HDF5 file to be read from
cscDatasetNamethe CSC dataset names used to construct a matrix
labelthe HDF5 dataset name for label
Template Parameters
VectorTypeType of object stored in Shark data container
LabelTypeType of label

Definition at line 343 of file HDF5.h.

◆ importPGM()

template<class T >
void shark::importPGM ( std::string const &  fileName,
T &  data,
std::size_t &  sx,
std::size_t &  sy 
)

Import a PGM image from file.

Parameters
fileNameThe file to read from
dataLinear object for storing image
sxWidth of imported image
syHeight of imported image

Definition at line 103 of file Pgm.h.

Referenced by importPGMSet().

◆ importPGMSet()

template<class T >
void shark::importPGMSet ( std::string const &  p,
Data< T > &  set 
)

Import PGM images scanning a directory recursively.

All images are required to have the same size. the shape of the images is stored in set.shape()

Parameters
pDirectory
setSet storing images

Definition at line 222 of file Pgm.h.

References createDataFromRange(), importPGM(), and SHARKEXCEPTION.

Referenced by main().

◆ inputDimension() [1/2]

template<class DatasetType >
std::size_t shark::inputDimension ( DataView< DatasetType > const &  view)

Return the input dimensionality of the labeled dataset represented by the view.

Definition at line 333 of file DataView.h.

References shark::DataView< DatasetType >::dataset(), and inputDimension().

◆ inputDimension() [2/2]

template<class InputType , class LabelType >
std::size_t shark::inputDimension ( WeightedLabeledData< InputType, LabelType > const &  dataset)

Return the input dimensionality of a weighted labeled dataset.

Definition at line 713 of file WeightedDataset.h.

References dataDimension(), and shark::WeightedLabeledData< InputT, LabelT >::inputs().

◆ kMeans() [1/3]

SHARK_EXPORT_SYMBOL std::size_t shark::kMeans ( Data< RealVector > const &  data,
std::size_t  k,
Centroids centroids,
std::size_t  maxIterations = 0 
)

The k-means clustering algorithm.

The k-means algorithm takes vector-valued data \( \{x_1, \dots, x_n\} \subset \mathbb R^d \) and splits it into k clusters, based on centroids \( \{c_1, \dots, c_k\} \). The result is stored in a Centroids object that can be used to construct clustering models.
This implementation starts the search with the given centroids, in case the provided centroids object (third parameter) contains a set of k centroids. Otherwise the search starts from the first k data points.
Note that the data set needs to include at least k data points for k-means to work. This is because the current implementation does not allow for empty clusters.
Parameters
datavector-valued data to be clustered
knumber of clusters
centroidscentroids input/output
maxIterationsmaximum number of k-means iterations; 0: unlimited
Returns
number of k-means iterations

Referenced by main().

◆ kMeans() [2/3]

SHARK_EXPORT_SYMBOL std::size_t shark::kMeans ( Data< RealVector > const &  data,
RBFLayer model,
std::size_t  maxIterations = 0 
)

The k-means clustering algorithm for initializing an RBF Layer.

The k-means algorithm takes vector-valued data \( \{x_1, \dots, x_n\} \subset \mathbb R^d \) and splits it into k clusters, based on centroids \( \{c_1, \dots, c_k\} \). The result is stored in a RBFLayer object that can be used to construct clustering models.
This is just an alternative frontend to the version using Centroids. it creates a centroid object, with as many clusters as are outputs in the RBFLayer and copies the result into the model.
Note that the data set needs to include at least k data points for k-means to work. This is because the current implementation does not allow for empty clusters.
Parameters
datavector-valued data to be clustered
modelRBFLayer input/output
maxIterationsmaximum number of k-means iterations; 0: unlimited
Returns
number of k-means iterations

◆ kMeans() [3/3]

template<class InputType >
KernelExpansion<InputType> shark::kMeans ( Data< InputType > const &  dataset,
std::size_t  k,
AbstractKernelFunction< InputType > &  kernel,
std::size_t  maxIterations = 0 
)

The kernel k-means clustering algorithm.

The kernel k-means algorithm takes data \( \{x_1, \dots, x_n\} \) and splits it into k clusters, based on centroids \( \{c_1, \dots, c_k\} \). The centroids are elements of the reproducing kernel Hilbert space (RHKS) induced by the kernel function. They are functions, represented as the components of a KernelExpansion object. I.e., given a data point x, the kernel expansion returns a k-dimensional vector f(x), which is the evaluation of the centroid functions on x. The value of the centroid function represents the inner product of the centroid with the kernel-induced feature vector of x (embedding of x into the RKHS). The distance of x from the centroid \( c_i \) is computes as the kernel-induced distance \( \sqrt{ kernel(x, x) + kernel(c_i, c_i) - 2 kernel(x, c_i) } \). For the Gaussian kernel (and other normalized kernels) is simplifies to \( \sqrt{ 2 - 2 kernel(x, c_i) } \). Hence, larger function values indicate smaller distance to the centroid.
Note that the data set needs to include at least k data points for k-means to work. This is because the current implementation does not allow for empty clusters.
Parameters
datasetvector-valued data to be clustered
knumber of clusters
kernelkernel function object
maxIterationsmaximum number of k-means iterations; 0: unlimited
Returns
centroids (represented as functions, see description)

Definition at line 141 of file KMeans.h.

◆ labelDimension() [1/2]

template<class DatasetType >
std::size_t shark::labelDimension ( DataView< DatasetType > const &  view)

Return the label dimensionality of the labeled dataset represented by the view.

Definition at line 338 of file DataView.h.

References shark::DataView< DatasetType >::dataset(), and labelDimension().

◆ labelDimension() [2/2]

template<class InputType , class LabelType >
std::size_t shark::labelDimension ( WeightedLabeledData< InputType, LabelType > const &  dataset)

Return the label/output dimensionality of a labeled dataset.

Definition at line 719 of file WeightedDataset.h.

References dataDimension(), and shark::WeightedLabeledData< InputT, LabelT >::labels().

◆ logPartitionFunction()

template<class RBMType >
double shark::logPartitionFunction ( RBMType const &  rbm,
double  beta = 1.0 
)

Calculates the value of the partition function $Z$.

Only useful for small input and theoretical analysis

Parameters
rbmthe RBM for which to calculate the function
betathe inverse temperature of the RBM. default is 1
Returns
the value of the partition function $Z*e^(-constant)$

Definition at line 48 of file analytics.h.

References beta.

Referenced by negativeLogLikelihood().

◆ makeKeyValuePair()

template<class Key , class Value >
KeyValuePair<Key,Value> shark::makeKeyValuePair ( Key const &  key,
Value const &  value 
)

◆ makeResultSet()

template<typename T , typename U >
ResultSet<T,U> shark::makeResultSet ( T const &  t,
U const &  u 
)

Generates a typed solution given the search point and the corresponding objective function value.

Parameters
[in]tThe search point.
[in]uThe objective function value.
Returns
A ResultSet containing the supplied search point and objective function value.

Definition at line 71 of file ResultSets.h.

◆ mean() [1/2]

◆ mean() [2/2]

template<class VectorType >
VectorType shark::mean ( UnlabeledData< VectorType > const &  data)

Definition at line 69 of file Statistics.h.

References covariance(), mean(), and variance().

◆ meanvar() [1/2]

template<class Vec1T , class Vec2T , class Vec3T , class Device >
void shark::meanvar ( Data< Vec1T > const &  data,
blas::vector_container< Vec2T, Device > &  mean,
blas::vector_container< Vec3T, Device > &  variance 
)

Calculates the mean and variance values of the input data.

Referenced by main(), shark::NormalizeComponentsZCA::train(), and shark::NormalizeComponentsUnitVariance< DataType >::train().

◆ meanvar() [2/2]

template<class Vec1T , class Vec2T , class MatT , class Device >
void shark::meanvar ( Data< Vec1T > const &  data,
blas::vector_container< Vec2T, Device > &  mean,
blas::matrix_container< MatT, Device > &  variance 
)

Calculates the mean, variance and covariance values of the input data.

◆ median_element() [1/2]

template<class Range >
boost::range_iterator<Range>::type shark::median_element ( Range &  range)

Returns the iterator to the median element. after this call, the range is partially ordered.

After the call, all elements left of the median element are guaranteed to be <= median and all element on the right are >= median.

Definition at line 80 of file functional.h.

Referenced by median_element(), and partitionEqually().

◆ median_element() [2/2]

template<class Range >
boost::range_iterator<Range>::type shark::median_element ( Range const &  rangeAdaptor)

Definition at line 91 of file functional.h.

References median_element().

◆ negativeLogLikelihood()

template<class RBMType >
double shark::negativeLogLikelihood ( RBMType const &  rbm,
UnlabeledData< RealVector > const &  inputs,
double  beta = 1.0 
)

Estimates the negative log-likelihood of a set of input vectors under the models distribution.

Only useful for small input and theoretical analysis

Parameters
rbmthe Restricted Boltzmann machine for which the negative log likelihood of the data is to be calculated
inputsthe input vectors
betathe inverse temperature of the RBM. default is 1
Returns
the log-likelihood

Definition at line 102 of file analytics.h.

References beta, logPartitionFunction(), and negativeLogLikelihoodFromLogPartition().

Referenced by shark::ExactGradient< RBMType >::eval(), and main().

◆ negativeLogLikelihoodFromLogPartition()

template<class RBMType >
double shark::negativeLogLikelihoodFromLogPartition ( RBMType const &  rbm,
UnlabeledData< RealVector > const &  inputs,
double  logPartition,
double  beta = 1.0 
)

Estimates the negative log-likelihood of a set of input vectors under the models distribution using the partition function.

Only useful for small input and theoretical analysis

Parameters
rbmthe Restricted Boltzmann machine for which the negative log likelihood of the data is to be calculated
inputsthe input vectors
logPartitionthe logarithmic value of the partition function of the RBM.
betathe inverse temperature of the RBM. default is 1
Returns
the log-likelihood

Definition at line 79 of file analytics.h.

References shark::Data< Type >::batches().

Referenced by negativeLogLikelihood().

◆ nonDominatedSort() [1/2]

template<class PointRange , class RankRange >
void shark::nonDominatedSort ( PointRange const &  points,
RankRange &  ranks 
)

Frontend for non-dominated sorting algorithms.

Assembles subsets/fronts of mutually non-dominated individuals. Afterwards every individual is assigned a rank by pop[i].rank() = frontIndex. The front of non-dominated points has the value 1.

Depending on dimensionality m and number of points n, either the fastNonDominatedSort algorithm with O(n^2 m) or the dcNonDominatedSort alforithm with complexity O(n log(n)^m) is called.

Definition at line 47 of file NonDominatedSort.h.

References dcNonDominatedSort(), fastNonDominatedSort(), and SIZE_CHECK.

Referenced by nonDominatedSort(), and shark::IndicatorBasedSelection< NSGA3Indicator >::operator()().

◆ nonDominatedSort() [2/2]

template<class PointRange , class RankRange >
void shark::nonDominatedSort ( PointRange const &  points,
RankRange const &  ranks 
)

Definition at line 68 of file NonDominatedSort.h.

References nonDominatedSort(), and ranks().

◆ numberOfClasses() [1/3]

template<class DatasetType >
std::size_t shark::numberOfClasses ( DataView< DatasetType > const &  view)

Return the number of classes (size of the label vector) of a classification dataset with RealVector label encoding.

Definition at line 327 of file DataView.h.

References shark::DataView< DatasetType >::dataset(), and numberOfClasses().

◆ numberOfClasses() [2/3]

std::size_t shark::numberOfClasses ( WeightedUnlabeledData< unsigned int > const &  labels)
inline

Definition at line 696 of file WeightedDataset.h.

References numberOfClasses().

◆ numberOfClasses() [3/3]

template<class InputType >
std::size_t shark::numberOfClasses ( WeightedLabeledData< InputType, unsigned int > const &  dataset)

Return the number of classes (highest label value +1) of a classification dataset with unsigned int label encoding.

Definition at line 724 of file WeightedDataset.h.

References shark::WeightedLabeledData< InputT, LabelT >::labels(), and numberOfClasses().

◆ operator!=()

bool shark::operator!= ( Shape const &  shape1,
Shape const &  shape2 
)
inline

Definition at line 113 of file Shape.h.

◆ operator<<() [1/5]

template<class SearchPoint , class Result >
std::ostream& shark::operator<< ( std::ostream &  out,
ResultSet< SearchPoint, Result > const &  solution 
)

Definition at line 75 of file ResultSets.h.

References shark::ResultSet< SearchPointT, ResultT >::value.

◆ operator<<() [2/5]

template<class E , class T >
std::basic_ostream<E, T>& shark::operator<< ( std::basic_ostream< E, T > &  os,
Shape const &  shape 
)

Definition at line 118 of file Shape.h.

◆ operator<<() [3/5]

template<class SearchPoint >
std::ostream& shark::operator<< ( std::ostream &  out,
ValidatedSingleObjectiveResultSet< SearchPoint > const &  solution 
)

Definition at line 131 of file ResultSets.h.

◆ operator<<() [4/5]

template<class T >
std::ostream& shark::operator<< ( std::ostream &  stream,
const WeightedUnlabeledData< T > &  d 
)

brief Outstream of elements for weighted data.

Definition at line 531 of file WeightedDataset.h.

◆ operator<<() [5/5]

template<class T , class U >
std::ostream& shark::operator<< ( std::ostream &  stream,
const WeightedLabeledData< T, U > &  d 
)

brief Outstream of elements for weighted labeled data.

Definition at line 688 of file WeightedDataset.h.

◆ operator==()

bool shark::operator== ( Shape const &  shape1,
Shape const &  shape2 
)
inline

Definition at line 102 of file Shape.h.

References shark::Shape::size().

◆ operator>>() [1/2]

template<class VectorType >
ConcatenatedModel<VectorType> shark::operator>> ( AbstractModel< VectorType, VectorType, VectorType > &  firstModel,
AbstractModel< VectorType, VectorType, VectorType > &  secondModel 
)

Connects two AbstractModels so that the output of the first model is the input of the second.

Definition at line 308 of file ConcatenatedModel.h.

References shark::ConcatenatedModel< VectorType >::add().

◆ operator>>() [2/2]

template<class VectorType >
ConcatenatedModel<VectorType> shark::operator>> ( ConcatenatedModel< VectorType > const &  firstModel,
AbstractModel< VectorType, VectorType, VectorType > &  secondModel 
)

◆ partial_shuffle() [1/2]

template<class RandomAccessIterator , class Rng >
void shark::partial_shuffle ( RandomAccessIterator  begin,
RandomAccessIterator  middle,
RandomAccessIterator  end,
Rng &  rng 
)

random_shuffle algorithm which stops after acquiring the random subsequence for [begin,middle)

Definition at line 55 of file functional.h.

References shuffle().

Referenced by partial_shuffle(), randomSubBatch(), and randomSubset().

◆ partial_shuffle() [2/2]

template<class RandomAccessIterator >
void shark::partial_shuffle ( RandomAccessIterator  begin,
RandomAccessIterator  middle,
RandomAccessIterator  end 
)

random_shuffle algorithm which stops after acquiring the random subsequence for [begin,middle)

Definition at line 71 of file functional.h.

References shark::random::globalRng, and partial_shuffle().

◆ partitionEqually() [1/2]

template<class Range >
boost::range_iterator<Range>::type shark::partitionEqually ( Range &  range)

Partitions a range in two parts as equal in size as possible.

The Algorithm partitions the range and returns the splitpoint. The elements in the range are ordered such that all elements in [begin,splitpoint) < [splitpoint,end). This partition is done such that the ranges are as equally sized as possible. It is guaranteed that the left range is not empty. However, if the range consists only of equal elements, the return value will be the end iterator indicating that there is no split possible. The whole algorithm runs in linear time by iterating 2 times over the sequence.

Definition at line 105 of file functional.h.

References median_element().

Referenced by partitionEqually(), and shark::BinaryTree< VectorType >::splitList().

◆ partitionEqually() [2/2]

template<class Range >
boost::range_iterator<Range>::type shark::partitionEqually ( Range const &  rangeAdaptor)

Partitions a range in two parts as equal in size as possible and returns it's result.

This the verison for adapted ranges.

Definition at line 133 of file functional.h.

References partitionEqually().

◆ penalizedFitness()

template<class IndividualRange >
auto shark::penalizedFitness ( IndividualRange &  range) -> decltype( boost::adaptors::transform(range,detail::IndividualPenalizedFitnessFunctor()) )

◆ preferenceAdjustedUnitVectors()

RealMatrix shark::preferenceAdjustedUnitVectors ( std::size_t const  n,
std::size_t const  sum,
std::vector< Preference > const &  preferences 
)

Return a set of evenly spaced n-dimensional points on the unit sphere clustered around the specified preference points.

Referenced by shark::NSGA3Indicator::init().

◆ preferenceAdjustedWeightVectors()

RealMatrix shark::preferenceAdjustedWeightVectors ( std::size_t const  n,
std::size_t const  sum,
std::vector< Preference > const &  preferences 
)

Return a set of of evenly spaced n-dimensional points on the "unit simplex" clustered around the specified preference points.

◆ randomSubBatch()

template<class DatasetType >
DataView<DatasetType>::batch_type shark::randomSubBatch ( DataView< DatasetType > const &  view,
std::size_t  size 
)

Creates a random batch of a given size.

Parameters
viewthe view from which the batch is to be created
sizethe size of the batch

Definition at line 285 of file DataView.h.

References partial_shuffle(), shark::DataView< DatasetType >::size(), and subBatch().

◆ randomSubset()

template<class DatasetType >
DataView<DatasetType> shark::randomSubset ( DataView< DatasetType > const &  view,
std::size_t  size 
)

creates a random subset of a DataView with given size

Parameters
viewthe view for which the subset is to be created
sizethe size of the subset

Definition at line 257 of file DataView.h.

References partial_shuffle(), shark::DataView< DatasetType >::size(), and subset().

Referenced by main().

◆ ranks()

template<class IndividualRange >
auto shark::ranks ( IndividualRange &  range) -> decltype( boost::adaptors::transform(range,detail::IndividualRankFunctor()) )

◆ read_front()

template<typename Stream >
FrontType shark::read_front ( Stream &  in,
std::size_t  noObjectives,
const std::string &  separator = " ",
std::size_t  headerLines = 0 
)

Definition at line 17 of file AdditiveEpsilonIndicatorMain.cpp.

Referenced by main().

◆ sampleLatticeUniformly()

template<typename Matrix , typename randomType = shark::random::rng_type>
Matrix shark::sampleLatticeUniformly ( randomType &  rng,
Matrix const &  matrix,
std::size_t const  n,
bool const  keep_corners = true 
)

Definition at line 90 of file Lattice.h.

Referenced by shark::NSGA3Indicator::init().

◆ searchPoint()

template<class IndividualRange >
auto shark::searchPoint ( IndividualRange &  range) -> decltype( boost::adaptors::transform(range,detail::IndividualSearchPointFunctor()) )

Definition at line 250 of file Individual.h.

References transform().

Referenced by shark::LineSearch::init(), main(), shark::VDCMA::step(), and shark::LMCMA::step().

◆ SHARK_VECTOR_MATRIX_TYPEDEFS() [1/2]

shark::SHARK_VECTOR_MATRIX_TYPEDEFS ( long  double,
BigReal   
)

◆ SHARK_VECTOR_MATRIX_TYPEDEFS() [2/2]

shark::SHARK_VECTOR_MATRIX_TYPEDEFS ( bool  ,
Bool   
)

◆ shuffle()

template<class Iterator , class Rng >
void shark::shuffle ( Iterator  begin,
Iterator  end,
Rng &  rng 
)

random_shuffle algorithm which stops after acquiring the random subsequence for [begin,middle)

Definition at line 44 of file functional.h.

References shark::random::discrete(), and swap().

Referenced by createCVBatch(), shark::ContrastiveDivergence< Operator >::evalDerivative(), partial_shuffle(), shark::UnlabeledData< InputType >::shuffle(), and shark::LabeledData< InputType, LabelType >::shuffle().

◆ subBatch()

template<class DatasetType , class IndexRange >
DataView<DatasetType>::batch_type shark::subBatch ( DataView< DatasetType > const &  view,
IndexRange const &  indizes 
)

Creates a batch given a set of indices.

Parameters
viewthe view from which the batch is to be created
indizesthe set of indizes defining the batch

Definition at line 269 of file DataView.h.

References createBatch(), and subset().

Referenced by shark::RFClassifier< LabelType >::computeFeatureImportances(), createCVFullyIndexed(), createCVIndexed(), and randomSubBatch().

◆ subset()

template<class DatasetType , class IndexRange >
DataView<DatasetType> shark::subset ( DataView< DatasetType > const &  view,
IndexRange const &  indizes 
)

creates a subset of a DataView with elements indexed by indices

Parameters
viewthe view for which the subset is to be created
indizesthe index of the elements to be stored in the view

Definition at line 247 of file DataView.h.

Referenced by shark::Data< LabelT >::indexedSubset(), main(), randomSubset(), and subBatch().

◆ sumOfWeights() [1/2]

template<class InputType >
double shark::sumOfWeights ( WeightedUnlabeledData< InputType > const &  dataset)

Returns the total sum of weights.

Definition at line 736 of file WeightedDataset.h.

◆ sumOfWeights() [2/2]

template<class InputType , class LabelType >
double shark::sumOfWeights ( WeightedLabeledData< InputType, LabelType > const &  dataset)

Returns the total sum of weights.

Definition at line 745 of file WeightedDataset.h.

◆ swap() [1/3]

template<class D1 , class W1 , class D2 , class W2 >
void shark::swap ( WeightedDataPair< D1, W1 > &&  p1,
WeightedDataPair< D2, W2 > &&  p2 
)

Definition at line 84 of file WeightedDataset.h.

References swap().

◆ swap() [2/3]

template<class K , class V >
void shark::swap ( KeyValuePair< K, V > &  pair1,
KeyValuePair< K, V > &  pair2 
)

Swaps the contents of two instances of KeyValuePair.

Definition at line 88 of file KeyValuePair.h.

References shark::KeyValuePair< Key, Value >::key, and shark::KeyValuePair< Key, Value >::value.

Referenced by createCVFullyIndexed(), createCVIndexed(), shark::QpMcBoxDecomp< Matrix >::deactivateExample(), shark::QpMcSimplexDecomp< Matrix >::deactivateExample(), shark::QpMcBoxDecomp< Matrix >::deactivateVariable(), shark::QpMcSimplexDecomp< Matrix >::deactivateVariable(), shark::ConcatenatedModel< VectorType >::eval(), shark::MultiChainApproximator< MarkovChainType >::evalDerivative(), shark::ExampleModifiedKernelMatrix< InputType, CacheType >::flipColumnsAndRows(), shark::BlockMatrix2x2< Matrix >::flipColumnsAndRows(), shark::RegularizedKernelMatrix< InputType, CacheType >::flipColumnsAndRows(), shark::GaussianKernelMatrix< T, CacheType >::flipColumnsAndRows(), shark::KernelMatrix< InputType, CacheType >::flipColumnsAndRows(), shark::ModifiedKernelMatrix< InputType, CacheType >::flipColumnsAndRows(), shark::DifferenceKernelMatrix< InputType, CacheType >::flipColumnsAndRows(), shark::CachedMatrix< Matrix >::flipColumnsAndRows(), shark::GeneralQuadraticProblem< MatrixT >::flipCoordinates(), shark::BoxedSVMProblem< MatrixT >::flipCoordinates(), shark::BoxBasedShrinkingStrategy< BoxConstrainedProblem< Problem > >::flipCoordinates(), shark::BoxConstrainedProblem< Problem >::flipCoordinates(), shark::CSVMProblem< MatrixT >::flipCoordinates(), shark::SvmProblem< Problem >::flipCoordinates(), main(), shark::WS2MaximumGradientCriterion::operator()(), shark::SimulatedBinaryCrossover< RealVector >::operator()(), shark::HMGSelectionCriterion::operator()(), repartitionByClass(), shark::MultiChainApproximator< MarkovChainType >::setData(), shark::BoxBasedShrinkingStrategy< BoxConstrainedProblem< Problem > >::shrink(), shuffle(), swap(), shark::LRUCache< QpFloatType >::swapLineIndices(), shark::ConcatenatedModel< VectorType >::weightedDerivatives(), shark::ConcatenatedModel< VectorType >::weightedInputDerivative(), and shark::ConcatenatedModel< VectorType >::weightedParameterDerivative().

◆ swap() [3/3]

template<class D1 , class W1 , class D2 , class W2 >
void shark::swap ( WeightedDataBatch< D1, W1 > &  p1,
WeightedDataBatch< D2, W2 > &  p2 
)

◆ tchebycheffScalarizer()

double shark::tchebycheffScalarizer ( RealVector const &  fitness,
RealVector const &  weights,
RealVector const &  optimalPointFitness 
)

Definition at line 41 of file Tchebycheff.h.

References w.

◆ toDataset()

template<class T >
DataView<T>::dataset_type shark::toDataset ( DataView< T > const &  view,
std::size_t  batchSize = DataView<T>::dataset_type::DefaultBatchSize 
)

Creates a new dataset from a View.

When the elements of a View needs to be processed repeatedly it is often better to use the packed format of the Dataset again, since then the faster batch processing can be used

Parameters
viewthe view from which to create the new dataset
batchSizethe size of the batches in the dataset

Definition at line 314 of file DataView.h.

References batchSize(), shark::DataView< DatasetType >::begin(), shark::DataView< DatasetType >::dataset(), shark::DataView< DatasetType >::end(), and shark::DataView< DatasetType >::size().

Referenced by main().

◆ toView()

template<class DatasetType >
DataView<DatasetType> shark::toView ( DatasetType &  set)

Creates a View from a dataset.

This is just a helper function to omit the actual type of the view

Parameters
setthe dataset from which to create the view

Definition at line 301 of file DataView.h.

◆ unitVectorsOnLattice()

RealMatrix shark::unitVectorsOnLattice ( std::size_t const  n,
std::size_t const  sum 
)

Return a set of evenly spaced n-dimensional points on the unit sphere.

Referenced by shark::NSGA3Indicator::init().

◆ unpenalizedFitness()

template<class IndividualRange >
auto shark::unpenalizedFitness ( IndividualRange &  range) -> decltype( boost::adaptors::transform(range,detail::IndividualUnpenalizedFitnessFunctor()) )

◆ variance()

template<class VectorType >
VectorType shark::variance ( Data< VectorType > const &  data)

◆ weightLattice()

RealMatrix shark::weightLattice ( std::size_t const  n,
std::size_t const  sum 
)

Returns a set of evenly spaced n-dimensional points on the "unit simplex".