Date of Award

5-2019

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Mathematical Sciences

First Advisor

Adrian M. Peter

Second Advisor

Georgios Anagnostopoulos

Third Advisor

Munevver Subasi

Fourth Advisor

Jewgeni Dshalalow

Abstract

The modus operandi for machine learning is to map functional data to numerical summaries, filter the data, and/or subject it to global signature extractions with the objecive of building robust feature vectors that uniquely characterize each function and then proceed with training algorithms that seek to optimally partition the feature space S ⊂ Rn into labeled regions. This holds true even when the original data are functional in nature, i.e. curves or surfaces that inherently vary over a continuum such as time or space. Functional data are often reduced to summary statistics, locally-sensitive features, and global signatures with the objective of building comprehensive feature vectors that uniquely characterize each function. This dissertation directly addresses representational issues of functional data for supervised learning. We propose novel frameworks for joint representation and discrimination of functional data, where rather than stripping the data of their functional attributes to make feature vectors, we instead use the basis representations of the data to enhance the inherent discriminative information. We first propose the Classification by Discriminative Interpolation (CDI) framework wherein functional data in the same class are adaptively reconstructed to be more similar to each other, while nearest neighbor functional data in other classes are simultaneously repelled. We then extend the CDI framework to further leverage the functional characteristics of the data by applying CDI to different feature function combinations which we term Classification by Discriminative Interpolation with Features (CDIF). Akin to other recent nearest neighbor metric learning paradigms like stochastic k-neighborhood selection and large margin nearest neighbors, both CDI and CDIF use class-specific representations which gerrymander similar functional data in an appropriate parameter space. In our third proposed supervised learning approach, time series in the same class are adaptively reconstructed to be more similar to each other, as done in CDI, but unlike CDI, the optimal Support Vector Machine (SVM) hyperplane that separates the basis expansions of the data is sought simultaneously. This method is termed Classification by Discriminative Reconstruction (CDR). The methodology of CDR explores the use of Wavelets and Radial Basis functions representations, which are further pruned through the use of sparsity inducing norm penalties. Experimental validation on several time series datasets, provided by the UCR Time Series Classification repository, establish the proposed discriminative interpolation frameworks as competitive or better in comparison to recent state-of-the-art techniques which continue to rely on the standard feature vector representation.

Included in

Mathematics Commons

Share

COinS