Homepage » Courses

Course 1

Dimension reduction techniques: Algorithms and applications

Speaker: Yousef Saad

Abstract
A common tool that is exploited in solving data mining and machine learning problems is that of `dimension reduction'. Dimension reduction is based on the precept that the observed data often lies in a noisy version of a low-dimensional subspace and so it is critical to work in this subspace not only to reduce computational cost but also to improve accuracy. The talk will start with an overview of the key concepts and then illustrate dimension reduction methods with applications such as information retrieval, face recognition and matrix completion for recommender systems. One of the main difficulties in many of the methods based on dimension reduction is to find the inherent approximate rank of the data at hand. We will show how a few simple random sampling methods for computing spectral densities and counting eigenvalues can be used for this purpose. Finally, if time permits, we will report on our first experiments in `materials informatics', a methodology which blends data mining and materials science. 

 

Course 2

RandNLA: Randomization in Numerical Linear Algebra

Speaker: Petros Drineas

Abstract
The introduction of randomization in the design and analysis of algorithms for matrix computations (such as matrix multiplication, least-squares regression, the Singular Value Decomposition (SVD), etc.) over the past 15 years provided a new paradigm and a complementary perspective to traditional numerical linear algebra approaches. These novel approaches were motivated by technological developments in many areas of scientific research that permit the automatic generation of large data sets, which are often modeled as matrices.

In this talk, we will outline how such approaches can be used to approximately solve problems such as least-squares and ridge-regression problems or approximate the Singular Value Decomposition (SVD) of matrices. Applications of the proposed algorithms to data analysis tasks (with a particular focus in population genetics) will also be discussed.

 

Course 3

Linearizations of polynomial and rational matrices

Speaker: Paul van Dooren

Abstract
We show that the problem of linearizations of polynomial and rational matrices is closely related to the polynomial matrix quadruples introduced by Rosenbrock in the seventies to represent rational transfer functions of dynamical systems. We also recall the concepts of irreducible and strongly irreducible quadruples which were introduced in the eighties, and show how they relate to the linearizations that are more common in the numerical linear algebra community. We then show that the family of strong linearizations of matrix polynomials, called ``block Kronecker pencils'', as well as their extension to rational eigenvalue problems, nicely fit in that general framework. The novelty of these block Kronecker pencils is that they can be proven to be backward stable in a structured sense, for the polynomial matrix case as well as for the rational matrix case.

This is based on joint work with F. Dopico (UC3M), P. Lawrence (KULeuven), J. Perez (UMontana) and M.C. Quintana (UC3M).