Identifying underlying geometry to denoise and analyze (high-dimensional) data

Speaker
Shira Faigenbaum-Golovin (Duke University)
Date
03/12/2023 - 16:00 - 15:00Add to Calendar 2023-12-03 15:00:00 2023-12-03 16:00:00 Identifying underlying geometry to denoise and analyze (high-dimensional) data In many applications that involve large volumes of data (whether low- or high-dimensional), identifying and exploiting the underlying geometry is an essential ingredient; this is also the case in many of the research projects on which I have worked.  In my talk, I will concentrate mainly on two projects.   In the first project, optimization techniques are used to determine an optimal low-dimensional smooth manifold (without imposing its dimension a priori) approximating a large family of noisy (possibly high-dimensional) data points. Theoretical analysis of the resulting algorithm shows that the proposed optimization solution converges to a quasi-uniform reconstruction of the manifold within a bounded time. This nonparametric approach can then be extended to address various approximation tasks in high dimensions, such as function approximation or recovering missing information in the data. We illustrate the results with toy data and real applications.  After years of extensive research, there were classes of functions for which a parametric approximation was unknown. In the second project, I will introduce a new reflecto-multiscale function, that is a generalization of the well-known refinable functions. I will rigorously show that the limit function of the reflecto-multiscale refinement process is Holder continuous and has Holder continuity of the highest-order well-defined derivatives. zoom: https://biu-ac-il.zoom.us/j/751076379 אוניברסיטת בר-אילן - המחלקה למתמטיקה mathoffice@math.biu.ac.il Asia/Jerusalem public
Place
zoom: https://biu-ac-il.zoom.us/j/751076379
Abstract

In many applications that involve large volumes of data (whether low- or high-dimensional), identifying and exploiting the underlying geometry is an essential ingredient; this is also the case in many of the research projects on which I have worked.  In my talk, I will concentrate mainly on two projects.  

In the first project, optimization techniques are used to determine an optimal low-dimensional smooth manifold (without imposing its dimension a priori) approximating a large family of noisy (possibly high-dimensional) data points. Theoretical analysis of the resulting algorithm shows that the proposed optimization solution converges to a quasi-uniform reconstruction of the manifold within a bounded time. This nonparametric approach can then be extended to address various approximation tasks in high dimensions, such as function approximation or recovering missing information in the data. We illustrate the results with toy data and real applications. 

After years of extensive research, there were classes of functions for which a parametric approximation was unknown. In the second project, I will introduce a new reflecto-multiscale function, that is a generalization of the well-known refinable functions. I will rigorously show that the limit function of the reflecto-multiscale refinement process is Holder continuous and has Holder continuity of the highest-order well-defined derivatives.

תאריך עדכון אחרון : 28/11/2023