“There are some things which cannot be learned quickly, and time, which is all we have, must be paid heavily for their acquiring. They are the very simplest things, and because it takes a man’s life to know them the little new that each man gets from life is very costly and the only heritage he has to leave.” - Ernest Hemingway (More…)
News #
I will be updating both good news, bad news and all kinds of news.
Recent Advanced in Research on Difference-of-Convex (DC) Programming
Second-order Stochastic Optimization methods for Machine Learning
Analysis of the Hessian # 1. Empirical Analysis of the Hessian of Over-Parametrized Neural Networks # Year: 2017 Authors: Levent Sagun, Utku Evci, V. Ugur Guney, Yann Dauphin, Leon Bottou ArXiv ID: arXiv:1706.04454 URL: https://arxiv.org/abs/1706.04454 Abstract: We study the properties of common loss surfaces through their Hessian matrix. In particular, in the context of deep learning, we empirically show that the spectrum of the Hessian is composed of two parts: (1) the bulk centered near zero, (2) and outliers away from the bulk. We present numerical evidence and mathematical justifications to the following conjectures laid out by Sagun et al. (2016): Fixing data, increasing the number of parameters merely scales the bulk of the spectrum; fixing the dimension and changing the data (for instance adding more clusters or making the data less separable) only affects the outliers. We believe that our observations have striking implications for non-convex optimization in high dimensions. First, the flatness of such landscapes (which can be measured by the singularity of the Hessian) implies that classical notions of basins of attraction may be quite misleading. And that the discussion of wide/narrow basins may be in need of a new perspective around over-parametrization and redundancy that are able to create large connected components at the bottom of the landscape. Second, the dependence of small number of large eigenvalues to the data distribution can be linked to the spectrum of the covariance matrix of gradients of model outputs. With this in mind, we may reevaluate the connections within the data-architecture-algorithm framework of a model, hoping that it would shed light into the geometry of high-dimensional and non-convex spaces in modern applications. In particular, we present a case that links the two observations: small and large batch gradient descent appear to converge to different basins of attraction but we show that they are in fact connected through their flat region and so belong to the same basin.
Some popular partial differential equations (PDEs)
Single PDEs # Linear equations # Laplace’s equation $$ \begin{equation} \Delta u = \sum_{i=1}^{n} u_{x_i x_i} = 0. \end{equation} $$ Helmholtz’s (or eigenvalue) equation $$ \begin{equation} -\Delta u = \lambda u. \end{equation} $$ Linear transport equation $$ \begin{equation} u_t + \sum_{i=1}^{n} b^i u_{x_i} = 0. \end{equation} $$ Liouville’s equation $$ \begin{equation} u_t + \sum_{i=1}^{n} (b^i u)_{x_i} = 0. \end{equation} $$ Heat (or diffusion) equation $$ \begin{equation} u_t - \Delta u = 0. \end{equation} $$
Study Mathematics at HCMUS
1. Applied Mathematics # MNC - Research Methodologies MTT001 - Advanced Functional Analysis MTT006 - Advanced Linear Algebra MTT011 - Numerical Analysis MTT012 - Stochastic Process MTT081 - Optimization Algorithms MTT106 - Non-linear Programming MTT107 - Set-valued Analysis MTT083 - Convex Analysis MTT130 - Numerical Programming for Applied Problems MTT131 - Seminar in Applied Mathematics MTT139 - Mathematical Models in Economics MTT147 - Statistical Modelling MTT099 - Differential Equations MTT097 - Partial Differential Equations MTH10403 - Functional Analysis MTT090 - Complex Analysis MTT149 - Convex Analysis and Optimization 2. Mathematical Analysis # MTT001 - Advanced Functional Analysis MTT006 - Advanced Linear Algebra MTT099 - Differential Equations MTT097 - Partial Differential Equations MTT090 - Complex Analysis MTT149 - Convex Analysis and Optimization
Explainable Reinforcement Learning (XRL)
In the progress…