abstract:
In tensor product approximation, Hierarchical Tucker tensor for-
mat (Hackbusch) and Tensor Trains (TT) (Tyrtyshnikov) have been
introduced recently, during period of the DFG SPP priority program
1324. It offers stable and robust approximation of highdimensional
problems by a low order cost. The talk reports on joint work with
Prof. Hackbusch and his group at MPI Leipzig. The corresponding
ranks required for an approximation up to agiven error depend on bi-
linear approximation rates and corresponding trace class norms.
For numerical computations, the computation of an approximate
solution can be casted into an optimization framework constraint by
the restriction to tensors of prescribed multi-linear ranks r or low rank
tensors. Beside the Dirac Frenkel variational principle which exploits
the differential geometric structure of the hierarchical tensor formats,
thressholding techniques based on an heorarchical SVD (HSVD) can
be applied to provide convergence. Beside the quasi-best approxima-
tion by hard thressholding, we discuss iterative soft thressholding tech-
niques, developed jointly together with M. Bachmayr (IGPM RWTH
Aachen). Soft thressholding iteration applies convex optimization tech-
niques to tensor product approximation. |