Matching Items (3)

157649-Thumbnail Image.png

Optimal sampling for linear function approximation and high-order finite difference methods over complex regions

Description

I focus on algorithms that generate good sampling points for function approximation. In 1D, it is well known that polynomial interpolation using equispaced points is unstable. On the other hand,

I focus on algorithms that generate good sampling points for function approximation. In 1D, it is well known that polynomial interpolation using equispaced points is unstable. On the other hand, using Chebyshev nodes provides both stable and highly accurate points for polynomial interpolation. In higher dimensional complex regions, optimal sampling points are not known explicitly. This work presents robust algorithms that find good sampling points in complex regions for polynomial interpolation, least-squares, and radial basis function (RBF) methods. The quality of these nodes is measured using the Lebesgue constant. I will also consider optimal sampling for constrained optimization, used to solve PDEs, where boundary conditions must be imposed. Furthermore, I extend the scope of the problem to include finding near-optimal sampling points for high-order finite difference methods. These high-order finite difference methods can be implemented using either piecewise polynomials or RBFs.

Contributors

Agent

Created

Date Created
  • 2019

154349-Thumbnail Image.png

Parallel optimization of polynomials for large-scale problems in stability and control

Description

In this thesis, we focus on some of the NP-hard problems in control theory. Thanks to the converse Lyapunov theory, these problems can often be modeled as optimization over polynomials.

In this thesis, we focus on some of the NP-hard problems in control theory. Thanks to the converse Lyapunov theory, these problems can often be modeled as optimization over polynomials. To avoid the problem of intractability, we establish a trade off between accuracy and complexity. In particular, we develop a sequence of tractable optimization problems - in the form of Linear Programs (LPs) and/or Semi-Definite Programs (SDPs) - whose solutions converge to the exact solution of the NP-hard problem. However, the computational and memory complexity of these LPs and SDPs grow exponentially with the progress of the sequence - meaning that improving the accuracy of the solutions requires solving SDPs with tens of thousands of decision variables and constraints. Setting up and solving such problems is a significant challenge. The existing optimization algorithms and software are only designed to use desktop computers or small cluster computers - machines which do not have sufficient memory for solving such large SDPs. Moreover, the speed-up of these algorithms does not scale beyond dozens of processors. This in fact is the reason we seek parallel algorithms for setting-up and solving large SDPs on large cluster- and/or super-computers.

We propose parallel algorithms for stability analysis of two classes of systems: 1) Linear systems with a large number of uncertain parameters; 2) Nonlinear systems defined by polynomial vector fields. First, we develop a distributed parallel algorithm which applies Polya's and/or Handelman's theorems to some variants of parameter-dependent Lyapunov inequalities with parameters defined over the standard simplex. The result is a sequence of SDPs which possess a block-diagonal structure. We then develop a parallel SDP solver which exploits this structure in order to map the computation, memory and communication to a distributed parallel environment. Numerical tests on a supercomputer demonstrate the ability of the algorithm to efficiently utilize hundreds and potentially thousands of processors, and analyze systems with 100+ dimensional state-space. Furthermore, we extend our algorithms to analyze robust stability over more complicated geometries such as hypercubes and arbitrary convex polytopes. Our algorithms can be readily extended to address a wide variety of problems in control such as Hinfinity synthesis for systems with parametric uncertainty and computing control Lyapunov functions.

Contributors

Agent

Created

Date Created
  • 2016

On K-derived quartics and invariants of local fields

Description

This dissertation will cover two topics. For the first, let $K$ be a number field. A $K$-derived polynomial $f(x) \in K[x]$ is a polynomial that

factors into linear factors over $K$,

This dissertation will cover two topics. For the first, let $K$ be a number field. A $K$-derived polynomial $f(x) \in K[x]$ is a polynomial that

factors into linear factors over $K$, as do all of its derivatives. Such a polynomial

is said to be {\it proper} if

its roots are distinct. An unresolved question in the literature is

whether or not there exists a proper $\Q$-derived polynomial of degree 4. Some examples

are known of proper $K$-derived quartics for a quadratic number field $K$, although other

than $\Q(\sqrt{3})$, these fields have quite large discriminant. (The second known field

is $\Q(\sqrt{3441})$.) I will describe a search for quadratic fields $K$

over which there exist proper $K$-derived quartics. The search finds examples for

$K=\Q(\sqrt{D})$ with $D=...,-95,-41,-19,21,31,89,...$.\\

For the second topic, by Krasner's lemma there exist a finite number of degree $n$ extensions of $\Q_p$. Jones and Roberts have developed a database recording invariants of $p$-adic extensions for low degree $n$. I will contribute data to this database by computing the Galois slope content, inertia subgroup, and Galois mean slope for a variety of wildly ramified extensions of composite degree using the idea of \emph{global splitting models}.

Contributors

Agent

Created

Date Created
  • 2019