{"id":1852,"date":"2024-07-02T16:04:48","date_gmt":"2024-07-02T16:04:48","guid":{"rendered":"https:\/\/www.ethanepperly.com\/?p=1852"},"modified":"2024-07-02T16:04:49","modified_gmt":"2024-07-02T16:04:49","slug":"neat-randomized-algorithms-randdiag-for-rapidly-diagonalizing-normal-matrices","status":"publish","type":"post","link":"https:\/\/www.ethanepperly.com\/index.php\/2024\/07\/02\/neat-randomized-algorithms-randdiag-for-rapidly-diagonalizing-normal-matrices\/","title":{"rendered":"Neat Randomized Algorithms: RandDiag for Rapidly Diagonalizing Normal Matrices"},"content":{"rendered":"\n<p>Consider two complex-valued square matrices <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-1e6bf85b9d278d45d234be841b3abc81_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#65;&#92;&#105;&#110;&#92;&#99;&#111;&#109;&#112;&#108;&#101;&#120;&#94;&#123;&#110;&#92;&#116;&#105;&#109;&#101;&#115;&#32;&#110;&#125;\" title=\"Rendered by QuickLaTeX.com\" height=\"14\" width=\"75\" style=\"vertical-align: -1px;\"\/> and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-d5730e7a1eb9217a59e31f7a99567323_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#66;&#92;&#105;&#110;&#92;&#99;&#111;&#109;&#112;&#108;&#101;&#120;&#94;&#123;&#110;&#92;&#116;&#105;&#109;&#101;&#115;&#32;&#110;&#125;\" title=\"Rendered by QuickLaTeX.com\" height=\"14\" width=\"76\" style=\"vertical-align: -1px;\"\/>. The first matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-11f4f587954b361e7d78940f65b8d70d_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#65;\" title=\"Rendered by QuickLaTeX.com\" height=\"13\" width=\"13\" style=\"vertical-align: 0px;\"\/> is <a href=\"https:\/\/en.wikipedia.org\/wiki\/Hermitian_matrix\">Hermitian<\/a>, being equal <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-7dcda31471de5042134166a7e527e581_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#65;&#32;&#61;&#32;&#65;&#94;&#42;\" title=\"Rendered by QuickLaTeX.com\" height=\"13\" width=\"56\" style=\"vertical-align: 0px;\"\/> to its <a href=\"https:\/\/en.wikipedia.org\/wiki\/Conjugate_transpose\">conjugate transpose<\/a> <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-2e6f04c5e73fd969ae577284f13bbce3_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#65;&#94;&#42;\" title=\"Rendered by QuickLaTeX.com\" height=\"13\" width=\"19\" style=\"vertical-align: 0px;\"\/>. The other matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-06d2c1a5a171b6d7d9c5df87d123c5a4_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#66;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/> is non-Hermitian, <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c868d40d1912919612facd9096592a90_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#66;&#32;&#92;&#110;&#101;&#32;&#66;&#94;&#42;\" title=\"Rendered by QuickLaTeX.com\" height=\"17\" width=\"58\" style=\"vertical-align: -4px;\"\/>. Let&#8217;s see how long it takes to compute their eigenvalue decompositions in MATLAB:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>&gt;&gt; A = randn(1e3) + 1i*randn(1e3); A = (A+A')\/2;\n&gt;&gt; tic; &#91;V_A,D_A] = eig(A); toc <mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">% Hermitian matrix<\/mark>\nElapsed time is 0.415145 seconds.\n&gt;&gt; B = randn(1e3) + 1i*randn(1e3);\n&gt;&gt; tic; &#91;V_B,D_B] = eig(B); toc <mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">% non-Hermitian matrix<\/mark>\nElapsed time is 1.668246 seconds.<\/code><\/pre>\n\n\n\n<p>We see that it takes <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-e6b0829ff5d83d0971e9b30599e42b34_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#52;&#92;&#116;&#105;&#109;&#101;&#115;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"21\" style=\"vertical-align: 0px;\"\/> longer to compute the eigenvalues of the non-Hermitian matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-06d2c1a5a171b6d7d9c5df87d123c5a4_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#66;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/> as compared to the Hermitian matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-11f4f587954b361e7d78940f65b8d70d_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#65;\" title=\"Rendered by QuickLaTeX.com\" height=\"13\" width=\"13\" style=\"vertical-align: 0px;\"\/>. Moreover, the matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-be1f4828d763950cb9b33235e60b7fab_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#86;&#95;&#65;\" title=\"Rendered by QuickLaTeX.com\" height=\"15\" width=\"20\" style=\"vertical-align: -3px;\"\/> of eigenvectors for a Hermitian matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-0bf9405dd0b6026e7a5057e309e55c69_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#65;&#32;&#61;&#32;&#86;&#95;&#65;&#68;&#95;&#65;&#86;&#95;&#65;&#94;&#123;&#45;&#49;&#125;\" title=\"Rendered by QuickLaTeX.com\" height=\"22\" width=\"115\" style=\"vertical-align: -6px;\"\/> is a <a href=\"https:\/\/en.wikipedia.org\/wiki\/Unitary_matrix\">unitary matrix<\/a>, <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f4056cd9e4628d2ef6c0f7f474cc86_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#86;&#95;&#65;&#94;&#42;&#86;&#95;&#65;&#32;&#61;&#32;&#86;&#95;&#65;&#86;&#95;&#65;&#94;&#42;&#32;&#61;&#32;&#73;\" title=\"Rendered by QuickLaTeX.com\" height=\"17\" width=\"143\" style=\"vertical-align: -5px;\"\/>.<\/p>\n\n\n\n<p>There are another class of matrices with nice eigenvalue decompositions, <a href=\"https:\/\/en.wikipedia.org\/wiki\/Normal_matrix\">normal matrices<\/a>. A square, complex-valued matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-52134c3741ef3371f17ceb962d0792f6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/> is <em><a href=\"https:\/\/en.wikipedia.org\/wiki\/Normal_matrix\">normal<\/a><\/em> if <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-01bcb1dde11a93f9dee93451ec0412fd_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;&#94;&#42;&#67;&#32;&#61;&#32;&#67;&#67;&#94;&#42;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"93\" style=\"vertical-align: 0px;\"\/>. The matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5e89cb8a3a767ebf96d84dc8f7eeb45_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#86;&#95;&#67;\" title=\"Rendered by QuickLaTeX.com\" height=\"15\" width=\"21\" style=\"vertical-align: -3px;\"\/> of eigenvectors for a normal matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-682f30ab6d39f87b690969dd8280fd71_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;&#32;&#61;&#32;&#86;&#95;&#67;&#32;&#68;&#95;&#67;&#32;&#86;&#95;&#67;&#94;&#123;&#45;&#49;&#125;\" title=\"Rendered by QuickLaTeX.com\" height=\"22\" width=\"117\" style=\"vertical-align: -6px;\"\/> is also unitary, <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-71b9aa61bed4129cd06f498af5ad9f52_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#86;&#95;&#67;&#94;&#42;&#86;&#95;&#67;&#32;&#61;&#32;&#86;&#95;&#67;&#86;&#95;&#67;&#94;&#42;&#32;&#61;&#32;&#73;\" title=\"Rendered by QuickLaTeX.com\" height=\"17\" width=\"144\" style=\"vertical-align: -5px;\"\/>. An important class of normal matrices are unitary matrices themselves. A unitary matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c7480669f4fca8a251671d27137d0b09_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#85;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"13\" style=\"vertical-align: 0px;\"\/> is always normal since it satisfies <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-ee3c51290a65a47cdd1249e35c8aa1d4_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#85;&#94;&#42;&#85;&#32;&#61;&#32;&#85;&#85;&#94;&#42;&#32;&#61;&#32;&#73;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"127\" style=\"vertical-align: 0px;\"\/>.<\/p>\n\n\n\n<p>Let&#8217;s see how long it takes MATLAB to compute the eigenvalue decomposition of a unitary (and thus normal) matrix:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>&gt;&gt; U = V_A;                     <mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">% unitary, and thus normal, matrix<\/mark>\n&gt;&gt; tic; &#91;V_U,D_U] = eig(U); toc <mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">% normal matrix<\/mark>\nElapsed time is 2.201017 seconds.<\/code><\/pre>\n\n\n\n<p>Even longer than it took to compute an eigenvalue decomposition of the non-normal matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-06d2c1a5a171b6d7d9c5df87d123c5a4_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#66;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/>! Can we make the normal eigenvalue decomposition closer to the speed of the Hermitian eigenvalue decomposition?<\/p>\n\n\n\n<p>Here is the start of an idea. Every square matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-52134c3741ef3371f17ceb962d0792f6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/> has a <em>Cartesian decomposition<\/em>:<p class=\"ql-center-displayed-equation\" style=\"line-height: 37px;\"><span class=\"ql-right-eqno\"> &nbsp; <\/span><span class=\"ql-left-eqno\"> &nbsp; <\/span><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-5efdc333ce38cd9aaa9c5c10ad40b44a_l3.png\" height=\"37\" width=\"345\" class=\"ql-img-displayed-equation quicklatex-auto-format\" alt=\"&#92;&#91;&#67;&#32;&#61;&#32;&#72;&#32;&#43;&#32;&#105;&#83;&#44;&#32;&#92;&#113;&#117;&#97;&#100;&#32;&#72;&#32;&#61;&#32;&#92;&#102;&#114;&#97;&#99;&#123;&#67;&#43;&#67;&#94;&#42;&#125;&#123;&#50;&#125;&#44;&#32;&#92;&#113;&#117;&#97;&#100;&#32;&#83;&#32;&#61;&#32;&#92;&#102;&#114;&#97;&#99;&#123;&#67;&#45;&#67;&#94;&#42;&#125;&#123;&#50;&#105;&#125;&#46;&#92;&#93;\" title=\"Rendered by QuickLaTeX.com\"\/><\/p>We have written <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-52134c3741ef3371f17ceb962d0792f6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/> as a combination of its <em>Hermitian part<\/em> <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f986724262f699e22a9dfc5cb767a6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: 0px;\"\/> and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-4015d3bcae440238eb2e7a73e66bae43_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#105;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"6\" style=\"vertical-align: 0px;\"\/> times its <em>skew-Hermitian part<\/em> <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-b89471a11dd7fd85184a38ddb3ea9145_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"12\" style=\"vertical-align: 0px;\"\/>. Both <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f986724262f699e22a9dfc5cb767a6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: 0px;\"\/> and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-b89471a11dd7fd85184a38ddb3ea9145_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"12\" style=\"vertical-align: 0px;\"\/> are Hermitian matrices. The Cartesian decomposition of a square matrix is analogous to the decomposition of a complex number into its real and imaginary parts.<\/p>\n\n\n\n<p>For a normal matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-52134c3741ef3371f17ceb962d0792f6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/>, the Hermitian and skew-Hermitian parts commute, <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-0d759339528a2c391ef783d89dcada2a_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;&#83;&#32;&#61;&#32;&#83;&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"80\" style=\"vertical-align: 0px;\"\/>. We know from matrix theory that commuting Hermitian matrices are <a href=\"https:\/\/en.wikipedia.org\/wiki\/Diagonalizable_matrix#Simultaneous_diagonalization\">simultaneously diagonalizable<\/a>, i.e., there exists <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-e7f252699e1616398a7ce5179d3e425e_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#81;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"14\" style=\"vertical-align: -4px;\"\/> such that <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-72be5c6e86c3b38fc1de6d4730a38734_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;&#32;&#61;&#32;&#81;&#68;&#95;&#72;&#81;&#94;&#42;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"102\" style=\"vertical-align: -4px;\"\/> and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-88647ba824a9f1cf8bebfba24cc6ae2d_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#83;&#32;&#61;&#32;&#81;&#68;&#95;&#83;&#81;&#94;&#42;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"94\" style=\"vertical-align: -4px;\"\/> for diagonal matrices <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-36d402fd8b348b5925fc3c52ea6d0cd3_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#68;&#95;&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"15\" width=\"27\" style=\"vertical-align: -3px;\"\/> and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-6bdadb18efb68959744a82f361010ce5_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#68;&#95;&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"15\" width=\"24\" style=\"vertical-align: -3px;\"\/>. Thus, given access to such <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-e7f252699e1616398a7ce5179d3e425e_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#81;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"14\" style=\"vertical-align: -4px;\"\/>, <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-52134c3741ef3371f17ceb962d0792f6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/> has eigenvalue decomposition <p class=\"ql-center-displayed-equation\" style=\"line-height: 19px;\"><span class=\"ql-right-eqno\"> &nbsp; <\/span><span class=\"ql-left-eqno\"> &nbsp; <\/span><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-d84fd3e79327e2e4082ba99967de3c9e_l3.png\" height=\"19\" width=\"171\" class=\"ql-img-displayed-equation quicklatex-auto-format\" alt=\"&#92;&#91;&#67;&#32;&#61;&#32;&#81;&#40;&#68;&#95;&#72;&#43;&#105;&#68;&#95;&#83;&#41;&#81;&#94;&#42;&#46;&#92;&#93;\" title=\"Rendered by QuickLaTeX.com\"\/><\/p><\/p>\n\n\n\n<p>Here&#8217;s a first attempt to turn this insight into an algorithm. First, compute the Hermitian part <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f986724262f699e22a9dfc5cb767a6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: 0px;\"\/> of <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-52134c3741ef3371f17ceb962d0792f6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/>, diagonalize <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-72be5c6e86c3b38fc1de6d4730a38734_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;&#32;&#61;&#32;&#81;&#68;&#95;&#72;&#81;&#94;&#42;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"102\" style=\"vertical-align: -4px;\"\/>, and then see if <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-e7f252699e1616398a7ce5179d3e425e_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#81;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"14\" style=\"vertical-align: -4px;\"\/> diagonalizes <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-52134c3741ef3371f17ceb962d0792f6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/>. Let&#8217;s test this out on a <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-17551d41f05fc09b4c2f2e5b57bfe27a_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#50;&#92;&#116;&#105;&#109;&#101;&#115;&#32;&#50;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"39\" style=\"vertical-align: 0px;\"\/> example:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>&gt;&gt; C = orth(randn(2) + 1i*randn(2)); <mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">% unitary matrix<\/mark>\n&gt;&gt; H = (C+C')\/2;                     <mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">% Hermitian part<\/mark>\n&gt;&gt; &#91;Q,~] = eig(H);\n&gt;&gt; Q'*C*Q                            <mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">% check to see if diagonal<\/mark>\nans =\n  -0.9933 + 0.1152i  -0.0000 + 0.0000i\n   0.0000 + 0.0000i  -0.3175 - 0.9483i<\/code><\/pre>\n\n\n\n<p>Yay! We&#8217;ve succeeded at diagonalizing the matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-52134c3741ef3371f17ceb962d0792f6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/> using only a Hermitian eigenvalue decomposition. But we should be careful about declaring victory too early. Here&#8217;s a bad example:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>>> C = &#91;1 1i;1i 1]; <mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">% normal matrix<\/mark>\n>> H = (C+C')\/2;\n>> &#91;Q,~] = eig(H);\n>> Q'*C*Q           <mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-cyan-blue-color\">% oh no! not diagonal<\/mark>\nans =\n   1.0000 + 0.0000i   0.0000 + 1.0000i\n   0.0000 + 1.0000i   1.0000 + 0.0000i<\/code><\/pre>\n\n\n\n<p>What&#8217;s going on here? The issue is that the Hermitian part <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c71ee44d8937fc0663231823c72cf6fb_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;&#32;&#61;&#32;&#73;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"49\" style=\"vertical-align: 0px;\"\/> for this matrix has a repeated eigenvalue. Thus, <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f986724262f699e22a9dfc5cb767a6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: 0px;\"\/> has multiple different valid matrices of eigenvectors. (In this specific case, <em>every<\/em> unitary matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-e7f252699e1616398a7ce5179d3e425e_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#81;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"14\" style=\"vertical-align: -4px;\"\/> diagonalizes <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f986724262f699e22a9dfc5cb767a6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: 0px;\"\/>.) By looking at <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f986724262f699e22a9dfc5cb767a6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: 0px;\"\/> alone, we don&#8217;t know which <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-e7f252699e1616398a7ce5179d3e425e_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#81;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"14\" style=\"vertical-align: -4px;\"\/> matrix to pick which also diagonalizes <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-b89471a11dd7fd85184a38ddb3ea9145_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"12\" style=\"vertical-align: 0px;\"\/>.<\/p>\n\n\n\n<p>He and Kressner developed a beautifully simple <em>randomized<\/em> algorithm called <em><a href=\"https:\/\/arxiv.org\/abs\/2405.18399\">RandDiag<\/a><\/em> to circumvent this failure mode. The idea is straightforward:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Form a random linear combination <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-3f4c07e68bef11c0b798123fb07082f7_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#77;&#32;&#61;&#32;&#92;&#103;&#97;&#109;&#109;&#97;&#95;&#49;&#32;&#72;&#32;&#43;&#32;&#92;&#103;&#97;&#109;&#109;&#97;&#95;&#50;&#32;&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"126\" style=\"vertical-align: -4px;\"\/> of the Hermitian and skew-Hermitian parts of <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-11f4f587954b361e7d78940f65b8d70d_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#65;\" title=\"Rendered by QuickLaTeX.com\" height=\"13\" width=\"13\" style=\"vertical-align: 0px;\"\/>, with <a href=\"https:\/\/en.wikipedia.org\/wiki\/Normal_distribution#Standard_normal_distribution\">standard normal<\/a> random coefficients <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-a4ad499fbb7740ff6f65d68f94c76e00_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#92;&#103;&#97;&#109;&#109;&#97;&#95;&#49;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"15\" style=\"vertical-align: -4px;\"\/> and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-5912e4f3ae72c3b25a27c50a3f152fba_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#92;&#103;&#97;&#109;&#109;&#97;&#95;&#50;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: -4px;\"\/>.<\/li>\n\n\n\n<li>Compute <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-e7f252699e1616398a7ce5179d3e425e_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#81;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"14\" style=\"vertical-align: -4px;\"\/> that diagonalizes <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-a038471f70ce2efa1e2bb1ab05e0a7ce_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#77;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"19\" style=\"vertical-align: 0px;\"\/>.<\/li>\n<\/ol>\n\n\n\n<p>That&#8217;s it!<\/p>\n\n\n\n<p>To get a sense of why He and Kressner&#8217;s algorithm works, suppose that <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f986724262f699e22a9dfc5cb767a6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: 0px;\"\/> has some repeated eigenvalues and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-b89471a11dd7fd85184a38ddb3ea9145_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"12\" style=\"vertical-align: 0px;\"\/> has all distinct eigenvalues. Given this setup, it seems likely that a random linear combination of <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-b89471a11dd7fd85184a38ddb3ea9145_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"12\" style=\"vertical-align: 0px;\"\/> and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f986724262f699e22a9dfc5cb767a6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: 0px;\"\/> will also have all distinct eigenvalues. (It would take a very special circumstances for a random linear combination to yield two eigenvalues that are <em>exactly<\/em> the same!) Indeed, this intuition is a fact: <a href=\"https:\/\/arxiv.org\/html\/2405.18399v1#S2\">With 100% probability,<\/a> <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-e7f252699e1616398a7ce5179d3e425e_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#81;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"14\" style=\"vertical-align: -4px;\"\/> diagonalizing a Gaussian random linear combination of simultaneously diagonalizable matrices <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f986724262f699e22a9dfc5cb767a6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: 0px;\"\/> and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-b89471a11dd7fd85184a38ddb3ea9145_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"12\" style=\"vertical-align: 0px;\"\/> also diagonalizes <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c5f986724262f699e22a9dfc5cb767a6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#72;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"16\" style=\"vertical-align: 0px;\"\/> and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-b89471a11dd7fd85184a38ddb3ea9145_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"12\" style=\"vertical-align: 0px;\"\/> individually.<\/p>\n\n\n\n<p>MATLAB code for RandDiag is as follows:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>function Q = rand_diag(C)\n   H = (C+C')\/2; S = (C-C')\/2i;\n   M = randn*H + randn*S;\n   &#91;Q,~] = eig(M);\nend<\/code><\/pre>\n\n\n\n<p>When applied to our hard <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-17551d41f05fc09b4c2f2e5b57bfe27a_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#50;&#92;&#116;&#105;&#109;&#101;&#115;&#32;&#50;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"39\" style=\"vertical-align: 0px;\"\/> example from before, RandDiag succeeds at giving a matrix that diagonalizes <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-52134c3741ef3371f17ceb962d0792f6_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#67;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"14\" style=\"vertical-align: 0px;\"\/>:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>&gt;&gt; Q = rand_diag(C);\n&gt;&gt; Q'*C*Q\nans =\n   1.0000 - 1.0000i  -0.0000 + 0.0000i\n  -0.0000 - 0.0000i   1.0000 + 1.0000i<\/code><\/pre>\n\n\n\n<p>For computing the matrix of eigenvectors for a <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-6dbf2012bdd1602aa8d50e2e4e28a775_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#49;&#48;&#48;&#48;&#92;&#116;&#105;&#109;&#101;&#115;&#32;&#49;&#48;&#48;&#48;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"92\" style=\"vertical-align: 0px;\"\/> unitary matrix, RandDiag takes 0.4 seconds, just as fast as the Hermitian eigendecomposition did.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>&gt;&gt; tic; V_U = rand_diag(U); toc\nElapsed time is 0.437309 seconds.<\/code><\/pre>\n\n\n\n<p>He and Kressner&#8217;s algorithm is delightful. Ultimately, it uses randomness in only a small way. For most coefficients <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-15f4c6435f8d502cfb1f8408f4e13a26_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#97;&#95;&#49;&#44;&#97;&#95;&#50;&#32;&#92;&#105;&#110;&#32;&#92;&#114;&#101;&#97;&#108;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"77\" style=\"vertical-align: -4px;\"\/>, a matrix <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-e7f252699e1616398a7ce5179d3e425e_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#81;\" title=\"Rendered by QuickLaTeX.com\" height=\"16\" width=\"14\" style=\"vertical-align: -4px;\"\/> diagonalizing <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-c2934d901d420dcca69e9c83a7fb5887_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#97;&#95;&#49;&#32;&#72;&#32;&#43;&#32;&#97;&#95;&#50;&#32;&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"15\" width=\"84\" style=\"vertical-align: -3px;\"\/> will also diagonalize <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-f3d47ad14ac54c42c42c0c126be39ab7_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#65;&#32;&#61;&#32;&#72;&#43;&#105;&#83;\" title=\"Rendered by QuickLaTeX.com\" height=\"15\" width=\"93\" style=\"vertical-align: -2px;\"\/>. But, for any specific choice of <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-00dec6e4d6ff46b3a2cdb4199b70a6cd_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#97;&#95;&#49;&#44;&#97;&#95;&#50;\" title=\"Rendered by QuickLaTeX.com\" height=\"12\" width=\"41\" style=\"vertical-align: -4px;\"\/>, there is a possibility of failure. To avoid this possibility, we can just pick <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-0cdef85fb3c2fd7b3336c4765e8c6778_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#97;&#95;&#49;\" title=\"Rendered by QuickLaTeX.com\" height=\"11\" width=\"15\" style=\"vertical-align: -3px;\"\/> and <img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.ethanepperly.com\/wp-content\/ql-cache\/quicklatex.com-7643f8caa17ba003302c47c512667175_l3.png\" class=\"ql-img-inline-formula quicklatex-auto-format\" alt=\"&#97;&#95;&#50;\" title=\"Rendered by QuickLaTeX.com\" height=\"11\" width=\"16\" style=\"vertical-align: -3px;\"\/> at random. It&#8217;s really as simple as that.<\/p>\n\n\n\n<p><strong>References:<\/strong> RandDiag was proposed in <em><a href=\"https:\/\/arxiv.org\/abs\/2405.18399\">A simple, randomized algorithm for diagonalizing normal matrices<\/a><\/em> by He and Kressner (2024), building on their earlier work in <em><a href=\"https:\/\/arxiv.org\/abs\/2212.07248\">Randomized Joint Diagonalization of Symmetric Matrices<\/a><\/em> (2022) which considers the general case of using random linear combinations to (approximately) simultaneous diagonalize (nearly) commuting matrices. RandDiag is an example of a linear algebraic algorithm that uses randomness to put the input into &#8220;general position&#8221;; see <em><a href=\"https:\/\/arxiv.org\/abs\/2402.17873\">Randomized matrix computations: Themes and variations<\/a><\/em> by Kireeva and Tropp (2024) for a discussion of this, and other, ways of using randomness to design matrix algorithms.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Consider two complex-valued square matrices and . The first matrix is Hermitian, being equal to its conjugate transpose . The other matrix is non-Hermitian, . Let&#8217;s see how long it takes to compute their eigenvalue decompositions in MATLAB: We see that it takes longer to compute the eigenvalues of the non-Hermitian matrix as compared to<a class=\"more-link\" href=\"https:\/\/www.ethanepperly.com\/index.php\/2024\/07\/02\/neat-randomized-algorithms-randdiag-for-rapidly-diagonalizing-normal-matrices\/\">Read more<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6,15],"tags":[],"class_list":["post-1852","post","type-post","status-publish","format-standard","hentry","category-expository","category-neat-randomized-algorithms"],"_links":{"self":[{"href":"https:\/\/www.ethanepperly.com\/index.php\/wp-json\/wp\/v2\/posts\/1852","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.ethanepperly.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.ethanepperly.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.ethanepperly.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.ethanepperly.com\/index.php\/wp-json\/wp\/v2\/comments?post=1852"}],"version-history":[{"count":7,"href":"https:\/\/www.ethanepperly.com\/index.php\/wp-json\/wp\/v2\/posts\/1852\/revisions"}],"predecessor-version":[{"id":1859,"href":"https:\/\/www.ethanepperly.com\/index.php\/wp-json\/wp\/v2\/posts\/1852\/revisions\/1859"}],"wp:attachment":[{"href":"https:\/\/www.ethanepperly.com\/index.php\/wp-json\/wp\/v2\/media?parent=1852"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.ethanepperly.com\/index.php\/wp-json\/wp\/v2\/categories?post=1852"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.ethanepperly.com\/index.php\/wp-json\/wp\/v2\/tags?post=1852"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}