定理1
每个子矩阵\(\mathbf{答}_{N} ^{v,v}\),\(\mathbf{答}_{N} ^{v,w}\),和\(\mathbf{答}_{N} ^{w,w}\)具有BTTB结构.更准确地说,每个矩阵都可以表示为\((N_{y} -1个)\)-通过-\((N_{y} -1个)\)块-具有块带宽的带状Toeplitz矩阵\(2L+1),
$$\mathbf美元{答}_{N} ^{I}=\begin{pmatrix}\mathbf{T}^{我}_{0}&\cdots&\mathbf{T}^{我}_{五十} &\mathbf{0}&\cdots&\mathpf{0{0}&\mathbf{0neneneep&\cdot&\mat血红蛋白{0}\\vdots&\ ddots&\\ddots&\ ddots&\ ddot&\ ddotes&\^{我}_{-L}&&ddots&&mathbf{T}^{我}_{0}&\ddots&\ddot&\mathbf{0}&\ddotes&\addots&\ mathbf}0}\\mathbf{0}&\ddots&\ddot&\mathbf}T}^{我}_{0}&\ddots&\ddot和\mathbf{0}和\ddots和\ mathbf^{我}_{0}&\ddots&\ddotes&\mathbf{0}\\mathbf}0}&\ ddots&\ddots&\mathbf{0}&\ ddot&\ddot&\ mathbf^{我}_{0}&\ddots&\mathbf{T}^{我}_{五十} \\vdots&\ddots&\ ddots&\ ddots&\ ddot&\ ddotes&\ ddots&\tdots&\ vdots\\mathbf{0}&\cdots&\T mathbf}0}&\ cdots&\T mathbf{0}&\T^{我}_{-L}&\cdots&\mathbf{T}^{我}_{0}\end{pmatrix}$$
(15)
其中上标\(I=(v,v),(v,w)\),或\((w,w)\).此外,每个矩阵块\(\mathbf{T}(T)_{j} ^{I}\),具有\(-L\leqslant j\leqslant L\),是一个\((N_{x} -1个)\)-通过-\((N_{x} -1个)\)带带宽的带状Toeplitz矩阵\(2K+1)
$$\mathbf美元{T}(T)_{j} ^{I}=\开始{pmatrix}{t}^{我}_{0,j}&\cdots&{t}^{我}_{K,j}&{0}&\cdots&{0}&{0{&\cdots&{0{0}\\vdots&\ddots&\ ddots&\ddots&\ ddots&\ ddot&\ vdots\\{t}^{我}_{-K,j}&\ddots&{t}^{我}_{0,j}和\ ddots和\ ddot和{0}和\ddots^{我}_{0,j}&\ ddots&\ ddot&{0}&\ddots&{0}\\vdots&\ddot&\ ddotes&\ ddots&\\ddots和\ ddots&\ vdots \\{0}&\ddots&{0{&\ddot^{我}_{0,j}和\ ddots和\ ddot和{0}和^{我}_{0,j}&\ddots&{t}^{我}_{K,j}\\vdots&\ ddots&\ddots&\ ddots&\ ddot&\ ddotes&\ ddots&\\ddots&\ vdots\\{0}&\ cdots&{0}&\cdots&{0}&{t}^{我}_{-K,j}&\cdots&{t}^{我}_{0,j}\结束{pmatrix}$$
(16)
证明
我们只研究了\(\mathbf{答}_{N} ^{v,v}\).分析\(\mathbf{答}_{N} ^{v,w}\)和\(\mathbf{答}_{N} ^{w,w}\)类似。通过扩展矩阵\(\mathbf{答}_{N} ^{v,v}\),我们很容易找到\(\mathbf{答}_{N} ^{v,v}\)可以写成以下形式:
$$\mathbf{答}_{N} ^{v,v}=\begin{pmatrix}{\mathbf{B}}{1,1}&{\mathbf{B{}{1,1}&\cdots&{\mathbf{B}}{1,N_{y} -1个}\\{mathbf{B}}_{2,1}&{mathbf{B}{{2,2}&\ddots&{mathbf{B{}}_2,N_{y} -1个}\\vdots&\ddots&\ ddots&\vdots\\{\mathbf{B}}_{N_{y} -1个,1}&{\mathbf{B}}_{N_{y} -1个,2}&\cdots&{\mathbf{B}}_{N_{y} -1个,N个_{y} -1个}\结束{pmatrix}$$
(17)
这里,每个块矩阵\({\mathbf{B}}_{j,j'}\)订单的\(N_{x} -1个\)表示行的交互j个和行\(j’\)在离散系统中\(1\leq-j\leq-N_{y}-1\)和\(1 \leq j’\leq N_{y}-1).来自(11)和(13),我们可以找到条目\(A)^{v,v}_{m,n}\neq 0\)当且仅当
$$\operatorname{supp}(\phi_{i',j'})\cap B_{delta}(x_{i},y_{j})\neq\emptyset$$
(18)
因此,所有矩阵块\({\mathbf{B}}_{j,j'}\)具有\(|j-j'|>L\)英寸(17)消失。然后\(\mathbf{答}_{N} ^{v,v}\)是一个\(2L+1)块带矩阵,可以表示为以下形式:
$$\mathbf美元{答}_{N} ^{v,v}=\begin{pmatrix}{\mathbf{B}}{1,1}&{\mathbf{B{}{1,2}&\cdots&{\mathbf{B}}{1,L+1}&\mathbf{0}&\cdots&\mathbf{0{\\{\mathpf{Bneneneep}{2,1}&}\mathbf1{B}{{{2,2}&\ ddots&\ cdots&\ddots&\ cdots&\mathbf{0}\\vdots&\ ddots&\\ddots&\ddots&\ ddots&\vdots\\{\mathbf{B}}_{L+1,1}&&ddots&&ddots&&ddots&&vdots \\\mathbf{0}&&ddots&&ddots&&ddots&&mathbf{0}\\\mathbf{0}&&ddots&&ddots&&ddots&&ddots&&mathbf{B}_{N_{y} -L-1,N个_{y} -1个}\\vdots&\ddots&\ ddots&\ ddots&\ ddot&\ ddotes&\ addots&\\vdots\\\mathbf{0}&\mathbf}&\cdots&\tathbf{0}&{\mathbf{B}_{N_{y} -1个,N个_{y} -L-1型}&\cdots&{\mathbf{B}}_{N_{y} -1个,N个_{y} -1个}\结束{pmatrix}$$
(19)
此外,在每个矩阵块中\({\mathbf{B}}_{j,j'}\),我们有\(A)^{v,v}_{m,n}=0\)对于\(|i-i'|>K\)也就是说,\({\mathbf{B}}_{j,j'}\)是一个\(2K+1)带状矩阵表示为
$$\mathbf美元{乙}_{j,j'}=\begin{pmatrix}{{b}}_{j,j'}^{1,1}&{b}_{j,j'{{1,2}&\cdots&{b{}_{j,j'}^{1,K+1}&{0}&\cdots&}0}\\{b}{j,j′}^2,1}&}{b}{j,ddots&\cdots&\ ddots&\ cdots&{0}\\vdots&\\ddots&\ ddots&\ ddot&\ ddotes&\ ddots&\vdots\\{{b}}_{j,j'}^{K+1,1}&\ ddos&\ ddot&\ ddotes&\ addots&\\ddots&\vdot \\{0}&\ddots&\ddot和\ ddots&\ ddots和\ ddot和_{y} -K-1,N个_{y} -1个}\\vdots&\ddots&\ ddots&\ ddots&\ ddot&\ ddotes&\ vdots\\{0}&\mathbf{0}&\cdots&\tmathbf}0}&{b}}_{j,j'}^{N_{y} -1个,个_{y} -K-1}&\cdots&{{b}}_{j,j'}^{N_{y} -1个,N个_{y} -1个}\结束{pmatrix}$$
(20)
通过引入以下翻译:
$$\xi_{1}=x'-x_{i},\qquad\xi_{2}=y'-y_{j}$$
(21)
条目\(\mathbf{答}_{m,n}^{v,v}\)在中给出(11)可以简化为
$$A{m,n}^{v,v}=\int_{B_{delta}(0,0)}\frac{\xi_{1}^{2}\sigma(\Vert(\xi_1},\xi_2})\Vert)}{\xi_1{2}+\xi_2{2}^}\biggl(\delta_{m,n}-\psi\biggl(\frac{\xi_{1} -x个_{i'-i}}{h{x}}\biggr)\psi\biggl(\frac{\xi_{2} -年_{j'-j}}{h{y}}\biggr)\bigger)\,d\xi{1}\,d_xi{2}$$
(22)
让\(j{1}'-j{1{=j{2}'-j{2}=l,-l\leql\leqL\),并让
$$\begin{aligned}&m_{1}=(j_{1}-1)(N_{x}-1)+i,\quad 1\leqi \leqN_{x}-1,1\leq j_{1}\leq N_{y}-1,\\&N_{1}=\bigl(j'_{1}-1\biger)(N_{x}-1)+i',\quad 1\leqi'\leqN_{x}-1,1\leq j'_{1}\leq N_{y}-1,\\&m_{2}=(j_2}-1)(N_{x}-1-)+i,\ quad 1\\leq i\leq N_{x{-1(j'{2}-1\biger)(N_{x}-1)+i',\quad 1\leqi'\leqN_{x}-1,1\leq j'{2}\leq N_{y}-1。\结束{对齐}$$
(23)
然后我们观察到,因为\(1\leq i,i'\leq N_{x} -1个\),
$$开始{对齐}b_{j{1},j'{1}}^{i,i'}&=A_{m_1},n_{1}^{v,v}\\&=int_{b_{delta}^{2}}\biggl(δ{m{1},n{1}}-\psi\biggl(frac{xi_{1} -x个_{i'-i}}{h{x}}\biggr)\psi\biggl(\frac{\xi_{2} -年_{j’_{1} -j个_{1} }}{h_{y}}\biggr)\大gr)\,d\xi_{2},n{2}}-\psi\biggl(\frac{\xi_{1} -x个_{i'-i}}{h{x}}\biggr)\psi\biggl(\frac{\xi_{2} -年_{j’_{2} -j个_{2} }}{h_{y}}\biggr)\,d\xi_{1}\,d_xi_{2}\\&=A{m_{2{,n{2}}^{v,v}=b_{j{2},j'{2}^{i,i'}。\结束{对齐}$$
(24)
根据(24),我们已经证明\(\mathbf{乙}_{j{1},j'{1}}=\mathbf{B}_{j_{2},j’_{2}}\)如果块矩阵\(\mathbf{乙}_{j{1},j'{1}\)和\(\mathbf{乙}_{j{2},j'{2}\)在同一条对角线上。
让\(i{3}’-i{3}=i{4}'-i{4}=k,-k\leqk\leqK\),并让
$$\begin{aligned}&m_{3}=(j-1)(N_{x}-1)+i{3},\quad 1\leqi{3{3}\leqN_{x}-1,1\leq j\leq N_{y}-1,\\&N_{3{=\bigl qj'\leqN_{y}-1,\\&m_{4}=(j-1)(N_{x}-1)+i_{4{,\quad 1\leqi_{4]\leqn_{x{-1,1\leq j\leqN2_y}-1=\bigl(j'-1\bigr)(N_{x}-1)+i'_{4},\quad 1\leqi'_}4}\leqN_{x}-1,1\leq j'\leq N_{y}-1。\结束{对齐}$$
(25)
然后我们观察到
$$开始{对齐}b_{j,j'}^{i_{3},i'{3}}&=A_{m_{3{,n_{3neneneep}^{v,v}\\&=int_{b_{delta}(0,0)}\frac{xi_{1}^{2}\sigma(\Vert(\xi_1},\xi_2})\Vert)}{xi_1}^2}+\xi_{2}^{2}}\biggl(δ{m{3},n{3}}-\psi\biggl(frac{xi_{1} -x个_{我'_{3} -i_{3} }}{h{x}}\biggr)\psi\biggl(\frac{\xi_{2} -年_{j'-j}}{h_{y}}\biggr)\ biggrδ{m{4},n{4}}-\psi\biggl(\frac{\xi_{1} -x个_{我'_{4} -i_{4} }}{h{x}}\biggr)\psi\biggl(\frac{\xi_{2} -年_{j'-j}}{h{y}}\biggr)\大gr)\,d\xi_{1}\,d_xi_{2}\\&=A_{m_{4},n_{4{}^v,v}=b_{j,j'}^{i_{4],i'{4}}。\结束{对齐}$$
(26)
根据(26),我们得出每个块矩阵\(\mathbf{乙}_{j,j'}\)是带状Toeplitz矩阵。组合(24)和(26),我们完成证明。□
推论1
刚度矩阵\(\mathbf{答}_{2N}\)可以存储在\(O(N)\)回忆.
证明
从结构上看\(\mathbf{答}_{2N}\),我们只证明了块矩阵\(\mathbf{答}_{N} ^{v,v}\)可以存储在\(O(N)\)回忆。来自定理1我们可以发现矩阵\(\mathbf{答}_{N} ^{v,v}\)只能通过存储以下内容来存储\((2K+1)\)-由-\((2L+1)\)条目:
$$\mathbf{G}=\begin{pmatrix}t_{-K,-L}^{v,v}&\cdots&t_{-K,0}^{v,v}&\cdots&t_{-K,L}^{v,v}\\vdots&\ddots&\ ddots&\ddots&\vdots\\t_{0,-L}^v,v{&\cdotes&t{0,0}^{v,v{&\cdots&t_{0,L}^{v,v}\\vdots&\ddots&\ vdots&\ddots&\vdots\\t_{K,-L}^{v、v}&\cdots&t_{K,0}^{v,v{&\cdot&t_}K,j}^{v,v}\end{pmatrix}$$
(27)
因此,\(\mathbf{答}_{2N}\)可以存储在\(O(4*(2K+1)*(2L+1))=O(N)\)记忆。□
推论2
对于任何矢量\(u\in\mathbb{R}^{2N}\),矩阵-矢量乘法\(\mathbf{答}_{2N}\mathbf{u}\)可以在中执行\(O(N\log N)\)操作.
证明
我们将向量除以u个分成两半。那就是,
$$\mathbf{u}=\begin{pmatrix}\mathbf{u}_{1} \\\数学BF{u}_{2} \结束{pmatrix}$$
(28)
在哪儿\(\mathbf{u}_{1} ,\mathbf{u}_{1} \in\mathbb{R}^{N}\).因此
$$\mathbf{答}_{2个}u=\开始{pmatrix}\mathbf{答}_{N} ^{v,v}&\mathbf{答}_{N} ^{v,w}\\mathbf{答}_{N} ^{v,w}&\mathbf{答}_{N} ^{w,w}\结束{pmatrix}\开始{pmatricx}\mathbf{u}_{1} \\ mathbf{u}_{2} \end{pmatrix}=\ begin{pmatrix}\mathbf{答}_{N} ^{v,v}\mathbf{u}_{1} +\mathbf{答}_{N} ^{v,w}\mathbf{u}_{2} \\ mathbf{答}_{N} ^{v,w}\mathbf{u}_{1} +\mathbf{答}_{N} ^{w,w}\mathbf{u}_{2} \结束{pmatrix}$$
(29)
由于矩阵的BTTB结构\(\mathbf{答}_{N} ^{v,v}\),\(\mathbf{答}_{N} ^{v,w}\)、和\(\mathbf{答}_{N} ^{w,w}\),矩阵-向量乘法\(\mathbf{答}_{N} ^{v,v}\mathbf{u}_{1}\),\(\mathbf{答}_{N} ^{v,w}\mathbf{u}_{2} \),\(\mathbf{答}_{N} ^{v,w}\mathbf{u}_{1}\)、和\(\mathbf{答}_{N} ^{w,w}\mathbf{u}_{2} \)可以在中计算\(O(N\log N)\)操作[19]。因此,总矩阵向量乘法\(\mathbf{答}_{2N}\mathbf{u}\)可以在中进行计算\(O(4N)=O(N)操作。□