基于隱變量貝葉斯模型的稀疏信號(hào)恢復(fù)
doi: 10.11999/JEIT140169 cstr: 32379.14.JEIT140169
基金項(xiàng)目:
國(guó)家自然科學(xué)基金(61379104)資助課題
Sparse Signals Recovery Based on Latent Variable Bayesian Models
-
摘要: 該文基于貝葉斯分析的視角,揭示了一類算法,包括使用隱變量模型的稀疏貝葉斯學(xué)習(xí)(SBL),正則化FOCUSS算法以及Log-Sum算法之間的內(nèi)在關(guān)聯(lián)。分析顯示,作為隱變量貝葉斯模型的一種,稀疏貝葉斯學(xué)習(xí)使用第2類最大似然(Type II ML)在隱變量空間進(jìn)行運(yùn)算,可以視作一種更為廣義和靈活的方法,并且為不適定反問題的稀疏求解提供了改進(jìn)的途徑。較之于目前基于第1類最大似然(Type I ML)的稀疏方法,仿真實(shí)驗(yàn)證實(shí)了稀疏貝葉斯學(xué)習(xí)的優(yōu)越性能。
-
關(guān)鍵詞:
- 信號(hào)處理 /
- 隱變量貝葉斯模型 /
- 第2類最大似然 /
- 稀疏貝葉斯學(xué)習(xí) /
- 迭代加權(quán)最小二乘法
Abstract: From a Bayesian perspective, the commonly used sparse recovery algorithms, including Sparse Bayesian Learning (SBL), Regularized FOCUSS (R_FOCUSS) and Log-Sum, are compared. The analysis shows that, as a special case of latent variable Bayesian models, SBL, which operates in latent variable space via type-II maximum likelihood method, can be viewed as a more general and flexible means, and offers an avenue for improvement when finding sparse solutions to underdetermined inverse problems. Numerical results demonstrate the superior performance of SBL as compared to state-of-the-art sparse methods based on type-I maximum likelihood. -
計(jì)量
- 文章訪問數(shù): 2435
- HTML全文瀏覽量: 127
- PDF下載量: 1325
- 被引次數(shù): 0