Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization

  • View
    56

  • Download
    0

Embed Size (px)

DESCRIPTION

Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization. Qing Ling Department of Automation University of Science and Technology of China Joint work with Zaiwen Wen (SJTU) and Wotao Yin (RICE) 2012/09/05. A brief introduction to my research interest. - PowerPoint PPT Presentation

Text of Decentralized Jointly Sparse Optimization by Reweighted Lq Minimization

  • Decentralized Jointly Sparse Optimization byReweighted Lq MinimizationQing Ling Department of Automation University of Science and Technology of China

    Joint work with Zaiwen Wen (SJTU) and Wotao Yin (RICE)

    2012/09/05

  • A brief introduction to my research interestoptimization and control in networked multi-agent systemsautonomous agents- collect data- process data- communicate problem: how to efficiently accomplish in-network optimization and control tasks through collaboration of agents?

  • Large-scale wireless sensor networks: decentralized signal processing, node localization, sensor selection how to fuse big sensory data?e.g. structural health monitoring how to localize blinds with anchors?blindanchorhow to assign sensors to targets?difficulty in data transmission decentralized optimization without any fusion center

  • Computer/server networks with big data: collaborative data miningnew challenges in the big data era- big data is stored in distributed computers/servers- data transmission is prohibited due to bandwidth/privacy/- computers/servers collaborate to do data mining distributed/decentralized optimization

  • Wireless sensor and actuator networks: with application in large-scale greenhouse controldecentralized control system design wireless sensing temperature humidity wireless actuating circulating fan wet curtain disadvantages of traditional centralized control communication burden in collecting distributed sensory data lack of robustness due to packet-loss, time-delay,

  • Recent workswireless sensor networks- decentralized signal processing with application in SHM- decentralized node localization using SDP and SOCP- decentralized sensor node selection for target trackingcollaborative data mining- decentralized approaches to jointly sparse signal recovery- decentralized approaches to matrix completionwireless sensor and actuator networks modeling, hardware design, controller design, prototypetheoretical issues convergence and convergence rate analysis

  • Decentralized Jointly Sparse Optimization byReweighted Lq MinimizationQing Ling Department of Automation University of Science and Technology of China

    Joint work with Zaiwen Wen (SJTU) and Wotao Yin (RICE)

    2012/09/05

  • OutlineBackgrounddecentralized jointly sparse optimization with applications

    Roadmapnonconvex versus convex, difficulty in decentralized computing

    Algorithm developmentsuccessive linearization, inexact average consensus

    Simulation and conclusion

  • Background (I): jointly sparse optimizationStructured signalsA sparse signal: only few elements are nonzeroJointly sparse signals: sparse, with the same nonzero supports

    Jointly sparse optimization: to recover X from linear measurements

    nonzeroszerosmeasurement matrixmeasurement noise

  • Background (II): decentralized jointly sparse optimizationDecentralized computing in a networkDistributed data in distributed agents & no fusion centerConsideration of privacy, difficulty in data collection, etc

    Goal: agent i has y(i) and A(i), to recover x(i) through collaborationDecentralized jointly sparse optimization

  • Background (III): applicationsCooperative spectrum sensing [1][2]Cognitive radios sense jointly sparse spectra {x(i)}Measure from time domain [1] or frequency selective filter [2]Decentralized recovery from {y(i)=A(i)x(i)}

    [1] F. Zeng, C. Li, and Z. Tian, Distributed compressive spectrum sensing in cooperative multi-hop wideband cognitive networks, IEEE Journal of Selected Topics in Signal Processing, vol. 5, pp. 3748, 2011[2] J. Meng, W. Yin, H. Li, E. Houssain, and Z. Han, Collaborative spectrum sensing from sparse observations for cognitive radio networks, IEEE Journal on Selected Areas on Communications, vol. 29, pp. 327337, 2011[3] N. Nguyen, N. Nasrabadi, and T. Tran, Robust multi-sensor classification via joint sparse representation, submitted to Journal of Advance in Information FusionDecentralized event detection [3]Sensors {i} sense few targets represented by jointly sparse {x(i)}Decentralized recovery from {y(i)=A(i)x(i)}

    Collaborative data mining, distributed human action recognition, etc

  • Roadmap (I): nonconvex versus convexConvex model: group lasso or L21 norm minimization Nonconvex versus convexConvex: with global convergence guaranteeNonconvex: often with better recovery performance

    Look back on nonconvex models to recover a single sparse signalReweighted L1/L2 norm minimization [4][5]Reweighted algorithms for jointly sparse optimization?

    [4] E. Candes, M. Wakin, and S. Boyd, Enhancing sparsity by reweighted L1 minimization, Journal of Fourier Analysis and Applications, vol. 14, pp. 877905, 2008[5] R. Chartrand and W. Yin, Iteratively reweighted algorithms for compressive sensing, In: Proceedings of ICASSP, 2008regularization parameter

  • Roadmap (II): difficulty in decentralized computingA popular decentralized computing technique: consensuscommon optimization variableobjective function in agent ilocal copy in agent ineighboring copies are equalObviously, two problems are equivalent for a connected networkEfficient algorithms (ADM, SGD, etc) for if it is convex [6]Nothing for consensus in jointly sparse optimization!Signals are different; common supports bring nonconvexity[6] D. Bertsekas and J. Tsitsiklis, Parallel and Distributed Computation: Numerical Methods, Second Edition, Athena Scientific, 1997

  • Roadmap (III): solution overviewNonconvex model + convex decentralized computing subproblemNonconvex model -> successive linearization -> reweighted LqNatural decentralized computing, one nontrivial subproblemInexactly solving the subproblem still leads to good recovery

  • Algorithm (I): successive linearizationNonconvex model (q=1 or 2)regularization parametersmoothing parameterSuccessive linearization to the joint sparsity term at tActually a majorization minimization approach

  • Algorithm (II): reweighted algorithmCentralized reweighted Lq minimization algorithmUpdating weight vectorweight vector u=[u1; u2; uN]Updating signalsFrom a decentralized implementation perspective Natural decentralized computing in x-updateOne subproblem needs decentralized solution in u-update

  • Algorithm (III): average consensusCheck u-update: average consensus problemRewrite to more familiar forms

  • Algorithm (IV): inexact average consensusSolve the average consensus problem with ADM (time t, slot s/S)Updating weight vectors (local copies)Updating Lagrange multipliers (c is a positive constant)Exact average consensus versus inexact average consensusExact average consensus: exact implementation of reweighted LqIntroducing inner loops: cost of coordination & communicationInexact average consensus: one iteration in the inner loop

  • Algorithm (V): decentralized reweighted LqAlgorithm outlineUpdating weight vectors (local copies)Updating Lagrange multipliers (c is a positive constant)Updating signals

  • Simulation (I): simulation settingsNetwork settingsL=50 agents, randomly deployed in 100100 areaCommunication range=30, bidirectionally connected

    Measurement settingsSignal dimension N=20, signal sparsity K=2Measurement dimension M=10Random measurement matrices and random measurement noise

    Parameter settings

  • Simulation (II): recovery performance

  • Simulation (III): convergence rate

  • ConclusionDecentralized jointly sparse optimization problemJointly sparse signal recovery in a distributed networkReweighted Lq minimization algorithmsFeature #1: nonconvex model
  • Thanks for your attention!