Emerging applications such as wearable electronics and the internet of things necessitate wireless transmission of large datasets and generate the need for efficient energy consumption. Exactly digitizing and transmitting this data is energy costly. Most decision-making applications involving this data are statistical. It is well known that for any statistical decision-making problem the posterior distribution is a sufficient statistic. Thus, for the aforementioned applications, it is desirable to locally calculate a posterior distribution and transmit only that information. However, calculating the posterior, especially in large dimensions, is traditionally hard. We here show that for a large class of problems we can exactly compute the posterior distribution using distributed optimization algorithms that can be stochastic and implemented in nonlinear circuits. Specifically, we consider applications such as imaging and spectroscopy where the latent signal can be modeled as having a sparse “Laplacian” prior with respect to an appropriate basis, and our measurement model is linear and Gaussian. In these classes of problems, we show that even in very large dimensions, we can perform exact Bayesian inference by simply drawing independent samples and solving distributed convex optimization problems. We accomplish this by formulating the Bayesian inference problem as a problem in finding a nonlinear map that exactly transforms samples from one distribution to samples from another. We show that this problem can be cast as a KL-divergence minimization problem and moreover that this problem is convex. We show that this convex problem can be exactly solved with low-energy analog circuits that obviate the need for a clock. This now enables new performance-complexity tradeoffs for statistical decision-making in these applications.

Exact Bayesian inference was implemented in a distributed manner using the Parallella platform. The architecture of the Parallella features 16 co-processors and Dual ARM-cores, making it an ideal candidate for implementation of a scalable, distributed algorithm.