MARC보기
LDR00000nam u2200205 4500
001000000435058
00520200227114250
008200131s2019 ||||||||||||||||| ||eng d
020 ▼a 9781687970718
035 ▼a (MiAaPQ)AAI27602802
035 ▼a (MiAaPQ)OhioLINKosu1555152640361367
040 ▼a MiAaPQ ▼c MiAaPQ ▼d 247004
0820 ▼a 001
1001 ▼a Byrne, Evan.
24510 ▼a Inference in Generalized Linear Models with Applications.
260 ▼a [S.l.]: ▼b The Ohio State University., ▼c 2019.
260 1 ▼a Ann Arbor: ▼b ProQuest Dissertations & Theses, ▼c 2019.
300 ▼a 163 p.
500 ▼a Source: Dissertations Abstracts International, Volume: 81-06, Section: B.
500 ▼a Advisor: Schniter, Philip.
5021 ▼a Thesis (Ph.D.)--The Ohio State University, 2019.
506 ▼a This item must not be sold to any third party vendors.
520 ▼a In this dissertation, we first consider two problems involving the generalized linear model: sparse multinomial logistic regression (SMLR) and sketched clustering, which in the context of machine learning are forms of supervised and unsupervised learning, respectively. Conventional approaches to these problems fit the parameters of the model to the data by minimizing some regularized loss function between the model and data with an iterative gradient-based algorithm, which may suffer from various issues such as slow convergence or finding a sub-optimal solution. Slow convergence is particularly detrimental when applied to modern datasets, which may contain upwards of millions of sample points. We take an alternate inference approach based on approximate message passing, rather than optimization. In particular, we apply the hybrid generalized approximate message passing (HyGAMP) algorithm to both of these problems in order to learn the underlying parameters of interest. The HyGAMP algorithm approximates the sum-product or min-sum loopy belief propagation algorithms, which approximate minimum mean squared error (MMSE) or maximum a posteriori (MAP) estimation, respectively, of the unknown parameters of interest. We apply a simplified form of HyGAMP (SHyGAMP) to SMLR, where we show through numerical experiments that our approach meets or exceeds the performance of state-of-the-art SMLR algorithms with respect to classification accuracy and algorithm training time. We then apply the MMSE-SHyGAMP algorithm to the sketched clustering problem, where we also show through numerical experiments that our approach exceeds the performance of other state-of-the-art sketched clustering algorithms with respect to clustering accuracy and computational efficiency, as well as the widely used K-means++ algorithm in some regimes.Finally, we study the problem of adaptive detection from quantized measurements. We focus on the case of strong, but low-rank interference, which is motivated by wireless communications applications for the military, where the receiver is experiencing strong jamming from a small number of sources in a time-invariant channel. In this scenario, the receiver requires many antennas to effectively null out the interference, but at the cost of increased hardware complexity, and total volume of data to be processed. Using highly quantized measurements is one method of reducing the amount of data to be processed, but it is unknown how this affects detection performance. We first investigate the effect of quantized measurements on existing unquantized detection algorithms. We observe that unquantized detection algorithms applied to quantized measurements lack the ability to null arbitrarily large interference, despite being able to null arbitrarily large interference when applied to unquantized measurements. We then derive a generalized likelihood ratio test for the quantized measurement model, which gives rise to a generalized bilinear model. Via simulation, we empirically observe the quantized algorithm only offers a fraction of a decibel improvement in equivalent SNR relative to unquantized algorithms. We then evaluate alternative techniques to address the performance loss due to quantized measurements, including a novel analog pre-whitening using digitally controlled phase-shifters. In simulation, we observe that the new technique shows up to 8 dB improvement in equivalent SNR.
590 ▼a School code: 0168.
650 4 ▼a Electrical engineering.
650 4 ▼a Computer engineering.
650 4 ▼a Artificial intelligence.
690 ▼a 0544
690 ▼a 0464
690 ▼a 0800
71020 ▼a The Ohio State University. ▼b Electrical and Computer Engineering.
7730 ▼t Dissertations Abstracts International ▼g 81-06B.
773 ▼t Dissertation Abstract International
790 ▼a 0168
791 ▼a Ph.D.
792 ▼a 2019
793 ▼a English
85640 ▼u http://www.riss.kr/pdu/ddodLink.do?id=T15494547 ▼n KERIS ▼z 이 자료의 원문은 한국교육학술정보원에서 제공합니다.
980 ▼a 202002 ▼f 2020
990 ▼a ***1008102
991 ▼a E-BOOK