MARC보기
LDR05937cam a2200673Mi 4500
001000000412177
00520190131143006
006m d
007cr |n|---|||||
008180303s2018 enk o 000 0 eng d
019 ▼a 1027194881 ▼a 1027356415 ▼a 1027556192 ▼a 1027713799
020 ▼a 9781788474559 ▼q (electronic bk.)
020 ▼a 1788474554 ▼q (electronic bk.)
020 ▼z 9781788478403
020 ▼z 1788478401
020 ▼a 1788478401
020 ▼a 9781788478403
0243 ▼a 9781788478403
035 ▼a 1717558 ▼b (N$T)
035 ▼a (OCoLC)1027155886 ▼z (OCoLC)1027194881 ▼z (OCoLC)1027356415 ▼z (OCoLC)1027556192 ▼z (OCoLC)1027713799
037 ▼a B08604 ▼b 01201872
037 ▼a 361CBCC8-C94D-472D-AC6F-4B0C12C84CBC ▼b OverDrive, Inc. ▼n http://www.overdrive.com
040 ▼a EBLCP ▼b eng ▼e pn ▼c EBLCP ▼d MERUC ▼d CHVBK ▼d OCLCO ▼d IDB ▼d OCLCF ▼d OCLCQ ▼d YDX ▼d VT2 ▼d TEFOD ▼d OCLCQ ▼d N$T ▼d C6I ▼d 247004
050 4 ▼a QA276.45.R3 ▼b .L589 2018
072 7 ▼a MAT ▼x 003000 ▼2 bisacsh
072 7 ▼a MAT ▼x 029000 ▼2 bisacsh
08204 ▼a 519.502855133 ▼2 23
1001 ▼a Liu, Yuxi (Hayden).
24510 ▼a R Deep Learning Projects : ▼b Master the techniques to design and develop neural network models in R.
260 ▼a Birmingham: ▼b Packt Publishing, ▼c 2018.
300 ▼a 1 online resource (253 pages).
336 ▼a text ▼b txt ▼2 rdacontent
337 ▼a computer ▼b c ▼2 rdamedia
338 ▼a online resource ▼b cr ▼2 rdacarrier
500 ▼a Exploratory data analysis.
5050 ▼a Cover; Copyright and Credits; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Handwritten Digit Recognition Using Convolutional Neural Networks; What is deep learning and why do we need it?; What makes deep learning special?; What are the applications of deep learning?; Handwritten digit recognition using CNNs; Get started with exploring MNIST; First attempt a?#x80;#x93; logistic regression; Going from logistic regression to single-layer neural networks; Adding more hidden layers to the networks; Extracting richer representation with CNNs; Summary.
5058 ▼a Chapter 2: Traffic Sign Recognition for Intelligent VehiclesHow is deep learning applied in self-driving cars?; How does deep learning become a state-of-the-art solution?; Traffic sign recognition using CNN; Getting started with exploring GTSRB; First solutionA? a?#x80;#x93; convolutional neural networks using MXNet; Trying something newA? a?#x80;#x93; CNNs using Keras with TensorFlow; Reducing overfitting with dropout; Dealing with a small training setA? a?#x80;#x93; data augmentation; Reviewing methods to prevent overfitting in CNNs; Summary; Chapter 3: Fraud Detection with Autoencoders; Getting ready.
5058 ▼a Installing Keras and TensorFlow for RInstalling H2O; Our first examples; A simple 2D example; Autoencoders and MNIST; Outlier detection in MNIST; Credit card fraud detection with autoencoders; Exploratory data analysis; The autoencoder approach a?#x80;#x93; Keras; Fraud detection with H2O; Exercises; Variational Autoencoders; Image reconstruction using VAEs; Outlier detection in MNIST; Text fraud detection; From unstructured text data to a matrix; From text to matrix representation a?#x80;#x94; the Enron dataset; Autoencoder on the matrix representation; Exercises; Summary.
5058 ▼a Chapter 4: Text Generation Using Recurrent Neural NetworksWhat is so exciting about recurrent neural networks?; But what is a recurrent neural network, really?; LSTM and GRU networks; LSTM; GRU; RNNs from scratch in R; Classes in R with R6; Perceptron as an R6 class; Logistic regression; Multi-layer perceptron; Implementing a RNN; Implementation as an R6 class; Implementation without R6; RNN without derivatives a?#x80;#x94; the cross-entropy method; RNN using Keras; A simple benchmark implementation; Generating new text from old; Exercises; Summary; Chapter 5: Sentiment Analysis with Word Embeddings.
5058 ▼a Warm-up a?#x80;#x93; data explorationWorking with tidy text; The more, the merrier a?#x80;#x93; calculating n-grams instead of single words; Bag of words benchmark; Preparing the data; Implementing a benchmark a?#x80;#x93; logistic regressionA? ; Exercises; Word embeddings; word2vec; GloVe; Sentiment analysis from movie reviews; Data preprocessing; From words to vectors; Sentiment extraction; The importance of data cleansing; Vector embeddings and neural networks; Bi-directional LSTM networks; Other LSTM architectures; Exercises; Mining sentiment from Twitter; Connecting to the Twitter API; Building our model.
520 ▼a R is a popular programming language used by statisticians and mathematicians for statistical analysis, and is popularly used for deep learning. This book demonstrates end-to-end implementations of five real-world projects on popular topics in deep learning such as handwritten digit recognition, traffic light detection, fraud detection, text ...
5880 ▼a Print version record.
590 ▼a Added to collection customer.56279.3 - Master record variable field(s) change: 072
650 0 ▼a R.
650 0 ▼a Artificial intelligence.
650 0 ▼a Neural networks.
650 7 ▼a Artificial intelligence. ▼2 fast ▼0 (OCoLC)fst00817247
650 7 ▼a MATHEMATICS / Applied ▼2 bisacsh
650 7 ▼a MATHEMATICS / Probability & Statistics / General ▼2 bisacsh
655 4 ▼a Electronic books.
7001 ▼a Maldonado, Pablo,
77608 ▼i Print version: ▼a Liu, Yuxi (Hayden). ▼t R Deep Learning Projects : Master the techniques to design and develop neural network models in R. ▼d Birmingham : Packt Publishing, 짤2018
85640 ▼3 EBSCOhost ▼u http://libproxy.dhu.ac.kr/_Lib_Proxy_Url/http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=1717558
938 ▼a EBL - Ebook Library ▼b EBLB ▼n EBL5309083
938 ▼a YBP Library Services ▼b YANK ▼n 15185820
938 ▼a EBSCOhost ▼b EBSC ▼n 1717558
990 ▼a ***1012033
994 ▼a 92 ▼b N$T