MARC보기
LDR00000cam u2200205Ii 4500
001000000430419
00520200122125827
007cr |n|---|||||
008180908s2018 enk o 001 0 eng d
019 ▼a 1051075116 ▼a 1079363066
020 ▼a 9781788997805
020 ▼a 1788997808
020 ▼z 178899289X
020 ▼z 9781788992893
035 ▼a 1879523 ▼b (N$T)
035 ▼a (OCoLC)1051140715 ▼z (OCoLC)1051075116 ▼z (OCoLC)1079363066
037 ▼a 1BBFB26C-0C51-4300-AD4D-8E976B58BE8E ▼b OverDrive, Inc. ▼n http://www.overdrive.com
040 ▼a EBLCP ▼b eng ▼e pn ▼c EBLCP ▼d YDX ▼d TEFOD ▼d MERUC ▼d IDB ▼d OCLCQ ▼d CHVBK ▼d N$T ▼d 247004
050 4 ▼a QA276.45.R3 ▼b .H636 2018
072 7 ▼a MAT ▼x 003000 ▼2 bisacsh
072 7 ▼a MAT ▼x 029000 ▼2 bisacsh
08204 ▼a 519.502855133 ▼2 23
1001 ▼a Hodnett, Mark, ▼e author.
24510 ▼a R deep learning essentials : ▼b a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet/ ▼c Mark Hodnett, Joshua F. Wiley. ▼h [electronic resource].
250 ▼a Second edition.
260 ▼a Birmingham: ▼b Packt Publishing Ltd, ▼c 2018.
300 ▼a 1 online resource.
336 ▼a text ▼b txt ▼2 rdacontent
337 ▼a computer ▼b c ▼2 rdamedia
338 ▼a online resource ▼b cr ▼2 rdacarrier
500 ▼a Includes index.
500 ▼a Document classification.
5050 ▼a Cover; Title Page; Copyright and Credits; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Getting Started with Deep Learning; What is deep learning?; A conceptual overview of neural networks; Neural networks as an extension of linear regression; Neural networks as a network of memory cells; Deep neural networks; Some common myths about deep learning; Setting up your R environment; Deep learning frameworks for R; MXNet; Keras; Do I need a GPU (and what is it, anyway)?; Setting up reproducible results; Summary; Chapter 2: Training a Prediction Model; Neural networks in R.
5058 ▼a Building neural network modelsGenerating predictions from a neural network; The problem of overfitting data -- the consequences explained; Use case -- building and applying a neural network; Summary; Chapter 3: Deep Learning Fundamentals; Building neural networks from scratch in R; Neural network web application; Neural network code; Back to deep learning; The symbol, X, y, and ctx parameters; The num.round and begin.round parameters; The optimizer parameter; The initializer parameter; The eval.metric and eval.data parameters; The epoch.end.callback parameter; The array.batch.size parameter.
5058 ▼a Using regularization to overcome overfittingL1 penalty; L1 penalty in action; L2 penalty; L2 penalty in action; Weight decay (L2 penalty in neural networks); Ensembles and model-averaging; Use case -- improving out-of-sample model performance using dropout; Summary; Chapter 4: Training Deep Prediction Models; Getting started with deep feedforward neural networks; Activation functions; Introduction to the MXNet deep learning library; Deep learning layers; Building a deep learning model; Use case -- using MXNet for classification and regression; Data download and exploration.
5058 ▼a Preparing the data for our modelsThe binary classification model; The regression model; Improving the binary classification model; The unreasonable effectiveness of data; Summary; Chapter 5: Image Classification Using Convolutional Neural Networks; CNNs; Convolutional layers; Pooling layers; Dropout; Flatten layers, dense layers, and softmax; Image classification using the MXNet library; Base model (no convolutional layers); LeNet; Classification using the fashion MNIST dataset; References/further reading; Summary; Chapter 6: Tuning and Optimizing Models.
5058 ▼a Evaluation metrics and evaluating performanceTypes of evaluation metric; Evaluating performance; Data preparation; Different data distributions; Data partition between training, test, and validation sets; Standardization; Data leakage; Data augmentation; Using data augmentation to increase the training data; Test time augmentation; Using data augmentation in deep learning libraries; Tuning hyperparameters; Grid search; Random search; Use case-using LIME for interpretability; Model interpretability with LIME; Summary; Chapter 7: Natural Language Processing Using Deep Learning.
520 ▼a This book demonstrates how to use deep Learning in R for machine learning, image classification, and natural language processing. It covers topics such as convolutional networks, recurrent neural networks, transfer learning and deep learning in the cloud. By the end of this book, you will be able to apply deep learning to real-world projects.
5880 ▼a Print version record.
590 ▼a Master record variable field(s) change: 050, 072, 650
650 0 ▼a R (Computer program language)
650 0 ▼a Artificial intelligence.
650 0 ▼a Machine learning.
650 0 ▼a Neural networks (Computer science)
650 7 ▼a MATHEMATICS / Applied. ▼2 bisacsh
650 7 ▼a MATHEMATICS / Probability & Statistics / General. ▼2 bisacsh
655 4 ▼a Electronic books.
7001 ▼a Wiley, Joshua F., ▼e author,
77608 ▼i Print version: ▼a Hodnett, Mark. ▼t R Deep Learning Essentials : A Step-By-step Guide to Building Deep Learning Models Using TensorFlow, Keras, and MXNet, 2nd Edition. ▼d Birmingham : Packt Publishing Ltd, ©2018 ▼z 9781788992893
85640 ▼3 EBSCOhost ▼u http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=1879523
938 ▼a EBL - Ebook Library ▼b EBLB ▼n EBL5501083
938 ▼a YBP Library Services ▼b YANK ▼n 15687049
938 ▼a EBSCOhost ▼b EBSC ▼n 1879523
990 ▼a ***1008102
991 ▼a E-BOOK
994 ▼a 92 ▼b N$T