V
主页
Batch_Normalization: Normalization vs mini-batch statistics
发布人
读深度学习论文,欢迎大家在弹幕中提问和指正
打开封面
下载高清视频
观看高清视频
视频下载器
Batch_Normalization: Normalization vs mini-batch statistics full
Batch_Normalization: Introduction
Batch_Normalization: Abstract
(中文) 掰开揉碎:initialization, covariance shift, batch normalization
我的第一门统计入门课Learning to Love Statistics by NotreDame University and Edx
Batch_Normalization: Convolutional Network
Batch_Normalization: Toward Reducing Internal Covariance Shift
Inception: GoogLeNet
ResNet: abstract and introduction
李宏毅深度学习batch normalization笔记要点问题
Batch_Normalization: Learning rate 更大也没关系
论文分解器:Generalization in deep learning (合集)
论文分解器:FaceNet (合集,论文首次解读完结)
Bayesian Statistics by Duke University and Coursera
EN 掰开揉碎:initialization, batch normalization, covariance shift
伯克利统计课程2.1X BerkerlyX Statistics 2.1X
ResNet related works
2018年最直观系统的统计概率普及课Crash Course Statistics (from lesson 11 onward)
Batch Normalization at test time
Inferential statistics by University of Amsterdam and Coursera
Inferential Statistics by Duke University with Coursera
YOLO:abstract
阿姆斯特丹大学统计课基础版Basic Statistics by University of Amsterdam and Coursera
图解概率机器学习基础:all of statistics a concise course in statistical inference
哈佛大学最受欢迎的统计课Harvard Statistics 110 introduction
从数据科学应用(R programming) 角度来学习统计与概率 intro to Data science
论文分解器:on the origin of deep learning (更新完section5)
Inception: architecture details
Dropout: Introduction
尝试用ABM再造一些经济学的基本概念
吴恩达机器学习图解笔记
(中文) 论文分解器:李宏毅解读Hinton's capsule 笔记
resnet 3.4 implementation
Inception: Abstract and Introduction
Dropout: abstract
Geoffrey Hinton - 论(人工)智慧的本质(演讲分解版)
Dropout: 如何理解weights * (1- dropout_rate)
概率机器学习基础:MIT概率课图解笔记
Inception: related works
Dropout: motivation 2