首页
网站开发
桌面应用
管理软件
微信开发
App开发
嵌入式软件
工具软件
数据采集与分析
其他
首页
>
> 详细
讲解EEE3314-01程序、辅导MATLAB语言编程、MATLAB程序讲解 讲解数据库SQL|讲解留学生Processing
项目预算:
开发周期:
发布时间:
要求地区:
INTRODUCTION TO AI
2020 Fall Semester
PROJECT
Due Date: 12 December (Saturday) before midnight to YSCEC
INSTRUCTIONS:
1. This paper consists of 5 pages with 2 Questions only.
2. You may use deep learning library/toolbox to solve the problem. However, you must describe
the related information such as name, version etc in the introduction part.
3. Submit a report containing the following items:
Example:
Introduction: A brief introduction of the problem.
Problems and Solution
Discussion
Conclusion
Remark: Introduction and conclusion only write once in the report.
4. Use English in the report.
5. Attach your code as an appendix. You may show the code snippet (not whole code) to aid your
explanation.
Question 1: AlexNet with CIFAR-10
In this question, we will adopt AlexNet discussed in the lecture. We will use CIFAR-10 dataset
(https://www.cs.toronto.edu/~kriz/cifar.html). You can find the tutorial for how do you train a
CNN with CIFAR-10 according to the specific deep learning library [1]-[4]. Most of the major
deep learning libraries (Tensorflow, Pytorch, MATLAB) come with a tutorial with CIFAR10
example.
You should have good knowledge on AlexNet architecture so that you can modify the
hyperparameters in the network according to the CIFAR 10 characteristics.
Train a standard AlexNet with CIFAR 10 via CE loss. Note the following:
Modify output layer (Softmax layer) to 10 nodes, correspond to 10 classes in CIFAR-10.
Use original CIFAR 10 images (32 x 32 x 3) to input AlexNet. With this input size, you will
find feature map vanishes at the last conv layer, so modify the stride number, padding etc.
If you face insufficent memory problem, modify the filter size, channel numbers, number of
neurons in the FC layers.
What you cannot change is the architecture (5 conv layers, 3 max pooling layers and 2 FC
layers).
(a) Describe training setup eg. data pre-processing/augmentation, initialization, hyperparameters
such as learning rate schedule, momentum, dropout rate, batch number, BN etc. Present them
systematically in the table form.
(b) Fill in the table below to show the weights and neurons of each layer in your modified
AlexNet network.
(c) Plot the following:
Training and test loss (not classification accuracy) vs epoch.
Classification accuracy on the training and test set vs epoch.
(d) Fill the table with the final accuracy that you obtained.
Training Accuracy (%) Testing Accuracy (%)
Remark:
1. The testing accuracy should be at least 75% and above. With proper tuning the network
hyperparameters, data augmentation, BN etc you can achieve more than 85%.
2. The different of training accuracy and testing accuracy should not more than 10% only.
3. The generalization techniques discussed in the lecture would be helpful. The score for this part
is proportional to the accuracy that you can obtain. Note the accuracy values should be consistent
to the graph shown in (c). If no graph is shown, answer in (d) is discounted.
(e) Discuss the issues that you experienced and the effort that you’ve taken to overcome them.
This could be non-convergence training or/and poor testing accuracy etc.
Remark: If your machine is slow in solving this question due to CPU or poor GPU, consider
Google Colab if you are using Python. See https://towardsdatascience.com/getting-started-withgoogle-colab-f2fff97f594c
Question 2: Transfer Learning
Now we want to see whether the features of AlexNet developed in Question 1 can transfer to other
domain. Here we use imdb_wiki face dataset (subset) that consists of 100 subjects with 30
samples per subject as target dataset. Note this is the case where source and target data are
dissimilar and small in size, relative to CIFAR 10.
(a) First, based on the model that has been trained with CIFAR 10, re-train the last (softmax)
layer in the prebuilt network in question 1 with face training samples (20 samples per subject).
This is equivalent to you freeze the rest of the layers in the network and replace with a new
(linear) classifier for face. Note number of output nodes should change to 100 instead of 10.
This serves as a baseline model. Evaluate the baseline model with face test set, which
composed of 10 samples per subject.
(b) Then, try to fine tune/remove/retrain chosen convolution and/or FC layers. Create three models
with different trials. Fill in the table below. Evaluate the model with face test set.
Trial Finetune Learning Rate
(If applicable)
(specify in this form: x% of the
original lr)
Model #
Finetune Conv 4 and 5, freeze the rest
(Example)
10% of original lr
(Example)
Model 1
(Example)
Remove Conv 5, FC 1 and FC2
(Example)
Not applicable (Example) Model 2
(Example)
For each model,
(i) Show both training and test loss vs. epoch graphs.
(ii) Show both classification accuracy on the training and test set vs. epoch graphs.
Fill in the table
Training Accuracy (%) Testing Accuracy (%)
Baseline Model
Model 1
Model 2
Model 3
(c) Discuss your observation in terms of generalization performance of transfer learning.
Remark: Almost every deep learning library comes with tutorial of CNN tuning such as
Tensorflow [5], MATLAB [6], Keras [7] etc.
References:
[1]. https://www.mathworks.com/help/nnet/examples/train-residual-network-on-cifar-10.html.
[2]. https://www.tensorflow.org/tutorials/deep_cnn
[3] https://blog.plon.io/tutorials/cifar-10-classification-using-keras-tutorial/
[4] http://caffe.berkeleyvision.org/gathered/examples/cifar10.html
[5] https://kratzert.github.io/2017/02/24/finetuning-alexnet-with-tensorflow.html
[6] https://www.mathworks.com/help/nnet/examples/transfer-learning-using-alexnet.html.
[7] https://blog.keras.io/building-powerful-image-classification-models-using-very-littledata.html
软件开发、广告设计客服
QQ:99515681
邮箱:99515681@qq.com
工作时间:8:00-23:00
微信:codinghelp
热点项目
更多
代写math 1151, autumn 2024 w...
2024-11-14
代做comp4336/9336 mobile dat...
2024-11-14
代做eesa01 lab 2: weather an...
2024-11-14
代写comp1521 - 24t3 assignme...
2024-11-14
代写nbs8020 - dissertation s...
2024-11-14
代做fin b377f technical anal...
2024-11-14
代做ceic6714 mini design pro...
2024-11-14
代做introduction to computer...
2024-11-14
代做cs 353, fall 2024 introd...
2024-11-14
代做phy254 problem set #3 fa...
2024-11-14
代写n1569 financial risk man...
2024-11-14
代写csci-ua.0202 lab 3: enco...
2024-11-14
代写econ2226: chinese econom...
2024-11-14
热点标签
mktg2509
csci 2600
38170
lng302
csse3010
phas3226
77938
arch1162
engn4536/engn6536
acx5903
comp151101
phl245
cse12
comp9312
stat3016/6016
phas0038
comp2140
6qqmb312
xjco3011
rest0005
ematm0051
5qqmn219
lubs5062m
eee8155
cege0100
eap033
artd1109
mat246
etc3430
ecmm462
mis102
inft6800
ddes9903
comp6521
comp9517
comp3331/9331
comp4337
comp6008
comp9414
bu.231.790.81
man00150m
csb352h
math1041
eengm4100
isys1002
08
6057cem
mktg3504
mthm036
mtrx1701
mth3241
eeee3086
cmp-7038b
cmp-7000a
ints4010
econ2151
infs5710
fins5516
fin3309
fins5510
gsoe9340
math2007
math2036
soee5010
mark3088
infs3605
elec9714
comp2271
ma214
comp2211
infs3604
600426
sit254
acct3091
bbt405
msin0116
com107/com113
mark5826
sit120
comp9021
eco2101
eeen40700
cs253
ece3114
ecmm447
chns3000
math377
itd102
comp9444
comp(2041|9044)
econ0060
econ7230
mgt001371
ecs-323
cs6250
mgdi60012
mdia2012
comm221001
comm5000
ma1008
engl642
econ241
com333
math367
mis201
nbs-7041x
meek16104
econ2003
comm1190
mbas902
comp-1027
dpst1091
comp7315
eppd1033
m06
ee3025
msci231
bb113/bbs1063
fc709
comp3425
comp9417
econ42915
cb9101
math1102e
chme0017
fc307
mkt60104
5522usst
litr1-uc6201.200
ee1102
cosc2803
math39512
omp9727
int2067/int5051
bsb151
mgt253
fc021
babs2202
mis2002s
phya21
18-213
cege0012
mdia1002
math38032
mech5125
07
cisc102
mgx3110
cs240
11175
fin3020s
eco3420
ictten622
comp9727
cpt111
de114102d
mgm320h5s
bafi1019
math21112
efim20036
mn-3503
fins5568
110.807
bcpm000028
info6030
bma0092
bcpm0054
math20212
ce335
cs365
cenv6141
ftec5580
math2010
ec3450
comm1170
ecmt1010
csci-ua.0480-003
econ12-200
ib3960
ectb60h3f
cs247—assignment
tk3163
ics3u
ib3j80
comp20008
comp9334
eppd1063
acct2343
cct109
isys1055/3412
math350-real
math2014
eec180
stat141b
econ2101
msinm014/msing014/msing014b
fit2004
comp643
bu1002
cm2030
联系我们
- QQ: 9951568
© 2021
www.rj363.com
软件定制开发网!