首页
网站开发
桌面应用
管理软件
微信开发
App开发
嵌入式软件
工具软件
数据采集与分析
其他
首页
>
> 详细
辅导R语言编程|解析Haskell程序|调试Matlab程序|解析R语言编程
项目预算:
开发周期:
发布时间:
要求地区:
Due on: Friday, JUNE 25th at 23:59 PST
Where: Brightspace (https://bright.uvic.ca/d2l/home/136102)
Instructions: >? You must complete this assignment entirely on your own. In other words, you should come up
with the solution yourself, write the code yourself, conduct the experiments yourself, analyze the
results yourself, and finally, write it all solely by yourself. The university policies on academic
dishonesty (a.k.a. cheating) will be taken very seriously.
? This does not mean that you need to go to a cave and self-isolate while preparing the assignment.
You are allowed to have high-level discussions with your classmates about the course material.
You are also more than welcome to use Piazza or come to office hours and ask questions. If in
doubt, ask!�� we are here to help.
? If you are still stuck, you can use books and published online material (i.e., material that has a
fixed URL). However, you must explicitly credit all sources. You are also not allowed
to copy-paste online materials. Woe to you if we catch you copy-pasting the uncredited
sources!
�C Why ��if stuck��? Assignments are designed to develop your practical ML skills and make
you strong. If you do the assignments well, the project will feel like a piece of cake. So,
give your best. But, on the other hand, do not waste a whole week on a single question:
if you are stuck on a question for a few days, ask (us) for help!
? If you cannot make it until the deadline, you can use a maximum of two grace days per
assignment. They are not free, though: each grace day comes with the 25% mark penalty
(so submitting on Monday evening would reduce your score by 25%; submitting on Tuesday
would further reduce it by 50%). No other accommodations will be provided unless explicitly
approved by the instructor at least 7 days before the deadline.
? These assignments are supposed to be really hard! Start early! You will need at least
two weeks to complete them!
�C If you do not feel challenged enough, please let me know, and I��ll think of something.
? Remember: you will need to gather at least one-third of all points during the assign-
ments to pass the course. If you don��t, you will get an F!
? Make sure to follow the technical requirements outlined below. TAs have the full power to take
50% off your grade if you disregard some of them.
? Be sure that your answers are clear and easy for TAs to understand. They can penalize you if
your solutions lack clarity or are convoluted (in a non-algebraic way), even if they are nominally
correct.
? We will try to grade your assignments within seven (7) days of the initial submission deadline.
1
? If you think there is a problem with your grade, you have one week to raise concerns after the
grades go public. Grading TAs will be holding office hours during those seven days to address
any such problems. After that, your grade is set in stone.
Technical matters:
? You must type up your analysis and solutions electronically and submit them as a self-containing
Jupyter notebook. Jupyter notebooks can contain code, its output, and images. They can also
be used to type math and proofs in LATEX mode.
�C You must use LATEX mode to type formulas. Typing a?2=sqrt(3)+b1 is a pretty good
way to lose 50% of your grade for no good reason.
? Each problem should be submitted as a separate file.
? Each file should be named SurnameInitial N.ipynb, where N is two digit-padded problem
number. Correct: SmithJ 05.ipynb. Incorrect: JohnSmith V12345 Problem 1.ipynb,
prob1.pdf etc.
? Zip all ipynb files and submit them as assignment1.zip to the Brightspace. Do not submit
RAR, TAR, 7zip, SHAR and whatnot; just use good ol�� ZIP. Do not include other files.
? The first cell of each Jupyter notebook must start with your name and V number. See the
attached notebook for the details.
? Your notebook should be organized sequentially according to the problem statement. Use
sections (with the appropriate numbers and labels) within the notebook. Figures and relevant
code should be placed in the proper location in the document.
? Notebook code must be runnable! Ideally, all answers will be the output of a code cell.
? You must use Python 3 to complete the assignments. Feel free to use NumPy and pandas as
you find it fit. Use SciPy, scikit-learn, and other non-standard libraries only when explicitly
allowed to do so.
? Your first executable cell should set the random seed to 1337 to ensure the reproducibility of
your results. For Numpy/SciPy and pandas, use numpy.random.seed(1337); otherwise, use
random.seed(1337).
? Document your code! Use either Markdown cells or Python comments to let us know what you
have done!
? Finally, be concise! We do not appreciate long essays that amount to basically nothing.
This assignment consists of 4 problems. Some are intended only for graduate students (those
taking CSC 503), and are labelled as such. Some contain bonus sections: you can use bonus points
to improve your overall homework score. Bonus points cannot be transferred to other assignments or
the final project. Any graduate-level problem counts as a bonus problem for undergraduate students.
Some problems are purposefully open-ended. Whatever you think the correct answer is, make
sure to support it with code and data.
2
Problem 1. The American Handwriting [40 points]
The U.S. National Institute of Standards and Technology collected digital images of the digits written
by high school students and the U.S. Census Bureau employees over the years. These images serve
as the basis of the extremely popular MNIST dataset that is commonly used to benchmark machine
learning classifiers.
Wouldn��t it be a good idea to play with that dataset? To do so, install Keras and load the MNIST
dataset as follows:
from keras.datasets import mnist
(train_X, train_y), (test_X, test_y) = mnist.load_data()
Now, let��s see how we can use neural networks to classify these images.
1. [Derivative; 5 points] Calculate the gradient of the softmax function:
f(x)i =
exi��
j e
x
j
2. [Simple; 10 points] Implement a simple one-layered neural network from scratch (using only
NumPy). The implementation should include the forward propagation (i.e. prediction), and
the backpropagation-powered gradient descent for training the network. Feel free to select
the number of nodes in the hidden layer yourself (it must be, however, greater than 10; the
recommended value is 128). Each hidden node should use the sigmoid activation function.
The output layer should use the softmax activation function to produce the final 10 values
(probabilities of each digit). You can use either classical or stochastic gradient descent. Learning
rate and the number of iterations are also up to you; you can experiment with {0.001, 0.01, 0.1}
to get a sense of the best learning rate.
Plot the network��s accuracy (or error) on test data as the number of iterations increases. Does
it keep raising (falling) with the number of gradient descent iterations?
3. [Keras; 5 points] Now use Keras to model and train the exact same network. Faster, slower?
Better?
4. [ReLU; 5 points] This time, use the ReLU activation functions instead of the sigmoids. What
happens?
5. [Dropout; 5 points] Now add two hidden layers. You should end up with a three-layer deep
neural network. Use Keras to model the network and add dropout to each layer. Further-
more, use L2 regularization for the training objective. Use cross-validation to select the best
regularization and dropout hyperparameters. How much improvement did you get?
6. [Convolution; 5 points] Use Keras to model and train a simple convolutional network with
one convolutional layer, one pooling layer, and one fully connected layer. You are free to pick
any hyperparameters you want: play with the data and provide some justification behind your
hyperparameter selection. How much improvement did you get, if any?
7. [Mugshots; 5 points] Display nine images of your choice that were consistently misclassified
by all of the above models (if there are no such images, pick at least those that were mis-
classified most of the time). You can plot an image via matplotlib.pyplot.imshow(image,
cmap=��gray��). Would you be able to classify those images yourself or not? Why?
3
It goes without saying: for each model, provide the corresponding training and test errors with
the metrics of your choice.
Bonus [5 points]: Use the leaky ReLU activation function for your artisan neural network you
made from scratch ([Simple]). Any improvements?
Problem 2. Support machines [20 points]
We have seen in the class how to use support vector machines to perform binary classification. Let��s
now see how they work in practice by playing with the sklearn��s SVM implementation.
1. [Hard; 5 points] Select all images corresponding to ones and sevens from the MNIST dataset,
and train a hard-margin SVM classifier that classifies if an image depicts 1 or 7. How good is
it? Can it even be done?
2. [Soft; 5 points] Now train a soft-margin SVM classifier for the same problem. Use cross-
validation to select the best value of C. Did you achieve better results?
3. [Kernel; 5 points] Try the following kernels with the best soft-margin model and see which
one is the best:
(a) polynomial kernel of various degrees;
(b) Gaussian kernel with various values of �� (also known as radial basis function kernel); and
(c) linear kernel.
So far, we��ve only used SVMs for binary for binary classification. But did you know that they can
be used for multiclass classification as well? One common way to do it is by using all-vs-all (AVA)
classification trick: train n(n? 1) binary classifiers fi,j that distinguish between the pairs of classes
i and j (if you have, of course, n different classes). Then, you can classify an example x as:
f(x) = arg max
i
��
j
fi,j(x).
4. [AVA; 5 points] Use the best-performing SVM model thus far to implement the multiclass
AVA classification for all digits at once. Once done, plot the confusion matrix (heatmap) of all
n2 classifiers.
Again, for each model, make sure to provide the training and test errors in the metrics of your
choice.
Problem 3. Pigs, begone! [20 points]
Let��s get rid of those annoying spam texts! First, get the SMS dataset from https://archive.ics.
uci.edu/ml/machine-learning-databases/00228/. This dataset consists of short SMS messages
and their class (spam or ham, where ham is not spam).
Represent each message as a bag of words, and use these bags of words to train your Na ive Bayes
classifier to predict if a message is a spam or not.
Use 70/30% split for the training/test data. How good is your simple classifier?
4
Problem 4. Uber-SVM [20 points; only for CSC 503]
Sometimes it is worth using different slack variables for different classes in the soft-margin SVM
formulation. For example, we will use this trick if the datasets are not balanced, or if it is more
important to correctly classify one class instead of the other. More formally, for a binary classification
problem between two classes + and ?, we would like to optimize the following function:
min
w,b
1
2��w��
2 + C+
��
i: yi is +
��i + C?
��
i: yi is ?
��i
such that yi(w>xi + b) �� 1 ? ��i and ��i �� 0 for any i. Your job is to derive the Lagrangian dual
formulation of this problem.
5
软件开发、广告设计客服
QQ:99515681
邮箱:99515681@qq.com
工作时间:8:00-23:00
微信:codinghelp
热点项目
更多
代做ceng0013 design of a pro...
2024-11-13
代做mech4880 refrigeration a...
2024-11-13
代做mcd1350: media studies a...
2024-11-13
代写fint b338f (autumn 2024)...
2024-11-13
代做engd3000 design of tunab...
2024-11-13
代做n1611 financial economet...
2024-11-13
代做econ 2331: economic and ...
2024-11-13
代做cs770/870 assignment 8代...
2024-11-13
代写amath 481/581 autumn qua...
2024-11-13
代做ccc8013 the process of s...
2024-11-13
代写csit040 – modern comput...
2024-11-13
代写econ 2070: introduc2on t...
2024-11-13
代写cct260, project 2 person...
2024-11-13
热点标签
mktg2509
csci 2600
38170
lng302
csse3010
phas3226
77938
arch1162
engn4536/engn6536
acx5903
comp151101
phl245
cse12
comp9312
stat3016/6016
phas0038
comp2140
6qqmb312
xjco3011
rest0005
ematm0051
5qqmn219
lubs5062m
eee8155
cege0100
eap033
artd1109
mat246
etc3430
ecmm462
mis102
inft6800
ddes9903
comp6521
comp9517
comp3331/9331
comp4337
comp6008
comp9414
bu.231.790.81
man00150m
csb352h
math1041
eengm4100
isys1002
08
6057cem
mktg3504
mthm036
mtrx1701
mth3241
eeee3086
cmp-7038b
cmp-7000a
ints4010
econ2151
infs5710
fins5516
fin3309
fins5510
gsoe9340
math2007
math2036
soee5010
mark3088
infs3605
elec9714
comp2271
ma214
comp2211
infs3604
600426
sit254
acct3091
bbt405
msin0116
com107/com113
mark5826
sit120
comp9021
eco2101
eeen40700
cs253
ece3114
ecmm447
chns3000
math377
itd102
comp9444
comp(2041|9044)
econ0060
econ7230
mgt001371
ecs-323
cs6250
mgdi60012
mdia2012
comm221001
comm5000
ma1008
engl642
econ241
com333
math367
mis201
nbs-7041x
meek16104
econ2003
comm1190
mbas902
comp-1027
dpst1091
comp7315
eppd1033
m06
ee3025
msci231
bb113/bbs1063
fc709
comp3425
comp9417
econ42915
cb9101
math1102e
chme0017
fc307
mkt60104
5522usst
litr1-uc6201.200
ee1102
cosc2803
math39512
omp9727
int2067/int5051
bsb151
mgt253
fc021
babs2202
mis2002s
phya21
18-213
cege0012
mdia1002
math38032
mech5125
07
cisc102
mgx3110
cs240
11175
fin3020s
eco3420
ictten622
comp9727
cpt111
de114102d
mgm320h5s
bafi1019
math21112
efim20036
mn-3503
fins5568
110.807
bcpm000028
info6030
bma0092
bcpm0054
math20212
ce335
cs365
cenv6141
ftec5580
math2010
ec3450
comm1170
ecmt1010
csci-ua.0480-003
econ12-200
ib3960
ectb60h3f
cs247—assignment
tk3163
ics3u
ib3j80
comp20008
comp9334
eppd1063
acct2343
cct109
isys1055/3412
math350-real
math2014
eec180
stat141b
econ2101
msinm014/msing014/msing014b
fit2004
comp643
bu1002
cm2030
联系我们
- QQ: 9951568
© 2021
www.rj363.com
软件定制开发网!