Partition of data based on percentages (for cross-validation)
5 次查看(过去 30 天)
显示 更早的评论
Hi all,
I have a matrix A made of x rows and y columns.
I would like to take 80% of my matrix A based on the number of rows, and do so 5 times, so as to equally partition my data set. So A1 would be the first 80% of the rows (and all columns), etc.
I had a look at this article https://uk.mathworks.com/help/nnet/ug/divide-data-for-optimal-neural-network-training.html but I am not sure any of these functions does what I want.
Please could anyone help me?
Thanks a lot
0 个评论
回答(4 个)
Walter Roberson
2017-7-1
If you are using a neural network, then you would configure net.divideFcn to dividerand() (the default) and set net.divideParam to the percentages you want.
Otherwise, use
nrow = size(A,1);
ntrain = floor(nrow * 80/100);
train_ind = randperm(nrow, ntrain);
train_rows = A(train_ind, :);
10 个评论
Walter Roberson
2019-5-3
For Holdout, M is not the percentage to hold out: it is the fraction
If you want the train index and test logical vectors you can get those directly from crossvalind:
[trainIdx, testIdx] = crossvalind('Holdout', FeatureLabSHUFFLE, M);
MA-Winlab
2019-5-3
编辑:MA-Winlab
2019-5-3
Yes @ Walter Roberson, thank you for the note. M should be between within the rang [0 1].
One more question:
With HoldOut, we do not have a loop like the one with k-fold! but how to do cross validation?
I mean, with HoldOut, we do not repeat the partition and traing a nd testing for k times as with k-fold!
I tried it in a loop of k=5 and kept checking cp.CorrectRate and cp.LastCorrectRate. They were the same, i.e cp.CorrectRate is not rolling.
Please correct me if I mistaken.
I am saying this because I read this
Using this method within a loop is similar to using K-fold cross-validation one time outside the loop, except that nondisjointed subsets are assigned to each evaluation.
This mote is from here
Greg Heath
2017-7-2
编辑:Greg Heath
2017-7-25
1. NOTE: Contrary to most statistical regression subroutines, MATLAB Neural Network subroutines operate on COLUMN VECTORS!
2. For N O-dimensional "O"utput target vectors corresponding to N I-dimensional "I"nput vectors:
[ I N ] = size(input)
[ O N ] = size(target)
3. Correspondingly, the data in the MATLAB NN database is stored columnwise. See the results of the keyboard commands
help nndatabase
doc nndatabase
4. The MATLAB NN Toolbox DEFAULT data division ratio is 0.7/0.15/0.15 with
Ntst = floor(0.15*N)
Nval = Ntst
Ntrn = N - Nval -Ntst
4. Instead of TRYING to evenly divide the data for m-fold crossvalidation, it is far easier to just use Ntrials designs with RANDOM datadivision AND RANDOM initial weights.
5. I gave up the nitpicking index considerations of worrying about the number of times each data point was in each of the trn/val/tst subsets. If you have concerns, just increase Ntrials!
6. Somewhere in several of my NEWSGROUP and/or ANSWERS posts, I did use nitpicking XVAL index considerations. Good Luck if you want to find some. I would first search using XVAL.
Hope this helps.
Thak you for formally accepting my answer
Greg
0 个评论
Lulu Dulac
2017-7-2
4 个评论
Greg Heath
2017-7-2
编辑:Greg Heath
2017-7-2
This should help:
HITS
SEARCH NEWSGROUP ANSWERS
CROSSVAL GREG 12 14
CROSSVAL 49 114
CROSSVALIND GREG 7 12
CROSSVALIND 46 77
CVPARTITION GREG 11 14
CVPARTITION 40 106
Greg
Walter Roberson
2017-7-3
"I don't have NN toolbox"
Then that was the operative limitation, not the fact that you are not using R2017a.
In my Answer I posted code for row-wise random division without any toolboxes. I did use a syntax of randperm that did not become available until R2011a.
ranjana roy chowdhury
2019-7-14
i have a dataset of 339 * 5825,i want to initialize 4 % of the dataset values with 0 excludind the entries that have -1 in it.please help me.
2 个评论
ranjana roy chowdhury
2019-7-15
the dataset is WS Dream dataset with 339*5825.The entries have values between 0 and 0.1,few entries are -1.I want to make 96% of this dataset 0 excluding the entries having -1 in dataset.
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Gaussian Process Regression 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!