Feeds
已回答
What formula is used to calculate perplexity in fitlda?
If you are asking about the 2nd output from the logp method, document log-probabilities are estimated using the Mean-Field Appro...
What formula is used to calculate perplexity in fitlda?
If you are asking about the 2nd output from the logp method, document log-probabilities are estimated using the Mean-Field Appro...
5 years 前 | 1
| 已接受
已回答
Matlab: Error using classreg.learning.FitTemplate/fit with hyperparameter optimization of SVM
You are passing ClassNames to fitcecoc - are your ClassNames a subset of all class names you have in yTrain? Train one ECOC m...
Matlab: Error using classreg.learning.FitTemplate/fit with hyperparameter optimization of SVM
You are passing ClassNames to fitcecoc - are your ClassNames a subset of all class names you have in yTrain? Train one ECOC m...
6 years 前 | 0
| 已接受
已回答
How are the automatic values of hyper-parameters in Matlab Regression Learner determined?
If you type edit classreg.learning.svmutils.optimalKernelScale in your MATLAB session and hit Return, the editor will brin...
How are the automatic values of hyper-parameters in Matlab Regression Learner determined?
If you type edit classreg.learning.svmutils.optimalKernelScale in your MATLAB session and hit Return, the editor will brin...
6 years 前 | 0
已回答
Perform Naive-Bayes classification(fitcnb) with non-zero off-diagonal covariance matrix
To estimate covariance per class, use |fitcdiscr| with discriminant type 'quadratic'.
Perform Naive-Bayes classification(fitcnb) with non-zero off-diagonal covariance matrix
To estimate covariance per class, use |fitcdiscr| with discriminant type 'quadratic'.
7 years 前 | 0
已回答
Lack of fit with fitrlinear on multivariate data (version 2016a and later)
Your test set has floor(74*0.05)=3 observations. You can't measure error of any model on such a tiny test set.
Lack of fit with fitrlinear on multivariate data (version 2016a and later)
Your test set has floor(74*0.05)=3 observations. You can't measure error of any model on such a tiny test set.
7 years 前 | 0
已回答
To calculate mahalanobis distance when the number of observations are less than the dimension
For classification, use regularized discriminant or pseudo discriminant. Both options are supported in |fitcdiscr|. Regularizati...
To calculate mahalanobis distance when the number of observations are less than the dimension
For classification, use regularized discriminant or pseudo discriminant. Both options are supported in |fitcdiscr|. Regularizati...
7 years 前 | 0
已回答
TreeBagger - Random forest
Take one observation and compute prediction for that observation. Then replace some predictor in that observation with NaN and r...
TreeBagger - Random forest
Take one observation and compute prediction for that observation. Then replace some predictor in that observation with NaN and r...
7 years 前 | 0
已回答
kmeans appear to miss obvious clusters
Do this (assuming there are no nan's in X): [cidx3,cmeans2] = kmeans(zscore(X),3,'dist','cosine','display','iter'); Did ...
kmeans appear to miss obvious clusters
Do this (assuming there are no nan's in X): [cidx3,cmeans2] = kmeans(zscore(X),3,'dist','cosine','display','iter'); Did ...
7 years 前 | 1
| 已接受
已回答
Help assess my Random Forest work and work on feature selection
First, you would increase your chance of getting a useful reply if you simplified the problem. Your code and your question are r...
Help assess my Random Forest work and work on feature selection
First, you would increase your chance of getting a useful reply if you simplified the problem. Your code and your question are r...
7 years 前 | 1
已回答
Understanding the equations behind the 'logistic' learner when using fitclinear
<https://www.mathworks.com/help/stats/classificationlinear.predict.html#bu4z0pc-6 predict help>: If the linear classification...
Understanding the equations behind the 'logistic' learner when using fitclinear
<https://www.mathworks.com/help/stats/classificationlinear.predict.html#bu4z0pc-6 predict help>: If the linear classification...
7 years 前 | 0
| 已接受
已回答
Nonlinear regression with categorical predictor?
Unless I misunderstood your dot notation, the problem is ill-defined. It has an infinite number of solutions. Rewrite it in this...
Nonlinear regression with categorical predictor?
Unless I misunderstood your dot notation, the problem is ill-defined. It has an infinite number of solutions. Rewrite it in this...
7 years 前 | 2
已回答
What does it mean for a tree in a TreeBagger ensemble to have to have >80% error? What is the best way to reduce error?
It's hard to identify the source of discrepancy without understanding what the package at that link does and how you used it. Ho...
What does it mean for a tree in a TreeBagger ensemble to have to have >80% error? What is the best way to reduce error?
It's hard to identify the source of discrepancy without understanding what the package at that link does and how you used it. Ho...
7 years 前 | 0
| 已接受
已回答
помогите с загрузкой Matlab
Алексей, на этом форуме редко кто говорит по-русски. По вопросам установки лучше всего обращаться в техническую поддержку в стра...
помогите с загрузкой Matlab
Алексей, на этом форуме редко кто говорит по-русски. По вопросам установки лучше всего обращаться в техническую поддержку в стра...
7 years 前 | 1
已回答
How to use svm in Matlab for my binary feature vector.
You most certainly do not need as many samples as you have features. Statements like "you need at least 6 times the number of ca...
How to use svm in Matlab for my binary feature vector.
You most certainly do not need as many samples as you have features. Statements like "you need at least 6 times the number of ca...
7 years 前 | 0
已回答
Why is SVM performance with small random datasets so high?
Let me make sure I got your procedure right. You apply M models to a dataset and measure their accuracies by cross-validation. E...
Why is SVM performance with small random datasets so high?
Let me make sure I got your procedure right. You apply M models to a dataset and measure their accuracies by cross-validation. E...
7 years 前 | 0
| 已接受
已回答
Classifier cross validation on grouped observations with different class ratio's
I haven't understood what you mean by "performing cross validation on 'grouped' observations. Where the patient ID would corresp...
Classifier cross validation on grouped observations with different class ratio's
I haven't understood what you mean by "performing cross validation on 'grouped' observations. Where the patient ID would corresp...
8 years 前 | 0
已回答
Why is SVM performance with small random datasets so high?
You have 12 observations. For each observation, the probability of correct classification is 0.5. What is the probability of cla...
Why is SVM performance with small random datasets so high?
You have 12 observations. For each observation, the probability of correct classification is 0.5. What is the probability of cla...
8 years 前 | 0
已回答
Feature selection to perform classification using Multinomial Logistic Regression
You describe a procedure for selecting a set of features at fixed hyperparameter values. You do not say what you do, if anything...
Feature selection to perform classification using Multinomial Logistic Regression
You describe a procedure for selecting a set of features at fixed hyperparameter values. You do not say what you do, if anything...
8 years 前 | 0
已回答
Feature selection to perform classification using Multinomial Logistic Regression
You should not use a linear model for feature selection and a nonlinear model for classification on the selected features. If yo...
Feature selection to perform classification using Multinomial Logistic Regression
You should not use a linear model for feature selection and a nonlinear model for classification on the selected features. If yo...
8 years 前 | 0
| 已接受
已回答
How to solve the “out of memory” problem in Logistic Regression achieved by “glmfit”
Logistic regression on tall (out-of-memory) arrays is supported in 16b through the |fitglm| function.
How to solve the “out of memory” problem in Logistic Regression achieved by “glmfit”
Logistic regression on tall (out-of-memory) arrays is supported in 16b through the |fitglm| function.
8 years 前 | 1
已回答
fitcsvm with identical variables gives different result on different machines
My guess is that gradients for two or more observations become equal within floating-point accuracy during optimization. The sol...
fitcsvm with identical variables gives different result on different machines
My guess is that gradients for two or more observations become equal within floating-point accuracy during optimization. The sol...
8 years 前 | 1
| 已接受
已回答
How to get the mean of ROC curves using Matlab?
Use |perfcurve|. Take a look at this piece of <http://www.mathworks.com/help/stats/perfcurve.html#bupt4p4-3 documentation>. Pass...
How to get the mean of ROC curves using Matlab?
Use |perfcurve|. Take a look at this piece of <http://www.mathworks.com/help/stats/perfcurve.html#bupt4p4-3 documentation>. Pass...
8 years 前 | 0
已回答
statistical significance at the 95% confidence level
The reference is T. Dietterich. Approximate statistical tests for comparing supervised classification learning algorithms....
statistical significance at the 95% confidence level
The reference is T. Dietterich. Approximate statistical tests for comparing supervised classification learning algorithms....
8 years 前 | 2
| 已接受
已回答
How can I change the properties of a classification model template?
modelTemplate.ModelParams.BoxConstraint = 100; This is undocumented and can change in a future release.
How can I change the properties of a classification model template?
modelTemplate.ModelParams.BoxConstraint = 100; This is undocumented and can change in a future release.
8 years 前 | 2
| 已接受
已回答
definition of score when using "predict" on trained adaBoostM1
Going to MATLAB online doc, typing 'AdaBoost" in the search box and then selecting the 3rd match brings me to this page: http...
definition of score when using "predict" on trained adaBoostM1
Going to MATLAB online doc, typing 'AdaBoost" in the search box and then selecting the 3rd match brings me to this page: http...
8 years 前 | 0
已回答
Matlab SVM linear binary classification failure
This is a consequence of the data being poorly scaled. Do std(m3) and observe that the standard deviations of the two predictors...
Matlab SVM linear binary classification failure
This is a consequence of the data being poorly scaled. Do std(m3) and observe that the standard deviations of the two predictors...
8 years 前 | 0
| 已接受
已回答
perfcurve and ROC curve
Q must be classification scores. What you put in Q sounds more like what perfcurve should return as output. Take a classifier fr...
perfcurve and ROC curve
Q must be classification scores. What you put in Q sounds more like what perfcurve should return as output. Take a classifier fr...
8 years 前 | 0
已回答
One standard error rule, classification in Matlab
The purpose is to simplify the tree without losing too much accuracy. One standard error is a heuristic rule. If one number is w...
One standard error rule, classification in Matlab
The purpose is to simplify the tree without losing too much accuracy. One standard error is a heuristic rule. If one number is w...
8 years 前 | 0
| 已接受
已回答
How can I make a decision stump using a decision tree ?
Use fitctree(X,Y,'minparent',size(X,1),'prune','off','mergeleaves','off')
How can I make a decision stump using a decision tree ?
Use fitctree(X,Y,'minparent',size(X,1),'prune','off','mergeleaves','off')
9 years 前 | 1
| 已接受
已回答
setting OOBPredictorImportance to 'on' or true generates error
You are likely using an older release of MATLAB. Refer to the doc for that release or type 'help TreeBagger.TreeBagger' to see p...
setting OOBPredictorImportance to 'on' or true generates error
You are likely using an older release of MATLAB. Refer to the doc for that release or type 'help TreeBagger.TreeBagger' to see p...
9 years 前 | 1