Displaying 20 results from an estimated 33 matches for "hyperplan".
Did you mean:
hyperplane
2005 May 12
2
SVM linear kernel and SV
...ted from cats data (from R). My dataset corresponds to a completely
separable case with a binary response variable ( Sex with 2 levels: F and
M), two explanatory variables (Bwt and Hwt) and the classes are balanced.
I've used svm() with a linear kernel and I'd like to plot the linear
hyperplane and the support vectors. I use plot.svm() and, according to me,
I would have found aligned support vectors (because the hyperplane is
linear) for each class but it wasn't the case. Could you explain me why ?
In addition, when I change the option 'scale' (from TRUE to FALSE) the
res...
2009 Jan 10
1
how to get the signed distance in SVM?
Dear all,
In the svm() function in the package e1071, is there anyway to get the
signed distance of a sample point to the separating hyperplane in the
feature space? Namely, if the separating hyperplane is given by f(x) =
h(x)^T * w - rho, is there any way to get f(x)?
Also, in the returned values of the function svm(), what does "$coefs"
mean? It is said to be the "corresponding coefficients times the
training labels&...
2020 Oct 27
3
R for-loop to add layer to lattice plot
Hello,
I am using e1071 to run support vector machine. I would like to plot
the data with lattice and specifically show the hyperplanes created by
the system.
I can store the hyperplane as a contour in an object, and I can plot
one object at a time. Since there will be thousands of elements to
plot, I can't manually add them one by one to the plot, so I tried to
loop into them, but only the last is added.
Here it the working...
2004 Dec 17
3
How to interpret and modify "plot.svm"?
...ult, crosses indicate support vectors. But why are there
two colors of crosses? What do they represent?
2. I want to draw a white-gray colored plot and modify the different
colored crosses or circles by different shaped points. Could anyone
give me a hint?
3. Is it possible for me to draw a "hyperplane" on the plot?
4. What is the algorithm to plot the contour region?
Thank you very much,
Frank
2020 Oct 28
0
R for-loop to add layer to lattice plot
On Tue, Oct 27, 2020 at 6:04 PM Luigi Marongiu <marongiu.luigi at gmail.com> wrote:
>
> Hello,
> I am using e1071 to run support vector machine. I would like to plot
> the data with lattice and specifically show the hyperplanes created by
> the system.
> I can store the hyperplane as a contour in an object, and I can plot
> one object at a time. Since there will be thousands of elements to
> plot, I can't manually add them one by one to the plot, so I tried to
> loop into them, but only the last is ad...
2003 Sep 14
3
Re: Logistic Regression
...erwise it is easy to achieve via IRLS
using ridge regressions. Then even though the data are separated, the penalized log-likelihood
has a unique maximum. One intriguing feature is that as the penalty parameter goes to zero,
the solution converges to the SVM solution - i.e. the optimal separating hyperplane
see http://www-stat.stanford.edu/~hastie/Papers/margmax1.ps
--------------------------------------------------------------------
Trevor Hastie hastie@stanford.edu
Professor, Department of Statistics, Stanford University
Phone: (650) 725-2231 (Statistics)...
2006 Jan 04
2
Looking for packages to do Feature Selection and Classification
...I wonder if there are packages that can do the feature selection and
classification at the same time. For instance, I am using SVM to classify my
samples, but it's easy to get overfitted if using all of the features. Thus,
it is necessary to select "good" features to build an optimum hyperplane
(?). Here is a simple example: Suppose I have 100 "useful" features and 100
"useless" features (or noise features), I want the SVM to give me the
same results when 1) using only 100 useful features or 2) using all 200
features.
Any suggestions or point me to a reference?
Than...
2004 May 11
1
AW: Probleme with Kmeans...
...I run this a second time it will be ok.
It seems, that kmeans has problems to initialize good starting points, because of the random choose of these starting initial points.
With kmeans(data,k,centers=c(...) the problem can be solved.
Generally, the starting points can be choose equidistant on a hyperplane of the data, which is also a simple way to get the intitial points
(www.fuzzyclustering.de , fc-package of H??ppner, manual).
Thank you for your comment,
Matthias
-----Urspr??ngliche Nachricht-----
Von: Unung Istopo Hartanto [mailto:unung at enciety.com]
Gesendet: Dienstag, 11. Mai 2004 16:...
2018 Apr 20
1
Further questions
...quot;)
abline(SimpleLinearReg1, col="red")
res <- signif(residuals(SimpleLinearReg1), 2)
pre <- predict(SimpleLinearReg1)
segments(Age, BloodPressure, Age, pre, col="red")
library(calibrate)
textxy(Age, BloodPressure, res, cex=0.7) 2)
2) I also need your help plotting a hyperplan for a multiple regression and here I am really stuck...
inc <- c(25000, 28000, 17500, 30000, 25000, 32000, 30000, 29000, 26000, 40000)
age <- c(60, 36, 21, 35, 33, 43, 29, 45, 41, 48)
educ <- c(12, 12, 13, 16, 16, 12, 13, 15, 15, 20)
MultLinearReg1 <- lm(inc ~ age + educ)
summary(MultLi...
2012 Jul 25
3
Plotting LDA results
...h datapoint has 19 dimensions. Training with lda(...) works fine, and I'm
getting 19 LD coefficients. So far so good. Now I want to visualize the
result, and here is where my simple knowledge ends. What I simply want to do
is doing a scatterplot on two dimensions and then plotting the projected
hyperplane in order to see the dividing line. The scatterplot I'm doing with
plot(class1[1:100,1], class1[1:100,2])
points(class2[1:100,1], class2[1:100,2])
but now I face with the problem of determining the intercept for abline(...)
(the slope should be -LD[1]/LD[2] I think...).
What I really don...
2002 Aug 20
0
Re: SVM questions
>
> So i guess from your prev. email the svmModel$coefs correspond to the
> "Alpha" .
yes (times the sign of y!).
>
> Why do I see three columns in the coefs?( Is this the number of classes -1
> = Numbe of hyperplanes)
yes, but in a packed format which is not trivial.
I attach some explanation I sent to R-help some time ago (the guy wanted
to write his own predict method).
> Is there a way to determine from the SV's key components of which in this
> case would be which of the 2000 genes helped in...
2020 Oct 23
2
How to shade area between lines in ggplot2
...= trainset, aes(x=x, y=y, color=z)) +
geom_point() + scale_color_manual(values = c("red", "blue"))
# show support vectors
df_sv = trainset[svm_model$index, ]
p = p + geom_point(data = df_sv, aes(x=x, y=y),
color="purple", size=4, alpha=0.5)
# show hyperplane (decision boundaries are off set by 1/w[2])
w = t(svm_model$coefs) %*% svm_model$SV # %*% = matrix multiplication
slope_1 = -w[1]/w[2]
intercept_1 = svm_model$rho / w[2]
p = p + geom_abline(slope = slope_1, intercept = intercept_1, col =
"royalblue4")
p = p + geom_ribbon(aes(ymin=in...
2020 Oct 23
2
How to shade area between lines in ggplot2
...gt; > geom_point() + scale_color_manual(values = c("red", "blue")) # show
> > support vectors df_sv = trainset[svm_model$index, ] p = p +
> > geom_point(data = df_sv, aes(x=x, y=y),
> > color="purple", size=4, alpha=0.5) # show hyperplane
> > (decision
> > boundaries are off set by 1/w[2]) w = t(svm_model$coefs) %*%
> > svm_model$SV # %*% = matrix multiplication
> > slope_1 = -w[1]/w[2]
> > intercept_1 = svm_model$rho / w[2]
> > p = p + geom_abline(slope = slope_1, intercept = intercept_1, col =...
2020 Oct 23
0
How to shade area between lines in ggplot2
..., y=y, color=z)) +
> geom_point() + scale_color_manual(values = c("red", "blue")) # show
> support vectors df_sv = trainset[svm_model$index, ] p = p +
> geom_point(data = df_sv, aes(x=x, y=y),
> color="purple", size=4, alpha=0.5) # show hyperplane
> (decision
> boundaries are off set by 1/w[2]) w = t(svm_model$coefs) %*%
> svm_model$SV # %*% = matrix multiplication
> slope_1 = -w[1]/w[2]
> intercept_1 = svm_model$rho / w[2]
> p = p + geom_abline(slope = slope_1, intercept = intercept_1, col =
> "royalblue4")...
2020 Oct 26
0
How to shade area between lines in ggplot2
...e_color_manual(values = c("red", "blue")) #
> > > show support vectors df_sv = trainset[svm_model$index, ] p = p +
> > > geom_point(data = df_sv, aes(x=x, y=y),
> > > color="purple", size=4, alpha=0.5) # show
> > > hyperplane (decision boundaries are off set by 1/w[2]) w =
> > > t(svm_model$coefs) %*% svm_model$SV # %*% = matrix multiplication
> > > slope_1 = -w[1]/w[2]
> > > intercept_1 = svm_model$rho / w[2]
> > > p = p + geom_abline(slope = slope_1, intercept = intercept_1, col =...
2004 Dec 01
2
2.0.1 compilation problem on Fedora Core 2
I have a compilation problem on FC2, 2xXeon box.
The following dialogue output from the end of the compilation illustrates:
[very large snipping sound ...]
* DONE (cluster)
begin installing recommended package foreign
make[2]: *** [foreign.ts] Error 1
make[2]: Leaving directory
`/usr/src/redhat/SOURCES/R-2.0.1/src/library/Recommended'
make[1]: *** [recommended-packages] Error 2
make[1]:
2007 Jul 13
0
convhulln {geometry} output from .call
...8
qhull warning: joggle ('QJ') always produces simplicial output.
Triangulated output ('Qt') does nothing.
Convex hull of 10 points in 3-d:
Number of vertices: 7
Number of facets: 10
Statistics for: | qhull s Qt Tcv QJ
Number of points processed: 7
Number of hyperplanes created: 16
Number of distance tests for qhull: 39
CPU seconds to compute hull (after input): 0
Input joggled by: 6.9e-11
Output completed. Verifying that all points are below 6.9e-15 of
all facets. Will make 100 distance computations.
>
--
Daniel E. Bunker
TraitNet Associate...
2010 Jun 11
1
Decision values from KSVM
...svm.
In general, when an SVM is used for classification, the label of an
unknown test-case is decided by the "sign" of its resulting value as
calculated from the SVM {-,+}
I want the actual values as a proximal representation of the "strength"
of the decision. (Further from the hyperplane indicates more confidence)
Any suggestions?
2010 Feb 23
0
BUG with LSSVM in R:
...ello,
I have noticed a bug with LSSVM implementation in R. It could be a bug with
the LSSVM itself that causes this problem.
I thought I should post this message to see if anyone else is familiar with
this problem and explain why the result is different for odd and even number
of cases.
Once the hyperplane is found using LSSVM, the prediction results vary when
you predict odd or even number of samples. Why? Here I provide e.g. with
Iris data in R, keep reducing prediction cases one-by-one, you will see the
discrepancy I am talking about. In my own data, this discrepancy between odd
and even number o...
2012 Jul 26
0
lda, collinear variables and CV
...pparently lda from the MASS package can be used in situations with
collinear variables. It only produces a warning then but at least it
defines a classification rule and produces results.
However, I can't find on the help page how exactly it does this. I have a
suspicion (it may look at the hyperplane containing the class means,
using some kind of default/trivial within-group covariance matrix) but I'd
like to know in detail if possible.
I find particularly puzzling that it produces different
results whether I choose CV=TRUE or I run a manual LOO cross-validation.
Constructing an examp...