search for: hyperplanes

Displaying 20 results from an estimated 33 matches for "hyperplanes".

Did you mean: hyperplane
2005 May 12
2
SVM linear kernel and SV
Dear all, I'm a trainee statistician in a company and we'd like to understand svm mechanism, at first with simple examples. I use e1071 package and I have several questions. I'm working with data extracted from cats data (from R). My dataset corresponds to a completely separable case with a binary response variable ( Sex with 2 levels: F and M), two explanatory variables (Bwt
2009 Jan 10
1
how to get the signed distance in SVM?
Dear all, In the svm() function in the package e1071, is there anyway to get the signed distance of a sample point to the separating hyperplane in the feature space? Namely, if the separating hyperplane is given by f(x) = h(x)^T * w - rho, is there any way to get f(x)? Also, in the returned values of the function svm(), what does "$coefs" mean? It is said to be the
2020 Oct 27
3
R for-loop to add layer to lattice plot
Hello, I am using e1071 to run support vector machine. I would like to plot the data with lattice and specifically show the hyperplanes created by the system. I can store the hyperplane as a contour in an object, and I can plot one object at a time. Since there will be thousands of elements to plot, I can't manually add them one by one to the plot, so I tried to loop into them, but only the last is added. Here it the working ex...
2004 Dec 17
3
How to interpret and modify "plot.svm"?
Dear R people, I am trying to plot the results from running svm in library(e1071). I use plot.svm. After searching through the help archives and FAQ, I still have several questions: 1. In default, crosses indicate support vectors. But why are there two colors of crosses? What do they represent? 2. I want to draw a white-gray colored plot and modify the different colored crosses or circles by
2020 Oct 28
0
R for-loop to add layer to lattice plot
On Tue, Oct 27, 2020 at 6:04 PM Luigi Marongiu <marongiu.luigi at gmail.com> wrote: > > Hello, > I am using e1071 to run support vector machine. I would like to plot > the data with lattice and specifically show the hyperplanes created by > the system. > I can store the hyperplane as a contour in an object, and I can plot > one object at a time. Since there will be thousands of elements to > plot, I can't manually add them one by one to the plot, so I tried to > loop into them, but only the last is adde...
2003 Sep 14
3
Re: Logistic Regression
Christoph Lehman had problems with seperated data in two-class logistic regression. One useful little trick is to penalize the logistic regression using a quadratic penalty on the coefficients. I am sure there are functions in the R contributed libraries to do this; otherwise it is easy to achieve via IRLS using ridge regressions. Then even though the data are separated, the penalized
2006 Jan 04
2
Looking for packages to do Feature Selection and Classification
Hi All, Sorry if this is a repost (a quick browse didn't give me the answer). I wonder if there are packages that can do the feature selection and classification at the same time. For instance, I am using SVM to classify my samples, but it's easy to get overfitted if using all of the features. Thus, it is necessary to select "good" features to build an optimum hyperplane (?).
2004 May 11
1
AW: Probleme with Kmeans...
Sorry, to solve your question I had tried: data(faithful) kmeans(faithful[c(1:20),1],10) Error: empty cluster: try a better set of initial centers But when I run this a second time it will be ok. It seems, that kmeans has problems to initialize good starting points, because of the random choose of these starting initial points. With kmeans(data,k,centers=c(...) the problem can be solved.
2018 Apr 20
1
Further questions
Hi R folks, In my previous post I forgot to mention that I was new to R. I was really grateful for your quick help. I have two further questions: 1) In the graph of a regression line I would like to show one specific residual yi obs - yi pred (let's take the person whose residual is 76). How do I add a bracket to this vertical distance and name it? I'am getting stuck after the
2012 Jul 25
3
Plotting LDA results
Dear Users! I think I still have some problems in understanding LDA and the methods of plotting the results. The case is the following: I'm having a dataset containing two classes where each datapoint has 19 dimensions. Training with lda(...) works fine, and I'm getting 19 LD coefficients. So far so good. Now I want to visualize the result, and here is where my simple knowledge ends. What
2002 Aug 20
0
Re: SVM questions
> > So i guess from your prev. email the svmModel$coefs correspond to the > "Alpha" . yes (times the sign of y!). > > Why do I see three columns in the coefs?( Is this the number of classes -1 > = Numbe of hyperplanes) yes, but in a packed format which is not trivial. I attach some explanation I sent to R-help some time ago (the guy wanted to write his own predict method). > Is there a way to determine from the SV's key components of which in this > case would be which of the 2000 genes helped in th...
2020 Oct 23
2
How to shade area between lines in ggplot2
also from this site: https://plotly.com/ggplot2/geom_ribbon/ I get the answer is geom_ribbon but I am still missing something ``` #! plot p = ggplot(data = trainset, aes(x=x, y=y, color=z)) + geom_point() + scale_color_manual(values = c("red", "blue")) # show support vectors df_sv = trainset[svm_model$index, ] p = p + geom_point(data = df_sv, aes(x=x, y=y),
2020 Oct 23
2
How to shade area between lines in ggplot2
Thank you, but this split the area into two and distorts the shape of the plot. (compared to ``` p + geom_abline(slope = slope_1, intercept = intercept_1 - 1/w[2], linetype = "dashed", col = "royalblue") + geom_abline(slope = slope_1, intercept = intercept_1 + 1/w[2], linetype = "dashed", col = "royalblue") ``` Why there
2020 Oct 23
0
How to shade area between lines in ggplot2
Hi What about something like p+geom_ribbon(aes(ymin = slope_1*x + intercept_1 - 1/w[2], ymax = slope_1*x + intercept_1 + 1/w[2], fill = "grey70", alpha=0.1)) Cheers Petr > -----Original Message----- > From: Luigi Marongiu <marongiu.luigi at gmail.com> > Sent: Friday, October 23, 2020 11:11 AM > To: PIKAL Petr <petr.pikal at precheza.cz> > Cc: r-help
2020 Oct 26
0
How to shade area between lines in ggplot2
Hi Put fill outside aes p+geom_ribbon(aes(ymin = slope_1*x + intercept_1 - 1/w[2], ymax = slope_1*x + intercept_1 + 1/w[2]), fill = "blue", alpha=0.1) The "hole" is because you have two levels of data (red and blue). To get rid of this you should put new data in ribbon call. Something like newdat <- trainset newdat$z <- factor(0) p+geom_ribbon(data=newdat, aes(ymin =
2004 Dec 01
2
2.0.1 compilation problem on Fedora Core 2
I have a compilation problem on FC2, 2xXeon box. The following dialogue output from the end of the compilation illustrates: [very large snipping sound ...] * DONE (cluster) begin installing recommended package foreign make[2]: *** [foreign.ts] Error 1 make[2]: Leaving directory `/usr/src/redhat/SOURCES/R-2.0.1/src/library/Recommended' make[1]: *** [recommended-packages] Error 2 make[1]:
2007 Jul 13
0
convhulln {geometry} output from .call
...8 qhull warning: joggle ('QJ') always produces simplicial output. Triangulated output ('Qt') does nothing. Convex hull of 10 points in 3-d: Number of vertices: 7 Number of facets: 10 Statistics for: | qhull s Qt Tcv QJ Number of points processed: 7 Number of hyperplanes created: 16 Number of distance tests for qhull: 39 CPU seconds to compute hull (after input): 0 Input joggled by: 6.9e-11 Output completed. Verifying that all points are below 6.9e-15 of all facets. Will make 100 distance computations. > -- Daniel E. Bunker TraitNet Associate Di...
2010 Jun 11
1
Decision values from KSVM
Hi, I'm working on a project using the kernlab library. For one phase, I want the "decision values" from the SVM prediction, not the class label. the e1071 library has this function, but I can't find the equivalent in ksvm. In general, when an SVM is used for classification, the label of an unknown test-case is decided by the "sign" of its resulting value as
2010 Feb 23
0
BUG with LSSVM in R:
Hello, I have noticed a bug with LSSVM implementation in R. It could be a bug with the LSSVM itself that causes this problem. I thought I should post this message to see if anyone else is familiar with this problem and explain why the result is different for odd and even number of cases. Once the hyperplane is found using LSSVM, the prediction results vary when you predict odd or even number of
2012 Jul 26
0
lda, collinear variables and CV
Dear R-help list, apparently lda from the MASS package can be used in situations with collinear variables. It only produces a warning then but at least it defines a classification rule and produces results. However, I can't find on the help page how exactly it does this. I have a suspicion (it may look at the hyperplane containing the class means, using some kind of default/trivial