similar to: Problems with hclust and/or cutree.

Displaying 20 results from an estimated 7000 matches similar to: "Problems with hclust and/or cutree."

2011 Sep 12
1
hclust and cutree: identifying branches as classes
Good afternoon, After cuting a hierarchical tree using cutree(), how to check correspondances between classes and branches? This is what we do: srndpchc <- hclust(dist(srndpc$x[1:1000,1:3]),method="ward") #creation of hierarchical tree plclust(srndpchc,hmin=20000) #visualisation srndpchc20000 = cutree(srndpchc,h=20000) #returns 4 classes table(srndpchc20000 ) srndclass20000 =
2012 Mar 29
2
hclust and plot functions work, cutree does not
Hi, I have the distance matrix computed and I feed it to hclust function. The plot function produces a dense dendrogram as well. But, the cutree function applied does not produce the desired list. Here is the code x=data.frame(similarity_matrix) colnames(x) = c(source_tags_vec) rownames(x) = c(source_tags_vec) clust_tree=hclust(as.dist(x),method="complete") plot(clust_tree)
2011 Sep 16
1
cutree() and rect.hclust(): different labelling of classes
I've found that while cutree() and rect.hclust() make the same classes for a given height in the dendrogram, the actual labeling of the classes is different. For example, both produce the same 4 classes but class 1 according to cutree() is class 4 according to rect.hclust(). Would it be possible that future versions provide the same labeling? rect.hclust() is useful to display the classes
2009 Sep 21
0
Help needed to clarify hclust and cutree algorithms
Dear R Helpers, I read carefully the documentation and all postings on the hclust and cutree functions, however some aspects of the tree ordering and cluster assignment performed by these functions remain unclear to me, so I would very much appreciate your help in making sure I get them right. Here is an example, with values chosen to illustrate the problems. I have a set of five profiles
2011 Sep 13
2
help with hclust and cutree
Hello, I would like to cut a hclust tree into several groups at a specific similarity. I assume this can be achieved by specifying the "h" argument with the specified similarity, e.g.: clust<-hclust(dist,"average") cut<-cutree(clust,h=0.65) Now, I would like to draw rectangles around the branches of the dendrogram highlighting the corresponding clusters, as is done by
2013 Aug 22
1
Interpreting the result of 'cutree' from hclust/heatmap.2
I have the following code that perform hiearchical clustering and plot them in heatmap. __ library(gplots) set.seed(538) # generate data y <- matrix(rnorm(50), 10, 5, dimnames=list(paste("g", 1:10, sep=""), paste("t", 1:5, sep=""))) # the actual data is much larger that the above # perform hiearchical clustering and plot heatmap test <- heatmap.2(y)
2007 Mar 02
0
Dice dissimilarity output and 'phylo' function in R
Dear All, I get some problems using the 'phylo' and dissimilarity functions in R. I converted an output from 'hclust' into an order of phylo so as to be able to use the 'consensus' function on it. Each time I submit the consensus codes, my computer hangs. When I tried to see what the contents of the object converted into order phylo is, I get the message
2009 Mar 02
0
Distance between clusters
Dear friends I reformulate the question. I think I did not formulate it properly. I have some data on some sites. I can define a dissimilarity between each pair of sites. Using this dissimilarity, I have clustered the sites using the hclust algorithm, with method ward. I then obtain 48 clusters, by cutting the tree using cutree with k=48. I would now like to estimate the distance between
2009 Mar 02
0
Distance between clusters
Dear friends I reformulate the question. I think I did not formulate it properly. I have some data on some sites. I can define a dissimilarity between each pair of sites. Using this dissimilarity, I have clustered the sites using the hclust algorithm, with method ward. I then obtain 48 clusters, by cutting the tree using cutree with k=48. I would now like to estimate the distance between
2003 Dec 11
1
cutree with agnes
Hi, this is rather a (presumed) bug report than a question because I can solve my personal statistical problem by working with hclust instead of agnes. I have done a complete linkage clustering on a dist object dm with 30 objects with agnes (R 1.8.0 on RedHat) and I want to obtain the partition that results from a cut at height=0.4. I run > cl1a <- agnes(dm, method="complete")
2003 Dec 11
1
cutree with agnes
Hi, this is rather a (presumed) bug report than a question because I can solve my personal statistical problem by working with hclust instead of agnes. I have done a complete linkage clustering on a dist object dm with 30 objects with agnes (R 1.8.0 on RedHat) and I want to obtain the partition that results from a cut at height=0.4. I run > cl1a <- agnes(dm, method="complete")
2015 Jun 06
2
Request: making cutree S3 in R?
Hello all, A question/suggestion: I was wondering if there is a chance of changing stats::cutree to be S3 and use cutree.hclust? For example: cutree <- function(tree, k = NULL, h = NULL,...) { UseMethod("cutree") } cutree.hclust <- stats::cutree # This will obviously need the actual content of stats::cutree This would be nicer for people like me to add new methods to
2012 Oct 11
2
extracting groups from hclust() for a very large matrix
Hello, I'm having trouble figuring out how to see resulting groups (clusters) from my hclust() output. I have a very large matrix of 4371 plots and 29 species, so simply looking at the graph is impossible. There must be a way to 'print' the results to a table that shows which plots were in what group, correct? I've attached the matrix I'm working with (the whole thing
2012 Aug 12
0
Different cluster orderings from cutree() and cut.dendrogram()
Hi! I just discovered that cutree() and cut.dendrogram() do not assign the same cluster numberings when called on the same tree. More specifically, cutree() assigns cluster numbers by order of appearance in the data, while cut.dendrogram() sorts clusters by height (see example below). I guess this is for historical reasons? I'm hit by this difference when I want to get a vector of cluster
2014 Jul 25
0
clustering with hclust
Hi everybody, I have a problem with a cluster analysis. I am trying to use hclust, method=ward. The Ward method works with SQUARED Euclidean distances. Hclust demands "a dissimilarity structure as produced by dist". Yet, dist does not seem to produce a table of squared euclidean distances, starting from cosines. In fact, computing manually the squared euclidean distances from cosines
2005 Sep 15
2
about cutree
Hi Everyone, I'm trying to use cutree to get the clusters after hclust. What I used is: mycluster<-cutree(cnclust,h=0.5) Now, my problem is, how can I get the actual clusters? Thanks! Best, Baoqiang Cao
2010 Sep 22
0
How to Ignore NaN values in Rows when using hclust function in making Heatmap??
I am making heatmaps for a dataset (~ 300*600 matrix) with the following R script (I am not familiar with R and this is the first time I am using it). library("gplots") library("Cairo") mydata <- read.csv(file="data.csv", header=TRUE, sep=",") rownames(mydata)=mydata$Name mydata <- mydata[,2:297] mydatamatrix <- data.matrix(mydata) mydatascale
2001 Aug 22
1
cutree (PR#1067)
Full_Name: Anja von Heydebreck Version: 1.3.0 OS: Alpha Unix Submission from: (NULL) (141.14.19.61) Hi, I repeatedly obtained meaningless results from the function 'cutree' in the 'mva' package, when the argument 'h' was greater or equal to the maximum height occuring: > library('mva') > y [,1] [,2] [,3] [,4] [1,] 0 1 -1 1 [2,] 0 -1
2007 Nov 27
2
exporting clustering results to table
Hello list, the following approach did not work: clustersA <- pam(distances, nkA, diss=TRUE); gc(); filenameclu = paste("filenameclu", ".txt"); write.table(clustersA , file=filenameclu,sep=","); although it worked with clustersA <- hclust(distances, method="ward"); and a consecutive kclassA <- cutree(clustersA, k=nkA); filename =
2002 Jan 22
1
cutree using a vector for h giving meaningless results
I try to use the routine cutree to cut a tree (created by hclust) into several groups by specifiing the hight of the cut. a<-1:10 cutree(tree, h=a*100) The Matrix with group meberships returned is ok for most of the hights, but in some cases (as for example h=800 and h=900) the results don't make sense (group membership=0 or 58965231, it looks like the range of data allowed by the data