search for: probeids

Displaying 18 results from an estimated 18 matches for "probeids".

Did you mean: probeid
2010 Jul 06
2
Could not find createData function
Hi, I am using "*Maanova* package" to do anova. I have created *datafile* with probeID as the first column, which is a tab limited text file and also created *designfile*. I have created *readma object* which is named as abf1. >From that readma object, i have to create data object by using *createData*function and also i hav to create model object by using *makemodel* function,
2006 Jun 30
1
lme and SAS Proc mixed
...he same results as when using the following SAS code: proc mixed; class refseqid probeid probeno end; model expression=end logpgc / ddfm=satterth; random probeno probeid / subject=refseqid type=cs; lsmeans end / diff cl; run; There are 3 genes (refseqid) which is the large grouping factor, with 2 probeids nested within each refseqid, and 16 probenos nested within each of the probeids. I have specified in the SAS Proc Mixed procedure that the variance-covariance structure is to be compound symmetric. Therefore, the variance-covariance matrix is a block diagonal matrix of the form, V_1 0 0 0 V_...
2012 Nov 02
1
Bioconductor, merging annotation with list of probeids
Hi all, Im very new to R so please forgive my poor language! I've been trying to map on my list of probeids the relative annotation but unsuccessfully. I get this error symbols <- mget(probes,mouse4302SYMBOL,ifnotfound=NA) Error in .checkKeysAreWellFormed(keys) : keys must be supplied in a character vector with no NAs Thanks for your help! Brawni -- View this message in context: http://r.7896...
2008 Mar 02
2
Variance Calculation in R
Hello, Thanks everyone for helping me with the previous queries. step 1: Here is the orginal data: short sample ProbeID Sample_1_D Sample_1_C Sample_2_D Sample_2_C 1 224588_at 2.425509867 11.34031409 11.46868531 11.75741478 step 2: i calculate the variance of the sample using this R code x<-1:20000 y<-2:141 data.matrix<-data.matrix(data[,y])#create data.matrix
2006 Jun 30
0
SAS Proc Mixed and lme
...he same results as when using the following SAS code: proc mixed; class refseqid probeid probeno end; model expression=end logpgc / ddfm=satterth; random probeno probeid / subject=refseqid type=cs; lsmeans end / diff cl; run; There are 3 genes (refseqid) which is the large grouping factor, with 2 probeids nested within each refseqid, and 16 probenos nested within each of the probeids. I have specified in the SAS Proc Mixed procedure that the variance-covariance structure is to be compound symmetric. Therefore, the variance-covariance matrix is a block diagonal matrix of the form, V_1 0 0 0 V_...
2011 Jun 30
4
aggregating data
Hi, I am interested in using the cast function in R to perform some aggregation. I did once manage to get it working, but have now forgotten how I did this. So here is my dilemma. I have several thousands of probes (about 180,000) corresponding to each gene; what I'd like to do is obtain is a frequency count of the various occurrences of each probes for each gene. The data would look
2008 Mar 03
3
R data Export to Excel
Here is my R Code x<-1:20000 y<-2:141 data.matrix<-data.matrix(data[,y])#create data.matrix variableprobe<-apply(data.matrix[x,],1,var) variableprobe #output variance across probesets hist(variableprobe) #displaying histogram of variableprobe write.table(cbind(data[1], Variance=apply(data[,y],1,var)),file='c://variance.csv') #export as a .csv file. Output in Excel all in 1
2011 Mar 09
2
collapse a data column into a row
I have a file with a data in columnar format like below: probeID rc_AI104113_at rc_AI178259_f_at rc_AI179134_i_at rc_AI179134_f_at rc_AI104113_at rc_AA819429_f_at How can I rewrite it in the format below: 'rc_AI104113_at', 'rc_AI178259_f_at', 'rc_AI179134_i_at', 'rc_AI179134_f_at', 'rc_AI104113_at', 'rc_AA819429_f_at' Is there any function to do
2008 Mar 04
1
Export csv data
Hi Everyone, Thanks for all the help with the previous queries. Here is what i want to do. i have 20000 probesets-->calculate all the variance accross all the probesets-->filter out probesets that are low so now i ended up with only 10000. The 10000 is fine but when i export to excel, it is missing the probeID. Here are my code and examples. #########calculate the variance across the
2010 Jul 06
0
Error in createData function
Hi, I am using "*Maanova* package" to do anova. I have created *datafile* with probeID as the first column, which is a tab limited text file and also created *designfile*. I have created *readma object* which is named as abf1. >From that readma object, i have to create data object by using *createData*function and also i hav to create model object by using *makemodel* function,
2010 Jul 13
0
object of class madata
Hi, Am using maanova package for doing anova.But am getting error like this..plz, help me regarding this.. > TGR=read.madata("rmaexpr.dat",designfile="design.dat") Reading one color array. Otherwise change arrayType='twoColor' then read the data again Warning messages: 1: In read.madata("rmaexpr.dat", designfile = "design.dat") : Assume that
2009 Jun 30
1
beadarray package
Dear R users, I am using the beadarray package. I am trying to upload raw bead-level data using these commands: ######################################################## library(beadarray) datadir <- ("C:/Computer_programs/R/beadarray/cecilia") targets = read.table("targets.txt", sep = "\t", header = TRUE, as.is = TRUE) BLData = readIllumina(arrayNames =NULL,
2010 Apr 26
1
Error in pf(q, df1, df2, lower.tail, log.p) : Non-numeric argument to mathematical function
inputfille snpid indid genotype gvariable probeid gene geneexpression rs1040480 CHB_NA18524 C/T 2 GI_19743926-I PTPRT 5.850586 rs1040480 CHB_NA18526 C/C 1 GI_19743926-I PTPRT 6.028641 rs1040480 CHB_NA18529 C/C 3 GI_19743926-I PTPRT 5.944392 rs1040481 CHB_NA18532 C/C 1 GI_19743926-I PTPRT 5.938578 rs1040481 CHB_NA18537 C/C 2 GI_19743926-I PTPRT 5.874439 rs1040481 CHB_NA18540 C/C 3 GI_19743926-I
2004 Nov 12
2
Boot from CD -> system + data on USB storage
Hi, I am looking for a solution to boot MY system on any PC. To store most of the system and all of my data I want to use an USB storage (in my case an external USB harddisk (2.0 capable)). Since booting off an USB device is not an universal thing I would prefer to have a boot disk with a minimal system - just enough to load most (all?) of the system from the attached USB device. Is this an
2020 Nov 19
0
[RFC] Control Flow Sensitive AutoFDO (FS-AFDO)
Hi Rong, This is a very interesting proposal. We've also observed profile quality degradation from CFG destructive pass like loop rotate, and I can see how this proposal would help improve quality of profile that drives later optimization passes in the pipeline. I have a few questions. * How does this affect today's AutoFDO? Specifically, can users upgrade compiler with FS-AutoFDO
2020 Nov 17
3
[RFC] Control Flow Sensitive AutoFDO (FS-AFDO)
Hi all, Here I include an RFC for control flow sensitive AutoFDO (FS-AFDO). This is a joint work with David Li. Questions and feedback are welcome. Thanks, Rong ============= [RFC] Control Flow Sensitive AutoFDO (FS-AFDO) 1. Motivation AFDO profile is derived from PMU samples from running an earlier build binary. PMU samples are indexed by the IP addresses. An offline tool uses the debug
2011 Jun 30
0
help with interpreting what nnet() output gives:
Greetings list, I am new to programming in R, and am using nnet() function for a project on neural networking. Firstly I wish to ask if there is any pdf explaining the algorithm nnet uses, which could tell me what the objects of the nnet class, like 'conn', 'nconn, 'nsunits', n and 'nunits' mean, and how weights are calculated. The package pdf has little or no
2010 Jun 14
1
unqiue problem
Hello everybody, I have a a matrix of 2 columns and over 27k rows. some of the rows are double , so I tried to remove them with the command unique(): > Workbook5 <- read.delim(file = "Workbook5.txt") > dim(Workbook5) [1] 27748 2 > Workbook5 <- unique(Workbook5) > dim(Workbook5) [1] 20101 2 it removed a lot of line, but unfortunately not all of them. I wanted