Displaying 20 results from an estimated 200 matches similar to: "Fwd: Cannot allocate a new database connection error"
2009 Feb 18
0
Segmentation Fault occured while connecting to the database
Hi All,
Can anyone help me please?I don't know much about segmentation faults.I
understand what it is,but why my script's throwing the error i don't know.
This is my main function:----
*main<-function()*
*{*
* dbName<-"xyz_database"*
* hostName<-"xyz.com"*
* con<-myDbconnect(dbName,hostName) *
*
2009 Feb 23
0
Segmentation Fault still exists
Hi All,
Sorry to bother everyone again.Ofcourse Prof Ripley ,Yihui and Uwe had
replied to my email.But this segmentation fault error was not solved.I agree
with Prof Ripley,as he said my R and all other configurations,are very
old.But what i don't understand is ,i was able to run analysis till few days
before,and why suddenly this error.
*The configuartions i use:---*
*R-version
*
*>
2009 May 22
2
Query regarding na.omit function
Hi friends,
I have a query regarding na.omit function.Please ,someone help me.
I have a function
xyz_function<-function(arguments)
{
some code
return(list(matrix=dataset))
}
xyz_function_returnvalue<-xyz_function(passed argumentss)
*Case-I*
xyz_function_returnvalue_deletingNArows<-na.omit((xyz_function_returnvalue))
*Case-II*
2009 Feb 18
2
Added system Info:--Segmentation Fault occured while connecting to the database
Hi All,
Wanted to add some more information ,regarding my problem.
configuration of teh OS and R:---
Linux 2.6.18-6-686
> R.Version()
$platform
[1] "i486-pc-linux-gnu"
$arch
[1] "i486"
$os
[1] "linux-gnu"
$system
[1] "i486, linux-gnu"
$status
[1] "Patched"
$major
[1] "2"
$minor
[1] "4.0"
$year
[1] "2006"
2009 Jan 06
1
Warning message:In pt(q, df, lower.tail, log.p) : NaNs produced
Hi friends,
Any idea why do i get this warning?And also why all computed p-values are
NaN.
Have shown below what i did in Windows r-console.:--
> df
c1 c2
1 1 50
2 NA NA
3 4 NA
4 7 6
5 NA 7
6 10 10
> r<-cor(x=df,y=NULL,use="complete.obs",method=c("pearson"))
> r
c1 c2
c1 1.0000000 -0.9148074
c2 -0.9148074 1.0000000
> cor.p.values<-
2009 Aug 20
2
Insert rows in between dataframes
Hi all,
Can anyone suggest me how to insert rows in between data frames and also
keep the ordering of row numbers correct?
Estimate Std. Error t
value Pr(>|t|)
recmeanC2 9.275880e-17 6.322780e-17 1.467057e+00
0.14349903
recmeanC3 1.283534e-17 2.080644e-17 6.168929e-01
0.53781390
2009 Jun 10
2
How to get the unique pairs of a set of pairs dataframe ?
Hi friends,
Please can anyone help me with an easier solution of doing the below
mentioned work.
Suppose i have a dataset like this:---
i1 i2 i3 i4 i5
1 7 13 1 2
2 8 14 2 2
3 9 15 3 3
4 10 16 4 4
5 11 17 5 5
6 12 18 6 7
*i1,i2,i3,i4,i5 are my items.I am able to find all possible pairs i.e
Say this dataframe is "item_pairs"
**i1,i2
**i1,i3
**i1,i4
i1,i5
**i2,i1
2009 Jun 25
2
Error: system is computationally singular: reciprocal condition number
I get this error while computing partial correlation.
*Error in solve.default(Szz) :
system is computationally singular: reciprocal condition number =
4.90109e-18*
Why is it?Can anyone give me some idea ,how do i get rid it it?
This is the function i use for calculating partial correlation.
pcor.mat <- function(x,y,z,method="p",na.rm=T){
x <- c(x)
y <- c(y)
2009 Jul 22
1
How to dynamically generate lm() function arguments?
Hi All,
How do you dynamically generate the arguments for the lm() function when
your items vary for each database.
Say in my case for a particular database i have items from i1 to i15 .
In the code below there a line like this :--
item_cat_fit<-lm(as.numeric(item_item_table$i1) ~
as.numeric(item_item_table$i2) + as.numeric(item_category_table$i3) ) *#
this gives proper results,i am
2009 Feb 13
1
Write and Load functions from an external file
Hi All,
Would be grateful,if anyone can answer my queries.
I need to share code. For example, if I am working in C/C++, I would put
some function declarations in .h files that you would include. In PHP, I
would create files with the common functions in them and then "include()"
them. So far, I haven't been able to figure out what the standard practice
is in R.
The two options
2009 Aug 28
1
How to generate mean anova value row in anova table, instead of individual value for each predictor
Hi All ,
Can anybody tell me if there's any way to get the summarized anova
values.Now i will explain what i mean , when i say "*summarized*".
Below you can see the anova table of recmeanC1 with rest* all* i.e from
recmeanC2 to i15(predictors),as shown in table.
Df Sum Squares Mean Square F value Significance [Pr(>F)]
recmeanC2 1 89.272 89.272
2009 Sep 30
1
How to calculate KMO?
Hi All,
How do i calculate KMO for a dataset?
*Dataset:---------------------*
m1 m2 m3 m4 m5 m6 m7 m8
1 2 20 20 2 1 4 14 12
2 9 16 3 5 2 5 5 15
3 18 18 18 13 17 9 2 4
4 7 7 2 12 2 11 11 11
5 7 8 5 19 5 2 20 18
6 7 4 7 4 7 9 3 3
7 5 5 5 12 5 13 13 12
8 6 6 4 3 5 17 17 16
9 12 12 4 2 4 4 14 14
10 5 14
2013 Jan 16
1
Problems regarding the package "BRugs"
Respected Sir,
With reference to my mail to you dated 8th
January,2013, and the reply by you dated 9th January, 2013, I am sending
this mail to you. I had a problem regarding running a program in the latest
version of the "BRugs" package in R 2.15.1 and 2.15.2. I want to mention
here that this program runs well to others, who are running it using the
earlier version
2009 Feb 18
1
Possible Cause of Segmentation Fault
Hi All,
If you have already finished reading my previous emails regarding
segmentation fault , please have a look at this .I think this may help you
to diagnose the reason for the segmentation fault and help me,because i
don't understand much.
Rather than running the script using the command "
source("new_regression.R") ", what I did was ,simply typed in the commands
in
2009 Jan 02
1
Calculating signicance value
Hi friends,
If someone can find out some time to go through my problem would be really
grateful.
I have a dataset(dataset1) as shown below:--
recmeanC1 recmeanC2 recmeanC3 recmeanC4 i1 i2 i3 i4 i5 i6 i7
i8 i9 i10 i11
1 NA 1 1.00 1.800000
NA 1 NA 1 1 NA 2 2 2 NA 2
2 2 2 1.00
2009 Jun 28
1
ERROR: system is computationally singular: reciprocal condition number = 4.90109e-18
Hi All,
This is my R-version information:---
> version
_
platform i486-pc-linux-gnu
arch i486
os linux-gnu
system i486, linux-gnu
status
major 2
minor 7.1
year 2008
month 06
day 23
svn rev 45970
language R
version.string R version 2.7.1 (2008-06-23)
While calculating partial
2009 Jun 06
1
how to make the dynamically creted string work inside if as a condition
Hi,
How to make an if condition work, if the condition inside if() is created
dynamically ,and that is a string .If i type teh dynamically created string
the if works fine but when dynamically created,it is a string and going
inside the if() ,an error is thrown saying : rgument is not logical ..I even
tried changing teh mode of the string to logical,but it doesn't work
Say my dynamically
2009 Apr 29
2
if condition doesn't evaluate to True/False
Hi friends,
Please help me with this bug.
*Bug in my code:*
In this variable sub_grp_whr_cls_data[sbgrp_no,1] I store the where
clause.every sub group has a where condition linked with it.
Database1
Where clause was not found for a particular subgroup,
sub_grp_whr_cls_data[sbgrp_no,1] value was NULL
So the condition (*sub_grp_whr_cls_data[sbgrp_no,1]=="NULL" ||
2009 Aug 24
1
natural sorting a data frame /vector by row
How to NATURAL sort a vector or data frame* by row* , in ascending order ?
V1 V2 V3 V4
i1 5.000000e-01 1.036197e-17 4.825338e+16 0.00000000
i10 4.001692e-18 1.365740e-17 2.930053e-01 0.76973827
i12 -1.052843e-17 1.324484e-17 -7.949081e-01 0.42735000
i13 2.571236e-17 1.357336e-17 1.894325e+00 0.05922715
i2
2009 Jul 20
1
Regression function lm() not giving proper results
*
*
Hi ,
Can anyone help me please with this problem?*
*
*CASE-I*
all_raw_data_NAomitted is my data frame.It has columns with names i1 ,i2,
i3,i4…, till i15.It has 291 rows actually ,couldn’t show here.
The data frame looks like this:--
i1 i2 i3 i4 i5 i6 i7 i8 i9 i10 i11 i12 i13 i14 i15
2 2 2 2 2 2 2 2 2 2 2 1 2 2 3 2
3 2 2 2 2 3 2 2 3 3