search for: safeco

Displaying 9 results from an estimated 9 matches for "safeco".

2007 Feb 06
1
glm gamma scale parameter
...elp & search the archives but I'm a bit confused trying to reconcile the terminology I'm used to w/R terminology as we're transitioning to R, so if I missed an obvious way to do this, or stated this question in a way that's incomprehensible, my apologies. Jill Willie Open Seas Safeco Insurance jilwil at safeco.com
2007 Jan 26
0
FW: reducing RODBC odbcQuery memory use?
New to R, sorry if one or either of these is an inappropriate list for a question like this below; please let me know if this is a general help question. Jill Willie Open Seas Safeco Insurance jilwil at safeco.com -----Original Message----- From: WILLIE, JILL Sent: Thursday, January 25, 2007 2:27 PM To: r-help at stat.math.ethz.ch Subject: reducing RODBC odbcQuery memory use? Basic Questions: 1. Can I avoid having RODBC use so much memory (35 times the data size or more)...
2007 Jan 21
1
Can we do GLM on 2GB data set with R?
...eem to find a way to tell what the boundaries are & roughly gauge the needed memory...other than trial & error. I've started by testing the data.frame & run out of memory on my PC. I'm new to R so please be forgiving if this is a poorly-worded question. Jill Willie Open Seas Safeco Insurance jilwil at safeco.com 206-545-5673
2007 Jan 25
1
Size of data vs. needed memory...rule of thumb?
...r at least I'm trying to be...at this point I'm mostly a new R-help-reader); I'd appreciated being pointed in the right direction if this isn't the right help list to send this question to...or if this question is poorly worded (I did read the posting guide). Jill Willie Open Seas Safeco Insurance jilwil@safeco.com [[alternative HTML version deleted]]
2008 Mar 03
0
reducing RODBC odbcQuery memory use?
...y (35 times the data size or more) making a data.frame & then .rda file via. sqlQuery/save? 2. If not, is there some more appropriate way from w/in R to pull large data sets (2-5GB) into .rda files from sql? [R] reducing RODBC odbcQuery memory use? From: WILLIE, JILL <JILWIL_at_SAFECO.com> Date: Thu 25 Jan 2007 - 22:27:02 GMT Basic Questions: Can I avoid having RODBC use so much memory (35 times the data size or more) making a data.frame & then .rda file via. sqlQuery/save? If not, is there some more appropriate way from w/in R to pull large da...
2011 Jun 09
2
Problem with a if statement inside a function
...uot;, "OldAmerican", "Old-American", "Old American", "Pemco", "Progressive", "Regence Group", "Reliance", "Response", "Safe", "Safe Auto", "SafeAuto", "Safe-Auto", "Safeco", "SafeCo", "Safeway", "Santa Fe", "Santa-Fe", "SantaFe", "Sentry", "Shelter", "Standard", "State Farm", "StateFarm", "State-Farm", "Titan", "Travele...
2004 Aug 16
1
turning off automatic coersion from list to matrix
Hello, I am having trouble understanding how R is coercing between matrices and lists in the following example. I have an aggregate behavior I like: aggregate(a[,"num"],by=list(product=a[,"product"],region=a[,"region"]), sum) Now in reality I have more columns than just product and region, and need to pick different combinations. So I want to abstract this into a
2004 Jun 02
1
data filtering
I would like to know if there is a way to do the following command in one step, primarily for speed on large data (5 million elements), and secondarily for readablity. mean(delta[(intersect(which(x[['class']]==0),which(delta<1)))]) Do I really have to rely on an intersect operator? Isn't that O(nlg(n))? Can't I just filter in one step? As an R newbie, I would have guessed
2004 Jul 02
1
reading large data
Hello, I have trouble using read.table for flat files of larger than about 300MB on windows 2000. Any ideas of how to file a bug report? Is it a known issue? I have three cuts of data, a 1%, 10% and 100% sample in flat text files. The 100% sample is about 350MB. When I read the 1% and 10% files, besides being slow, everything works. RAM footprint appears to increase approximately 2x of text