Displaying 20 results from an estimated 10000 matches similar to: "R on os390"
2000 Aug 17
2
R on os390
G'day R friends,
I didn't get any replies on the main list so I thought I'd try with the
experts.
I was wondering if anyone's ported R to os390. If so, are the vsize and
nsize limits the same as other platforms?
I could really annoy those SAS guys then.
thanks,
John Strumila
john.strumila@team.telstra.com
2000 Mar 03
1
tapply, sorting and the heap
howdy gurus,
I'm new and green and I was hoping for a tiny bit of your expertise.
I'm running out of virtual memory (heap?) when summing using tapply. I've
already used --vsize=90M on my hpux machine. (details below)
Can I pre-sort or something to prevent my error?
thanks,
John Strumila
john.strumila at corpmail.telstra.com.au
> gc()["Vcells","total"]
[1]
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
2000 Feb 24
1
queueing problems
howdy R friends,
I'm new but I used to play with S+ a long time ago. Can someone please help
me with how to approach this?
I have some response time data I want to 'correlate' with other data. I
believe queueing is involved so I need to prove somehow (F test?) that
response ~ exponential(...)
How do I go about this? I cant find exponential in 'nlm' or other
functions.
1999 Apr 27
2
Memory management
Dear all,
I don't get it:
First of all, the help doesn't say what are the memory limits of
R. Say, what's the max heap size for instance ????
Secondly, I invoke R with the following commands each time:
rgui --vsize 30M --nsize 1000K
rgui --vsize 30M --nsize 2000K
rgui --vsize 30M --nsize 3000K
rgui --vsize 30M --nsize 4000K
I try to open a matrix 8000x8000 by issuing
2003 Sep 19
2
Latest Samba Disto for Os390/Unix ?
HI,I would be interested if there is a release of Samba for Os390/Unix
later that 1.9.
(I have tried downloading the 'samba-latest.tar.gz' file from the mirror
site,but as I cannot unzip/untar it,due to not having the 'gzip' program
!).
Therfore I am not sure how compatible it may or may not be with my
Os390/Uss platform.
Thanks for any help received in advance.
John.
2010 Jan 18
0
R jobs keep hanging linux server despite mem.limits modifcations
My group is working with datasets between 100 Mb and 1 GB in size, using
multiple log ins. From the documentation, it appears that vsize is limited
to 2^30-1, which tends to prove too restrictive for our use. When we drop
that restriction (set vsize = NA) we end up hanging the server, which
requires a restart. Is there any way to increase the memory limits on R
while keeping our jobs from
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone,
I have a problem that some people may have already encountered but i did not
find the solution yet.
As I use R to simulate several arrays of data, my workspace is now 35Mb big and
I cannot launch R with it.
An "xdr real data read error occured" and R tells me to delete .RData or
increase memory. I WON'T delete this file and changing the max-nsize to 40600k
did not
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject
matter (there aren't more than about a dozen) but either did not find
solutions or found them not to work.
Here is the issue. I am trying to run a spatial regression on a
medium-sized dataset. Part of the functions in the spdep package I use
require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG
2009 Feb 01
0
setting a large value of --max-vsize
Hello,
I'm using a 64bit Linux with 16GB of RAM. I'd like to limit the memory
that the R process can use so I'm trying to use --max-vsize switch.
However, it is seems that I can't enforce a limit above 2GB.
shlomo at hippo:~$ uname -a
Linux hippo 2.6.24-16-generic #1 SMP Thu Apr 10 12:47:45 UTC 2008
x86_64 GNU/Linux
This WORKS:
--------------------
shlomo at hippo:~$ R
2003 May 14
1
SFTP on OS390
Hi
Anyone have any idea on how to implement sftp in os390?
Well .. using OpenSSH .. I've already installed SSL and SSH successfully ..
I can do putty and scp .. but sftp is still a problem ..
anyone has a guideline?
thanks
2005 Dec 14
0
Centos for OS390 Hercules problem
Hi!
I use 2 volumes to install Centos for OS390 (Address 120 and 121). All the
installation process is ok, but when the linux says that i can reboot the
system whitouth problems
i dont't know the IPL addres to boot and the LOADPARM.
Anyone can help me, please?
Thanks
2005 Dec 20
1
Problems in batch mode
Dear R-users,
I am trying to run some simulations in batch mode. In an older version
of the program, I used
rterm --vsize=100M --nsize=5000K --restore --save <input file> output file,
however, in the new version R 2.2.0 , the parameters vsize and nsize are
ignored.
I can use the command memory.limit to increase memory, but I am not sure if
this corresponds to vsize and nsize.
2006 Mar 03
0
Memory problem
Hi list,
I am analysing a large dataset using random coefficient (using nlme) and
fixed effects (using lm function) models. I have problem with my R version
2. 2. 1 due to memory allocation difficulties. When I try to expand the
memory I get the following error message.
> R --min-vsize=10000000 --max-vsize=1000000000 --min-nsize=500
--max-nsize=10000000
Error: target of assignment expands
2000 Jan 23
1
size limits
Hi,
I have a few questions about how to handle large data sets in R.
What is the size of the largest matrix that R can comfortably deal with?
Is this size limit imposed by R's software, or is it a question
of the machine that one runs on?
How does one go about choosing reasonable values of vsize
and nsize?
I have a data set with about 1,000,000 rows, and 30
2009 May 07
1
increasing memory for R bg job
Hi,
Is the following command used to increase the memory or any other command when a background R job is run?
R --min-vsize=vl --max-vsize=vu --min-nsize=nl --max-nsize=nu --max-ppsize=N
source:
http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html
Thx
Carol
[[alternative HTML version deleted]]
2012 Sep 21
1
Defunct of --max-vsize and mem.limits
R-devel,
I am migrating from R.2.13.2 to R.2.15.1 and just realize that R command line options --max-nsize and --max-vsize are no longer supported along with the defunct of mem.limits(). To me, the function and options along with other two, --min-nsize and --min-vsize, are useful in allowing some explicit control of R memory usage. One benefit is that the setting of maximum boundary could
2008 Feb 12
2
Cox model
Hello R-community,
It's been a week now that I am struggling with the implementation of a cox
model in R. I have 80 cancer patients, so 80 time measurements and 80
relapse or no measurements (respective to censor, 1 if relapsed over the
examined period, 0 if not). My microarray data contain around 18000 genes.
So I have the expressions of 18000 genes in each of the 80 tumors (matrix
2005 Jun 29
3
Memory Management under Linux: Problems to allocate large amounts of data
Dear Group
I'm still trying to bring many data into R (see older postings). After solving some troubles with the database I do most of the work in MySQL. But still I could be nice to work on some data using R. Therefore I can use a dedicated Server with Gentoo Linux as OS hosting only R. This Server is a nice machine with two CPU and 4GB RAM which should do the job:
Dual Intel XEON 3.06 GHz
2000 Jul 20
1
bad R bug
Hi,
I am not on this mailing list, but here is a terrible bug that has
stopped me in my tracks. I am unable to remove observations from a data
matrix.
temp is the original matrix. Notice that there are 288 entries with a
104 in the first column. I attempt to remove these entries, but R does
not do it.
brad
ACTUAL COMMANDS:
> dim(temp)
[1] 30528 11
> table(temp[,1])
1 3 4