Displaying 14 results from an estimated 14 matches similar to: "memory cleaning"
2012 Aug 08
0
Testing for a second order factor using SEM package
Hi!
The following model specification works when testing for first order
factors, but when I attempt to test for a second order factor by adding the
last 4 lines in the model, I get the error message below:
model.cfa.ru <- specifyModel()
sRU1 <- sRU, NA, 1
sRU2 <- sRU, lam12
sRU3 <- sRU, lam13
sRU4 <- sRU, lam14
sRU5 <- sRU, lam15
sRU6 <- sRU, lam16
sRU <-> sRU, mak1
2005 Oct 18
1
Memory problems with large dataset in rpart
Dear helpers,
I am a Dutch student from the Erasmus University. For my Bachelor thesis I
have written a script in R using boosting by means of classification and
regression trees. This script uses the function the predefined function
rpart. My input file consists of about 4000 vectors each having 2210
dimensions. In the third iteration R complains of a lack of memory,
although in each iteration
2001 Dec 07
2
Memory problem
Dear all,
I have written a little R program to convert images. See below. Within the
loop over j (the filenames) memory consumption grows constantly. rm( ... )
inside the loop did not help. Memory does not grow if I remove the writeBin
statements between the two #-------- marks. But obviously this is not
solution I want...
Thanks for any advice.
Manfred Baumstark
P.S. As I'm new to R:
2019 Nov 30
1
Re: [PATCH nbdkit 1/3] filters: stats: Show size in GiB, rate in MiB/s
On Sat, Nov 30, 2019 at 02:17:05AM +0200, Nir Soffer wrote:
> I find bytes and bits-per-second unhelpful and hard to parse. Using GiB
> for sizes works for common disk images, and MiB/s works for common
> storage throughput.
>
> Here is an example run with this change:
>
> $ ./nbdkit --foreground \
> --unix /tmp/nbd.sock \
> --exportname '' \
>
2004 Sep 21
1
files being blanked by writing
I'm running Samba 2.2.8a on SuSE Linux 8.1 with kernel 2.4.19. I have
a share defined by this on a web server to allow members of the jamigos
group to edit web pages.
[users]
comment = User Web Pages
path = /home
valid users = @jamigos
read only = No
create mask = 0664
force create mode = 0664
directory mask = 0775
2005 Sep 15
4
Error in vector("double", length) : vector size specified is too large....VLDs
I have what R seems to consider a very large dataset, a 12MB text file of
lat,long,and height values, 130,000 rows to be exact.
Here's what I get:
Thomas Colson
North Carolina State University
Department of Forestry and Environmental Resources
(919) 673 8023
tom_colson at ncsu.edu
Calendar:
www4.ncsu.edu/~tpcolson
2004 Feb 10
0
Evaluating R. I need to open "a dataset".
Hello,
Our Statistics Group is evaluating the use of R for the elaboration of
some index.
We have some datasets sas and we would like to evaluate performance in
the elaborations of mean, percentile, Gini index of a population and of
a survey sample.
I need to open "a dataset". Currently I've understood that I've to
follow a code sequence like this:
alfa <- {a
2019 Nov 30
0
[PATCH nbdkit v2 1/3] filters: stats: Add size in GiB, show rate in MiB/s
I find bytes and bits-per-second unhelpful and hard to parse.
Add also size in GiB, and show rate in MiB per second. This works
well for common disk images and storage.
Here is an example run with this change:
$ ./nbdkit --foreground \
--unix /tmp/nbd.sock \
--exportname '' \
--filter stats \
file file=/var/tmp/dst.img \
statsfile=/dev/stderr \
--run 'qemu-img
2004 Feb 11
0
The use of R for the elaboration of some index
Hello,
Our Statistics Group is evaluating the use of R for the elaboration of
some index.
We have some datasets sas (120 Mb) and we would like to evaluate
performance in the elaborations of mean, percentile, Gini index of a
population and of a survey sample.
I need to open "a dataset". Currently I've understood that I've to
follow a code sequence like this:
alfa <- {a
2003 Aug 11
1
Memory-problem?
Hi,
I have a big problem with my R-script. It seems to be a memory problem, but I'm not sure.
My script:
test.window <- function(stat, some arguments){
several ifs and ifs in ifs (if(){...if(){...}})
}
...
for (ii in 1 : length(data)){ ## data is a vector of length 2500
stat <- test.window( some arguments )
## there are 15 arguments including a "big" list
2019 Nov 30
0
[PATCH nbdkit 1/3] filters: stats: Show size in GiB, rate in MiB/s
I find bytes and bits-per-second unhelpful and hard to parse. Using GiB
for sizes works for common disk images, and MiB/s works for common
storage throughput.
Here is an example run with this change:
$ ./nbdkit --foreground \
--unix /tmp/nbd.sock \
--exportname '' \
--filter stats \
file file=/var/tmp/dst.img \
statsfile=/dev/stderr \
--run 'qemu-img convert
2019 Nov 30
5
[PATCH nbdkit 0/3] filters: stats: More useful, more friendly
- Use more friendly output with GiB and MiB/s.
- Measure time per operation, providing finer grain stats
- Add missing stats for flush
I hope that these changes will help to understand and imporve virt-v2v
performance.
Nir Soffer (3):
filters: stats: Show size in GiB, rate in MiB/s
filters: stats: Measure time per operation
filters: stats: Add flush stats
filters/stats/stats.c | 117
2013 Feb 28
1
DNS IPv6 Question
Hi there,
Domaincontroller name: risky.home.schinz.de
dig dig @risky -tANY risky.home.schinz.de.
returns:
;; ANSWER SECTION:
risky.home.schinz.de. 900 IN A 10.0.180.254
risky.home.schinz.de. 900 IN AAAA ::1
I have a strange behavior in my network. It's only concerning DNS, but
maybe someone can help me.
If a do a ssh on risky from another machine (exact command:
2019 Nov 30
4
[PATCH nbdkit v2 0/3] filters: stats: More useful, more friendly
- Use more friendly output with GiB and MiB/s
- Measure time per operation, providing finer grain stats
- Add total stats for understanding system throughput
- Add missing stats for flush
I hope that these changes will help to understand and improve virt-v2v
performance.
Changes since v1:
- Keep bytes values
- Increase precision to 0.001 GiB and 0.001 MiB/s
- Add total stats
- Show time before