search for: virtualr

Displaying 6 results from an estimated 6 matches for "virtualr".

Did you mean: virtual
2015 Nov 18
4
Linux ate my RAM...
Hello everyone, Excuse the title. I'm trying to do something very specific that goes against some common assumptions. I am aware of how Linux uses available memory to cache. This, in almost all cases, is desirable. I've spent years explaining to users how to properly read the free output. I'm now trying to increase VM density on host systems (by host, I mean the physical system, not
2008 Jun 15
1
randomForest, 'No forest component...' error while calling Predict()
Dear R-users, While making a prediction using the randomForest function (package randomForest) I'm getting the following error message: "Error in predict.randomForest(model, newdata = CV) : No forest component in the object" Here's my complete code. For reproducing this task, please find my 2 data sets attached ( http://www.nabble.com/file/p17855119/data.rar data.rar ).
2015 Nov 18
0
Linux ate my RAM...
...spective, all Linux memory is > being "used?. Nope. VMware?s memory ballooning feature purposely keeps some of the guest?s RAM locked away from the kernel. This is where RAM comes from when another guest needs more physical RAM than it currently has access to: https://blogs.vmware.com/virtualreality/2008/10/memory-overcomm.html There are downsides. One is that pages locked up by the balloon driver aren?t being used by Linux?s buffer cache. But on the other hand, the hypervisor itself fulfills some of that role, which is why rebooting a VM guest is typically much faster than rebooting...
2008 Apr 26
2
Calling a stored model within the predict() function
Hi all, First of all, I'm a novice R user (less that a week), so perhaps my code isn't very efficient. Using the MBoost package I created a model using the following command and saved it to a file for later use: model <- gamboost(fpfm,data=SampleClusterData,baselearner="bbs") # Creating a model save(model,file="model.RData") # Saving a model After this, during a
2009 Mar 08
1
Prevent saving the workspace while running a script in batch mode
Dear R-user, I'm running a certain R script in DOS batch mode. Is there a way to prevent R from saving the workspace once this script is finished? I'm asking this because the resulting .RData file has the size of >70MB. I don't need this file since my script already writes the required output and this makes the whole process very slow (>15 minutes each time) FYI, my script:
2008 Apr 24
0
R crashes while running a positive checked script (PR#11264)
Full_Name: Bas Zimmerman Version: 2.7.0 (2008-04-22) OS: Windows 2000 Pro SP 4 Eng Submission from: (NULL) (62.51.53.106) Running the following line of the R-code SurvivalEnsembles.R, part of the MBoost package results in a program crash: 'AMLrf <- cforest(I(log(time)) ~ ., data = AMLlearn, control = ctrl, weights = AMLw)' This package received a OK-check, see