My bet would be a .RData file. They extract data from shared objects in memory
but do not restore that data in shared objects, so there is a risk of memory
requirements exploding.
If this is actually a problem specific to the Mac, you might want to ask on
R-sig-mac.
On October 25, 2022 6:33:10 AM PDT, ken eagle <eaglek2011 at gmail.com>
wrote:>I thought I was loading a ~300M binary (bigwig) file into another
>application, but the window changed to the R GUI without my realizing it
>and R tried to load the file. I?m on a Mac M1-based laptop with system
>12.5.1 and 8G of ram, running R 4.2.1 (Intel version). According to
>ActivityMonitor, R grabs ~45G (!) of memory before the system warns me that
>it is out of application memory and I have to force quit R. I have since
>tried locating/moving/editing .Rprofile, .Rapp.history, or .RData files
>with no impact. Starting R from a Terminal session works fine but has no
>impact on the problem. I have tried restarting R, restarting the laptop,
>and re-installing R, all with no change; on restarting R, it just starts
>reloading the file it can?t handle (CtrlC and ESC don?t do anything). I
>have changed the directory name of the offending file, and the filename
>itself, without changing what happens upon starting R. I?ve also created a
>new .Rprofile that successfully executes but does not prevent the load
>problem unless the .Rprofile includes a quit() command.
>
>Any suggestions?
>
>Ken
>
> [[alternative HTML version deleted]]
>
>______________________________________________
>R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.
--
Sent from my phone. Please excuse my brevity.