That sounds like the operating system is terminating R due to overloading the
system. You have not supplied the information requested in the Posting Guide, so
you may not get very specific responses on how to solve this. I would suggest
running partial data sets of larger and larger size to identify the largest you
can successfully analyze. You may be able to use a more efficient algorithm or
you may have to run your analysis using a machine with more memory or
reconfigured process usage limits.
---------------------------------------------------------------------------
Jeff Newmiller The ..... ..... Go Live...
DCN:<jdnewmil at dcn.davis.ca.us> Basics: ##.#. ##.#. Live
Go...
Live: OO#.. Dead: OO#.. Playing
Research Engineer (Solar/Batteries O.O#. #.O#. with
/Software/Embedded Controllers) .OO#. .OO#. rocks...1k
---------------------------------------------------------------------------
Sent from my phone. Please excuse my brevity.
On February 14, 2014 3:05:47 AM PST, vikrant <vikrant.shimpi at tcs.com>
wrote:>HI ,
>
>I am using R on a very huge datasets which contains lot of text. I
>prepared
>a all word vector of words by using strsplit function. Now I want to
>compute
>frequency of unique words from all word vector. For doing
>so, I used two ways
>
>1) as.data.frame(table(x))
>2) sapply(x,x,length)
>
>x contains approximately 9 lac words and no.of unique words is around
>33k.
>
>When I run any one of these command, R window closes automatically.
>
>Please let me know what is the solution for this?
>
>
>
>
>--
>View this message in context:
>http://r.789695.n4.nabble.com/R-windows-close-while-calculating-frequency-tp4685315.html
>Sent from the R help mailing list archive at Nabble.com.
>
>______________________________________________
>R-help at r-project.org mailing list
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide
>http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.