Displaying 5 results from an estimated 5 matches for "bigtabl".
Did you mean:
bigtable
2017 Jul 03
2
R memory limits on table(x, y) (and bigtabulate)
...tps://stat.ethz.ch/R-manual/R-devel/library/base/html/Memory-limits.html>
and <http://www.win-vector.com/blog/2015/06/r-in-a-64-bit-world/>, but
I just want to make sure I understood that right);
- I thought I could handle this with the package bigtabulate, but whenever I run
xy.tab <- bigtable(data.frame(x, y), ccols=1:2)
R crashes as follows:
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted
Any idea on what I am doing wrong with bigtabulate? Thanks for your
consideration
2006 May 19
4
Fast update of a lot of records in a database?
We have a PostgreSQL table with about 400000 records in it. Using
either RODBC or RdbiPgSQL, what is the fastest way to update one (or a
few) column(s) in a large collection of records? Currently we're
sending sql like
BEGIN
UPDATE table SET col1=value WHERE id=id
(repeated thousands of times for different ids)
COMMIT
and this takes hours to complete. Surely there must be a quicker
2012 Feb 29
0
Question about tables in bigtabulate
...ou would get if you could run table()
on the bigmatrix
## thats emulated in this example by coercing the bigmatrix to a matrix.
## in the real application that is not possible, because of RAM limits
P <- table(as.matrix(test))
## the package big tabulate has a version of table called bigtable.
## you can run table on an individual column.
## I want to run it on all the columns. basically combine the results of
running it on individual columns
## if you try to specify multiple columns, you get a contingency table, and
if you use too many
## columns you will hang your system hard .. so...
2017 Jul 03
0
R memory limits on table(x, y) (and bigtabulate)
...ual/R-devel/library/base/html/Memory-limits.html>
> and <http://www.win-vector.com/blog/2015/06/r-in-a-64-bit-world/>, but
> I just want to make sure I understood that right);
> - I thought I could handle this with the package bigtabulate, but whenever I run
>
> xy.tab <- bigtable(data.frame(x, y), ccols=1:2)
>
> R crashes as follows:
>
> terminate called after throwing an instance of 'std::bad_alloc'
> what(): std::bad_alloc
> Aborted
>
> Any idea on what I am doing wrong with bigtabulate? Thanks for your
> consideration
>
> ____...
2017 Jan 21
1
How to handle INT8 data
To summarise this thread, there are basically three ways of handling int64 in R:
* coerce to character
* coerce to double
* store in double
There is no ideal solution, and each have pros and cons that I've
attempted to summarise below.
## Coerce to character
This is the easiest approach if the data is used as identifiers. It
will have some performance drawbacks when loading and will