Displaying 9 results from an estimated 9 matches for "inktomi".
2005 Feb 04
3
Handling large data sets via scan()
I'm trying to read in datasets with roughly 150,000 rows and 600
features. I wrote a function using scan() to read it in (I have a 4GB
linux machine) and it works like a charm. Unfortunately, converting the
scanned list into a datafame using as.data.frame() causes the memory
usage to explode (it can go from 300MB for the scanned list to 1.4GB for
a data.frame of 30000 rows) and it fails
2004 Aug 06
0
bug report w.r.t. streaming of metadata in icecast
On Tue, 7 Aug 2001, Richard Fromm wrote:
> to clarify, the behavior that i saw was that most of the time the title
> streaming appeared in the client, but sometimes it did not.
i think i've nailed this down. the problem is if the icecast server
starts in the middle of a song already being sourced to it. (which you
can get by starting icecast, start ices, then shutdown and restart
2004 Aug 06
0
bug report w.r.t. streaming of metadata in icecast
On Fri, 10 Aug 2001, Brendan Cully wrote:
> The server returns an icy-metaint header which tells where the info
> will be inserted. If, say, it's 4096 then you'll find the metadata
> after every 4096 bytes of MP3 data.
that's what i originally thought was supposed to happen, which is why i
sent the original mail with a bug report, saying i was expecting periodic
insertion of
2005 Feb 24
1
Do environments make copies?
I am using environments to avoid making copies (by keeping references).
But it seems like there is a hidden copy going on somewhere - for
example in the code fragment below, I am creating a reference to "y"
(of size 500MB) and storing the reference in object "data". But when I
save "data" and then restore it in another R session, gc() claims it is
using twice the
2005 Feb 24
1
Do environments make copies?
I am using environments to avoid making copies (by keeping references).
But it seems like there is a hidden copy going on somewhere - for
example in the code fragment below, I am creating a reference to "y"
(of size 500MB) and storing the reference in object "data". But when I
save "data" and then restore it in another R session, gc() claims it is
using twice the
2005 Feb 19
2
Memory Fragmentation in R
I have a data set of roughly 700MB which during processing grows up to
2G ( I'm using a 4G linux box). After the work is done I clean up (rm())
and the state is returned to 700MB. Yet I find I cannot run the same
routine again as it claims to not be able to allocate memory even though
gcinfo() claims there is 1.1G left.
At the start of the second time
===============================
2005 Feb 19
2
Memory Fragmentation in R
I have a data set of roughly 700MB which during processing grows up to
2G ( I'm using a 4G linux box). After the work is done I clean up (rm())
and the state is returned to 700MB. Yet I find I cannot run the same
routine again as it claims to not be able to allocate memory even though
gcinfo() claims there is 1.1G left.
At the start of the second time
===============================
2004 Aug 06
2
bug report w.r.t. streaming of metadata in icecast
i've been trying to get title streaming of metadata to work with icecast
1.3.10. i've found what i believe to be a bug -- is this the right place to
file a bug report?
it appears that this information should be periodically inserted into the data
stream. the behavior that i was seeing was that the information was appearing
a maximum of one time and sometimes zero times.
i believe there
2004 Aug 06
2
bug report w.r.t. streaming of metadata in icecast
On Tue, 7 Aug 2001, Brendan Cully wrote:
> On Tuesday, 07 August 2001 at 12:56, Richard Fromm wrote:
> > i believe there is a bug in the following line in write_chunk_with_metadata()
> > in source.c. here is the original:
> >
> > if (source->info.udpseqnr == clicon->food.client->udpseqnr) {
> >
> > and here is the change:
> >
> >