Displaying 20 results from an estimated 8000 matches similar to: "Reading and coalescing many datafiles."
2012 Dec 10
26
[PATCH 00/11] Add virtual EPT support Xen.
From: Zhang Xiantao <xiantao.zhang@intel.com>
With virtual EPT support, L1 hyerpvisor can use EPT hardware
for L2 guest''s memory virtualization. In this way, L2 guest''s
performance can be improved sharply. According to our testing,
some benchmarks can show > 5x performance gain.
Signed-off-by: Zhang Xiantao <xiantao.zhang@intel.com>
Zhang Xiantao (11):
2003 Jun 04
2
rsync for migrating oracle datafiles
Hi - a question for all ye rsync guru's out there...
I have a need to migrate some fairly large Oracle datafiles from a UFS filesystem to VxFS (VERITAS), however I am not being allowed nearly enough outage time to perform a standard file copy migration. The datafiles (of which there are about 4 are about 50GB each in size and on separate UFS filesystems.
I am considering instigating a local
2012 Oct 26
1
Parsing very large xml datafiles with SAX (XML package): What data structure should I favor?
Hello again,
I have another question related to parsing a very large xml file with SAX:
what kind of data structure should I favor? Unlike using DOM function that
can return lists of relevant nodes and let me use various versions of
'apply', the SAX parsing returns me one thing at a time.
I first tried to simply append to simple solution of appending to lists as
I get the data. But I
2012 Oct 26
1
Parsing very large xml datafiles with SAX: How to profile <anonymous> functions?
Hello everyone,
I'm trying to parse a very large XML file using SAX with the XML package
(i.e., mainly the xmlEventParsing function). This function takes as an
argument a list of other functions (handlers) that will be called to handle
particular xml nodes.
If when I use Rprof(), all the handler functions are lumped together under
the <anonymous> label, and I get something like this:
2004 Oct 04
3
Working with large datafiles
Hi,
I have been enjoying r for some time now, but was wondering about working
with larger data files. When I try to load in big files with more than
20,000 records, the programs seems unbable to store all the records. Is
there some way that I can increase the size of records that I work with?
Ideally I would like to work with census data which can hold a million
records.
Greg
2006 Mar 21
3
Rsync 4TB datafiles...?
I need to rsync 4 TB datafiles to remote server and clone to a new oracle
database..I have about 40 drives that contains this 4 TB data. I would like
to do rsync from a directory level by using --files-from=FILE option. But
the problem is what will happen if the network connection fails the whole
rsync will fail right.
rsync -a srchost:/ / --files-from=dbf-list
and dbf-list would contain this:
2009 Mar 10
2
How to color certain area under curve
For a given random variable rv, for instance, rv = rnorm(1000),
I plot its density curve and calculate some quantiles:
plot(density(rv))
P10P50P90 = = quantile(rv,probs = c(10,50,90)/100)
I would like to color the area between P10 and P90 and under the curve
and mark the P50 on the curve.
> rv = rnorm(1000)
> plot(density(rv))
> P10P50P90 = = quantile(rv,probs = c(10,50,90)/100)
Could
2017 Feb 24
2
Looking for Speech Recognition (ASR) suggestions
Hello Luca,
Thank you for your response. I?m familiar with speech recognition and TTS, but new to MRCP.
Yes, the 100k options is used for names in a directory listing.
In the pre-MRCP support, Nuance ASR used API events/methods for the application to tell ASR when the prompt was playing and when it stopped. If ASR detected speech, it would signal an event so we would stop playing the prompt.
2009 Sep 10
2
ASR & ACD
Is there any program Asterisk users use to calculate ASR and ACD ??
Thanks for any comments.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.digium.com/pipermail/asterisk-users/attachments/20090910/af1f9656/attachment.htm
2017 Feb 22
2
Looking for Speech Recognition (ASR) suggestions
Is it correct that the unimrcp is the best approach for Asterisk and ASR/TTS?
Could anyone provide pros/cons for the various ASR options for Asterisk?
We need the ability for very large grammars (over 100,000 options). Because of this, my initial thought is Nuance or Lumenvox. Does this sound correct?
Have a great day!
Dan
-------------- next part --------------
An HTML attachment was
2010 Jul 15
1
How do I combine lists of data.frames into a single data frame?
The data.frame is constructed by one of the following functions:
funweek <- function(df)
if (length(df$elapsed_time) > 5) {
rv = fitdist(df$elapsed_time,"exp")
rv$year = df$sale_year[1]
rv$sample = df$sale_week[1]
rv$granularity = "week"
rv
}
funmonth <- function(df)
if (length(df$elapsed_time) > 5) {
rv =
2017 Oct 22
3
ASR Suggestions for small dictionnary (<1000 entries) lookup in France/french
Hello,
I'm in the early stages of designing an Emergency calling service IVR
application.
The IVR application asks simple one or two questions like "which is the
postal code of the area you are currently calling from ?" "Is the correct
?". The expected values are a 5-digits number like
"twenty-five-thousand-two-hundreds-twelve" or
2007 May 19
2
RV permissions -- can''t park in my lot!
I''m trying to setup RV and cannot get it started. I keep getting:
Permission Denied : /var/log/rv.log
I''ve tried running it as root to test and also using www-data and chrgrp of
rv.log to www-data.
Here''s the output:
nohup su -c "/usr/bin/ruby rv_harness.rb 3301 127.0.0.1 < /dev/null 2>&1 >
/dev/null" www-data < /dev/null 2>&1 >>
2012 Mar 26
1
assigning vector or matrix sparsely (for use with mclapply)
Dear R wizards---
I have a wrapper on mclapply() that makes it a little easier for me to
do multiprocessing. (Posting this may make life easier for other
googlers.) I pass a data frame, a vector that tells me what rows
should be recomputed, and the function; and I get back a vector or
matrix of answers.
d <- data.frame( id=1:6, val=11:16 )
loc <- c(TRUE,TRUE,FALSE,TRUE,FALSE,TRUE)
2011 Aug 11
2
[Hivex] [PATCH] Correct 32-bit to 64-bit call
---
generator/generator.ml | 2 +-
1 files changed, 1 insertions(+), 1 deletions(-)
diff --git a/generator/generator.ml b/generator/generator.ml
index 31478cd..de911f1 100755
--- a/generator/generator.ml
+++ b/generator/generator.ml
@@ -1771,7 +1771,7 @@ static void raise_closed (const char *) Noreturn;
pr " rv = copy_type_value (r, len, t);\n";
pr "
2012 Sep 14
2
Opus for ASR
Hello,
All of the Opus quality studies that I've seen focused on human-perceived quality. I'm interested to know of any experience with machined "perceived" quality, particularly related to speech recognition or biometrics.
I'm also interested in folks thoughts on optimizing Opus for ASR. For example, removing certain classes of comfort noise, filtering non-speech bands,
2012 Jul 19
1
Switching log(J) to log(J+1) to avoid log(0) in HAR-RVJ model
I am working with xts dependent data, and my code is as follows (the problem
is explained throughout):
dat <- getdat("prices")
dat <- read.zoo(dat, sep = "",format="%d/%m/%Y %H:%M",
tz="", FUN=NULL, regular=TRUE,
header=TRUE, index.column=1, colClasses=c("character",
"numeric"))
dat <- as.xts(dat)
2009 Mar 06
4
rosh patch
Hey Guys,
Below is a patch for the com32/rosh/rosh.c from tonights syslinx git.
The patch does the following:
1) changes the rosh_issp to use an if instead of a case for this simple
test.
2) changes the rosh_dir_arg function to move the readdir() to inside the
while test.
This will let me go through my APUE book.
Let me know if you have any questions.
Keith
--- rosh.orig 2009-03-05
2019 May 17
1
Re: [nbdkit PATCH v2 08/24] ocaml: Implement .cache script callback
On Wed, May 15, 2019 at 10:57:58PM -0500, Eric Blake wrote:
> +static int
> +can_cache_wrapper (void *h)
> +{
> + CAMLparam0 ();
> + CAMLlocal1 (rv);
> +
> + caml_leave_blocking_section ();
> +
> + rv = caml_callback_exn (can_cache_fn, *(value *) h);
> + if (Is_exception_result (rv)) {
> + nbdkit_error ("%s", caml_format_exception
2005 Jun 03
2
Dirty Rotten Hack. (reversing tickmarks on axes?)
I feel dirty.
I have some graphs I'm building to communicate chargeback rates and service
usage for our backup system here at the University of Florida. These come
down to daily data points on a graph of number-of-bytes transferred and
stored.
Since we chargeback on the same basis (price per MB this, price per KB that)
the same chart with a different scale can be used to communicate bytes