Displaying 20 results from an estimated 2139 matches for "largest".
2019 Jan 23
2
[PATCH nbdkit] tests: Add generic requires function.
I only did a handful of the tests to demonstrate the point. If the
patch approach is accepted then I'll do the remainder of the tests in v2.
Rich.
2019 Jan 23
0
[PATCH nbdkit] tests: Add generic ‘requires’ function to testing test prerequisites.
---
tests/functions.sh.in | 17 ++++++++++++++++-
tests/test-memory-largest.sh | 8 ++------
tests/test-partition1.sh | 22 ++++------------------
tests/test-partition2.sh | 6 +-----
tests/test-pattern-largest.sh | 8 ++------
5 files changed, 25 insertions(+), 36 deletions(-)
diff --git a/tests/functions.sh.in b/tests/functions.sh.in
index 35647f7..97afbbf...
2018 Aug 02
3
tdbtool repack fails
...tdbtool repack) my "dc=domain,dc=com.ldb" file:
tdb> info
Size of file/data: 3388084000/1050098055
Header offset/logical size: 0/3388084000
Number of records: 669737
Incompatible hash: no
Active/supported feature flags: 0x00000000/0x00000001
Robust mutexes locking: no
Smallest/average/largest keys: 12/57/242
Smallest/average/largest data: 72/1510/1235987
Smallest/average/largest padding: 5/412/252927
Number of dead records: 2
Smallest/average/largest dead records: 399912712/661866360/923820008
Number of free records: 38999
Smallest/average/largest free records: 12/18485/715170720
Number...
2018 Sep 11
0
[PATCH nbdkit 4/4] tests: Add a helper function which waits for nbdkit to start up.
...data-base64.sh | 16 ++-------
tests/test-data-raw.sh | 16 ++-------
tests/test-fua.sh | 45 ++++++++++----------------
tests/test-ip.sh | 17 ++--------
tests/test-log.sh | 16 ++-------
tests/test-memory-largest-for-qemu.sh | 19 ++---------
tests/test-memory-largest.sh | 20 ++----------
tests/test-nozero.sh | 41 ++++++++---------------
tests/test-offset2.sh | 16 ++-------
tests/test-parallel-nbd.sh | 15 ++-------
tests/test-pattern-largest-for-...
2010 Jul 24
4
Trouble retrieving the second largest value from each row of a data.frame
I have a data frame with a couple million lines and want to retrieve the largest and second largest values in each row, along with the label of the column these values are in. For example
row 1
strongest=-11072
secondstrongest=-11707
strongestantenna=value120
secondstrongantenna=value60
Below is the code I am using and a truncated data.frame. Retrieving the largest value wa...
2018 Sep 13
0
[PATCH v2 nbdkit 4/5] tests: Use a generic cleanup mechanism instead of explicit trap.
...-----
tests/test-data-7E.sh | 9 ++-------
tests/test-data-base64.sh | 9 ++-------
tests/test-data-raw.sh | 9 ++-------
tests/test-fua.sh | 9 ++-------
tests/test-log.sh | 9 ++-------
tests/test-memory-largest-for-qemu.sh | 9 ++-------
tests/test-memory-largest.sh | 9 ++-------
tests/test-nozero.sh | 9 ++-------
tests/test-offset2.sh | 9 ++-------
tests/test-parallel-file.sh | 4 +++-
tests/test-parallel-nbd.sh | 4 +++-
tests...
2018 Sep 13
0
[PATCH v2 nbdkit 5/5] tests: Add a helper function which waits for nbdkit to start up.
...h | 27 +------------
tests/test-data-raw.sh | 27 +------------
tests/test-fua.sh | 52 +++++++-------------------
tests/test-ip.sh | 23 ++----------
tests/test-log.sh | 23 ++----------
tests/test-memory-largest-for-qemu.sh | 30 ++-------------
tests/test-memory-largest.sh | 31 ++-------------
tests/test-nozero.sh | 48 ++++--------------------
tests/test-offset2.sh | 27 +------------
tests/test-parallel-nbd.sh | 19 ++--------
tests/test-pattern...
2002 Jun 26
4
Largest file system being synced
I'm interested in large file system replication capability of rsync. Could
some of you people who use it share how large their mirroring is? What would
you say is the largest sized site being mirrored using rsync?
Thanks!
JP
2018 Sep 13
8
[PATCH v2 nbdkit 0/5] tests: Move common functions into tests/functions.sh
v1 was here:
https://www.redhat.com/archives/libguestfs/2018-September/msg00057.html
v2:
- Fix tab vs spaces in configure.ac.
- To generate list of plugins, use printf instead of xargs.
- Use 'source ./functions.sh' instead of 'source functions'.
- functions.sh: Consistent quoting in foreach_plugin function.
- functions.sh: Change the contract of start_nbdkit so it
2019 Jan 23
0
[PATCH v2 nbdkit] tests: Add generic ‘requires’ function for test prerequisites.
...---
tests/test-error0.sh | 6 +-----
tests/test-error10.sh | 6 +-----
tests/test-error100.sh | 6 +-----
tests/test-full.sh | 8 ++------
tests/test-ip.sh | 21 ++++++---------------
tests/test-memory-largest-for-qemu.sh | 8 ++------
tests/test-memory-largest.sh | 8 ++------
tests/test-offset2.sh | 8 ++------
tests/test-parallel-file.sh | 10 ++--------
tests/test-parallel-nbd.sh | 10 ++--------
tests/test-partition1.sh | 22 ++++---...
2004 Sep 21
2
Ever see a stata import problem like this?
...entiles Smallest
1% 197432 19721
5% 199649 19722
10% 1974116 19723 Obs 40933
25% 1983475 19724 Sum of Wgt. 40933
50% 1996808 Mean 9963040
Largest Std. Dev. 9006352
75% 1.99e+07 2.00e+07
90% 2.00e+07 2.00e+07 Variance 8.11e+13
95% 2.00e+07 2.00e+07 Skewness .18931
99% 2.00e+07 2.00e+07 Kurtosis 1.045409
GSS YEAR FOR THIS RESPONDENT
----...
2018 Sep 11
7
[PATCH nbdkit 0/4] tests: Move common functions into tests/functions.sh
...ons.sh.
Patch 1: Preparation for patch 3.
Patch 2: Fix a long-standing bug in how man pages links are generated.
Patch 3: Common code for iterating a test function over every plugin.
Patch 4: Common code for starting nbdkit in a test and waiting for the
PID file to appear. This is the largest and most complex of
the patches but is basically repetitive.
Rich.
2019 Jan 23
2
[PATCH v2 nbdkit] tests: Add generic requires.
v1 was here:
https://www.redhat.com/archives/libguestfs/2019-January/thread.html#00198
For v2 I changed most existing prerequisite tests to use the new
mechanism.
I only changed simple tests. There are a few more complex tests that
don't fit the “requires model” and those are not changed.
I normalized qemu-io/qemu-img testing to always use the --version
flag, where previously we used a mix
2024 Nov 21
1
tdb_expand overflow detected
...ng from. If it fails or
>> shows a cache full of nonsense, well that is also interesting.
>
> That is 161 lines of expired stuff.
Yeah, I'm not sure how that adds to 4 billion.
tdbtool /var/db/samba4/gencache.tdb
tdb> info
will show lines describing the "smallest/average/largest" of various things.
Douglas
2012 Apr 27
2
find the eigenvector corresponding to the largest eigenvalue
Hi,
If I use the eigen() function to find the eigenvalues of a matrix, how can I find the eigenvector corresponding to the largest eigen value?
Thanks!
[[alternative HTML version deleted]]
2011 Apr 26
7
Second largest element from each matrix row
Hi,
I need to extract the second largest element from each row of a
matrix. Below is my solution, but I think there should be a more efficient
way to accomplish the same, or not?
set.seed(1)
a <- matrix(rnorm(9), 3 ,3)
sec.large <- as.vector(apply(a, 1, order, decreasing=T)[2,])
ans <- sapply(1:length(sec.large), function(i...
2010 Jun 12
1
Fast way to compute largest eigenvector
Hello all,
I was wondering if there is a function in R that only computes the eigenvector
corresponding to the largest/smallest eigenvalue of an arbitrary real matrix.
Thanks
Minh
--
Living on Earth may be expensive, but it includes an annual free trip
around the Sun.
2001 Nov 27
1
largest file size?
what is the largest file size
limitation in ext3 ?
Thank you
2006 Jul 26
2
largest acceptable lookup table in a package
Hi
One of my packages needs a look-up table of pre-calculated
numbers in the data directory.
I would like to have the matrix as large as possible.
What is the largest size matrix that would be an acceptable datafile in
an R package?
[
The table is a square, upper triangular matrix
consisting of logs of Stirling numbers calculated by Maple.
As discussed on the List a few days ago (thanks again David!)
Stirling numbers are computationally challenging;
one ne...
2010 Jun 24
5
Best way to compute a sum
...<- 0 ; for(i in (1:200000000)) a <- a + 1/i
> b <- 0 ; for(i in (200000000:1)) b <- b + 1/i
> c <- sum(1/(1:200000000))
> d <- sum(1/(200000000:1))
> order(c(a,b,c,d))
[1] 1 2 4 3
> b<c
[1] TRUE
> c==d
[1] FALSE
I'd expected b being the largest, since we sum up the smallest
numbers first. Instead, c is the largest, which is sum() applied
to the vector ordered with largest numbers first.
Can anyone shed some light on this?
What is the best way in R to compute a sum while avoiding
cancellation effects?
By the way, sum() in the above e...