Displaying 20 results from an estimated 5000 matches similar to: "package for saving large datasets in ASCII"
2011 Mar 13
2
Problems getting html files out of R CMD check
Hi,
I'm trying to R CMD check a package, however I have hit a snag. There seems
to be a problem with the creation of the /html files (the only file that's
constructed here is the 00Index.html). I've tested each of the .Rd files
independently with R CMD Rdconv, they all happily create html files without
complaint.
R CMD check <package> gives the no warnings. I'm therefore
2018 Nov 27
2
[2.3.4] Segmentation faults
It's still missing core dump (or bt full from it)
Aki
On 27.11.2018 8.39, Joan Moreau wrote:
>
> Thank you Aki
>
> here the requested data (below)
>
> Please not as well that we have numerous subfolders (>50) and pretty
> big mailbox sizes (>20G)
>
> Bug appears mostly in auth process and index-worker
>
>
> dovecot -n :
>
> # 2.4.devel
2018 Nov 25
3
[2.3.4] Segmentation faults
<!doctype html>
<html>
<head>
<meta charset="UTF-8">
</head>
<body>
<div>
<br>
</div>
<blockquote type="cite">
<div>
On 25 November 2018 at 06:29 Joan Moreau <
<a href="mailto:jom@grosjo.net">jom@grosjo.net</a>> wrote:
</div>
<div>
2018 Nov 28
2
[2.3.4] Segmentation faults
See https://dovecot.org/bugreport.html#coredumps
Without a backtrace it's not really possible to figure out where it's crashing.
> On 28 Nov 2018, at 13.20, Joan Moreau <jom at grosjo.net> wrote:
>
> Where to get that ?
>
>
> On 2018-11-27 08:50, Aki Tuomi wrote:
>
>> It's still missing core dump (or bt full from it)
>>
>> Aki
2008 May 24
1
R-Excel Macro Problem
I'm trying to write R functions into VBA code. I've done this many other
times in other documents and everything has run great. But today I keep
recieving an error message "Run-time error '1004': Application-defined or
object-defined error."
Has anyone else encountered this same error message?
I do not recieve this error in the document when running regular VBA code.
2002 Aug 28
4
Huge data frames?
A friend of mine recently mentioned that he had painlessly imported a
data file with 8 columns and 500,000 rows into matlab. When I tried
the same thing in R (both Unix and Windows variants) I had little
success. The Windows version hung for a very long time, until I
eventually more or less ran out of virtual memory; I tried to set the
proper memory allocations for the Unix version, but it never
2018 Nov 29
3
[2.3.4] Segmentation faults
finally managed to locate the dump
here the output:
# gdb /usr/libexec/dovecot/auth
/var/lib/systemd/coredump/core.auth.0.3a33f56105e043de802a7dfcee265a07.28130.1543516118000000
GNU gdb (GDB) 8.2
Copyright (C) 2018 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later
<http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
2018 Nov 24
2
v2.3.4 released
On Fri, 23 Nov 2018 10:45:56 -0500, Brad Smith stated:
>On 11/23/2018 9:31 AM, The Doctor wrote:
>
>> On Fri, Nov 23, 2018 at 04:06:53PM +0300, Odhiambo Washington wrote:
>>> On Fri, 23 Nov 2018 at 15:29, Timo Sirainen <tss at iki.fi> wrote:
>>>
>>>> https://dovecot.org/releases/2.3/dovecot-2.3.4.tar.gz
>>>>
2018 Nov 25
0
[2.3.4] Segmentation faults
Hi
THis is the lines I have in my dmesg (see below)
In dovecot log , I see:
Nov 25 04:26:47 auth-worker: Error: double free or corruption (fasttop)
What do to about it ?
Using lastest 2.3.4 version
Thank you
--------
[132932.169265] Code: 00 00 00 00 00 00 00 00 00 00 80 00 00 00 00 00 00
00 21 00 00 00 00 00 00 00 02 00 00 00 00 00 00 00 20 aa 36 f5 7a 7f 00
00 <40> 24 3d
2018 Nov 27
0
[2.3.4] Segmentation faults
Thank you Aki
here the requested data (below)
Please not as well that we have numerous subfolders (>50) and pretty big
mailbox sizes (>20G)
Bug appears mostly in auth process and index-worker
dovecot -n :
# 2.4.devel (de42b54aa): /etc/dovecot/dovecot.conf
# Pigeonhole version 0.6.devel (65909cfa)
# OS: Linux 4.19.4-arch1-1-ARCH x86_64 ext4
# Hostname: gjserver
base_dir =
2018 Nov 28
0
[2.3.4] Segmentation faults
Where to get that ?
On 2018-11-27 08:50, Aki Tuomi wrote:
> It's still missing core dump (or bt full from it)
>
> Aki
>
> On 27.11.2018 8.39, Joan Moreau wrote:
>
> Thank you Aki
>
> here the requested data (below)
>
> Please not as well that we have numerous subfolders (>50) and pretty big mailbox sizes (>20G)
>
> Bug appears mostly in
2018 Nov 29
0
[2.3.4] Segmentation faults
Can't find any "core" files (updatedb ; locate "core"). Coredump are
usually in /var/liv/systemd/coredump for other programs, but nothing for
dovecot.
Looks like issue is in 'auth' and 'indexer-worker'. Where can be the
coredump files ?
On 2018-11-28 18:13, Timo Sirainen wrote:
> See https://dovecot.org/bugreport.html#coredumps
>
> Without a
2003 Mar 12
6
Simple question about export
Hi,
Sorry about making this stupid question, but I did not
find the answer from documentation.
I managed to read my spss .sav file into the R, no
problem. Next I would like to write this data to a
file in ascii-format. I tried to use write.table and I
got no error messages, but no file either. What is the
right way to make it?
At least write.table("c:\foo\data.dat") does not
2002 Jan 25
2
problem with read.table -- example
Hi,
I have not get much response for my question about read.table for couple of
days ago. As I said, the problem is that read.delim() do not want to read
more than 51 lines of data, with longer file it reads the first column as
row names, although I have not noticed any special binary symbols around
line 51 either. This problem seems to happen with this particular file, I
could easily read in a
2001 Sep 20
3
indexing an array
Dear everybody,
I have a following problem. I have a 3D array
lambda <- array( dim=c(N,M,M-1))
where I have to extract the elements as follows:
lambda[ 1, state[1], 1]
lambda[ 1, state[1], 2]
...
lambda[ 1, state[1], M-1]
lambda[ 2, state[2], 1]
...
lambda[ 2, state[2], M-1]
...
lambda[ N, state[N], M-1]
i.e. the result should be a 2D array, where the second index follows the
first one
2007 Oct 17
1
R CMD build and et_EE.UTF-8 locale -> invalid files (PR#10351)
Full_Name: Ott Toomet
Version: 2.6.0, 2.5.x
OS: debian etch, lenny
Submission from: (NULL) (80.235.63.243)
When building a package with 'R CMD build name_of_directory" using "et_EE.UTF-8"
locale, I get the following:
siim at tancredi:~/tyyq/econ/micEcon$ R CMD build trunk
* checking for file 'trunk/DESCRIPTION' ... OK
* preparing 'trunk':
* checking
2018 Nov 30
0
[2.3.4] Segmentation faults
ANother (very very long) example :
# gdb /usr/libexec/dovecot/indexer-worker
core.indexer-worker.0.3a33f56105e043de802a7dfcee265a07.21017.1543533424000000
GNU gdb (GDB) 8.2
Copyright (C) 2018 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later
<http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to
2001 Oct 11
2
large dataframes to ascii
Hi R-users,
I want to convert a large dataset (from stata format) to an ascii table.
The resulting table should be a human-readable table (like CSV, or
tab-separated file or something similar). R reads the stata-file quite
easily (with some problems which are discussed here earlier), but so long I
have not found a suitable way to write it in ascii format.
Sure, there exists write.table, which
2001 Dec 27
1
write.table and large datasets
Hi,
I'll continue the discussion about the write.table() and problems with large
datasets.
The databases I have to work with are quite huge, 7500 obs x 1200 vars were
on of the smallest of them. I usually write a perl script to preprocess
them line-by-line and extract only the variables which I need later. This
results into quite a manageable size but I have to have the dataset in ASCII
2014 Jul 25
0
Tinc + Tomato (firmware)
I've been running Tinc on my routers for several years. I thought I'd do
an integration of tinc with gui in Tomato firmware because I find it useful.
It's been working well for me, but I'm sure there's there's a bug or
two, or something I've overlooked. Let me know of anything and I'll
correct it in a future release.
I created a tutorial for Tomato users here.