Displaying 20 results from an estimated 1000 matches similar to: "ocfs hung"
2006 Nov 16
1
Regarding debugocfs
Hi experts,
My customer issued debugocfs to check for file_size and extent info
but values such as file_size, alloc_size, next_free_ext were 0.
(/dev/sdi1 contains datafiles and arc files)
# debugocfs -a 0 /dev/sdi1
debugocfs 1.0.10-PROD1 Fri Mar 5 14:35:29 PST 2004
(build fcb0206676afe0fcac47a99c90de0e7b)
file_extent_0:
file_number = 128
disk_offset = 1433600
curr_master = 0
file_lock =
2004 Sep 01
2
ocfs doesn't free space?
an ocfs-volume was nearly full (only 800MB free). i deleted some
datafiles to free space:
$ df -h .
Filesystem Size Used Avail Use% Mounted on
/dev/sdp1 10G 5.3G 4.8G 53% /db/DPS
so there are more than 4GB available.
$ sqlplus /nolog
SQL*Plus: Release 9.2.0.4.0 - Production on Wed Sep 1 12:57:48 2004
Copyright (c) 1982, 2002, Oracle Corporation. All rights
2004 Mar 10
9
Lock contention issue with ocfs
I am still having this weird problem with nodes hanging while I'm
running OCFS. I'm using OCFS 1.0.9-12 and RHAS 2.1
I've been working on tracking it down and here's what I've got so far:
1. I create a file from node 0. This succeeds; I can /bin/cat the
file, append, edit, or whatever.
2. From node 1, I do an operation that accesses the DirNode (e.g.
/bin/ls)
3. Node 0
2004 Mar 10
9
Lock contention issue with ocfs
I am still having this weird problem with nodes hanging while I'm
running OCFS. I'm using OCFS 1.0.9-12 and RHAS 2.1
I've been working on tracking it down and here's what I've got so far:
1. I create a file from node 0. This succeeds; I can /bin/cat the
file, append, edit, or whatever.
2. From node 1, I do an operation that accesses the DirNode (e.g.
/bin/ls)
3. Node 0
2004 Dec 01
2
cp --o_direct
Another question.
When my database is running, I do
[oracle@LNCSTRTLDB03 LPTE3]$ cp --o_direct xdb01.dbf /tmp
cp: cannot open `xdb01.dbf' for reading: Permission denied
[oracle@LNCSTRTLDB03 LPTE3]$
When the database is shudown it works.
Is this normal for ocfs because with any other filesystem I can just
copy a file at any time (Its only a testing, I know I cant copy
datafiles and have
2004 Mar 30
1
RHEL 3 and OCFS 1.0.9-12 / 1.0.11-1
Is the following statement still valid for either OCFS 1.0.9-12 or OCFS
1.0.11-1?
The following is from one of the questions put forward by Derek Suzuki on
Ocfs-users
"A couple more minor questions about OCFS and RHEL3"
> Next, I saw a Metalink thread which suggests that async I/O is not >
supported on OCFS with RHAS 2.1. It doesn't say anything about RHEL3. >
We've
2004 Jun 04
1
RHE L3 -- OCFS 1.0.9-12 and 1.0.12
Running database in ASYNC mode in RHEL 3 has a potential risk of redo logs failure due to some short io's.
A note from
http://oss.oracle.com/projects/ocfs/dist/files/RedHat/RHEL3/i386/README.txt
says that the above mentioned problem is fixed in OCFS 1.0.9-12
"RELEASE 1.0.9-12
Fixes a potential corruption with large, aligned, direct I/Os, for
example Oracle redo logs or direct path SQL
2004 Apr 22
1
A couple more minor questions about OCFS and RHE L3
Sort of a followup...
We've been running OCFS in sync mode for a little over a month now,
and it has worked reasonably well. Performance is still a bit spotty, but
we're told that the next kernel update for RHEL3 should improve the
situation. We might eventually move to Polyserve's cluster filesystem for
its multipathing capability and potentially better performance, but at least
we
2004 Oct 01
3
Reading multiple files into R
I want to read data from a number of files into R.
Reading individual files one by one requires writing enormous amount of
code that will look something like the following.
****************
maptools:::dbf.read("wb-01vc.dbf")->dist1
maptools:::dbf.read("wb-02vc.dbf")->dist2
maptools:::dbf.read("wb-03vc.dbf")->dist3
2007 Jun 22
2
One file open or locked way too many times. How to fix?
A Windows 2000 Server is a member server of the domain. The domain
server is CentOS 4.5 with all updates and Samba 3.0.24 built using the
packaging/RHEL/makerpms.sh script. The W2k server is opening this file
on the samba server.
This problem started several versions of CentOS and Samba ago, and I did
the upgrades thinking it would fix it. It did not.
It seems to be "locking" or
2005 Aug 19
1
Summary: Unexpected result of read.dbf
Hi there,
This is summary and patch for a bug in read.dbf, demonstrating in
Message-Id: <20050818150446.697835cb.stanimura-ngs at umin.ac.jp>.
After consulting Rjpwiki, a cyber-community of R user in Japan, the
cause was found, and the patch of solution was proposed.
Overflowing occurs when we use read.dbf for reading a dbf file having
a field of longer signed integer. For example,
$
2011 May 16
1
reading multiple .dbf files
Hello..
I am currently working on running random forest to make predictions. For that I
have a bunch of .dbf files from shapefiiles. Earlier I was running random forest
on those dbf files individually but now I have >1,000 such files and procesisng
one by one is not an option.
I started by trying to read multiple dbf files as:
>setwd (..)
>a=list.files()
> for (x in
2002 May 10
2
RODBC for importing dbf
Hi
I know that it is very easy to import data from a dbf file to R, by
saving the data as csv, for instance.
However, I have several hundreds of files to do that. So, I thought of
using RODBC to read the dbf files and save it as data.frame. However, I
cannot even start (this is my first time using such package):
> library(RODBC)
> bdades <- odbcConnect("prova.DBF")
Warning
2006 Apr 01
4
-newbie | RODBC import query
Greetings -
After 20+ years of using SAS, for a variety of reasons, I'm using [R]
for a bunch of things - while I'm getting a pretty good a handling
[R] for script programming, and statistical analysis, I'm struggling
with 'pulling data into [R]'. For reasons beyond my control, a number
of the files I get sent to 'work with' are in Dbase format (*.dbf).
For
2005 Apr 17
2
Quorum error
Had a problem starting Oracle after expanding an EMC Metalun. We get the
following errors:
>WARNING: OemInit2: Opened file(/oradata/dbf/quorum.dbf 8), tid =
main:1024 file = oem.c, line = 491 {Sun Apr 17 10:33:41 2005 }
>ERROR: ReadOthersDskInfo(): ReadFile(/oradata/dbf/quorum.dbf)
failed(5) - (0) bytes read, tid = main:1024 file = oem.c, line = 1396
{Sun Apr 17 10:33:41 2005 }
2012 Aug 29
1
Destination file is larger than source file
We are using the standard -av switch. And both filesystems are the same -
UFS.
/opt/rsync/bin/rsync -av -e "ssh -l root" --delete
--exclude-from=/var/scripts/exclude
--password-file=/var/scripts/transfer.passwd <username>@<source
host>::<source dir>/ /<destination dir>
Source system
<source host>:<source dir># du -sh *
1K nohup.out
20G
2012 Aug 13
4
write.dbf error: invalid subscript type 'list'
Dear all,
I am basically a GIS user and am new to R.
I am trying to write a data frame to a dbf file.
*n.simulations <- 999
binomial <- kulldorff(geo, cases, population, NULL, pop.upper.bound,
n.simulations, alpha.level, plot)
cluster <- binomial$most.likely.cluster$location.IDs.included
df <- data.frame(ID=seq(1,n.simulations,by=1),
simloglkhd=binomial$simulated.log.lkhd)
2012 Oct 02
2
Efficient Way to gather data from various files
Hello,
Sorry if this process is too simple for this list. I know I can do it, but
I always read online about how when using R one should always try to avoid
loops and use vectors. I am wondering if there exists a more "R friendly"
way to do this than to use for loops.
I have a dataset that has a list of "ID"s. Let's call this dataset "Master"
Each of these
2006 Nov 03
2
Installing a package - and making it work?
Dear list
- Propoerly a beginner question, so bare with my frustration...
I tried install the 'shapefiles' package into R 2.4.0 but it seems that the install had little effect...
> install.packages(c("shapefiles"))
--- Please select a CRAN mirror for use in this session ---
trying URL
2006 Mar 16
2
Samba and Foxpro for Unix
Hi !
I have an unique situation.
An SCO Unixware 7.1.3 with Samba 2.2.7a (yes, i know; it sucks !)
On the machine, is running MS Foxpro 2.6 for UNIX (it's running pretty well
for several years) but due to new demands (like Crystal Reports, Visual
view; etc) the directories with DBF files were shared using Samba. At the
beginning, when everything was only about reading, there were no