Hi,
All the pools seem healthy and zfs file systems are all fine according to
"zpool status -x" but during the boot we get the following error:
fmadm faulty also returns this:
-------- ----------------------------------------------------------------------
degraded zfs://pool=pe09_01
8f5e62aa-c0af-4536-cef3-8e9d9169ea92
-------- ----------------------------------------------------------------------
degraded zfs://pool=re09_01
eca6f995-12ba-ce3b-9a5b-d33f1b6580ac
-------- ----------------------------------------------------------------------
Thanks,
SUNW-MSG-ID: ZFS-8000-CS, TYPE: Fault, VER: 1, SEVERITY:
Major
EVENT-TIME: Fri Oct 26 18:25:33 EDT
2007
PLATFORM: SUNW,Sun-Fire-T200, CSN: -, HOSTNAME:
myserver
SOURCE: zfs-diagnosis, REV:
1.0
EVENT-ID: 8168b743-76b2-ccc0-9816-
e7387c93267b
DESC: A ZFS pool failed to open. Refer to http://sun.com/msg/ZFS-8000-CS
for more information.
AUTO-RESPONSE: No automated response will
occur.
IMPACT: The pool data is
unavailable
REC-ACTION: Run ''zpool status -x'' and either attach the
missing device
or
restore from
backup.
SUNW-MSG-ID: ZFS-8000-CS, TYPE: Fault, VER: 1, SEVERITY:
Major
EVENT-TIME: Fri Oct 26 18:25:34 EDT
2007
PLATFORM: SUNW,Sun-Fire-T200, CSN: -, HOSTNAME:
myserver
SOURCE: zfs-diagnosis, REV:
1.0
EVENT-ID: 5c6e92ad-560b-ee04-
b324-8f32ccccf869
DESC: A ZFS pool failed to open. Refer to http://sun.com/msg/ZFS-8000-CS
for more information.
AUTO-RESPONSE: No automated response will
occur.
IMPACT: The pool data is
unavailable
REC-ACTION: Run ''zpool status -x'' and either attach the
missing device
or
restore from backup.
Is my pool bad or this is just a bug? How make it go away?
Thanks,
This message posted from opensolaris.org