hmmmm
2010-May-24 14:15 UTC
[zfs-discuss] cannot import pool from another system, device-ids different! please help!
Hi!
i had 6 disks in a raidz1 pool that i replaced from 1TB drives to 2TB drives.
i have installed the older 1TB drives in another system and would like to import
the old pool to access some files i accidentally deleted from the new pool.
the first system (with the 2TB''s) is a Opensolaris system and the other
is running
EON solaris (based on snv 130)
I think the problem is that in the EON system, the drives get different
ID''s
and when i replaced the 1TB drives i didnt export the pool.
only one drive show up as online, is this because it is the only one
connected "in the right order"? i dont remember which order the drives
where connected to the controller in the Opensolaris system.
what can i do to import this pool????
HELP!!!
eon:1:~#uname -a
SunOS eon 5.11 snv_130 i86pc i386 i86pc
eon:2:~#format
Searching for disks...done
AVAILABLE DISK SELECTIONS:
0. c1d0 <SAMSUNG-S13PJDWS25695-0001-931.51GB>
/pci at 0,0/pci-ide at d/ide at 0/cmdk at 0,0
1. c2d0 <SAMSUNG-S13PJDWS25725-0001-931.51GB>
/pci at 0,0/pci-ide at d/ide at 1/cmdk at 0,0
2. c3d0 <SAMSUNG-S13PJDWS25695-0001-931.51GB>
/pci at 0,0/pci-ide at d,1/ide at 0/cmdk at 0,0
3. c4d0 <SAMSUNG-S13PJDWS25695-0001-931.51GB>
/pci at 0,0/pci-ide at d,1/ide at 1/cmdk at 0,0
4. c5d0 <SAMSUNG-S13PJ1KQ40672-0001-931.51GB>
/pci at 0,0/pci-ide at d,2/ide at 0/cmdk at 0,0
5. c6d0 <SAMSUNG-S13PJ1KQ40672-0001-931.51GB>
/pci at 0,0/pci-ide at d,2/ide at 1/cmdk at 0,0
Specify disk (enter its number):
eon:3:~#zpool import
pool: videodrome
id: 5063071388564101079
state: UNAVAIL
status: The pool was last accessed by another system.
action: The pool cannot be imported due to damaged devices or data.
see: http://www.sun.com/msg/ZFS-8000-EY
config:
videodrome UNAVAIL insufficient replicas
raidz1-0 UNAVAIL insufficient replicas
c1t0d0 UNAVAIL cannot open
c1t1d0 UNAVAIL cannot open
c10t0d0 UNAVAIL cannot open
c0t1d0 UNAVAIL cannot open
c11t0d0 UNAVAIL cannot open
c1d0 ONLINE
--
This message posted from opensolaris.org
Mark J Musante
2010-May-24 14:31 UTC
[zfs-discuss] cannot import pool from another system, device-ids different! please help!
On Mon, 24 May 2010, hmmmm wrote:> i had 6 disks in a raidz1 pool that i replaced from 1TB drives to 2TB > drives. i have installed the older 1TB drives in another system and > would like to import the old pool to access some files i accidentally > deleted from the new pool.Did you use the ''zpool replace'' command to do the replace? If so, once the replace completes, the ZFS label on the original disk is overwritten to make it available for new pools. Regards, markm
hmmmm
2010-May-24 14:41 UTC
[zfs-discuss] cannot import pool from another system, device-ids different! please help!
yes i used "zpool replace". why is one drive recognized? shouldnt the labels be wiped on all of them? am i screwed? -- This message posted from opensolaris.org
hmmmm
2010-May-24 16:30 UTC
[zfs-discuss] cannot import pool from another system, device-ids different! please help!
but...wait......that cant be..... i disconnected the 1TB drives and plugged in the 2TB''s before doing replace command. no information could be written to the 1TBs at all since it is physically offline. -- This message posted from opensolaris.org
Mark J Musante
2010-May-24 16:59 UTC
[zfs-discuss] cannot import pool from another system, device-ids different! please help!
On Mon, 24 May 2010, hmmmm wrote:> but...wait......that cant be..... > i disconnected the 1TB drives and plugged in the 2TB''s before doing replace command. no information could be written to the 1TBs at all since it is physically offline.Do the labels still exist? What does ''zdb -l /dev/rdsk/<disk>'' show? Regards, markm
hmmmm
2010-May-25 14:27 UTC
[zfs-discuss] cannot import pool from another system, device-ids different! please help!
eon:1:~#zdb -l /dev/rdsk/c1d0 -------------------------------------------- LABEL 0 -------------------------------------------- failed to unpack label 0 -------------------------------------------- LABEL 1 -------------------------------------------- failed to unpack label 1 -------------------------------------------- LABEL 2 -------------------------------------------- failed to unpack label 2 -------------------------------------------- LABEL 3 -------------------------------------------- failed to unpack label 3 same for the other five drives in the pool.... what now? -- This message posted from opensolaris.org
eXeC001er
2010-May-25 14:38 UTC
[zfs-discuss] cannot import pool from another system, device-ids different! please help!
try to "zdb -l /dev/rdsk/c1d0s0" 2010/5/25 hmmmm <bajsadbrun at pleasespam.me>> eon:1:~#zdb -l /dev/rdsk/c1d0 > -------------------------------------------- > LABEL 0 > -------------------------------------------- > failed to unpack label 0 > -------------------------------------------- > LABEL 1 > -------------------------------------------- > failed to unpack label 1 > -------------------------------------------- > LABEL 2 > -------------------------------------------- > failed to unpack label 2 > -------------------------------------------- > LABEL 3 > -------------------------------------------- > failed to unpack label 3 > > > same for the other five drives in the pool.... > what now? > -- > This message posted from opensolaris.org > _______________________________________________ > zfs-discuss mailing list > zfs-discuss at opensolaris.org > http://mail.opensolaris.org/mailman/listinfo/zfs-discuss >-------------- next part -------------- An HTML attachment was scrubbed... URL: <http://mail.opensolaris.org/pipermail/zfs-discuss/attachments/20100525/dabb6dce/attachment.html>
hmmmm
2010-May-25 19:50 UTC
[zfs-discuss] cannot import pool from another system, device-ids different! please help!
eon:6:~#zdb -l /dev/rdsk/c1d0s0
--------------------------------------------
LABEL 0
--------------------------------------------
version: 22
name: ''videodrome''
state: 0
txg: 55561
pool_guid: 5063071388564101079
hostid: 919514
hostname: ''Videodrome''
top_guid: 15080595385902860350
guid: 12602499757569516679
vdev_children: 1
vdev_tree:
type: ''raidz''
id: 0
guid: 15080595385902860350
nparity: 1
metaslab_array: 23
metaslab_shift: 35
ashift: 9
asize: 6001149345792
is_log: 0
children[0]:
type: ''disk''
id: 0
guid: 5800353223031346021
path: ''/dev/dsk/c1t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1123096/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5/disk at
0,0:a''
whole_disk: 1
DTL: 30
children[1]:
type: ''disk''
id: 1
guid: 11924500712739180074
path: ''/dev/dsk/c1t1d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1089951/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5/disk at
1,0:a''
whole_disk: 1
DTL: 31
children[2]:
type: ''disk''
id: 2
guid: 6297108650128259181
path: ''/dev/dsk/c10t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1089667/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,1/disk at
0,0:a''
whole_disk: 1
DTL: 32
children[3]:
type: ''disk''
id: 3
guid: 828343558065682349
path: ''/dev/dsk/c0t1d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1098856/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,1/disk at
1,0:a''
whole_disk: 1
DTL: 33
children[4]:
type: ''disk''
id: 4
guid: 16604516587932073210
path: ''/dev/dsk/c11t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1117911/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,2/disk at
0,0:a''
whole_disk: 1
DTL: 34
children[5]:
type: ''disk''
id: 5
guid: 12602499757569516679
path: ''/dev/dsk/c11t1d0s0''
devid: ''id1,sd at
ASAMSUNG_HD103UJ=S13PJDWS256953/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,2/disk at
1,0:a''
whole_disk: 1
DTL: 57
--------------------------------------------
LABEL 1
--------------------------------------------
version: 22
name: ''videodrome''
state: 0
txg: 55561
pool_guid: 5063071388564101079
hostid: 919514
hostname: ''Videodrome''
top_guid: 15080595385902860350
guid: 12602499757569516679
vdev_children: 1
vdev_tree:
type: ''raidz''
id: 0
guid: 15080595385902860350
nparity: 1
metaslab_array: 23
metaslab_shift: 35
ashift: 9
asize: 6001149345792
is_log: 0
children[0]:
type: ''disk''
id: 0
guid: 5800353223031346021
path: ''/dev/dsk/c1t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1123096/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5/disk at
0,0:a''
whole_disk: 1
DTL: 30
children[1]:
type: ''disk''
id: 1
guid: 11924500712739180074
path: ''/dev/dsk/c1t1d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1089951/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5/disk at
1,0:a''
whole_disk: 1
DTL: 31
children[2]:
type: ''disk''
id: 2
guid: 6297108650128259181
path: ''/dev/dsk/c10t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1089667/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,1/disk at
0,0:a''
whole_disk: 1
DTL: 32
children[3]:
type: ''disk''
id: 3
guid: 828343558065682349
path: ''/dev/dsk/c0t1d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1098856/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,1/disk at
1,0:a''
whole_disk: 1
DTL: 33
children[4]:
type: ''disk''
id: 4
guid: 16604516587932073210
path: ''/dev/dsk/c11t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1117911/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,2/disk at
0,0:a''
whole_disk: 1
DTL: 34
children[5]:
type: ''disk''
id: 5
guid: 12602499757569516679
path: ''/dev/dsk/c11t1d0s0''
devid: ''id1,sd at
ASAMSUNG_HD103UJ=S13PJDWS256953/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,2/disk at
1,0:a''
whole_disk: 1
DTL: 57
--------------------------------------------
LABEL 2
--------------------------------------------
version: 22
name: ''videodrome''
state: 0
txg: 55561
pool_guid: 5063071388564101079
hostid: 919514
hostname: ''Videodrome''
top_guid: 15080595385902860350
guid: 12602499757569516679
vdev_children: 1
vdev_tree:
type: ''raidz''
id: 0
guid: 15080595385902860350
nparity: 1
metaslab_array: 23
metaslab_shift: 35
ashift: 9
asize: 6001149345792
is_log: 0
children[0]:
type: ''disk''
id: 0
guid: 5800353223031346021
path: ''/dev/dsk/c1t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1123096/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5/disk at
0,0:a''
whole_disk: 1
DTL: 30
children[1]:
type: ''disk''
id: 1
guid: 11924500712739180074
path: ''/dev/dsk/c1t1d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1089951/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5/disk at
1,0:a''
whole_disk: 1
DTL: 31
children[2]:
type: ''disk''
id: 2
guid: 6297108650128259181
path: ''/dev/dsk/c10t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1089667/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,1/disk at
0,0:a''
whole_disk: 1
DTL: 32
children[3]:
type: ''disk''
id: 3
guid: 828343558065682349
path: ''/dev/dsk/c0t1d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1098856/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,1/disk at
1,0:a''
whole_disk: 1
DTL: 33
children[4]:
type: ''disk''
id: 4
guid: 16604516587932073210
path: ''/dev/dsk/c11t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1117911/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,2/disk at
0,0:a''
whole_disk: 1
DTL: 34
children[5]:
type: ''disk''
id: 5
guid: 12602499757569516679
path: ''/dev/dsk/c11t1d0s0''
devid: ''id1,sd at
ASAMSUNG_HD103UJ=S13PJDWS256953/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,2/disk at
1,0:a''
whole_disk: 1
DTL: 57
--------------------------------------------
LABEL 3
--------------------------------------------
version: 22
name: ''videodrome''
state: 0
txg: 55561
pool_guid: 5063071388564101079
hostid: 919514
hostname: ''Videodrome''
top_guid: 15080595385902860350
guid: 12602499757569516679
vdev_children: 1
vdev_tree:
type: ''raidz''
id: 0
guid: 15080595385902860350
nparity: 1
metaslab_array: 23
metaslab_shift: 35
ashift: 9
asize: 6001149345792
is_log: 0
children[0]:
type: ''disk''
id: 0
guid: 5800353223031346021
path: ''/dev/dsk/c1t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1123096/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5/disk at
0,0:a''
whole_disk: 1
DTL: 30
children[1]:
type: ''disk''
id: 1
guid: 11924500712739180074
path: ''/dev/dsk/c1t1d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1089951/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5/disk at
1,0:a''
whole_disk: 1
DTL: 31
children[2]:
type: ''disk''
id: 2
guid: 6297108650128259181
path: ''/dev/dsk/c10t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1089667/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,1/disk at
0,0:a''
whole_disk: 1
DTL: 32
children[3]:
type: ''disk''
id: 3
guid: 828343558065682349
path: ''/dev/dsk/c0t1d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1098856/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,1/disk at
1,0:a''
whole_disk: 1
DTL: 33
children[4]:
type: ''disk''
id: 4
guid: 16604516587932073210
path: ''/dev/dsk/c11t0d0s0''
devid: ''id1,sd at
AWDC_WD20EADS-00S2B0=_____WD-WCAVY1117911/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,2/disk at
0,0:a''
whole_disk: 1
DTL: 34
children[5]:
type: ''disk''
id: 5
guid: 12602499757569516679
path: ''/dev/dsk/c11t1d0s0''
devid: ''id1,sd at
ASAMSUNG_HD103UJ=S13PJDWS256953/a''
phys_path: ''/pci at 0,0/pci1043,8239 at 5,2/disk at
1,0:a''
whole_disk: 1
DTL: 57
--
This message posted from opensolaris.org