Is my RAID array OK or not?


I’ve just set up my Suse 11.1 system with software RAID 1 on all partitions. Now I wanted to set up RAID monitoring, but mdadm gave me this suspicious looking message:

susebox:/ mdadm --monitor /dev/md1
Mar 10 11:43:34: DegradedArray on /dev/md1 unknown device

I got the same message for all arrays. Does this mean that all arrays are degraded?

mdadm --detail gives me the following:

susebox:/ mdadm --detail /dev/md1
Version : 1.00
Creation Time : Sun Feb 22 14:13:26 2009
Raid Level : raid1
Array Size : 2096468 (2047.68 MiB 2146.78 MB)
Used Dev Size : 2096468 (2047.68 MiB 2146.78 MB)
Raid Devices : 2
Total Devices : 1
Persistence : Superblock is persistent

Intent Bitmap : Internal

Update Time : Tue Mar 10 11:51:08 2009
      State : active, degraded

Active Devices : 1
Working Devices : 1
Failed Devices : 0
Spare Devices : 0

       Name : linux:1
       UUID : aff88832:e4618dcb:f6ea008f:376d32c7
     Events : 958

Number   Major   Minor   RaidDevice State
   0       8        2        0      active sync   /dev/sda2
   1       0        0        1      removed

/proc/mdstat shows the following:

Personalities : [raid1] [raid0] [raid6] [raid5] [raid4]
md3 : active raid1 sda4[0]
953594164 blocks super 1.0 [2/1] [U_]
bitmap: 455/455 pages [1820KB], 1024KB chunk

md0 : active raid1 sda1[0]
96376 blocks super 1.0 [2/1] [U_]
bitmap: 4/6 pages [16KB], 8KB chunk

md1 : active raid1 sda2[0]
2096468 blocks super 1.0 [2/1] [U_]
bitmap: 7/8 pages [28KB], 128KB chunk

md2 : active raid1 sda3[0]
20964752 blocks super 1.0 [2/1] [U_]
bitmap: 160/160 pages [640KB], 64KB chunk

So, is my raid array degraded or not? If yes, what do I need to do? If not, how can I properly set up monitoring? I’ve looked at a number of different tutorials, but am still confused.

Thanks in advance for your help! :expressionless:

It looks like you don’t have RAID1 set up at all. You realise that for RAID1 you have to devote two partitions to each logical RAID1 partition, that is to say you lose half the disk space, to get mirroring. You can’t just say, ok this one partition is RAID1. You have to say these two partitions form the logical RAID1 partition so that if one fails, you still have my data intact. Naturally it makes sense to have those two partitions on different disks, otherwise if one disk fails you lose everything.


Thanks your your reply.

While I might appear computer-illeterate from my original post, I am fully aware of what a RAID 1 array is, that it means losing half of my disk space and that I need to dedicated two partitions to each raid device. I am not at all new to the concept of running RAID arrays, I have successfully worked with them before on Windows systems. It’s just setting up and managing Linux software RAIDs that I am totally new to.

I have set up the raid array during as described in this tutorial:
How to install openSUSE on software RAID - openSUSE

There are two identical harddisks in my system (sda and sdb). Of course I partitioned both of them identically. According to the setup I creted on installation, sda and sdb should be used by the four RAID partitions.

Unfortunately the system did not boot from disk after installation (see software raid 1 does not boot without Opensuse DVD - openSUSE Forums for details), so I am left with no other option but booting the system from the SUSE Linux DVD. I’m not sure if this might be related to my current problem.

Well there is no sign of the second partition in all your RAID1 arrays. I don’t know how that happened. I didn’t even think the installer would let you half-setup a RAID1 array. Somehow you have to reattach the second partitions.

Booting off disks in RAID1 is a separate issue, it is possible to have RAID1 and yet be only able to boot of the first disk due GRUB setup not being done on both disks. The installer is supposed to take care of that, but sometimes it fails. It is always possible to fix that up using grub from the CLI.

Actually on further thought, I think it has everything to do with booting off DVD. The DVD doesn’t have the scripts to form the arrays. What you have to do is manually reassemble the arrays using mdadm (instructions for how to detect and reassemble the array are in the man page), then install the boot loader properly on the disk so that it boots off disk. Then the arrays will be assembled at boot time.

I have just re-added the partitions on /dev/sdb like this:

linuxbox:/mdadm --add /dev/md0 /dev/sdb1
mdadm: re-added /dev/sdb1

Now I am waiting for them to rebuild (takes a while since it’s still a pentium 3 machine…), will try rebooting afterwards to see if that removes sdb from the RAID array again and update this post with the results afterwards.

I think you have to reinstall the boot loader otherwise the same thing will happen again, it will not boot off the disk and not assemble the array at boot time.

I’d like to, but I was unable to do this even after posting in the Suse Forums:

software raid 1 does not boot without Opensuse DVD - openSUSE Forums

It seems nobody could tell me how to install the boot loader or what might cause my current problem booting the system properly without the installation DVD. If you know any solution, I’d be more than happy to give it a try, but right now I’m stuck.

Do it from the CLI

# grub
grub> device (hd0) /dev/sda
grub> root (hd0,0)
grub> setup (hd0)

This assumes that /boot is on /dev/sda1, change the second 0 in root (hd0,0) if not. At the setup step, you should get some messages about finding various stages of GRUB.

Then repeat with sdb, but sticking with hd0, because when it boots of sdb, it’s effectively hd0 to GRUB.

If you do not get the messages at the setup step, then your BIOS is unable to boot off the disk for one reason or another.

Thanks a lot for the info on how to set up grub! :slight_smile:

I’ve installed grub on both disks and am now waiting for the last partition to finish rebuilding before I try rebooting the system.

I just tried rebooting and the system still does not boot off the harddisk. The bios complains about no active partion being on the harddisk, and the error message appears twice, so it does seem to try booting both hard disks.

I’ve tried activating the boot partitions on both disks via fdisk, but the error message still did not go away.

On the bright side, I can still boot via installation DVD, and the raid arrays still stay in sync after doing this, so the booting problem is just of cosmetic nature.

Thanks very much for your help! :slight_smile:

As I recall you are using an add-on card for the SATA drives, right? It may be that the BIOS is not able to boot off this interface, so you may have to continue to use the workaround of booting off the DVD.