Make the install DVD
First you download the OpenSuse iso image from official site.
Make sure you download the correct version. If you have a DVD writer then you can use Windows or a disk writer program to create a bootable OpenSuse DVD. If you do not then you have to create a bootable USB Stick.
Verify to BIOS setup
You must modify the BIOS setup to use DVD as primary boot device and one of HDD for second boot option. Alternative you can use a shortcut button to open the BIOS specific boot menu. This can be F9 or F10 or other key depending on your motherboard BIOS.
For UEFI Setup
If your motherboard has UEFI support then you can try to use UEFI setup. For this you must boot in UEFI mode form DVD-ROM. Then the setup will be able to use GPT disk partition table.
For uefi you need to create a small /boot partition of 512 MB and a small /boot/efi partition of 100MB. This partition must be FAT32 partition on GPT disk or else you will not be able to boot the OS in EFI mode.
In this article I will talk only about BIOS legacy mode install not UEFI install.
Configure DISK using LVM
In the next video I will show how I try to install OpenSuse Linux using LVM (Logical Volume Map). This install failed due to previous disk partitions present on disk.
Configure DISK using RAID
In my next attempt I have clean the disks first using a live DVD with Gparted program. Then I have created RAID array and I have formatted partitions using OpenSuse live DVD. In the next video you can see a successful install…
In this video I have explain how to use RAID to mount several Linux mounting points to different RAID channels. I have used this configuration:
- Compact Flash 4 GB for /boot partition
- RAID 10 array for /opt
- RAID 1 array for /var
- RAID 1 array for /home
- Single 500 GB for primary disk
The primary disk partitions:
- /swap (16 GB)
- /tmp (16 GB)
- / (20 GB)
- /home (412 GB)
I have considered that primary disk is not necessary to be RAID. This is due to good performance of primary disk and this particular PC is going to be used for development. If the primary disk will fail then I have to reinstall Linux.
The primary disk is used only to load programs and to keep the current configuration files. Some of the disk load will be distributed to other disks so it will not be heavily used and will have a longer life.
RAID 10 Benchmark Test
After install I have done some benchmark tests. I try to do this benchmark 2 times and I have 2 YouTube videos. First time I had done the benchmark immediate after I have formatted the RAID using XFS. The disks are formatted in the background and the test was showing bad performance.
Theoretical for RAID 10 performance for read is 4 times greater than one disk and write performance is 2 times greater than one disk. The space for RAID 10 is 2 times bigger than one disk. In practice software RAID 10 do not give you expected performance for all samples. It is depending on sample size.
RAID 10 performance for sample size of 10 MB:
- Read: 143 MB/s
- Write: 73 MB/s
- Access time: 15 msec
All 4 disks in this RAID are the same Scorpion WD500BPKX:
- Read 92 MB/s
- Write 84 MB/s
- 16.34 msec
These tests show that RAID10 array is no good for a desktop PC that has only 2 or 3 disks for example. We can use RAID10 array for servers to increase capacity and redundancy but we need to use 4, 6 or 8 disks.