Showing posts with label NFS. Show all posts
Showing posts with label NFS. Show all posts

9/30/11

VMWare ESXi Lab Improvements, separate storage network with Jumbo Frames enabled

I recently decided to upgrade my VSphere 5 Lab and move my storage off to a separate network. The following diagram describes my new topology.


Before doing this I had everything on the same network, by isolating all of the storage traffic I was able to enable Jumbo Frames and bring this environment a bit closer to VMWares best practices for NFS storage.

To create the new network first I changed the Iomega IX4 200D's network settings,


I put each Gigabit network card on a seperate network, the 192.x.x.x network is for managing the ESXi environment and the 10.x.x.x network is the storage network. Next I enabled Jumbo Frames on the NIC and set the size to 9000.

Following this change I created a new VMKernel on each ESXi host using the VSphere Client, I added 2 physical network cards to each VMKernel. 



I made sure to enable Jumbo Frames on both the new vSwitch that is created for the VMKernel and the VMKernel itself, I set them both to 9000 to match the Iomega settings.


I enabled NIC Teaming on the two network cards


Note: Although NIC Teaming will not increase speed for a single NFS Datastore, it will help when there are multiple NFS Datastores, VMWare recommends using multiple NFS Datastores.


Once the network and NAS settings are complete the NFS datastores can be created normally.









9/8/11

IOMEGA IX4-200D Thoughts

I've owned the Iomega IX4 200D for almost 2 years now, since this is a core piece of my home infrastructure, I felt some feedback would be appropriate.

The NAS is not designed for speed, in most reviews I've read it's usually ranked near the bottom of the pack. That said I still think it's a great device with a ton of functionality, the unit ships with Seagate Barracuda hard drives that spin at 5900 RPM, they are the low power edition of the product line. I use mine (I actually own 2) for a VMWare ESXi 5.0 environment that I run, my environment consists of 2 - 3.2 GHZ Quad Core Phenom II white boxes each with 8GB of RAM and 5 Intel Pro 1000 NIC cards. The NAS comes with 2 Gigabit NICS that can be teamed for performance or used individually for redundancy, I found this to be pretty rare for most home NAS devices on the market.

The Iomega is certified with VMWare to run VSphere and supports both iSCSI and NFS. I purchased the unit with 4 500GB Seagate Barracudas and quickly ordered 4 2TB Barracudas to replace the original drives. Switching out the drives is actually not supported by Iomega and will void the warranty however, I kept the 500 GB drives so if I ever have to ship it back, I can put the original drives back in.

The operating system of the NAS is actually striped across all disks in the device so you must be careful to only switch one drive at a time until the array is completely rebuilt. I have had a drive fail once and it was very easy to install a new drive and let it rebuild the array, on an 8TB RAID 5 array about half full it takes about 24 hours to rebuild but the nice thing is that the array is still accessible during the reconstruction.

That's all I can think of right now, I'll keep adding to this post as I have time...