what are people installing VMware on, typically?

xSauronx

Lifer
Jul 14, 2000
19,582
4
81
We are using vmware more and more with our clients, and while I run it from a flash drive at home (white box, el cheapo setup), the tech manager usually has me setting up a RAID1 array for the Host OS. Even if I use that as a datastore some for a few ISOs, theres a lot of unused space there.

I seem to remember from a class last year that some vendors offered vmware preinstalled on a CF card or something, but what is a best practice, or common practice for situations where you have just a single server and several drives? I hate the idea of giving up a lot of storage, but I do like having the RAID redundancy.
 

Paperlantern

Platinum Member
Apr 26, 2003
2,239
6
81
Not really much reason to RAID the host drives honestly (unless you need major uptime). I run it off a PNY 1GB thumb stick, if it fails, just throw another thumb drive in its place, slap esx back on it, boot it up, mount VMs, away you go. Def RAID the datastores in some fashion depending on available drive bays in the host in a business production server, completely the best practice there.

At home i dont because esxi doesnt recognize my on board controller in my Proliant DL320 G5, but i take ovf templates of any machine i get running fairly often so if i do lose a datastore i can be back up pretty quick. But in a production business environment i would never leave it to single drives, no way.

The OS disk on the host though, depends on the uptime needed, if you need the 5 nines... yeah, raid the os disk too so if you have a failure you can fix it with no downtime.
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
One thing to remember is that once the memory stick is used to boot, it goes dormant and esxi runs from a ram drive. (upgrades aside) You can use the vCenter CLI to backup the host config fo ra quick restore or use any sector by sector copier to snap a ".bin" and push it back to another stick as needed.
 

yinan

Golden Member
Jan 12, 2007
1,801
2
71
If you are using ESXi correctly in a production environment you will not use the local storage for anything but the hypervisor, so it really doesn't matter what you install it on. Install it on the cheapest thing you are allowed to.

The VMs should ALWAYS live on remote RAIDed storage.
 

xSauronx

Lifer
Jul 14, 2000
19,582
4
81
If you are using ESXi correctly in a production environment you will not use the local storage for anything but the hypervisor, so it really doesn't matter what you install it on. Install it on the cheapest thing you are allowed to.

The VMs should ALWAYS live on remote RAIDed storage.

well the VMs do. For instance, we set up a server a couple of weeks ago that had 8x 300GB drives, so I was told to use 2 for RAID1 for the esxi install, and 6x raid5 for vm storage.

i personally run esxi5 from a flash drive at home, but if vmware dies at home its not a critical issue for me. it would be for customers, so having redundancy is good but im trying to get the tech manager to look into something other than giving up 2 entire drives for the install because it wastes an insane amount of space
 

yinan

Golden Member
Jan 12, 2007
1,801
2
71
well the VMs do. For instance, we set up a server a couple of weeks ago that had 8x 300GB drives, so I was told to use 2 for RAID1 for the esxi install, and 6x raid5 for vm storage.

i personally run esxi5 from a flash drive at home, but if vmware dies at home its not a critical issue for me. it would be for customers, so having redundancy is good but im trying to get the tech manager to look into something other than giving up 2 entire drives for the install because it wastes an insane amount of space

You are still using the LOCAL drives for VMs which should NEVER be done in a production environment. It means you do not get to take advantage of HA or vMotion. it really doesn't matter if the local drives go unused.
 

stash

Diamond Member
Jun 22, 2000
5,468
0
0
You are still using the LOCAL drives for VMs which should NEVER be done in a production environment. It means you do not get to take advantage of HA or vMotion. it really doesn't matter if the local drives go unused.

You can do shared-nothing VMotion in 5.1 so this is no longer strictly true.
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
well the VMs do. For instance, we set up a server a couple of weeks ago that had 8x 300GB drives, so I was told to use 2 for RAID1 for the esxi install, and 6x raid5 for vm storage.

i personally run esxi5 from a flash drive at home, but if vmware dies at home its not a critical issue for me. it would be for customers, so having redundancy is good but im trying to get the tech manager to look into something other than giving up 2 entire drives for the install because it wastes an insane amount of space

So buy another $5 memory stick and tape it to the server? The stick is read only aside from update-manager pushing an ESXi upgrade to it. Backup the host config via vcenter periodically and you can deploy a replacement host and push the config in about 10 minutes. There really is zero reason to RAID 1 (to disk at least) the ESXi install.