No 10gbe NICs showing in VMware 6.X with X10SDV-7TP4F

When I started building my VMware ESXi server, I did not have a Switch that could handle 10Gbe SFP+.  Now that I have a Dell X1052, I figured I would cable up a 10Gbe DAC and get moving.  Much to my surprise, I received a link light on the switch, but not on the motherboard.

Capture.PNG

Digging into the VMware side, I noticed that the 10Gbe NICs are not available, only the 1Gbe NICs.

Capture

A quick Google search brought me to a great site for VMware knowledge, tinkertry.com.  It appears that the drivers for the 10Gbe are not loaded.  So following the directions here, we open the SSH Console and enter the following command.

esxcli software vib install -v https://cdn.tinkertry.com/files/net-ixgbe_4.5.1-1OEM.600.0.0.2494585.vib –no-sig-check

We then reboot our host to get the new VIB to be loaded.

Capture

Low and behold on reboot we see the two 10gbe NICs.

Capture

Advertisements

Software Raid in Windows and NVMe U.2 Drive Benchmark

I have recently acquired a couple Intel 750 NVMe U.2 Drives to play around with.  In order to utilize these drives you need to source a SuperMicro AOC-SLG3-2E4 NVMe PCIe card or similar variant.  I good write up on the HBAs available is from our good friends at ServerTheHome.

In my VMware Server, I have passed through the two NVMe drives into a Windows 2016 VM.  From there we launch Disk Management, and below we see the two drives.

Snip20170427_1.png

For our testing we want the raw speed at first, so we right click on a drive and select “New Striped Volume”.  If we want redundancy we would choose a Mirrored Volume.  If we just wanted the combined storage of the two drives to appear to the OS as one drive, we would chose Spanned Volume.

Snip20170427_2.png

For there we make sure both our drives are selected.

Snip20170427_3.png

We assign the drive a letter, tell the OS to format the drive, give it a name if we wish, and click Finish.

We will get a warming, as the OS will convert the Disk from basic to dynamic.  Snip20170427_4.png

A note, if you receive an error message stating that there is not enough space on the drive, be sure you have the same Disk Type before you start and that the same amount of space is available on each drive.  For myself one drive was GPT and the other was Basic, resulting in a slight mismatch of drive space.  Once both drives were set to GPT, the drive space was the same and the mirrored operation could continue.

The drives will format and then appear once completed.

Snip20170427_5.png

Once formatted we can run some benchmarks against the hardware.

Snip20170428_7.png

In a RAID0 essentially this is what we expect to see, double the Reads and double the Writes.

The ATTO Results are also as expected.  There is some ludicrous speeds here, but you need an application that needs and can take advantage of these speeds.

Snip20170428_8.png

Building a Home Lab – Back to the Future!

In my current role, I don’t do much System Engineer work, and frankly it’s boring to not.

I’ve decided that I need to keep my skills up and I’m working on rebuilding in my home lab what we had with the old team.

Compiling a list this is what I need to build, and what I hope to blog on.  I find that the simple how to guides are prevalent across the web, so I’ll blog about those pain points I encounter.

  1.  Systems Center Operations Manager 2016
  2.  Systems Center Virtual Machine Manager 2016 (with Substitute VMware ESXI and vSphere)
  3.  Windows Server Update Services
  4.  Windows Deployment Services
  5.  Active Directory
  6.  Windows DNS
  7.  Puppet
  8.  Foreman (though not technically in our old environment, I would like to use it for nix deployments
  9.  SQL Clustering and Replication
  10.  Virtual Desktop Infrastructure

My Active Directory, Windows DNS, Foreman, and Puppet Installations are already completed.  My vSphere and ESXi Systems is complete as well.

My entire infrastructure is running on the following as VMs off an ESXi Server

Hardware

SUPERMICRO MBD-X10SRL-F Server Motherboard LGA 2011 R3

Intel Xeon E5-2609 V4 1.7 GHz 20MB L3 Cache LGA 2011 85W BX80660E52609V4 Server Processor

8X SAMSUNG 16GB 288-Pin DDR4 SDRAM ECC Registered DDR4 2133 (PC4 17000) Server Memory Model M393A2G40DB0-CPB

SUPERMICRO SSD-DM064-PHI SATA DOM (SuperDOM) Solutions

Intel RS3DC080 PCI-Express 3.0 x8 Low Profile Ready SATA / SAS Controller Card

Intel 750 Series AIC 400GB PCI-Express 3.0 x4 MLC Internal Solid State Drive (SSD) SSDPEDMW400G4X1

4X HGST HITACHI Ultrastar SSD400M 400GB SAS 6GB SSD 2.5″ HUSML4040ASS600 (SSD DataStore)

4X SAMSUNG SM843T 480GB SATA3 2.5″ SSD INTERNAL SOLID STATE ENTERPRISE DRIVE LAPTOP (SSD DataStore 2)

Intel Ethernet Converged Network Adapter X540-T2

Intel Ethernet Server Adapter I350-F4

MikroTik – CRS125-24G-1S-IN – , Cloud Router Gigabit Switch, 24x 10/100/1000 Mbit/s Gigabit Ethernet with AutoMDI/X, Fully manageable Layer3, RouterOS v6, Level 5 license.

ASUS XG-U2008 Unmanaged 2-port 10G and 8-port Gigabit Switch

Notes

  • Both the SSD Datastores are running in a RAID 5
  • All Parts except the Intel 750 SSD, Motherboard, and CPU were purchased on eBay.   I saved hundreds by doing that.  DDR4 ECC is dirt cheap on eBay right now.
  • I will run out of storage space and memory long before I run out of CPU
    • To up the memory I’ll need to purchase 32GB DIMMS as I am maxed out with 16GB DIMMScapture
  • The Mikrotik Switch Handles 1G Traffic
  • The Asus Switch Handles 10G Traffic between my NAS and my Vmware Server
    • Current the 10G NIC is direct connected to one of my VMs that allows me to transfers data over the 10G NICs
  • The storage breakdown is below.  NappItDataStore is my NAS and it has ISO files for OS installs, and other installers.  All told, VM storage is only 2.8TB.  I would love to consolidate down to 4 X 1.2TB NVME SSDs and currently have my eye on this.  At 500$ a drive, it’s too expensive.  Hopefully these drives will start to come down on the used market.  At that point I could eliminate the Intel RAID Card, and go with all PCIE SSDs.

Capture.PNG

 

 

OpenIndiana Install – Take Two

I decided to retry the OpenIndiana Install, and this time I re-downloaded the ISO.  Surprisingly without issue the ISO Booted quickly.

First issues I ran into was that the OpenIndiana Text Installer only saw the 6TB Drives I had installed.  In order to get the installer to see the Sata DOM, I needed to remove the HBA Cables and then re-run the installer.  As you can see, the installer now saw the 64GB SATA DOM, and the 200GB Intel SSD.

capture

Once configuring the Network, and setting the usernames and passwords, the installer did its thing.

capture

After a quick install, the systems was rebooted, and the HBA Drives reconnected.

As for configuration, I followed directions from one of my favorite Server/Hardware Sites ServetheHome

  • Simply launch a Terminal in Openindiana, then type su 
  • Next, we will launch the installer command: wget -O – http://www.napp-it.org/nappit | perl 
  • Now you can sit back and watch the magic.

The install went without issue, but currently OpenIndiana does not have drivers for the X557 Intel NICs that are built into the SuperMicro X10SDV-4C-TLN4F Motherboard. Testing on 1G Links was similar to what we say in FreeNAS.  Currently I only know that FreeNAS has support for the X557 Nics.

 

Okay what happened with the NAS Server?

I had this question asked earlier today, and the plain answer is “Stuff”.

I decided that I needed more expandability in the server, and since I was using a Fractal Node 804 for a gaming case, I decided to swap out the board and gaming components and put them in the Fractal Node 304.  This allows me up to 8 3.5in HDDs and a lot more real estate.

Couple notes on the Fractal Node 304 Case

  1.  Cable Management is a Pain.  See the bundle of cables by the back of the GPU?  That’s what I mean.  There is only room there because I don’t have the HDD Sliders in.
  2. The PSU design is smart, but is a problem if you need to access it.  Point and case, I didn’t flip the PSU Switch before powering on and had to remove the cover and flip it.

So what am I running in my gaming box and what games do I play?

It’s a rather simple build running Windows 10 to play No Man’s Sky