In an earlier installment, I pointed out that popular branded solutions are surprisingly expensive for low-performing hardware. Reviews indicate that they have rather poor performance. So for my comparison, I’ll use the Synology DiskStation DS1513+, which reportedly has good performance, more than what a single-link gigabit Ethernet connection can handle.
It has quite a few things in common with the home-made solution: Multiple Ethernet ports that can be used for redundancy or increased throughput, the ability to host related servers as “apps”, and not user-friendly enough for novices to do the initial setup.
While I was doing this, the Synology DiskStation could be found for $830. It contains a dual-core Atom D2700 running at 2.13 GHz and 2GB of DDR3 RAM.
Now, there are two ways to approach this. Clearly a competent file server can run on low-end x86_64 processors with a small (by today’s desktop standards) amount of RAM. The original FreeNAS was commonly used with hand-me-down hardware.
But, times have changed. The new FreeNAS, a rewrite by iXsystems, was designed with more modern concerns in mind: RAM is much cheaper now and the system can be more capable and easier to write if it doesn’t have to cope with low-RAM installations. In addition, the safety of ZFS against mysterious data corruption relies on the RAM not having mysterious corruption too, and should be used with ECC RAM. Then comes dire warnings about Windows file shares (CIFS) being single threaded and thus needing a fast CPU (as opposed to multiple slower cores), and features such as encryption demanding ever more CPU performance. Oh, and the Realtek NIC used on many consumer motherboards is not good for FreeNAS; it needs an Intel NIC.
In short, I’m looking at a server-grade system, not a typical desktop or “gamer” enthusiast system. What you don’t need is fancy overclock support, sound, lots of slots and multi-video-card support, etc. so a low-end server board is actually about the same price as a “fancy” desktop motherboard.
In particular, the Supermicro brand comes highly recommended. I could have gotten an X9-series server motherboard and put a Xeon E3 v2 CPU on it. But why stop there? I spent more to go with the newer X10-series board and a Xeon E3 v3 “Haswell” CPU. The X10-SL7-f in fact contains an 8-channel SAS controller as well as the usual 6 SATA channels, sprouting a whopping 14 SATA connectors on the motherboard. It also features IPMI 2.0 on its own dedicated network port, which is a wonderful feature and I’ll have more to say about it later.
So without further ado, here is the breakdown of my build:
Parts List
Item Description | Price |
ICY DOCK MB153SP-B 3 in 2 SATA Internal Backplane Raid Cage Module | $63.99 |
Intel Intel Xeon E3-1245V3 Haswell 3.4GHz LGA 1150 84W Quad-Core Server Processor | $289.99 |
SUPERMICRO MBD-X10SL7-F-O uATX Server Motherboard | $239.99 |
SeaSonic SSR-360GP 360W ATX12V v2.31 80 PLUS GOLD Certified Active PFC Power Supply New 4th Gen CPU Certified Haswell Ready | $59.99 |
Fractal Design Define R4 Black Pearl w/ USB 3.0 ATX Mid Tower Silent PC Computer Case | $99.99 |
ZALMAN CNPS5X Performa 92mm FSB (Fluid Shield Bearing) Powerful Cooling Performance CPU Cooler | $19.99 |
2 × 8GB PC3-12800 DDR3-1600MHz ECC Unbuffered CL11 HYNIX Memory | 178.92 |
Total without drives | $952.86 |
WD Red WD40EFRX 4TB IntelliPower 64MB Cache SATA 6.0Gb/s 3.5" NAS Internal Hard Drive -Bulk | 3×$189.99 = $569.97 |
Seagate ST4000DM000 Desktop 4TB 64MB Cache | 2×$155.49 = $310.98 |
Total for Build | $1833.81 |
The raw power seriously outclasses the DiskStation, and is only $120 more. With the X9/v2 option, it would have actually been less.

Above is the result, Oort, with the side open. You can see the stack of 8 drive trays, and the large heat sink over the CPU.

Here is a front view. The grill along the edges allow air intake from the front. The blank front face is imposing and mysterious… I wonder if I can get some artwork over it?

And finally, with the front panel open. There is foam sound-dampening on all the case surfaces including the inside of this door. The ICY-Dock hot-swap bays are now accessible. I plan to use these for backing up and mounting off-site volumes while they are resident. The main drives require side access, which is simply a matter of removing two thumb screws.
Now back to the details. The X10 (rather than X9) series mainboard allows the use of the newer Haswell processors, which run cooler and save power. The onboard SAS saves what would be a hundred dollar PCI card, and is much easier as well since it provides common SATA-compatible connectors. And finally, this motherboard has the wonderful IPMI 2.0 with full KVM-over-LAN.
For the CPU, I looked at the the chart in Wikipedia, along with the prices and availability at NewEgg. I chose the lowest (cheapest) Xeon E3 that had onboard graphics and hyperthreading. Why do I need onboard graphics if the system doesn’t have a monitor? I think that the monitor-over-LAN feature still requires an actual VGA; it doesn’t emulate one, but just captures the output. There is a more primitive remote management feature that allows for a TTY-style console (also over LAN), but I don’t think that helps with initial BIOS screen stuff. Also, with the standard built-in GPU I can use it for computation other than drawing graphics. Maybe it will accelerate other software I run on the box at some point.
I’m keeping the box in a closet which besides building up heat from the machines gets afternoon sun on the outside wall. The closet is warm in the summer. My experience with the stock cooler that comes with the CPU is that it’s loud or even inadequate. Looking through NewEgg, I looked for this style with low noise and a good price. I normally like this style in part because it takes a standard square fan which can be updated and replaced, but the Zalman is known for quiet fans too. I mounted it, not with the thermal grease that it came with, but with Phobia HeGrease, carefully applied and spread.
The RAM was not available at NewEgg. Apparently ECC but not Buffered/Registered also is uncommon. Buffering is used to facilitate having many more memory sticks on a board, which is not the case of this server-but-desktop board. I found it at a specialty RAM site, www.memoryamerica.com, which has a wide selection. To be on the safe side, I looked at the brands that Supermicro had tested on this board, and took the cheaper of the two. 16GiB uses two of the four memory slots, so it can be doubled in the future.
I use Seasonic power supplies, and that’s another story. I looked for “Haswell support”, which enables a new improved stand-by mode.
Now for the case: Some mentions on the FreeNAS web forum led me to Fractal Designs. I followed up by reading reviews and the manufacturer’s web site. There are a couple models that are so similar that I wonder what the difference is! Since there is no direct explanation, it takes reading the specs very carefully and comparing the dimensions to spot the real differences. This R4 features an internal stack of 8 HDD trays (with the anti-vibration mounting) plus two half-height 5¼″ external bays. If you include two SSDs stuck elsewhere, that is 13 drives total, which is nicely close to the motherboard’s support of 14.
I chose an option with two external bays so I could fit a 3-disk hot-swap backplane. Here I went with the name-brand ICY-Dock and a with-tray design, because I had trouble with trayless on Mercury. So using the front-loading drive bay requires the use of two mounting screws, which is not very handy as it turns out.
Worse, the “2 half height bays” is a little exaggerated. It’s more like 1.95 half height bays, as a large bulge protrudes into the area where the bottom bay should be. I had to remove the bottom piece of sheet metal from the ICY-Dock in order to squeeze it in; this also got rid of the normal mounting ears. I’ll make a bracket some day (a perfect job for a 3D printer), but it fits tightly and is not heavily used, so I left it without screws for the time being.
Other than that, assembling was easy and straightforward. Testing proved interesting and adventuresome, and I’ll tell you about that later.