smaka510 Posted October 11, 2017 Share Posted October 11, 2017 (edited) Hello, I decided to convert my Windows 10 x64 file server into a NAS w/ plex. But I'm afraid that I dove in head first without knowing everything. I have a few questions: FYI - I got the RAM and SSD's from a bunch of decommissioned servers/laptops at work. I only use HGST enterprise drives (3TB and 4TB). Here is my hardware: Case - $50 (craigslist) - Supermicro SC846E1-R900B 24 bay 4u case w/ dual 900w PSU's and SAS1 backplane (backplane is not used) SAS2 backplane - $250 (ebay) - BPN-SAS2-846EL1 w/ frame Motherboard/Xeons - $300 (ebay) - Supermicro-X9DRi-F w/ 2x E5-2630v1 Memory - $0 - 128GB-DDR3-ECC NIC - $50 (ebay) - MYRICOM 10G-PCIE2-8B2-2S SFP+ DUAL-PORT 10Gb Transceiver - $13 (ebay) - Finisar FTLX8571D3BCL SFP-SR-SW-10Gb Multimode SFP HDD's - $60 (ebay) - 8x 3TB HGST Ultrastar HUS724030ALE641 Enterprise Drive (manufacturer refurbished) SSD's - $0 - Samsung Evo 850 Pro HBA - $55 (ebay) - Dell H310 SAS HBA w LSI-9211-8i (pre-flashed to P20 IT Mode) ------------------------------ Total w/o HDD's = $718 Question 1: Backplane w/ HBA The BPN-SAS2-846EL1 appears to have 2x SAS connectors but the "EL1" version does not support multi-linking, so what is the other port for? Is it for fail-over just in case the connector somehow breaks? Or can it be used to give more PCI lanes to the setup (from 4x to 8x)? From what I see, ALL pre-configured systems come with 2x SFF-8087 cables. WHY do they do this if only 1 is required and the other does not increase bandwidth? It is confusing and misleading. Question 2: relates to question 1 I was under the impression that if ever I needed more storage, I could just build another server then run the cable from the 2nd port of the HBA up to the backplane of the new server. Am I getting this wrong? Edited October 12, 2017 by smaka510 cool like dat Quote Link to comment
ken-ji Posted October 12, 2017 Share Posted October 12, 2017 (edited) Answer 1: From the documentation, it is only for failover. Answer 2: That would work, though your terms are all wrong. You would build/get another enclosure, and wire it up to the 2nd port of the HBA. but your HBA is 8i (internal only) so you would either run a SFF-8087 across chassis, or do conversion with 2x 8087-8088 converters and and 8088 cable. If you used a similar/same case as the enclosure, you'd need to make sure the chassis powers up properly even without a functioning motherboard installed. Edited October 12, 2017 by ken-ji Quote Link to comment
smaka510 Posted October 17, 2017 Author Share Posted October 17, 2017 I was able to get everything put together. The motherboard came with 16gb of ram (4x 4gb sticks). I ended up swapping them out for 128gb of ram. After doing some digging I'm finding that most people do not need that much. I was previously a freeNas user and they suggest 1gb ram for every 1tb of storage. My daughter ended up putting the whole thing together 1 Quote Link to comment
smaka510 Posted October 17, 2017 Author Share Posted October 17, 2017 Question 3: How much memory to use? Is 128gb too much memory for unraid w/ plex (7+ transcodes)? I have been reading threads where people are getting away with using far less. Question 4: Loud! Has anyone swapped out the Enterprise PSU's for a standard ATX PSU? I'm also planning to swap the fans and heat syncs in an attempt to bring the noise level down. Has anyone ever done this? Are there any threads that cover it? I am planning to keep this powered on 24/7, is there any harm in doing this? Quote Link to comment
tdallen Posted October 17, 2017 Share Posted October 17, 2017 (edited) Hi - Anything more than 16GB is getting excessive unless you have a reason. It's possible to use lots of memory on Dockers and/or VMs, but unRAID itself is very efficient and can run with less than 4GB of RAM. FYI, Plex doesn't consume much RAM unless you set it up to write transcoded streams to a RAM drive (and then you could use all that RAM). Plex uses lots of CPU for transcoding, but not a lot of RAM. Edited October 17, 2017 by tdallen Quote Link to comment
CHBMB Posted October 17, 2017 Share Posted October 17, 2017 That's a great pic of your daughter helping you, I can only hope when my little girl is older we can share some PC building experiences together! 1 Quote Link to comment
FreeMan Posted October 17, 2017 Share Posted October 17, 2017 I concur - great pic of your daughter! Reminds me of when my youngest was that young and I had him building machines... Quote Link to comment
smaka510 Posted October 22, 2017 Author Share Posted October 22, 2017 I ended up getting a PSU from EVGA. It’s so much more quiet. It’s so quiet that I may let this beast live in the house. It supports 2 CPUs and is modular. I will loose the redundancy but that is not as big a deal as the noise. i am am also working on a custom fan wall. Nothing fancy, just a piece of sheet metal with holes cut out for 3x 140mm Antec fans. I’m also planning to drill holes to mount 80mm desktop fans in place of the current exhaust fans. I did a mock up and while everything is running it’s completely silent. I came up on another case as well. Supermicro 2u 12 bay. Also, I’m working on a deal to get 7x 800GB SAS SSDs (building a SAN with this one). More on that when I get all the parts. Quote Link to comment
smaka510 Posted October 23, 2017 Author Share Posted October 23, 2017 (edited) Question #5: Gaming Video Card HDMI/DP over ethernet? I would like to be able to access my VMs from my living room and office. This server is going to be racked in my garage. Has anyone tried using the "Blackbird" 1080p or 4k extender kits? They will pass the video and USB signals over CAt6e. I've been searching and cannot find the info I need on this. Edited October 23, 2017 by smaka510 Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.