kevschu

Members
  • Posts

    12
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

kevschu's Achievements

Noob

Noob (1/14)

1

Reputation

  1. I am having a similar issue. I had a locking issue when I first switched over to Ryzen and it was related to CState config and it worked for several months after making changes in the BIOS. a month or so ago, it locks up a couple times a week. Made sure the BIOS was up to date, haven’t double checked memory speeds yet. I have my syslog server setup, i will make a separate post instead of stealing this. thought I would chime in that you aren’t alone in this.
  2. Well, I replaced the Asus Prime B450M-A/CSM motherboard with an ASRock X570M Pro and things have been running solid for the last 3 days.
  3. I looked through the thread. I've ensured that "Global C-State" is disabled, and that the PSU Idle setting is set to typical. I've also added "rcu_nocbs=0-3" to the syslinux configuration. The only change this time will be the rcu-nocbs setting, as the two power settings were already configured during this last lock. attached is the syslog and new diags. This portion of the syslog seems important.. schumachertower-diagnostics-20200312-2016.zip syslog-192.168.0.3.log
  4. Alright, so, that didn't take long for it to become unresponsive this time. I had to hard power it down again, but I think adjusting the "shutdown timeout" in disk settings helped because the disk that usually shows up as disabled wasn't disabled when it came back online, this time. I do have the syslog on the syslog server though. Is there a way for me to anonymize it, or will running the diagnostic collector grab it from that location and do that for me?
  5. Yeah, doing that it comes up fine. it is only after an ungraceful reboot when unraid becomes unresponsive that the drive becomes disabled. but it is always that one drive.
  6. hmmm. would need clarification on "reboot on purpose". if i. 1. rebuild the drive 2. start using it again 3. gracefully reboot using the unraid UI 4. system comes back up and the drive is still fine. If 1. Unraid becomes unavailable 2. power device off using physical power button on system 3. power device on 4. system comes back up and the drive is disabled.
  7. I have rebuilt the drive in the past. I will end up rebuilding it again. I would like to try and figure out why this scenario of the system becoming unavailable, and then that drive in a disabled state when it recovers keeps happening.
  8. I would suspect my single stick of memory is fine then, it is registering at 2400MHz in the bios.
  9. Thank you for the reply. Didn't know the 3200G was considered a 2nd gen chip. Going through the link that was shared will work on the following to see if it brings stability and resolves my issue. 1. Set syslog server up, just to make sure I have the diagnostic data saved somewhere safe on the reboots 2. check the "Power Supply Idle Control" in the BIOS, I don't believe it is set to the suggested "typical current idle". 3. Swap the memory out as I am using DDR4 3200, and the CPU only supports up to 2933. Maybe swap it out for 2400 or 2666, 2933 seems niche based on what I see as available options on, say, Newegg. Am I missing anything else from that article? There weren't ever any errors listed on the drive that keeps showing up as Disabled, and I have copied the data off of that drive onto other drives so I am not worried about cloning it.
  10. I have been using unraid for a few years now, and was on 6.8.0 without any issues before, using an older intel board and an i7-3990. Recently I replaced that board and CPUwith an Asus B450 board and Ryzen 3 3200G. Since then, once a week or so, sometimes longer, the system becomes unavailable on the network. It cannot be pinged, and my router shows it as offline (system is turned on and there weren't any power outages or brownouts). If I turn the system off by pressing the power button and letting Unraid do its thing to turn off, once the systems comes back up, I always end up having the same data drive (same drive each time) in a disabled status. schumachertower-diagnostics-20200310-2115.zip
  11. I've read that unRAID has not supported USB Drive enclosures in the past, is this still the case? i saw a post that indicated this was added in 6.2? I have a Sans Digital TRU5T+BN currently hooked up to my desktop and it is configured with RAID 5 and it is hooked up using USB3. The problem with this is that i cannot see the status of individual drives over the USB3 + Raid 5 configuration. I would like to purchase an Intel Nuc, and have it host the drive enclosure in a JBOD configuration, and use UnRaid to take care of the raid configuration so then i can get the status of individual drives. Is this possible?