That’s what I was wondering as well?
If so, what’s the “correct” location to store stuff like documents, downloads, configurations, etc.?
bio?
That’s what I was wondering as well?
If so, what’s the “correct” location to store stuff like documents, downloads, configurations, etc.?
This is exactly what it looks like.
I had this exact situation happen to the fascia boards on my previous house. Carpenter bees bored into the wood and were living in it. Then a woodpecker came along and got them.
The damage in your picture looks exactly how my fascia boards looked after the woodpecker got his meal. You can also see the tunnels that go into the wood. I never even knew the bees were in the fascia, but somehow the woodpecker did…
Thank you for responding and providing the link and info. The top comment in that reddit post has the same link I posted above.
For the
zpool import // find the ID of the NVME pool
How did you find the ID of the NVME pool? I think is part of the problem I have where I see multiple partitions and not entirely sure which is the “boot” partition I should be pointing to. I think in your case, you’re pointing to the “data” partition, but this might help me eliminate one of my options.
I’m also not sure how the raid1 plays into things since it seems like both physical drives seem to have the same partitions. Not sure if I can just point to one of the “boot” partitions on one of the drives and it’ll find it’s partner when it starts booting?
Thank you for the details and link.
I looked around a little and seems like there are settings to help avoid this problem. Letting me know about this problem makes sure I catch it early. Unlike some of the people I’ve found that didn’t see the problem until it was already pretty bad…
I’ll keep this in mind if I can ever get this to work.
What were the issues you had with Clover in particular? I’d be interested to hear since I’m trying to head down that path myself.
For your “remap” can you explain what you did/have an example? I think this might give me the knowledge I’m lacking since I think part of my problem is not understanding which partition/PARTUUID is the Proxmox boot/what I should point Clover at.
Prior to doing the Proxmox install, and prior to the PCIe bifurcation, I still was unable to see the drives directly in iDRAC/bios. What I’ve read online is Dell does this for “reasons” and they happen to sell an add-on card to let you directly access NVMe from PCIe.
While I’m not ruling out the ZFS mirror issue, I don’t think it’s the cause of my problem considering both Clover and the Proxmox install debug can see the drives/partitions. I just don’t understand partition/device/boot structures and processes enough to make sense of what I’m seeing in the blkid/preboot results.
Trying to find information about it online just gets me bad guides about making partitions. The Linux docs for blkid and fdisk also don’t seem to have the explanation of the results, just the arguments for the commands.
The server has 24x 2.5" bays. I have an old SSD drive that I figure I could use as a last resort to be the Proxmox boot drive and then just use the NVMe’s as storage.
I was just hoping to have the Proxmox install/configuration in the NVMe RAID1 just for some minor safety in case a drive dies. From what I’ve read, this should be possible. I’m just lacking the knowledge to know what I’ve done wrong. (Mostly my lack of understanding the blkid results.)
The original plan is to use an SD card with Clover in read-only mode to bootload Proxmox running on the NVMe drives. (Read-only to prevent frying the SD card) This server has a built in SD Card slot Dell calls “vFlash” that you can actually remotely partition and configure. That’s where I was going to put the final configuration of Clover.
How fast/often is Proxmox writing logging? It’s concerning that you say you had this fry some NVMes since that’s what I’m trying to do here. Is this a setting that you can adjust?
I don’t think the bifurcation is causing me issues. Before I enabled it, I wasn’t able to see the drives from iDRAC/Bios. From what I’ve been able to research, this is expected and Dell sells the “solution” to booting directly from them. (Add-in card that’s pretty pricey…)
I do have an old SATA SSD that I’m considering slotting into one of the bays and using to boot. But I see that as a “last resort” option. I was hoping to have a bit of redundancy with the Proxmox install/configuration itself.
I feel that there’s a solution to the current setup and I just lack the knowledge to fix it. Everything I’ve been able to find points to my current setup being able to work. I’m just being hindered by not understanding partition/device/boot structure.
From what I understand, and what I saw during the Proxmox installation, if I can get past whatever part of the POST/boot process is preventing seeing the drives directly, I can use Clover to bootload from there. I’ve been able to boot into Clover just fine, and it was able to “see” the drives and partitions. I just don’t know which one should hold the Proxmox boot and if I’ve configured the Clover config correct.
From what I’ve read online, Dell does something similar. There’s some sort of card/add-on that can enable directly seeing and booting from PCIe but they are costly.
This server has the internal USB and a build in SD slot accessible from the rear. (There’s also a dual card option like you mention for redundancy.)
My plan was to get Clover working with USB, then use the vFlash SD slot to hold the Clover bootloader in read-only mode. This would hopefully prevent the SD card from dying quickly.
Thank you for posting this with the explanations and great visuals! I am wanting to upgrade to a setup almost identical to this and you’ve basically given me the bill of materials and task list.
Anything you wish you had done differently or suggest changing/upgrading before I think about putting something similar together?
Any way you could update/create your own drawing with what you mean? (Bad paint drawings are acceptable!)
I ask because I am curious if I am subject to the same problem. I’m not the most networking savvy so I need the extra help/explanation and maybe the drawing will help others.
I’ll throw my code into the ring as well. I posted it over in the Python community and have been using it myself.
It’s not the most user friendly yet though. Still working on improving it as I get time though and open to suggestions/requests.
https://rclone.org/protondrive/
Sometimes you don’t even need the open-sourcing side of things. I use rclone for OneDrive and I doubt Microsoft open sources it.
That being said, the rclone proton drive docs mention it’s a beta and it’s mostly the API they need to work properly.