Adding a second NVME memory card to workstation

I am working on my HP Z640 and wish to add an additional NVMe SSD.

At present I have 1 x Kingston 2TB NV1 M.2 2280 NVMe SSD installed in a
StarTech.com M.2 PCIe SSD Adapter - x4 PCIe 3.0 NVMe - M.2 Adapter.

I have just acquired a PCIe adaptor StarTech.com 2x M.2 SATA SSD Controller Card - PCIe which can accept 2 no. SSD cards.

Can I plug and play with these, ie. Swap PCIe adaptor and put my existing SSD memory card in the new adaptor?

If yes, can I then add a second SSD card and then expand capacity to use all the SSD memory or do I have to start over with a fresh installation?

Just planning now as I am still saving for the second SSD.

@Budgie2 You can add a second adapter in a spare slot, note SSD and NVMe are not the same… or you can get a two port NVME (tight for heatsink space) adaptor with bifurcation and set the slot to x4x4 in the BIOS assuming the X8 slot in the Motherboard.

Um! I have a PCIe adaptor SATA SSD Controller card which has a single NVMe memory card installed.

I have acquired a variant PCIe adaptor of the same make which has socket for two cards.
I am tight for slots so want to swap PCIe adaptors. Sadly the adaptor I have acquired is not the correct one. Will return it.

Sorry I asked!

@Budgie2 No need to be sorry, this is the one I have;

GLOTRENDS PA21 Dual M.2 NVMe to PCIe 4.0 X8 Adapter Without PCIe Bifurcation Function, Support 22110/2280/2260/2242/2230 Size (PCIe Bifurcation Motherboard is Required)

Then there is this one (with built in Bifurcation);

GLOTRENDS PA20 Dual M.2 NVMe to PCIe 3.0 X4 Adapter with PCIe Bifurcation Function, Support 22110/2280/2260/2242/2230 Size

This one looks good (bit more space between devices);

10Gtek 2-Port M.2 NVMe Adapter M-Key, PCIe X8 Gen3. Requires Motherboard BIOS Support for Bifurcation

Hi Malcolm,
Very many thanks for searching for me.
I have returned the unwanted card and will get refund. The correct one from StarTech, which does not need bifurcation, is available but out of my budget at present. (Business still very slow!).

My needs are not pressing but meanwhile it would be good to know exactly what is meant by bifurcation in this case?

Also I am considering a single 4TB chip as alternative to a new card and additional 2Tb chip. Only trouble with this that I can see is I have to rebuild machine. Any views on best choice.

Finally I need to understand what is available in my hardware BIOS. I am presently working on Z640 which I hope can cope with bifurcation if required but…

Since I am using btrfs on whole of my 2TB NVMe and this has all my system including /home, I have not created a separate ext4 partition for /home as I have in the past. All I want to do now is increase my storage capacity and optimise my hardware performance. What difference would bifurcation make to me I ask.

PCI is all new to me so PCIe versions, slot size and number of lanes are all a blur. Is there any good article I can read?

Many thanks for your help once more.
Alastair

@Budgie2 bifurcation is “splitting” so an X8 slot can become a X4+X4 slot. If you pop into the system BIOS (F10), Advanced and Slot Settings, in each slot you will see the bifurcation setting it will be on auto, you can set to what is needed for a particular slot and card…

Have a read here for PCIe https://en.wikipedia.org/wiki/PCI_Express

Hi Malcolm,
Many thanks. Will read up and save my pennies meanwhile.
Regards,

@Budgie2 Not interested in using the hardware RAID instead with some SSD’s?

He can use btrfs

@karlggest For the operating system? I would only use for data on say xfs or ext4…

Hi Malcolm,
Hardware RAID? Do I have it on my Z640?

I do not have many hardware bays as I think one slot taken by a Card device for camera cards and the second by existing small capacity SSD which has windows and booting partitions. I would need some minor hardware bits for more ssds. Cables are there but not the hardware slots. Also costs would be significant.

BTW I have whole of my TW OS on btrfs. Act of faith on my part.

Hi!

I don’t have much experience in hardware RAID (I only used once). In btrfs, as far as I know, if you have two disks in a btrfs volume, you can remove one without remove it from the volume first. But works fine. Well, if you have a home subvolume btrfs should not include it in the snapshot, so I don’t think you have a problem here. Maybe you prefer a separate /home (i.e. to clean reinstall).

It’s true ballo from Plasma 5 may not work well (I think issue is fixed in Plasma 6). But works fine.

Well, seems it works.

At least, RAID 1 seems to work: Solved: HP Z640 Raid 1 Setup - HP Support Community - 5566808

Well, If you want /home in a separate device or maybe using xfs or other, you can do it in this moment.

If you want use the 2 devices in a single volume:

In this example, it is a system installed in a 2TB disk named /dev/nvme0 and plugged a new device nvme of 4TB in /dev/nvme1. Take care system runs at the lowest speed if it’s different.

To add de device:

sudo btrfs device add -f /dev/nvme1 /

To balance:

sudo btrfs filesystem balance /

@Budgie2 yes, at the bottom of the board, there are four (4) SATA connectors, then another two connecting the CD/DVD device and one SATA. It’s all controlled in the BIOS, for JOBD or RAID. I have a 4 port 2.5 open caddy that fits in a 5.25 drive bay at the top of my 440.

I would worry that after making the change that grub wouldn’t recognize the boot volume. Will the /dev/nvme0n1p1 or whatever address change by doing this? Will grub handle this kind of swap without any fuss? I don’t know, I am curious if someone can enlighten me.

UEFI would detect the ESP partition with grub in it and start the whole boot chain. It shouldn’t matter if the partition is first/middle/last when using EFI and GPT partition table unlike MBR and legacy BIOS.

P.S. Perhaps it would be better if you created a new thread.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.