|
|
Subscribe / Log in / New account

CXL 2: Pooling, sharing, and I/O-memory resources

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 19, 2022 20:56 UTC (Thu) by Cyberax (✭ supporter ✭, #52523)
Parent article: CXL 2: Pooling, sharing, and I/O-memory resources

Is CXL the new NVRAM? A new technology that is supposed to revolutionize everything, that requires a ton of kernel-level reworks, and then just disappears into nothingness?


to post comments

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 20, 2022 3:57 UTC (Fri) by xanni (subscriber, #361) [Link] (1 responses)

Unlikely, because for example AMD Zen 4 CPUs will incorporate CXL and they're likely to be widely used.

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 20, 2022 12:00 UTC (Fri) by simcop2387 (subscriber, #101710) [Link]

I'd expect this though to be segmented off into the Epyc line only though. At least for a generation or two simply because I don't expect anything close to the consumer level to be supporting CXL until hardware manufacturers start demanding it.

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 20, 2022 10:27 UTC (Fri) by atnot (subscriber, #124910) [Link]

One has to distinguish here between CXL, CXL.mem and what zealous vendors promise CXL.mem can do. I have my doubts about the latter, but there's definitely compelling use cases for CXL.mem, and CXL itself is kind of too useful and big to fail at this point.

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 21, 2022 1:45 UTC (Sat) by willy (subscriber, #9762) [Link] (7 responses)

If by "nothingness", you mean "has shipped two generations of product".

https://www.intel.com/content/www/us/en/products/details/...

And, yes, the kernel needed (and still needs more) modification to support it well.

It's funny you bring it up though, since one of the use cases is to put PMem on CXL.

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 21, 2022 1:50 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link] (5 responses)

Yep. It totally exists: https://www.insight.com/en_US/shop/product/P23535-B21/HEW...

You can buy a 256Gb module for an affordable $3,386.99 , a 512Gb module is on a discount right now at $9,768.99

It's fair to say that Optane memory is vaporware compared to the initial vision of systems with tens of terabytes of persistent RAM.

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 21, 2022 3:38 UTC (Sat) by willy (subscriber, #9762) [Link] (4 responses)

As opposed to a 32GB DIMM for $560?

https://www.insight.com/en_US/shop/product/P00924-B21/HEW...

It's not "vaporware" just because you don't want to pay for it. And you can buy an Exadata X9M with 18TB of PMem per rack (admittedly that's spread over three servers each with 6TB). I did think we'd have more capacity by now (about double what we have). But an undershoot on capacity is hardly the same thing as "doesn't exist". It was always going to be a high-end option.

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 21, 2022 5:17 UTC (Sat) by repnop (guest, #71931) [Link]

Additionally, it's entirely myopic to see CXL as simply a persistent memory technology.

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 21, 2022 6:53 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link] (2 responses)

You can buy them much cheaper: https://memory.net/product/m393aag40m32-cae-samsung-1x-12...

But yep, this means that the persistent RAM is pretty much a non-entity now. It needs special chipsets, it's expensive, it's not available through most cloud computing providers.

CXL will likely be similar once people find out that it's nowhere close in speed to normal RAM.

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 21, 2022 11:58 UTC (Sat) by willy (subscriber, #9762) [Link] (1 responses)

I was comparing apples-to-apples. PMem from this vendor vs DRAM from this vendor. PMem is cheaper per bit than DRAM, just as promised. It's more expensive per bit than NAND, also as promised. It's available in large capacities, so the price of a DIMM is high.

If you were expecting it in your laptop by now, then you weren't paying attention. You also don't have 100Gbps networking in your laptop, but it very much exists.

I think I'm done here. You said something stupid and hyperbolic; now you're determined to Be Right. It doesn't really matter what the facts are.

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 21, 2022 18:56 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

> PMem is cheaper per bit than DRAM

Except that it's not. At best it's about the same, and certainly machines with tens of terabytes of NVRAM are in the realm of exoticware for at least a decade more. This situation is far away from the hyped state where NVRAM was going to be ubiquitous.

> You also don't have 100Gbps networking in your laptop, but it very much exists.

I have 40Gbps networking at home, and it's not even that expensive. It's not in my laptop (for some reason adapters max out at 10GBps), but I certainly can use it otherwise.

CXL 2: Pooling, sharing, and I/O-memory resources

Posted May 21, 2022 8:10 UTC (Sat) by zdzichu (subscriber, #17118) [Link]

It's always amuses me that hardware companies develop and ship products before getting kernel support ready. Who is gonna use their products?
Or HW vendors designing CPU features conflicting with the way Linux operates (like Intel's memory tags). It's like they sit in a cave, completely ignoring software supposed to run on their silicon.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds