A real-world optical chip that you can actually buy is exciting. Still, seems to be far from a consumer-grade optical CPU. It’s more like a microcontroller, which you stick at the end of your 10 GBit fiber optic cable, and receive processed optic data.
Memory is going to be a big problem, because any AI workload requires a ton of it, and replacing even a simple 16 GB DRAM chip with an all-optic equivalent means you are essentially creating 16 GB of L1 CPU cache, which would be like 100 server CPUs stacked together, used only for their cache memory. And if you are using a conventional DRAM, you need to introduce optic-to-electric converter, which will be a speed bottleneck of your system, and probably expensive.
A real-world optical chip that you can actually buy is exciting. Still, seems to be far from a consumer-grade optical CPU. It’s more like a microcontroller, which you stick at the end of your 10 GBit fiber optic cable, and receive processed optic data.
Memory is going to be a big problem, because any AI workload requires a ton of it, and replacing even a simple 16 GB DRAM chip with an all-optic equivalent means you are essentially creating 16 GB of L1 CPU cache, which would be like 100 server CPUs stacked together, used only for their cache memory. And if you are using a conventional DRAM, you need to introduce optic-to-electric converter, which will be a speed bottleneck of your system, and probably expensive.