Paris-based startup LightOn, maker of photonic co-processors for large-scale AI, unveils today PAGnol, a 1.5 billion parameter language model. PAGnol is a collection of large French language models, geared towards free-form text generation — with PAG standing for pré-apprentissage génératif.
As compact replacements for fully-fledged optical circuits, and as an alternative to integrated photonics circuits, this novel range of devices is a compelling contender for scalable photonic linear quantum optical computing. Starting from the interfacing of several qubits in its first generation, we plan to increase the circuit size in future iterations, with the aim to rapidly demonstrate a quantum advantage. In this context, a key feature of these devices is the ability to offer low losses, all-to-all circuit connectivity, and easy reconfigurability at any circuit size.
This year, thanks to the awesome Machine Learning team at LightOn, we have two accepted papers at NeurIPS, the AI flagship conference, and have five papers in its“Beyond Backpropagation” satellite workshop that will take place on Saturday. This is significant on many levels, not the least being that these papers have been nurtured and spearheaded by two Ph.D. students (Ruben Ohana and Julien Launay) who are doing their thesis as LightOn engineers.