paper
arXiv cs.AI
November 18th, 2025 at 5:00 AM

Hybrid Quantum Transformer for Language Generation

arXiv:2511.10653v1 Announce Type: cross Abstract: Although quantum computing has been increasingly applied to replace classical computation, most existing quantum or hybrid models remain confined to simple tasks, with no successful application to large-scale natural language generation to date. In this work, we present the first hybrid quantum-classical large language model (LLM) for natural language generation, HyQuT, capable of performing coherent and context-aware dialogue. The proposed architecture integrates variational quantum circuits (VQCs) into the Transformer framework at both 8M and 150M parameter scales. Experimental results show that a minimal number of qubits (10 qubits with 80 quantum gates) can replace about 10% of the classical parameters in the 150M-parameter model, while achieving comparable convergence stability and generation quality. This study provides an early demonstration of the feasibility of integrating quantum computing to large-scale generative language models.

#ai
#llm
#research

Score: 2.80

Engagement proxy: 0

Canonical link: https://arxiv.org/abs/2511.10653