NEural contextualiZed representation for CHinese lAnguage understanding. A Chinese pretrained language model based on BERT with improvements including functional relative positional encoding, whole word masking, and mixed precision training. Achieves state-of-the-art on Chinese NLU tasks.

Outputs 2

NEZHA

model

NEZHA: Neural Contextualized Representation for Chinese Language Understanding

paper

arXiv: 1909.00204

nlppretrained-modelopen-source