2020¶
- BART: Denosing Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
- Billion-scale Commodity Embedding for E-commerce Recommendation in Alibaba
- Deep Neural Networks for YouTube Recommendations
- Unified Language Model Pre-training for Natural Language Understanding And Generation