- GPT-2 and BERT Pretrained Weights (pytorch)
You can find weights for BERT and GPT-2 models (pytorch), ready to be used with HuggingFace’s Transformers or your own models:
https://www.kaggle.com/abhishek/bert-pytorch
https://www.kaggle.com/abhishek/gpt2-pytorch - 一个 NLP 数据增广的库:nlpaug
- One of the best collection of papers about BERT
https://github.com/thunlp/PLMpapers
TensorFlow 2.0 Question Answering
支持一根棒棒糖!
- 本文链接: http://sunyancn.github.io/post/24791.html
- 版权声明: 本博客所有文章除特别声明外,均采用 BY-NC-SA 许可协议。转载请注明出处!