0%

TensorFlow 2.0 Question Answering

  1. GPT-2 and BERT Pretrained Weights (pytorch)
    You can find weights for BERT and GPT-2 models (pytorch), ready to be used with HuggingFace’s Transformers or your own models:
    https://www.kaggle.com/abhishek/bert-pytorch
    https://www.kaggle.com/abhishek/gpt2-pytorch
  2. 一个 NLP 数据增广的库nlpaug
  3. One of the best collection of papers about BERT
    https://github.com/thunlp/PLMpapers
支持一根棒棒糖!