Natural language understanding comprises a wide range of diverse tasks such as textual entailment, question answering, semantic similarity assessment,. Improving Language Understanding by Generative Pre-Training, OpenAI, 2018 Transformer open open a a bank Transformer Transformer POSITIVE Fine-tune on Classification Task Transformer open a Transformer Transformer Train Deep (12-layer) Transformer LM.
Status: Archive (code is provided as-is, no updates expected)
Code and model for the paper 'Improving Language Understanding by Generative Pre-Training'
Currently this code implements the ROCStories Cloze Test result reported in the paper by running:
python train.py --dataset rocstories --desc rocstories --submit --analysis --data_dir [path to data here]
![Openai improving language understanding by generative pre-training Openai improving language understanding by generative pre-training](https://cdn-images-1.medium.com/max/1600/1*aC5E6WcoOX8mZ4MHw72zjg.png)
Note: The code is currently non-deterministic due to various GPU ops. The median accuracy of 10 runs with this codebase (using default hyperparameters) is 85.8% - slightly lower than the reported single run of 86.5% from the paper.
![Openai gpt paper Openai gpt paper](/uploads/1/2/5/8/125849970/902486439.png)
The ROCStories dataset can be downloaded from the associated website.