BERT pre-training from scratch with tensorflow version 2.x

Issue

i used run_pretraining.py (https://github.com/google-research/bert/blob/master/run_pretraining.py) python script in tensorflow version 1.15.5 version before. I use Google cloud TPU, as well.
Is it possible or any python script for BERT pre-training from scratch on TPU using tensorflow version 2.x ?

Solution

Yes you can use NPL library from TF2 model garden.

The instructions for creating training data and running pretraining are here:
nlp/docs/train.md#pre-train-a-bert-from-scratch.

You can also follow BERT Fine Tuning with Cloud TPU tutorial with some changes to run pretraining script instead of fine tuning.

Answered By – Gagik

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published