BERT pre-training from scratch with tensorflow version 2.x


i used ( python script in tensorflow version 1.15.5 version before. I use Google cloud TPU, as well.
Is it possible or any python script for BERT pre-training from scratch on TPU using tensorflow version 2.x ?


Yes you can use NPL library from TF2 model garden.

The instructions for creating training data and running pretraining are here:

You can also follow BERT Fine Tuning with Cloud TPU tutorial with some changes to run pretraining script instead of fine tuning.

Answered By – Gagik

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published