Google Just Open-Sourced a Machine Learning Technique

Under a minute read
| Link

Google has made a particular machine learning technique open-source. It’s called Bidirectional Encoder Representations from Transformers (BERT), used for natural language processing (NLP).

With this release, anyone in the world can train their own state-of-the-art question answering system (or a variety of other models) in about 30 minutes on a single Cloud TPU, or in a few hours using a single GPU.

For more technical details Google released a white paper called BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.

Check It Out: Google Just Open-Sourced a Machine Learning Technique

Google Just Open-Sourced a Machine Learning Technique
Add a Comment

Log in to comment (TMO, Twitter, Facebook) or Register for a TMO Account