DevOps R&D performance
Media Matrix
Open source China APP
Authorization Agreement Apache-2.0
development language Python View source code »
operating system Cross platform
Software type Open source software
Open source organizations nothing
region Unknown
deliverer h4cd
intended for unknown
Recording time 2019-10-09

Software Introduction

Transformers (formerly known as pytorch transformers and pytorch reserved bert) are natural language processing frameworks for TensorFlow 2.0 and PyTorch. It provides the most advanced general architecture for natural language understanding (NLU) and natural language generation (NLG), including BERT, GPT-2, RoBERta, XLM, DistilBert and XLNet, with more than 32 pre trained models in more than 100 languages, TensorFlow 2.0 and PyTorch Deep interoperability between.

characteristic:

  • It is as easy to use as pytorch transformers
  • Powerful and simple like Keras
  • High performance when processing NLU and NLG tasks
  • The entry threshold for educators and practitioners is low
  • With low computing cost, researchers can share training models without further training. There are eight built-in architectures, including more than 30 pre training models, some of which use more than 100 languages
  • Three lines of code can train the most advanced model
  • The deep interoperability between TensorFlow 2.0 and PyTorch models allows you to move a single model between two frameworks at will
Expand to read the full text

code

Gitee index of is
exceed Items for

comment

Click to lead the topic 📣 Post and join the discussion 🔥
No content temporarily
{{o.pubDate | formatDate}}

{{formatAllHtml(o.title)}}

{{parseInt(o.replyCount) | bigNumberTransform}}
{{parseInt(o.viewCount) | bigNumberTransform}}
No more
No content temporarily
Issued a question and answer
{{o.pubDate | formatDate}}

{{formatAllHtml(o.title)}}

{{parseInt(o.replyCount) | bigNumberTransform}}
{{parseInt(o.viewCount) | bigNumberTransform}}
No more
No content temporarily
No content temporarily
zero comment
thirty-three Collection
share
 OSCHINA
Log in to view more high-quality content
 Back to top
Top