This library is based on the Transformers library by Hugging Face. Simple Transformers lets you quickly train and evaluate Transformer models. Only 3 lines of code are needed to initialize a model, train the model, and evaluate a model. It supports Sequence Classification, Token Classification (NER),Question Answering,Language Model Fine-Tuning, Language Model Training, Language Generation, T5 Model, Seq2Seq Tasks , Multi-Modal Classification and Conversational AI. To use W&B for visualizing model training. To use this, set a project name for W&B in theDocumentation Index
Fetch the complete documentation index at: https://wb-21fd5541-john-wbdocs-2044-rename-serverless-products.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
wandb_project attribute of the args dictionary. This logs all hyperparameter values, training losses, and evaluation metrics to the given project.
wandb.init() can be passed as wandb_kwargs.
Structure
The library is designed to have a separate class for every NLP task. The classes that provide similar functionality are grouped together.simpletransformers.classification- Includes all Classification models.ClassificationModelMultiLabelClassificationModel
simpletransformers.ner- Includes all Named Entity Recognition models.NERModel
simpletransformers.question_answering- Includes all Question Answering models.QuestionAnsweringModel