معرفی شرکت ها


adapter-transformers-3.1.0


Card image cap
تبلیغات ما

مشتریان به طور فزاینده ای آنلاین هستند. تبلیغات می تواند به آنها کمک کند تا کسب و کار شما را پیدا کنند.

مشاهده بیشتر
Card image cap
تبلیغات ما

مشتریان به طور فزاینده ای آنلاین هستند. تبلیغات می تواند به آنها کمک کند تا کسب و کار شما را پیدا کنند.

مشاهده بیشتر
Card image cap
تبلیغات ما

مشتریان به طور فزاینده ای آنلاین هستند. تبلیغات می تواند به آنها کمک کند تا کسب و کار شما را پیدا کنند.

مشاهده بیشتر
Card image cap
تبلیغات ما

مشتریان به طور فزاینده ای آنلاین هستند. تبلیغات می تواند به آنها کمک کند تا کسب و کار شما را پیدا کنند.

مشاهده بیشتر
Card image cap
تبلیغات ما

مشتریان به طور فزاینده ای آنلاین هستند. تبلیغات می تواند به آنها کمک کند تا کسب و کار شما را پیدا کنند.

مشاهده بیشتر

توضیحات

A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
ویژگی مقدار
سیستم عامل -
نام فایل adapter-transformers-3.1.0
نام adapter-transformers
نسخه کتابخانه 3.1.0
نگهدارنده []
ایمیل نگهدارنده []
نویسنده Jonas Pfeiffer, Andreas Rücklé, Clifton Poth, Hannah Sterz, based on work by the HuggingFace team and community
ایمیل نویسنده pfeiffer@ukp.tu-darmstadt.de
آدرس صفحه اصلی https://github.com/adapter-hub/adapter-transformers
آدرس اینترنتی https://pypi.org/project/adapter-transformers/
مجوز Apache
<!--- Copyright 2020 The AdapterHub Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> <p align="center"> <img style="vertical-align:middle" src="https://raw.githubusercontent.com/Adapter-Hub/adapter-transformers/master/adapter_docs/logo.png" /> </p> <h1 align="center"> <span>adapter-transformers</span> </h1> <h3 align="center"> A friendly fork of HuggingFace's <i>Transformers</i>, adding Adapters to PyTorch language models </h3> ![Tests](https://github.com/Adapter-Hub/adapter-transformers/workflows/Tests/badge.svg) [![GitHub](https://img.shields.io/github/license/adapter-hub/adapter-transformers.svg?color=blue)](https://github.com/adapter-hub/adapter-transformers/blob/master/LICENSE) [![PyPI](https://img.shields.io/pypi/v/adapter-transformers)](https://pypi.org/project/adapter-transformers/) `adapter-transformers` is an extension of [HuggingFace's Transformers](https://github.com/huggingface/transformers) library, integrating adapters into state-of-the-art language models by incorporating **[AdapterHub](https://adapterhub.ml)**, a central repository for pre-trained adapter modules. _💡 Important: This library can be used as a drop-in replacement for HuggingFace Transformers and regularly synchronizes new upstream changes. Thus, most files in this repository are direct copies from the HuggingFace Transformers source, modified only with changes required for the adapter implementations._ ## Installation `adapter-transformers` currently supports **Python 3.7+** and **PyTorch 1.3.1+**. After [installing PyTorch](https://pytorch.org/get-started/locally/), you can install `adapter-transformers` from PyPI ... ``` pip install -U adapter-transformers ``` ... or from source by cloning the repository: ``` git clone https://github.com/adapter-hub/adapter-transformers.git cd adapter-transformers pip install . ``` ## Getting Started HuggingFace's great documentation on getting started with _Transformers_ can be found [here](https://huggingface.co/transformers/index.html). `adapter-transformers` is fully compatible with _Transformers_. To get started with adapters, refer to these locations: - **[Colab notebook tutorials](https://github.com/Adapter-Hub/adapter-transformers/tree/master/notebooks)**, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub - **https://docs.adapterhub.ml**, our documentation on training and using adapters with _adapter-transformers_ - **https://adapterhub.ml** to explore available pre-trained adapter modules and share your own adapters - **[Examples folder](https://github.com/Adapter-Hub/adapter-transformers/tree/master/examples/pytorch)** of this repository containing HuggingFace's example training scripts, many adapted for training adapters ## Implemented Methods Currently, adapter-transformers integrates all architectures and methods listed below: | Method | Paper(s) | Quick Links | | --- | --- | --- | | Bottleneck adapters | [Houlsby et al. (2019)](https://arxiv.org/pdf/1902.00751.pdf)<br> [Bapna and Firat (2019)](https://arxiv.org/pdf/1909.08478.pdf) | [Quickstart](https://docs.adapterhub.ml/quickstart.html), [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/01_Adapter_Training.ipynb) | | AdapterFusion | [Pfeiffer et al. (2021)](https://aclanthology.org/2021.eacl-main.39.pdf) | [Docs: Training](https://docs.adapterhub.ml/training.html#train-adapterfusion), [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/03_Adapter_Fusion.ipynb) | | MAD-X,<br> Invertible adapters | [Pfeiffer et al. (2020)](https://aclanthology.org/2020.emnlp-main.617/) | [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/04_Cross_Lingual_Transfer.ipynb) | | AdapterDrop | [Rücklé et al. (2021)](https://arxiv.org/pdf/2010.11918.pdf) | [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/05_Adapter_Drop_Training.ipynb) | | MAD-X 2.0,<br> Embedding training | [Pfeiffer et al. (2021)](https://arxiv.org/pdf/2012.15562.pdf) | [Docs: Embeddings](https://docs.adapterhub.ml/embeddings.html), [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/08_NER_Wikiann.ipynb) | | Prefix Tuning | [Li and Liang (2021)](https://arxiv.org/pdf/2101.00190.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#prefix-tuning) | | Parallel adapters,<br> Mix-and-Match adapters | [He et al. (2021)](https://arxiv.org/pdf/2110.04366.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#mix-and-match-adapters) | | Compacter | [Mahabadi et al. (2021)](https://arxiv.org/pdf/2106.04647.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#compacter) | | LoRA | [Hu et al. (2021)](https://arxiv.org/pdf/2106.09685.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#lora) | | (IA)^3 | [Liu et al. (2022)](https://arxiv.org/pdf/2205.05638.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#ia-3) | | UniPELT | [Mao et al. (2022)](https://arxiv.org/pdf/2110.07577.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#unipelt) | ## Supported Models We currently support the PyTorch versions of all models listed on the **[Model Overview](https://docs.adapterhub.ml/model_overview.html) page** in our documentation. ## Citation If you use this library for your work, please consider citing our paper [AdapterHub: A Framework for Adapting Transformers](https://arxiv.org/abs/2007.07779): ``` @inproceedings{pfeiffer2020AdapterHub, title={AdapterHub: A Framework for Adapting Transformers}, author={Pfeiffer, Jonas and R{\"u}ckl{\'e}, Andreas and Poth, Clifton and Kamath, Aishwarya and Vuli{\'c}, Ivan and Ruder, Sebastian and Cho, Kyunghyun and Gurevych, Iryna}, booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations}, pages={46--54}, year={2020} } ```


نیازمندی

مقدار نام
- filelock
<1.0,>=0.1.0 huggingface-hub
>=1.17 numpy
>=20.0 packaging
>=5.1 pyyaml
!=2019.12.17 regex
- requests
!=0.11.3,<0.13,>=0.11.1 tokenizers
>=4.27 tqdm
- importlib-metadata
>=0.10.0 accelerate
>=2.3 tensorflow
- onnxconverter-common
- tf2onnx
- tensorflow-text
<1.12,>=1.0 torch
!=0.1.92,>=0.1.91 sentencepiece
<=3.20.1 protobuf
!=0.11.3,<0.13,>=0.11.1 tokenizers
- torchaudio
- librosa
>=0.3.0 pyctcdecode
- phonemizer
<0.3.1 resampy
- Pillow
- optuna
- ray[tune]
- sigopt
- timm
==1.2.0 codecarbon
>=0.10.0 accelerate
- librosa
>=0.3.0 pyctcdecode
- phonemizer
<0.3.1 resampy
==1.2.0 codecarbon
>=0.6.5 deepspeed
>=0.10.0 accelerate
>=0.6.5 deepspeed
>=0.10.0 accelerate
- pytest
- pytest-xdist
- timeout-decorator
- parameterized
- psutil
- datasets
<0.3.5 dill
- pytest-timeout
==22.3 black
<2.0.0,>=1.4.12 sacrebleu
- rouge-score
- nltk
<3.1.19 GitPython
>=0.3.0 hf-doc-builder
<=3.20.1 protobuf
- sacremoses
- rjieba
==1.7.3 cookiecutter
- optuna
>=2.3 tensorflow
- onnxconverter-common
- tf2onnx
- tensorflow-text
<1.12,>=1.0 torch
!=0.1.92,>=0.1.91 sentencepiece
<=3.20.1 protobuf
!=0.11.3,<0.13,>=0.11.1 tokenizers
- torchaudio
- librosa
>=0.3.0 pyctcdecode
- phonemizer
<0.3.1 resampy
- Pillow
- optuna
- ray[tune]
- sigopt
- timm
==1.2.0 codecarbon
>=0.10.0 accelerate
- pytest
- pytest-xdist
- timeout-decorator
- parameterized
- psutil
- datasets
<0.3.5 dill
- pytest-timeout
==22.3 black
<2.0.0,>=1.4.12 sacrebleu
- rouge-score
- nltk
<3.1.19 GitPython
>=0.3.0 hf-doc-builder
- sacremoses
- rjieba
==1.7.3 cookiecutter
>=5.5.4 isort
>=3.8.3 flake8
>=1.0 fugashi
<2.0,>=1.0.0 ipadic
>=1.0.7 unidic-lite
>=1.0.2 unidic
==0.16.0 docutils
- myst-parser
==3.2.1 sphinx
- sphinx-markdown-tables
==0.4.3 sphinx-rtd-theme
- sphinx-copybutton
==0.4.1 sphinxext-opengraph
- sphinx-intl
- sphinx-multiversion
- scikit-learn
- pytest
- pytest-xdist
- timeout-decorator
- parameterized
- psutil
- datasets
<0.3.5 dill
- pytest-timeout
==22.3 black
<2.0.0,>=1.4.12 sacrebleu
- rouge-score
- nltk
<3.1.19 GitPython
>=0.3.0 hf-doc-builder
<=3.20.1 protobuf
- sacremoses
- rjieba
==1.7.3 cookiecutter
>=2.3 tensorflow
- onnxconverter-common
- tf2onnx
- tensorflow-text
!=0.1.92,>=0.1.91 sentencepiece
!=0.11.3,<0.13,>=0.11.1 tokenizers
- Pillow
>=5.5.4 isort
>=3.8.3 flake8
==0.16.0 docutils
- myst-parser
==3.2.1 sphinx
- sphinx-markdown-tables
==0.4.3 sphinx-rtd-theme
- sphinx-copybutton
==0.4.1 sphinxext-opengraph
- sphinx-intl
- sphinx-multiversion
- scikit-learn
>=1.4.0 onnxruntime
>=1.4.2 onnxruntime-tools
- librosa
>=0.3.0 pyctcdecode
- phonemizer
<0.3.1 resampy
- pytest
- pytest-xdist
- timeout-decorator
- parameterized
- psutil
- datasets
<0.3.5 dill
- pytest-timeout
==22.3 black
<2.0.0,>=1.4.12 sacrebleu
- rouge-score
- nltk
<3.1.19 GitPython
>=0.3.0 hf-doc-builder
<=3.20.1 protobuf
- sacremoses
- rjieba
==1.7.3 cookiecutter
<1.12,>=1.0 torch
!=0.1.92,>=0.1.91 sentencepiece
!=0.11.3,<0.13,>=0.11.1 tokenizers
- torchaudio
- librosa
>=0.3.0 pyctcdecode
- phonemizer
<0.3.1 resampy
- Pillow
- optuna
- ray[tune]
- sigopt
- timm
==1.2.0 codecarbon
>=5.5.4 isort
>=3.8.3 flake8
>=1.0 fugashi
<2.0,>=1.0.0 ipadic
>=1.0.7 unidic-lite
>=1.0.2 unidic
==0.16.0 docutils
- myst-parser
==3.2.1 sphinx
- sphinx-markdown-tables
==0.4.3 sphinx-rtd-theme
- sphinx-copybutton
==0.4.1 sphinxext-opengraph
- sphinx-intl
- sphinx-multiversion
- scikit-learn
>=1.4.0 onnxruntime
>=1.4.2 onnxruntime-tools
>=2.3 tensorflow
- onnxconverter-common
- tf2onnx
- tensorflow-text
<1.12,>=1.0 torch
!=0.1.92,>=0.1.91 sentencepiece
<=3.20.1 protobuf
!=0.11.3,<0.13,>=0.11.1 tokenizers
- torchaudio
- librosa
>=0.3.0 pyctcdecode
- phonemizer
<0.3.1 resampy
- Pillow
- optuna
- ray[tune]
- sigopt
- timm
==1.2.0 codecarbon
>=0.10.0 accelerate
==0.16.0 docutils
- myst-parser
==3.2.1 sphinx
- sphinx-markdown-tables
==0.4.3 sphinx-rtd-theme
- sphinx-copybutton
==0.4.1 sphinxext-opengraph
- sphinx-intl
- sphinx-multiversion
==0.16.0 docutils
- myst-parser
==3.2.1 sphinx
- sphinx-markdown-tables
==0.4.3 sphinx-rtd-theme
- sphinx-copybutton
==0.4.1 sphinxext-opengraph
- sphinx-intl
- sphinx-multiversion
>0.3 fairscale
- librosa
>=0.3.0 pyctcdecode
- phonemizer
<0.3.1 resampy
- ftfy
- optuna
- ray[tune]
- sigopt
>=1.0 fugashi
<2.0,>=1.0.0 ipadic
>=1.0.7 unidic-lite
>=1.0.2 unidic
==1.7.3 cookiecutter
- onnxconverter-common
- tf2onnx
>=1.4.0 onnxruntime
>=1.4.2 onnxruntime-tools
>=1.4.0 onnxruntime
>=1.4.2 onnxruntime-tools
- optuna
==22.3 black
>=5.5.4 isort
>=3.8.3 flake8
<3.1.19 GitPython
>=0.3.0 hf-doc-builder
- ray[tune]
- datasets
>=2.31.0 sagemaker
!=0.1.92,>=0.1.91 sentencepiece
<=3.20.1 protobuf
- pydantic
- uvicorn
- fastapi
- starlette
- sigopt
- scikit-learn
- torchaudio
- librosa
>=0.3.0 pyctcdecode
- phonemizer
<0.3.1 resampy
- pytest
- pytest-xdist
- timeout-decorator
- parameterized
- psutil
- datasets
<0.3.5 dill
- pytest-timeout
==22.3 black
<2.0.0,>=1.4.12 sacrebleu
- rouge-score
- nltk
<3.1.19 GitPython
>=0.3.0 hf-doc-builder
<=3.20.1 protobuf
- sacremoses
- rjieba
==1.7.3 cookiecutter
>=2.3 tensorflow
- onnxconverter-common
- tf2onnx
- tensorflow-text
>=2.3 tensorflow-cpu
- onnxconverter-common
- tf2onnx
- tensorflow-text
- librosa
>=0.3.0 pyctcdecode
- phonemizer
<0.3.1 resampy
- timm
!=0.11.3,<0.13,>=0.11.1 tokenizers
<1.12,>=1.0 torch
- torchaudio
- librosa
>=0.3.0 pyctcdecode
- phonemizer
<0.3.1 resampy
- filelock
<1.0,>=0.1.0 huggingface-hub
- importlib-metadata
>=1.17 numpy
>=20.0 packaging
<=3.20.1 protobuf
!=2019.12.17 regex
- requests
!=0.1.92,>=0.1.91 sentencepiece
<1.12,>=1.0 torch
!=0.11.3,<0.13,>=0.11.1 tokenizers
>=4.27 tqdm
- Pillow


زبان مورد نیاز

مقدار نام
>=3.7.0 Python


نحوه نصب


نصب پکیج whl adapter-transformers-3.1.0:

    pip install adapter-transformers-3.1.0.whl


نصب پکیج tar.gz adapter-transformers-3.1.0:

    pip install adapter-transformers-3.1.0.tar.gz