معرفی شرکت ها


finetuning-scheduler-2.0.2


Card image cap
تبلیغات ما

مشتریان به طور فزاینده ای آنلاین هستند. تبلیغات می تواند به آنها کمک کند تا کسب و کار شما را پیدا کنند.

مشاهده بیشتر
Card image cap
تبلیغات ما

مشتریان به طور فزاینده ای آنلاین هستند. تبلیغات می تواند به آنها کمک کند تا کسب و کار شما را پیدا کنند.

مشاهده بیشتر
Card image cap
تبلیغات ما

مشتریان به طور فزاینده ای آنلاین هستند. تبلیغات می تواند به آنها کمک کند تا کسب و کار شما را پیدا کنند.

مشاهده بیشتر
Card image cap
تبلیغات ما

مشتریان به طور فزاینده ای آنلاین هستند. تبلیغات می تواند به آنها کمک کند تا کسب و کار شما را پیدا کنند.

مشاهده بیشتر
Card image cap
تبلیغات ما

مشتریان به طور فزاینده ای آنلاین هستند. تبلیغات می تواند به آنها کمک کند تا کسب و کار شما را پیدا کنند.

مشاهده بیشتر

توضیحات

A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.
ویژگی مقدار
سیستم عامل -
نام فایل finetuning-scheduler-2.0.2
نام finetuning-scheduler
نسخه کتابخانه 2.0.2
نگهدارنده []
ایمیل نگهدارنده []
نویسنده Dan Dale
ایمیل نویسنده danny.dale@gmail.com
آدرس صفحه اصلی https://github.com/speediedan/finetuning-scheduler
آدرس اینترنتی https://pypi.org/project/finetuning-scheduler/
مجوز Apache-2.0
<div align="center"> <img src="https://github.com/speediedan/finetuning-scheduler/raw/v2.0.2/docs/source/_static/images/logos/logo_fts.png" width="401px"> **A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.** ______________________________________________________________________ <p align="center"> <a href="https://finetuning-scheduler.readthedocs.io/en/stable/">Docs</a> • <a href="#Setup">Setup</a> • <a href="#examples">Examples</a> • <a href="#community">Community</a> </p> [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/finetuning-scheduler)](https://pypi.org/project/finetuning-scheduler/) [![PyPI Status](https://badge.fury.io/py/finetuning-scheduler.svg)](https://badge.fury.io/py/finetuning-scheduler)\ [![codecov](https://codecov.io/gh/speediedan/finetuning-scheduler/release/2.0.2/graph/badge.svg?flag=gpu)](https://codecov.io/gh/speediedan/finetuning-scheduler) [![ReadTheDocs](https://readthedocs.org/projects/finetuning-scheduler/badge/?version=latest)](https://finetuning-scheduler.readthedocs.io/en/stable/) [![DOI](https://zenodo.org/badge/455666112.svg)](https://zenodo.org/badge/latestdoi/455666112) [![license](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://github.com/speediedan/finetuning-scheduler/blob/master/LICENSE) </div> ______________________________________________________________________ <img width="300px" src="https://github.com/speediedan/finetuning-scheduler/raw/v2.0.2/docs/source/_static/images/fts/fts_explicit_loss_anim.gif" alt="FinetuningScheduler explicit loss animation" align="right"/> [FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) is simple to use yet powerful, offering a number of features that facilitate model research and exploration: - easy specification of flexible fine-tuning schedules with explicit or regex-based parameter selection - implicit schedules for initial/naive model exploration - explicit schedules for performance tuning, fine-grained behavioral experimentation and computational efficiency - automatic restoration of best per-phase checkpoints driven by iterative application of early-stopping criteria to each fine-tuning phase - composition of early-stopping and manually-set epoch-driven fine-tuning phase transitions ______________________________________________________________________ ## Setup ### Step 0: Install from PyPI ```bash pip install finetuning-scheduler ``` <!-- --> ### Step 1: Import the FinetuningScheduler callback and start fine-tuning! ```python import lightning as L from finetuning_scheduler import FinetuningScheduler trainer = L.Trainer(callbacks=[FinetuningScheduler()]) ``` Get started by following [the Fine-Tuning Scheduler introduction](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) which includes a [CLI-based example](https://finetuning-scheduler.readthedocs.io/en/stable/index.html#example-scheduled-fine-tuning-for-superglue) or by following the [notebook-based](https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/finetuning-scheduler.html) Fine-Tuning Scheduler tutorial. ______________________________________________________________________ ### Installation Using the Standalone `pytorch-lightning` Package *applicable to versions >= `2.0.0`* Now that the core Lightning package is `lightning` rather than `pytorch-lightning`, Fine-Tuning Scheduler (FTS) by default depends upon the `lightning` package rather than the standalone `pytorch-lightning`. If you would like to continue to use FTS with the standalone `pytorch-lightning` package instead, you can still do so as follows: Install a given FTS release (for example v2.0.0) using standalone `pytorch-lightning`: ```bash export FTS_VERSION=2.0.0 export PACKAGE_NAME=pytorch wget https://github.com/speediedan/finetuning-scheduler/releases/download/v${FTS_VERSION}/finetuning-scheduler-${FTS_VERSION}.tar.gz pip install finetuning-scheduler-${FTS_VERSION}.tar.gz ``` ______________________________________________________________________ ## Examples ### Scheduled Fine-Tuning For SuperGLUE - [Notebook-based Tutorial](https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/finetuning-scheduler.html) - [CLI-based Tutorial](https://finetuning-scheduler.readthedocs.io/en/stable/#example-scheduled-fine-tuning-for-superglue) - [FSDP Scheduled Fine-Tuning](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/fsdp_scheduled_fine_tuning.html) - [LR Scheduler Reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/lr_scheduler_reinitialization.html) (advanced) - [Optimizer Reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/optimizer_reinitialization.html) (advanced) ______________________________________________________________________ ## Continuous Integration Fine-Tuning Scheduler is rigorously tested across multiple CPUs, GPUs and against major Python and PyTorch versions. Each Fine-Tuning Scheduler minor release (major.minor.patch) is paired with a Lightning minor release (e.g. Fine-Tuning Scheduler 2.0 depends upon Lightning 2.0). To ensure maximum stability, the latest Lightning patch release fully tested with Fine-Tuning Scheduler is set as a maximum dependency in Fine-Tuning Scheduler's requirements.txt (e.g. \<= 1.7.1). If you'd like to test a specific Lightning patch version greater than that currently in Fine-Tuning Scheduler's requirements.txt, it will likely work but you should install Fine-Tuning Scheduler from source and update the requirements.txt as desired. <details> <summary>Current build statuses for Fine-Tuning Scheduler </summary> | System / (PyTorch/Python ver) | 1.11/3.8 | 2.0.0/3.8, 2.0.0/3.10 | | :---------------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | | Linux \[GPUs\*\*\] | - | [![Build Status](https://dev.azure.com//speediedan/finetuning-scheduler/_apis/build/status/Multi-GPU%20&%20Example%20Tests?branchName=refs%2Ftags%2F2.0.2)](https://dev.azure.com/speediedan/finetuning-scheduler/_build/latest?definitionId=2&branchName=refs%2Ftags%2F2.0.2) | | Linux (Ubuntu 20.04) | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.0.2)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.0.2)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) | | OSX (11) | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.0.2)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.0.2)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) | | Windows (2022) | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.0.2)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) | [![Test](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml/badge.svg?tag=2.0.2)](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) | - \*\* tests run on two RTX 2070s </details> ## Community Fine-Tuning Scheduler is developed and maintained by the community in close communication with the [Lightning team](https://pytorch-lightning.readthedocs.io/en/stable/governance.html). Thanks to everyone in the community for their tireless effort building and improving the immensely useful core Lightning project. PR's welcome! Please see the [contributing guidelines](https://finetuning-scheduler.readthedocs.io/en/stable/generated/CONTRIBUTING.html) (which are essentially the same as Lightning's). ______________________________________________________________________ ## Citing Fine-Tuning Scheduler Please cite: ```tex @misc{Dan_Dale_2022_6463952, author = {Dan Dale}, title = {{Fine-Tuning Scheduler}}, month = Feb, year = 2022, doi = {10.5281/zenodo.6463952}, publisher = {Zenodo}, url = {https://zenodo.org/record/6463952} } ``` Feel free to star the repo as well if you find it useful or interesting. Thanks 😊!


نیازمندی

مقدار نام
<2.0.2,>=2.0.0 lightning
>=1.11.0 torch
!=10.15.0.a,>=10.14.0 rich
!=4.18.0,!=4.19.0,!=4.20.0,>=4.15.2 jsonargparse[signatures]
>=2.1.0 omegaconf
>=1.1.0 hydra-core
>=6.4 coverage
>=2.1 codecov
>=6.0 pytest
>=10.2 pytest-rerunfailures
==3.2 twine
>=0.920 mypy
>=3.9.2 flake8
>=1.0 pre-commit
<=3.20.1 protobuf
- ipython[notebook]
>=1.10 jupytext
>=0.9.6 nbval
- datasets
- evaluate
>=4.18.0 transformers
- scikit-learn
- sentencepiece
>=2.2 tensorboardX
!=4.18.0,!=4.19.0,!=4.20.0,>=4.15.2 jsonargparse[signatures]
>=2.1.0 omegaconf
>=1.1.0 hydra-core
!=10.15.0.a,>=10.14.0 rich
!=4.18.0,!=4.19.0,!=4.20.0,>=4.15.2 jsonargparse[signatures]
>=2.1.0 omegaconf
>=1.1.0 hydra-core
>=6.4 coverage
>=2.1 codecov
>=6.0 pytest
>=10.2 pytest-rerunfailures
==3.2 twine
>=0.920 mypy
>=3.9.2 flake8
>=1.0 pre-commit
<=3.20.1 protobuf
- ipython[notebook]
>=1.10 jupytext
>=0.9.6 nbval
- datasets
- evaluate
>=4.18.0 transformers
- scikit-learn
- sentencepiece
>=2.2 tensorboardX
!=4.18.0,!=4.19.0,!=4.20.0,>=4.15.2 jsonargparse[signatures]
>=2.1.0 omegaconf
>=1.1.0 hydra-core
!=10.15.0.a,>=10.14.0 rich
!=4.18.0,!=4.19.0,!=4.20.0,>=4.15.2 jsonargparse[signatures]
>=2.1.0 omegaconf
>=1.1.0 hydra-core
- ipython[notebook]
>=1.10 jupytext
>=0.9.6 nbval
>=6.4 coverage
>=2.1 codecov
>=6.0 pytest
>=10.2 pytest-rerunfailures
==3.2 twine
>=0.920 mypy
>=3.9.2 flake8
>=1.0 pre-commit
<=3.20.1 protobuf


زبان مورد نیاز

مقدار نام
>=3.8 Python


نحوه نصب


نصب پکیج whl finetuning-scheduler-2.0.2:

    pip install finetuning-scheduler-2.0.2.whl


نصب پکیج tar.gz finetuning-scheduler-2.0.2:

    pip install finetuning-scheduler-2.0.2.tar.gz