diff --git a/README.md b/README.md index 1b320ef50b..612f864e1c 100644 --- a/README.md +++ b/README.md @@ -14,39 +14,7 @@ See the License for the specific language governing permissions and limitations under the License. --> -## `adapters` Library - -This branch contains the development version of `adapters`, the next-generation library for parameter-efficient and modular transfer learning. - -> **Note**: For the stable version of `adapter-transformers`, please switch to the [master branch of the repo](https://github.com/adapter-hub/adapter-transformers). - -### Changes compared to `adapter-transformers` - -- `adapters` is a standalone package, using `transformers` as an external dependency but not patching it directly -- All adapter-related classes now are imported via `adapters` namespace, e.g.: - ```python - from adapters import BertAdapterModel - # ... - ``` -- Built-in HF model classes can be adapted for usage with adapters via a wrapper method, e.g.: - ```python - import adapters - from transformers import BertModel - - model = BertModel.from_pretrained("bert-base-uncased") - adapters.init(model) - ``` - -Features not (yet) working: - -- Loading model + adapter checkpoints using HF classes -- Using Transformers pipelines with adapters -- Using HF language modeling classes with invertible adapters - -## Documentation -To read the documentation of _Adapters_, follow the steps in [docs/README.md](docs/README.md). Currently, the documentation is **not** yet available from https://docs.adapterhub.ml/. - ---- +> **Note**: This repository holds the codebase of the _Adapters_ library, which has replaced `adapter-transformers`. For the legacy codebase, go to: https://github.com/adapter-hub/adapter-transformers-legacy.

diff --git a/docs/_static/custom.css b/docs/_static/custom.css index 680ab48e88..e4ba5749a4 100644 --- a/docs/_static/custom.css +++ b/docs/_static/custom.css @@ -31,3 +31,8 @@ a { .wy-side-scroll { padding-bottom: 1em; } + +/* override table no-wrap */ +.wy-table-responsive table td, .wy-table-responsive table th { + white-space: normal; +} diff --git a/docs/conf.py b/docs/conf.py index 83e1403c6c..5f73a52d99 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -19,9 +19,9 @@ # -- Project information ----------------------------------------------------- -project = "adapters" -copyright = "2020-2022, Adapter-Hub Team" -author = "Adapter-Hub Team" +project = "AdapterHub" +copyright = "2020-2023, AdapterHub Team" +author = "AdapterHub Team" docs_versions = [ "adapters1.1.1", diff --git a/docs/index.rst b/docs/index.rst index 323913cf7b..fdddf228ec 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -6,19 +6,29 @@ AdapterHub Documentation ================================================ +.. note:: + This documentation is based on the new *Adapters* library. + + The documentation based on the legacy *adapter-transformers* library can be found at: `https://docs-legacy.adapterhub.ml `_. + *AdapterHub* is a framework simplifying the integration, training and usage of adapters and other efficient fine-tuning methods for Transformer-based language models. For a full list of currently implemented methods, see the `table in our repository `_. The framework consists of two main components: -- ``adapters``, an extension of Hugging Face's `Transformers `_ library that adds adapter components to transformer models +.. list-table:: + :widths: 50 50 + :header-rows: 1 -- `The Hub `_, a central repository collecting pre-trained adapter modules + * - `Adapters `_ + - `AdapterHub.ml `_ + * - an add-on to Hugging Face's `Transformers `_ library that adds adapters into transformer models + - a central collection of pre-trained adapter modules Currently, we support the PyTorch versions of all models as listed on the `Model Overview `_ page. .. toctree:: - :maxdepth: 2 + :maxdepth: 1 :caption: Getting Started installation @@ -79,7 +89,7 @@ Currently, we support the PyTorch versions of all models as listed on the `Model classes/models/xmod .. toctree:: - :maxdepth: 2 + :maxdepth: 1 :caption: Adapter-Related Classes classes/adapter_config