Skip to content

Commit

Permalink
Prepare readme and docs for release
Browse files Browse the repository at this point in the history
  • Loading branch information
calpt committed Nov 14, 2023
1 parent 6ae327a commit 6ec40c8
Show file tree
Hide file tree
Showing 4 changed files with 23 additions and 40 deletions.
34 changes: 1 addition & 33 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,39 +14,7 @@ See the License for the specific language governing permissions and
limitations under the License.
-->

## `adapters` Library

This branch contains the development version of `adapters`, the next-generation library for parameter-efficient and modular transfer learning.

> **Note**: For the stable version of `adapter-transformers`, please switch to the [master branch of the repo](https://github.com/adapter-hub/adapter-transformers).
### Changes compared to `adapter-transformers`

- `adapters` is a standalone package, using `transformers` as an external dependency but not patching it directly
- All adapter-related classes now are imported via `adapters` namespace, e.g.:
```python
from adapters import BertAdapterModel
# ...
```
- Built-in HF model classes can be adapted for usage with adapters via a wrapper method, e.g.:
```python
import adapters
from transformers import BertModel

model = BertModel.from_pretrained("bert-base-uncased")
adapters.init(model)
```

Features not (yet) working:

- Loading model + adapter checkpoints using HF classes
- Using Transformers pipelines with adapters
- Using HF language modeling classes with invertible adapters

## Documentation
To read the documentation of _Adapters_, follow the steps in [docs/README.md](docs/README.md). Currently, the documentation is **not** yet available from https://docs.adapterhub.ml/.

---
> **Note**: This repository holds the codebase of the _Adapters_ library, which has replaced `adapter-transformers`. For the legacy codebase, go to: https://github.com/adapter-hub/adapter-transformers-legacy.
<p align="center">
<img style="vertical-align:middle" src="https://raw.githubusercontent.com/Adapter-Hub/adapter-transformers/master/adapter_docs/logo.png" />
Expand Down
5 changes: 5 additions & 0 deletions docs/_static/custom.css
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,8 @@ a {
.wy-side-scroll {
padding-bottom: 1em;
}

/* override table no-wrap */
.wy-table-responsive table td, .wy-table-responsive table th {
white-space: normal;
}
6 changes: 3 additions & 3 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,9 @@

# -- Project information -----------------------------------------------------

project = "adapters"
copyright = "2020-2022, Adapter-Hub Team"
author = "Adapter-Hub Team"
project = "AdapterHub"
copyright = "2020-2023, AdapterHub Team"
author = "AdapterHub Team"

docs_versions = [
"adapters1.1.1",
Expand Down
18 changes: 14 additions & 4 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,19 +6,29 @@
AdapterHub Documentation
================================================

.. note::
This documentation is based on the new *Adapters* library.

The documentation based on the legacy *adapter-transformers* library can be found at: `https://docs-legacy.adapterhub.ml <https://docs-legacy.adapterhub.ml>`_.

*AdapterHub* is a framework simplifying the integration, training and usage of adapters and other efficient fine-tuning methods for Transformer-based language models.
For a full list of currently implemented methods, see the `table in our repository <https://github.com/adapter-hub/adapters#implemented-methods>`_.

The framework consists of two main components:

- ``adapters``, an extension of Hugging Face's `Transformers <https://huggingface.co/transformers/>`_ library that adds adapter components to transformer models
.. list-table::
:widths: 50 50
:header-rows: 1

- `The Hub <https://adapterhub.ml>`_, a central repository collecting pre-trained adapter modules
* - `Adapters <https://github.com/adapter-hub/adapters>`_
- `AdapterHub.ml <https://adapterhub.ml/explore>`_
* - an add-on to Hugging Face's `Transformers <https://huggingface.co/transformers/>`_ library that adds adapters into transformer models
- a central collection of pre-trained adapter modules

Currently, we support the PyTorch versions of all models as listed on the `Model Overview <model_overview.html>`_ page.

.. toctree::
:maxdepth: 2
:maxdepth: 1
:caption: Getting Started

installation
Expand Down Expand Up @@ -79,7 +89,7 @@ Currently, we support the PyTorch versions of all models as listed on the `Model
classes/models/xmod

.. toctree::
:maxdepth: 2
:maxdepth: 1
:caption: Adapter-Related Classes

classes/adapter_config
Expand Down

0 comments on commit 6ec40c8

Please sign in to comment.