Skip to content

Commit

Permalink
Setting Electricity mix and providers from config file (#108)
Browse files Browse the repository at this point in the history
* feat: added config and tests. Todo: update docs

* feat: added docs and updated conf handling

* chore: better with precommit

* feat: fix mypy

* feat: fix tests

* feat: fix mypy

* feat: update pre-commit

* feat: fix tests
  • Loading branch information
NP4567-dev authored Feb 9, 2025
1 parent 0fb1919 commit de34f2e
Show file tree
Hide file tree
Showing 11 changed files with 261 additions and 50 deletions.
8 changes: 4 additions & 4 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
repos:
- repo: https://github.com/charliermarsh/ruff-pre-commit
# Ruff version.
rev: 'v0.0.254'
rev: 'v0.9.5'
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix]
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
rev: v5.0.0
hooks:
- id: check-merge-conflict
- id: mixed-line-ending
Expand All @@ -16,10 +16,10 @@ repos:
# - id: bandit
# exclude: tests/
- repo: https://github.com/Lucas-C/pre-commit-hooks-safety
rev: v1.3.1
rev: v1.4.0
hooks:
- id: python-safety-dependencies-check
- repo: https://github.com/pre-commit/mirrors-mypy
rev: 'v1.13.0'
rev: 'v1.15.0'
hooks:
- id: mypy
96 changes: 79 additions & 17 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,26 +63,88 @@ For detailed instructions on each provider, refer to the complete list of [suppo

Below is a simple example demonstrating how to use the GPT-3.5-Turbo model from OpenAI with EcoLogits to track environmental impacts.

```python
from ecologits import EcoLogits
from openai import OpenAI

# Initialize EcoLogits
EcoLogits.init()

client = OpenAI(api_key="<OPENAI_API_KEY>")

response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Tell me a funny joke!"}
=== "Default init with python"
```python
from ecologits import EcoLogits
from openai import OpenAI

# Initialize EcoLogits
EcoLogits.init()

client = OpenAI(api_key="<OPENAI_API_KEY>")

response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Tell me a funny joke!"}
]
)

# Get estimated environmental impacts of the inference
print(f"Energy consumption: {response.impacts.energy.value} kWh")
print(f"GHG emissions: {response.impacts.gwp.value} kgCO2eq")
```

=== "Parametrized init"

```python
from ecologits import EcoLogits
from openai import OpenAI

# Initialize EcoLogits
EcoLogits.init(providers=["openai", "mistral"], electricity_mix_zone="WOR")

client = OpenAI(api_key="<OPENAI_API_KEY>")

response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Tell me a funny joke!"}
]
)

# Get estimated environmental impacts of the inference
print(f"Energy consumption: {response.impacts.energy.value} kWh")
print(f"GHG emissions: {response.impacts.gwp.value} kgCO2eq")
```

You can also provide the ecologits configuration through a toml file.

=== "main.py"
```python
from ecologits import EcoLogits
from openai import OpenAI

# Initialize EcoLogits
EcoLogits.init(config_path="pyproject.toml")

client = OpenAI(api_key="<OPENAI_API_KEY>")

response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Tell me a funny joke!"}
]
)

# Get estimated environmental impacts of the inference
print(f"Energy consumption: {response.impacts.energy.value} kWh")
print(f"GHG emissions: {response.impacts.gwp.value} kgCO2eq")
```

=== "pyproject.toml"
```toml
[ecologits]
region="FRA"
providers=[
"openai"
]
)
```

# Get estimated environmental impacts of the inference
print(f"Energy consumption: {response.impacts.energy.value} kWh")
print(f"GHG emissions: {response.impacts.gwp.value} kgCO2eq")
```
!!! info Internal priorizations
- If no init parameters are provided, EcoLogits will check for a pyproject.toml file with an ecologits config an rely on it to initialize.
- If both a toml file and init parameter are provided, the parameters will prevail.

Environmental impacts are quantified based on four criteria and across two phases:

Expand Down
2 changes: 1 addition & 1 deletion docs/scripts/gen_references.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,5 +26,5 @@

mkdocs_gen_files.set_edit_path(full_doc_path, Path("../") / path)

with mkdocs_gen_files.open("reference/SUMMARY.md", "w") as nav_file: #
with mkdocs_gen_files.open("reference/SUMMARY.md", "w") as nav_file:
nav_file.writelines(nav.build_literate_nav())
28 changes: 27 additions & 1 deletion docs/tutorial/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,9 +51,10 @@ To use EcoLogits in your projects, you will need to initialize the client tracer
from ecologits import EcoLogits

# Default initialization method
EcoLogits.init()
EcoLogits.init() # (1)!
```

1. If you have a pyproject.toml containing an EcoLogits configuration, it will now be loaded by an empty init.

### Configure providers

Expand Down Expand Up @@ -90,3 +91,28 @@ from ecologits import EcoLogits
# Select the electricity mix of France
EcoLogits.init(electricity_mix_zone="FRA")
```

### Configure both through a toml configuration file

You can also set the providers and electricity mix zone through a .toml file (for example your project's .toml).

```python title="Load an EcoLogits configuration file "
from ecologits import EcoLogits

# Initialize EcoLogits with a config file
EcoLogits.init(config_path="pyproject.toml")
EcoLogits.init(config_path="ecologits.toml")

```

To do so, just provide the path to your file (or provide nothing if it is a `pyproject.toml` located at the root of your project).
The expected formatting of the EcoLogits configuration is as follows.

```toml title="pyproject.toml"
[ecologits]
region="FRA"
providers=[
"openai",
"mistral"
]
```
4 changes: 2 additions & 2 deletions ecologits/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,6 @@

__version__ = "0.5.2"
__all__ = [
"__version__",
"EcoLogits"
"EcoLogits",
"__version__"
]
63 changes: 51 additions & 12 deletions ecologits/_ecologits.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
import importlib.metadata
import importlib.util
import os
from dataclasses import dataclass, field
from typing import Optional, Union

import toml # type: ignore [import]
from packaging.version import Version

from ecologits.exceptions import EcoLogitsError
Expand Down Expand Up @@ -83,6 +84,11 @@ def init_litellm_instrumentor() -> None:
"litellm": init_litellm_instrumentor
}

@dataclass
class _Config:
electricity_mix_zone: str = field(default="WOR")
providers: list[str] = field(default_factory=list)


class EcoLogits:
"""
Expand Down Expand Up @@ -113,17 +119,22 @@ class EcoLogits:
```
"""
@dataclass
class _Config:
electricity_mix_zone: str = field(default="WOR")
providers: list[str] = field(default_factory=list)
config= _Config()

@staticmethod
def _read_ecologits_config(config_path: str)-> dict[str, str]|None:

config = _Config()
with open(config_path) as config_file:
config = toml.load(config_file).get("ecologits", None)
if config is None:
logger.warning("Provided file did not contain the ecologits key. Falling back on default configuration")
return config

@staticmethod
def init(
providers: Optional[Union[str, list[str]]] = None,
electricity_mix_zone: str = "WOR",
config_path: str| None = None,
providers: str | list[str]|None = None,
electricity_mix_zone: str|None = None,
) -> None:
"""
Initialization static method. Will attempt to initialize all providers by default.
Expand All @@ -132,15 +143,43 @@ def init(
providers: list of providers to initialize (all providers by default).
electricity_mix_zone: ISO 3166-1 alpha-3 code of the electricity mix zone (WOR by default).
"""
default_providers = list(set(_INSTRUMENTS.keys()))
default_electricity_mix_zone = "WOR"

if config_path is not None and (providers is not None or electricity_mix_zone is not None):
logger.warning("Both config path and init arguments provided, init arguments will be prioritized")

if (config_path is None
and providers is None
and electricity_mix_zone is None
and os.path.isfile("pyproject.toml")):

config_path = "pyproject.toml"

if config_path:
try:
user_config: dict[str, str]|None = EcoLogits._read_ecologits_config(config_path)
logger.info("Ecologits configuration found in file and loaded")
except FileNotFoundError:
logger.warning("Provided file does not exist, will fall back on default values")
user_config = None

if user_config is not None:
providers = user_config.get("providers", default_providers) if providers is None else providers
electricity_mix_zone = (user_config.get("electricity_mix_zone", electricity_mix_zone)
if electricity_mix_zone is None
else electricity_mix_zone)

if isinstance(providers, str):
providers = [providers]
if providers is None:
providers = list(_INSTRUMENTS.keys())
elif providers is None:
providers = default_providers
if electricity_mix_zone is None:
electricity_mix_zone = default_electricity_mix_zone

init_instruments(providers)

EcoLogits.config.electricity_mix_zone = electricity_mix_zone
EcoLogits.config.providers += providers
EcoLogits.config=_Config(electricity_mix_zone=electricity_mix_zone, providers=providers)
EcoLogits.config.providers = list(set(EcoLogits.config.providers))


Expand Down
4 changes: 1 addition & 3 deletions ecologits/tracers/huggingface_tracer.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,9 +69,7 @@ def huggingface_chat_wrapper_stream(
) -> Iterable[ChatCompletionStreamOutput]:
timer_start = time.perf_counter()
stream = wrapped(*args, **kwargs)
token_count = 0
for chunk in stream:
token_count += 1
for token_count, chunk in enumerate(stream, start=1):
request_latency = time.perf_counter() - timer_start
impacts = llm_impacts(
provider=PROVIDER,
Expand Down
17 changes: 7 additions & 10 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,6 @@ huggingface-hub = ["huggingface-hub (>=0.28.1,<0.29.0)", "tiktoken (>=0.8.0,<0.9
google-generativeai = ["google-generativeai (>=0.8.4,<0.9.0)"]
litellm = ["litellm (>=1.60.6,<2.0.0)", "rapidfuzz (>=3.12.1,<4.0.0)"]


[tool.poetry]
requires-poetry = ">=2.0"

Expand Down Expand Up @@ -104,7 +103,7 @@ ignore_errors = true


[tool.ruff]
select = [
lint.select = [
"A",
"ANN",
"ARG",
Expand Down Expand Up @@ -136,10 +135,8 @@ select = [
"YTT"
]

ignore = [
lint.ignore = [
"A003",
"ANN101",
"ANN102",
"ANN401",
"N805",
"N818",
Expand All @@ -151,7 +148,7 @@ ignore = [
"TRY003"
]

fixable = [
lint.fixable = [
"A",
"ANN",
"ARG",
Expand Down Expand Up @@ -181,7 +178,7 @@ fixable = [
"W",
"YTT"
]
unfixable = []
lint.unfixable = []

exclude = [
".bzr",
Expand All @@ -206,12 +203,12 @@ exclude = [

line-length = 120

dummy-variable-rgx = "^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"
lint.dummy-variable-rgx = "^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"

target-version = "py39"

[tool.ruff.mccabe]
[tool.ruff.lint.mccabe]
max-complexity = 10

[tool.ruff.pydocstyle]
[tool.ruff.lint.pydocstyle]
convention = "google"
Empty file added tests/config/__init__.py
Empty file.
Loading

0 comments on commit de34f2e

Please sign in to comment.