Skip to content

Commit

Permalink
Merge branch 'main' into dpo
Browse files Browse the repository at this point in the history
  • Loading branch information
ZePan110 authored Nov 8, 2024
2 parents 058c9d1 + 786cabe commit c44a54f
Show file tree
Hide file tree
Showing 42 changed files with 1,343 additions and 1,535 deletions.
4 changes: 4 additions & 0 deletions .github/workflows/docker/compose/embeddings-compose-cd.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,3 +22,7 @@ services:
build:
dockerfile: comps/embeddings/predictionguard/Dockerfile
image: ${REGISTRY:-opea}/embedding-predictionguard:${TAG:-latest}
embedding-reranking-local:
build:
dockerfile: comps/embeddings/tei/langchain/Dockerfile.dynamic_batching
image: ${REGISTRY:-opea}/embedding-reranking-local:${TAG:-latest}
4 changes: 0 additions & 4 deletions .github/workflows/docker/compose/llms-compose-cd.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,6 @@ services:
build:
dockerfile: comps/llms/text-generation/vllm/llama_index/Dockerfile
image: ${REGISTRY:-opea}/llm-vllm-llamaindex:${TAG:-latest}
llm-vllm-llamaindex-hpu:
build:
dockerfile: comps/llms/text-generation/vllm/llama_index/dependency/Dockerfile.intel_hpu
image: ${REGISTRY:-opea}/llm-vllm-llamaindex-hpu:${TAG:-latest}
llm-predictionguard:
build:
dockerfile: comps/llms/text-generation/predictionguard/Dockerfile
Expand Down
4 changes: 0 additions & 4 deletions .github/workflows/docker/compose/llms-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,6 @@ services:
build:
dockerfile: comps/llms/text-generation/vllm/langchain/Dockerfile
image: ${REGISTRY:-opea}/llm-vllm:${TAG:-latest}
llm-vllm-hpu:
build:
dockerfile: comps/llms/text-generation/vllm/langchain/dependency/Dockerfile.intel_hpu
image: ${REGISTRY:-opea}/llm-vllm-hpu:${TAG:-latest}
llm-vllm-ray:
build:
dockerfile: comps/llms/text-generation/vllm/ray/Dockerfile
Expand Down
3 changes: 1 addition & 2 deletions .github/workflows/scripts/freeze_images.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,7 @@

declare -A dict
dict["langchain/langchain"]="docker://docker.io/langchain/langchain"
# dict["vault.habana.ai/gaudi-docker/1.16.1/ubuntu22.04/habanalabs/pytorch-installer-2.2.2"]="docker://vault.habana.ai/gaudi-docker/1.16.1/ubuntu22.04/habanalabs/pytorch-installer-2.2.2"
dict["opea/habanalabs:1.16.1-pytorch-installer-2.2.2"]="docker://docker.io/opea/habanalabs:1.16.1-pytorch-installer-2.2.2"
dict["vault.habana.ai/gaudi-docker/1.18.0/ubuntu22.04/habanalabs/pytorch-installer-2.4.0"]="docker://vault.habana.ai/gaudi-docker/1.18.0/ubuntu22.04/habanalabs/pytorch-installer-2.4.0"

function get_latest_version() {
repo_image=$1
Expand Down
2 changes: 1 addition & 1 deletion comps/animation/wav2lip/dependency/Dockerfile.intel_hpu
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Use a base image
# FROM python:3.11-slim
FROM vault.habana.ai/gaudi-docker/1.16.2/ubuntu22.04/habanalabs/pytorch-installer-2.2.2:latest AS hpu
FROM vault.habana.ai/gaudi-docker/1.18.0/ubuntu22.04/habanalabs/pytorch-installer-2.4.0 AS hpu

# Set environment variables
ENV LANG=en_US.UTF-8
Expand Down
3 changes: 1 addition & 2 deletions comps/asr/whisper/dependency/Dockerfile.intel_hpu
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,7 @@
# SPDX-License-Identifier: Apache-2.0

# HABANA environment
# FROM vault.habana.ai/gaudi-docker/1.16.1/ubuntu22.04/habanalabs/pytorch-installer-2.2.2:latest as hpu
FROM opea/habanalabs:1.16.1-pytorch-installer-2.2.2 as hpu
FROM vault.habana.ai/gaudi-docker/1.18.0/ubuntu22.04/habanalabs/pytorch-installer-2.4.0 AS hpu

RUN useradd -m -s /bin/bash user && \
mkdir -p /home/user && \
Expand Down
16 changes: 7 additions & 9 deletions comps/cores/mega/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,17 @@
import argparse

from .exporter import convert_to_docker_compose
from .manifests_exporter import convert_to_manifests


def export_kubernetes_manifests(mega_yaml, output_dir, device="cpu"):
print(f"Generating Kubernetes manifests from {mega_yaml} to {output_dir}")
# Add your logic to convert the YAML to Kubernetes manifest here
def export_kubernetes_manifests(mega_yaml, output_file):
print(f"Generating Kubernetes manifests from {mega_yaml} to {output_file}")
convert_to_manifests(mega_yaml, output_file)


def export_docker_compose(mega_yaml, output_file, device="cpu"):
def export_docker_compose(mega_yaml, output_file):
print(f"Generating Docker Compose file from {mega_yaml} to {output_file}")
convert_to_docker_compose(mega_yaml, output_file, device)
convert_to_docker_compose(mega_yaml, output_file)


def opea_execute():
Expand All @@ -30,9 +31,6 @@ def opea_execute():
compose_parser = export_subparsers.add_parser("docker-compose", help="Export to Docker Compose")
compose_parser.add_argument("mega_yaml", help="Path to the mega YAML file")
compose_parser.add_argument("output_file", help="Path to the Docker Compose file")
compose_parser.add_argument(
"--device", choices=["cpu", "gaudi", "xpu", "gpu"], default="cpu", help="Device type to use (default: cpu)"
)

# Export to Kubernetes
kube_parser = export_subparsers.add_parser("kubernetes", help="Export to Kubernetes")
Expand All @@ -48,7 +46,7 @@ def opea_execute():
# Execute appropriate command
if args.command == "export":
if args.export_command == "docker-compose":
export_docker_compose(args.mega_yaml, args.output_file, args.device)
export_docker_compose(args.mega_yaml, args.output_file)
elif args.export_command == "kubernetes":
export_kubernetes_manifests(args.mega_yaml, args.output_dir, args.device)
else:
Expand Down
Loading

0 comments on commit c44a54f

Please sign in to comment.