Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Krishna/jsonutilreadme fix #655

Merged
merged 9 commits into from
Dec 3, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .github/workflows/python-package.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

# [ ] TODO [pep 458](https://blog.pypi.org/posts/2024-11-14-pypi-now-supports-digital-attestations/)

name: Python package

on:
Expand All @@ -16,7 +18,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.9", "3.10", "3.11", "3.12"]
python-version: ["3.10", "3.11", "3.12"]

steps:
- uses: actions/checkout@v4
Expand Down
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,8 @@ audio/
video/
artifacts_three
dataframe/

.ruff_cache
.pytest_cache
static/generated
runs
Financial-Analysis-Agent_state.json
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -462,7 +462,7 @@ from pydantic import BaseModel, Field
from transformers import AutoModelForCausalLM, AutoTokenizer

from swarms import ToolAgent
from swarms.utils.json_utils import base_model_to_json
from swarms.tools.json_utils import base_model_to_json

# Load the pre-trained model and tokenizer
model = AutoModelForCausalLM.from_pretrained(
Expand Down
2 changes: 1 addition & 1 deletion docs/swarms/install/install.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ Before you begin, ensure you have the following installed:
poetry install --extras "desktop"
```

=== "Using Docker"
=== "Using Docker COMING SOON [DOES NOT WORK YET]"

Docker is an excellent option for creating isolated and reproducible environments, suitable for both development and production.

Expand Down
9 changes: 7 additions & 2 deletions swarms/structs/agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -1621,11 +1621,16 @@ def get_docs_from_doc_folders(self):
files = os.listdir(self.docs_folder)

# Extract the text from the files
# Process each file and combine their contents
all_text = ""
for file in files:
text = data_to_text(file)
file_path = os.path.join(self.docs_folder, file)
text = data_to_text(file_path)
all_text += f"\nContent from {file}:\n{text}\n"

# Add the combined content to memory
return self.short_memory.add(
role=self.user_name, content=text
role=self.user_name, content=all_text
)
except Exception as error:
logger.error(
Expand Down
2 changes: 0 additions & 2 deletions swarms/structs/auto_swarm_builder.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
from loguru import logger

import os
from typing import List

Expand Down
Loading