-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Converting Models is High Memory Consuming... #11
Comments
@ClaudeCoulombe unfortunately, expected RAM usage in this case is quite big, probably bigger than the size of the model. Someone suggested a more memory efficient way of converting the models, if I recall correctly this was using hdf format, but I can't find this suggestion at the moment |
Greetings @lopuhin, Thanks for your quick answer. I've tried with a huge memory server (64 GB) of RAM, the code was able to fill that out and give a Memory Error... The function culprit seems to be the You're probably better to understand the problem than me. Maybe a generator with Below the traces: Traceback (most recent call last): |
any updates regarding this issue ? I've expected a .pkl extension as well but i get ".joblib" any ideas how i go about getting a file.pkl for the model ? Thanks in advance |
@ahmelshi this is likely a different issue. .joblib model should work fine. |
thank you for your fast reply. upon executing "adagram/load_julia.py ... " i get about 9.joblib files i was wondering if there is a way to combine or produce them into 1 file.pkl ? any ideas ? |
I see - you could load them in python with joblib, and then save with pickle or different joblib options (there should be an option that gives a single file) |
Greetings,
I'm trying to convert models built with AdaGram.jl Julia to JSON and then in Python as explained in the README.rst
I've used a pretty big model, the huang_super_300D_0.2_min20_hs_t1e-17.model which has a file size of over 3.3 GB. The conversion to JSON gives two files, a 4.7M id2word.json and a 23GB vm.json
To convert the model my command is:
sudo nohup python3 ./adagram/load_julia.py ./ model.joblib &
How much RAM should it take to convert the JSON to Python model?
Also, I did not understand why the .joblib extension for the Python model ?
I've expected a .pkl extension ?
The text was updated successfully, but these errors were encountered: