You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using the ITI method in Qwen model, and it worked when in edit_weight step.
However, when I load the previously saved model , the error occurred as following:
{path_prefix}/validation/results_dump/edited_models_dump//Qwen2.5-14B-Instruct-GPTQ-Int4_seed_42_top_24_heads_alpha_15 were not used when initializing Qwen2ForCausalLM: ['model.layers.17.self_attn.o_proj.bias', 'model.layers.21.self_attn.o_proj.bias', 'model.layers.23.self_attn.o_proj.bias'
The outputs of the original model and the intervened model are identical.
I am using AutoModelForCauselLM to load model. I have been troubled by this error for several days, thank you.
The text was updated successfully, but these errors were encountered:
I am using the ITI method in Qwen model, and it worked when in
edit_weight
step.However, when I load the previously saved model , the error occurred as following:
The outputs of the original model and the intervened model are identical.
I am using
AutoModelForCauselLM
to load model. I have been troubled by this error for several days, thank you.The text was updated successfully, but these errors were encountered: