You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi I discover aihorde does allown you set model even you have choice one, it use random one in current code
KoboldAI/modeling/inference_models/horde/class.py
def _raw_generate(
we can found this code
cluster_metadata = {
"prompt": decoded_prompt,
"params": reqdata,
"models": [x for x in utils.koboldai_vars.cluster_requested_models if x],
"trusted_workers": False,
}
the var utils.koboldai_vars.cluster_requested_models is never set.
so I did change the code to this
matched_models = [x for x in self.models if x.get('name') == self.model_name]
cluster_metadata = {
"prompt": decoded_prompt,
"params": reqdata,
"models": self.model,
"trusted_workers": False,
}
now it takes the model I choice, but I think it should find all model that are simulre with different names and do a array of it so it use more server
The text was updated successfully, but these errors were encountered:
Hi henk717, thanks for you tips.
I am not instreed in that, I am doing small side project in python
proxy lightweight proxy so koboldai and other can connect to LM Studio or other that have openai compatible API
I notes a bug in my code it did not respect select all value now it does
selected_model = self.model
if ( self.model == ["all"]):
selected_model = []
Hi I discover aihorde does allown you set model even you have choice one, it use random one in current code
KoboldAI/modeling/inference_models/horde/class.py
def _raw_generate(
we can found this code
cluster_metadata = {
"prompt": decoded_prompt,
"params": reqdata,
"models": [x for x in utils.koboldai_vars.cluster_requested_models if x],
"trusted_workers": False,
}
the var utils.koboldai_vars.cluster_requested_models is never set.
so I did change the code to this
matched_models = [x for x in self.models if x.get('name') == self.model_name]
cluster_metadata = {
"prompt": decoded_prompt,
"params": reqdata,
"models": self.model,
"trusted_workers": False,
}
The text was updated successfully, but these errors were encountered: