-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Evaluation Error in IDOL #48
Comments
In my server there are 20 cpu's and 2 gpu's |
Still I couldn't solve the error "RuntimeError: indices should be either on cpu or on the same device as the indexed tensor (cpu)" |
You can add the following two lines of code below the error location. I successfully run the inference on a single gpu. |
Thank you for your suggestion, I think it works! |
Hi,
When I try to train a model for IDOL with the ssh cennection to a server, I am taking following error in the evaluation stage:
-- Process 1 terminated with the following error: Traceback (most recent call last): File "/home/aylinaydin/anaconda3/envs/project/lib/python3.7/site-packages/torch/multiprocessing/spawn.py", line 69, in _wrap fn(i, *args) File "/home/aylinaydin/Project/VNext/detectron2/engine/launch.py", line 126, in _distributed_worker main_func(*args) File "/home/aylinaydin/Project/VNext/projects/IDOL/train_net.py", line 161, in main res = Trainer.test(cfg, model) File "/home/aylinaydin/Project/VNext/detectron2/engine/defaults.py", line 617, in test results_i = inference_on_dataset(model, data_loader, evaluator) File "/home/aylinaydin/Project/VNext/detectron2/evaluation/evaluator.py", line 158, in inference_on_dataset outputs = model(inputs) File "/home/aylinaydin/anaconda3/envs/project/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl return forward_call(*input, **kwargs) File "/home/aylinaydin/Project/VNext/projects/IDOL/idol/idol.py", line 284, in forward 0]) # (height, width) is resized size,images. image_sizes[0] is original size File "/home/aylinaydin/Project/VNext/projects/IDOL/idol/idol.py", line 357, in inference det_masks = output_mask[indices] RuntimeError: indices should be either on cpu or on the same device as the indexed tensor (cpu)
I look for change the device for each tensor, but i couldn't. Can you help me?
The text was updated successfully, but these errors were encountered: