You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using distributed or parallel set-up in script?: yes
Using GPU in script?: yes
GPU type: NVIDIA RTX A6000
Who can help?
No response
Information
The official example scripts
My own modified scripts
Tasks
An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
My own task or dataset (give details below)
Reproduction
Hello,
I am finetuning a BertForSequenceClassification after which point I would like to test it using pipelines.
However, since I have multiple GPUs, I use torch.nn.DataParallel to wrap it in the following way:
System Info
transformers
version: 4.48.0Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Hello,
I am finetuning a
BertForSequenceClassification
after which point I would like to test it usingpipelines
.However, since I have multiple GPUs, I use
torch.nn.DataParallel
to wrap it in the following way:and then try to use it for inference via:
This worked when I simply had the
BertForSequenceClassification
instance but now with theDataParallel
wrapping over it I get:The text was updated successfully, but these errors were encountered: