Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Progress Bar #80

Closed
work4cs opened this issue Jan 21, 2025 · 4 comments · Fixed by #83
Closed

Add Progress Bar #80

work4cs opened this issue Jan 21, 2025 · 4 comments · Fixed by #83

Comments

@work4cs
Copy link

work4cs commented Jan 21, 2025

Could we add progress bar (such as tqdm) when we add a list of images to TreeOfLifeClassifier or CustomLabelsClassifier?

@johnbradley
Copy link
Collaborator

Thanks for the issue @work4cs.

When you pass a list of images to the predict() function these images are processed as a group. If you are using a "cuda" or "mps" device the images will be loaded onto the GPU and processed in a single encode_image operation. Typically this method will be faster than processing the images one at a time.

@work4cs Which device are you using "cpu", "cuda", or "mps"(default is "cpu")? How many images are you passing to the predict() function?

@work4cs
Copy link
Author

work4cs commented Jan 22, 2025

Thanks for the issue @work4cs.

When you pass a list of images to the predict() function these images are processed as a group. If you are using a "cuda" or "mps" device the images will be loaded onto the GPU and processed in a single encode_image operation. Typically this method will be faster than processing the images one at a time.

@work4cs Which device are you using "cpu", "cuda", or "mps"(default is "cpu")? How many images are you passing to the predict() function?

Oh it seems like we are dealing with a single batch now. If a batch of images can fit in GPU or CPU, then we don't need to monitor the time and progress since it can finish in the blink of an eye. Would you like to extend it to multiple batch with a batch_size parameter passing to BaseClassifier for the case when there are too many images in the list? Thank you!

@johnbradley
Copy link
Collaborator

I think a batch_size parameter might be a little more convenient on the predict() function because the constructor for the classifiers takes a little while to load the model and supporting files.

@work4cs
Copy link
Author

work4cs commented Jan 23, 2025

I think a batch_size parameter might be a little more convenient on the predict() function because the constructor for the classifiers takes a little while to load the model and supporting files.

Sounds good!

johnbradley added a commit that referenced this issue Jan 30, 2025
Adds --batch-size for predict command and classifier predict()
methods.

Fixes #80
johnbradley added a commit that referenced this issue Jan 31, 2025
Adds --batch-size for predict command and classifier predict()
methods.

Fixes #80
johnbradley added a commit that referenced this issue Jan 31, 2025
Adds --batch-size for predict command and classifier predict()
methods.

Fixes #80
johnbradley added a commit that referenced this issue Jan 31, 2025
Adds --batch-size for predict command.
Adds batch_size to classifier predict() function.
Default batch size is 10.
Updates early notebooks to process images in a batch.

Fixes #80
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants