Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Speed decreases significantly for real time inference #1

Open
Saqueeb opened this issue Jan 17, 2022 · 0 comments
Open

Speed decreases significantly for real time inference #1

Saqueeb opened this issue Jan 17, 2022 · 0 comments

Comments

@Saqueeb
Copy link

Saqueeb commented Jan 17, 2022

Hi there,
Thank you for this amazing repo. I tried converting my custom object detection model using this repo and while testing I found out the inference speed decreased from 25-35 FPS to 1-2.5 FPS which is really poor. I need to convert the darknet model to tf js and for that, I tried converting it to the Tensorflow model first. Do have any idea how I can keep the speed and accuracy and convert the weights to Tensorflow/TF JS model?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant