Fast converge to oversized models #1571
Unanswered
HighExecutor
asked this question in
Q&A
Replies: 1 comment
-
This feature is not working that well for AutoKeras. The |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Good day!
My train data has following shape: 7373 images 120x120.
I'm trying to restrict model size with
max_model_size
parameter. For example:However, the search process is converging very fast (2-5 trials) to models that have more parameters with log:
I played with bigger amount of parameters like 20KK and everytime I get this error. This happens for both 'greedy' and 'bayessian' tuners.
When I remove this restriction, the framework just starts to fit very big models that crucially slow the learning process or even just went out of memory. The tests were conducted on local 32GB RAM with NVidia 2080RTX machine and on DGX cluster.
The question is, how to restrict a size of model?
Is this limitation for max_model_size is used by tuner to restrict the search space?
The Idea:
It would be great if tuner will able to use this limitation and do not find oversized models.
Beta Was this translation helpful? Give feedback.
All reactions