Add basic first version of random kahypar+#35
Add basic first version of random kahypar+#35chrisstaudt wants to merge 8 commits intojcmgray:mainfrom
Conversation
|
Hello @CSNWEB! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found: There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2024-07-11 12:28:54 UTC |
|
Thanks for this! Yes random sampling of |
|
The basic version I added in this PR (temperature exponential, costumed uniform), which comes from your original hyperoptimizer if I remember correctly, was very effective in my tests. It nearly always performed better in terms of best path found after x seconds than optuna or nevergrad. It definitely seems to be sufficient to find reasonably good greedy paths. |
|
I reduced the code size quite a bit by removing all the contract tree related stuff and merging the paths directly while removing the duplicated tree traversal. Apart from the hacked in costmod and temperature tuning I am happy with the code now. |
|
Great, could you maybe add a basic test? e.g. just to the matrix in I'll take a look and add the costmod and temperature sampling soon. |
|
OK I've added |
|
Awesome, thanks. I will try to add some tests towards the end of the week. |
|
Just a short heads up that I haven't forgotten this, but due to some personal circumstances my time right now is quite limited, so I might not be able to have a look at this again until the end of the month or even start of July when the lecture period is over. |
|
I finally have some time again and added the method to the test matrix. I think it would probably make sense to add something in the docs as well, may be under drivers (https://cotengra.readthedocs.io/en/latest/advanced.html#drivers)? |
Based on https://github.com/ti2-group/hybrid_contraction_tree_optimizer/
Unfortunately I just realized that the RandomOptimizer does not optimize the costmod and temperature. Not sure how I missed that while going over the code. Without that the results are much worse though. I think ideally these parameters should be randomly chosen within the optimize function for each trial. For now I just hacked it over the optimize functions, which will probably lose some performance in the accelerated rust case.