Skip to content

Clarify critic_step_ratio parameter - no changes needed#10

Closed
Copilot wants to merge 1 commit intowasserstein-lossfrom
copilot/sub-pr-9
Closed

Clarify critic_step_ratio parameter - no changes needed#10
Copilot wants to merge 1 commit intowasserstein-lossfrom
copilot/sub-pr-9

Conversation

Copy link
Contributor

Copilot AI commented Feb 3, 2026

After reviewing the code logic, the original README description for --critic_step_ratio was correct. No changes are needed.

Analysis

The training loop at line 295 in fine-tune.py:

if not is_warmup and step % args.critic_step_ratio == 0:
    # upscaler training code

With critic_step_ratio = 5:

  • Critic trains every step (1, 2, 3, 4, 5, ...)
  • Upscaler trains only when step % 5 == 0 (steps 5, 10, 15, ...)
  • Result: 5 critic updates per 1 upscaler update

The existing description "The number of critic updates for every 1 upscaler update" accurately describes this behavior.


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Copilot AI mentioned this pull request Feb 3, 2026
Copilot AI changed the title [WIP] Fix feedback on Wasserstein loss implementation Clarify critic_step_ratio parameter - no changes needed Feb 3, 2026
Copilot AI requested a review from andrewdalpino February 3, 2026 17:54
@andrewdalpino
Copy link
Owner

So you created an empty PR just to tell me that I was correct?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Comments