Skip to content

refactor: get_working_proxy#216

Merged
LY-Xiang merged 1 commit into
LL1.2.4from
refactor/getworkingproxy
Aug 13, 2025
Merged

refactor: get_working_proxy#216
LY-Xiang merged 1 commit into
LL1.2.4from
refactor/getworkingproxy

Conversation

@LY-Xiang
Copy link
Copy Markdown
Collaborator

@LY-Xiang LY-Xiang commented Aug 13, 2025

Sourcery 总结

重构了用于检查代理连接性的辅助函数,改进了类型提示,精简了逻辑,并实现了更安全的执行器关闭。

增强功能:

  • can_connect 中添加类型注解并统一异常处理
  • 简化 can_connect 以使用 response.ok
  • 更改 check_proxy 以返回 URL 和针对 ZIP 归档文件的连接结果的元组
  • 重构 get_working_proxy 以内联方式提交 futures,正确关闭执行器,并在没有可用代理时返回空字符串
Original summary in English

Summary by Sourcery

Refactor helper functions for checking proxy connectivity with improved type hints, streamlined logic, and safer executor shutdown.

Enhancements:

  • Add type annotations and unify exception handling in can_connect
  • Simplify can_connect to use response.ok
  • Change check_proxy to return a tuple of URL and connectivity result against the ZIP archive
  • Refactor get_working_proxy to submit futures inline, properly shutdown the executor, and return an empty string when no proxy works

@sourcery-ai
Copy link
Copy Markdown
Contributor

sourcery-ai Bot commented Aug 13, 2025

审阅者指南

此 PR 通过引入类型注解、通过 requests.head(...).ok 简化连接检查、统一 check_proxy 返回 (url,success) 元组,以及通过适当的执行器关闭和空字符串回退来简化 get_working_proxy 并发逻辑,从而重构了代理连接流程。

重构后代理连接检查的序列图

sequenceDiagram
    participant get_working_proxy
    participant ThreadPoolExecutor
    participant check_proxy
    participant can_connect
    participant requests
    get_working_proxy->>ThreadPoolExecutor: Submit check_proxy for each proxy
    ThreadPoolExecutor->>check_proxy: Execute check_proxy(proxy)
    check_proxy->>can_connect: Call can_connect(url)
    can_connect->>requests: requests.head(url, timeout)
    requests-->>can_connect: Response (ok or exception)
    can_connect-->>check_proxy: Return True/False
    check_proxy-->>ThreadPoolExecutor: Return (url, result)
    ThreadPoolExecutor-->>get_working_proxy: Return first successful proxy or ""
Loading

文件级更改

更改 详情 文件
重构 can_connect 签名和实现
  • 为 url、timeout 和返回类型添加类型提示
  • 将状态码范围检查替换为 response.ok
  • 将异常捕获简化为 requests.RequestException
install_windows.py
重新设计 check_proxy 以使用 can_connect 并返回元组
  • 将签名更改为 (url: str, timeout: float) -> tuple[str,bool]
  • 将 URL 健康检查委托给 can_connect,并使用特定的 GitHub 存档 URL
  • 始终返回 (url, result) 而非 None
install_windows.py
简化 get_working_proxy 并发和清理
  • 将返回类型注解为 str 并在未找到时返回空字符串
  • 使用列表推导式提交 Future 并取消手动 Future 到代理的映射
  • 在 finally 块中添加带有 cancel_futures 的 executor.shutdown
install_windows.py

提示和命令

与 Sourcery 交互

  • 触发新审查: 在拉取请求上评论 @sourcery-ai review
  • 继续讨论: 直接回复 Sourcery 的审查评论。
  • 从审查评论生成 GitHub issue: 回复 Sourcery 的审查评论,要求 Sourcery 从中创建 issue。您也可以回复审查评论并加上 @sourcery-ai issue 来创建 issue。
  • 生成拉取请求标题: 随时在拉取请求标题的任意位置写入 @sourcery-ai,即可生成标题。您也可以在拉取请求上评论 @sourcery-ai title,随时(重新)生成标题。
  • 生成拉取请求摘要: 随时在拉取请求正文的任意位置写入 @sourcery-ai summary,即可在您想要的位置生成 PR 摘要。您也可以在拉取请求上评论 @sourcery-ai summary,随时(重新)生成摘要。
  • 生成审阅者指南: 在拉取请求上评论 @sourcery-ai guide,随时(重新)生成审阅者指南。
  • 解决所有 Sourcery 评论: 在拉取请求上评论 @sourcery-ai resolve,以解决所有 Sourcery 评论。如果您已处理完所有评论且不想再看到它们,这将很有用。
  • 驳回所有 Sourcery 审查: 在拉取请求上评论 @sourcery-ai dismiss,以驳回所有现有 Sourcery 审查。如果您想从头开始进行新审查,这将特别有用——别忘了评论 @sourcery-ai review 来触发新审查!

自定义您的体验

访问您的 仪表盘 以:

  • 启用或禁用审查功能,例如 Sourcery 生成的拉取请求摘要、审阅者指南等。
  • 更改审查语言。
  • 添加、删除或编辑自定义审查指令。
  • 调整其他审查设置。

获取帮助

Original review guide in English

Reviewer's Guide

The PR refactors the proxy connectivity flows by introducing type annotations, streamlining connectivity checks via requests.head(...).ok, unifying check_proxy to return a (url,success) tuple, and simplifying the get_working_proxy concurrency logic with proper executor shutdown and an empty-string fallback.

Sequence diagram for the refactored proxy connectivity check

sequenceDiagram
    participant get_working_proxy
    participant ThreadPoolExecutor
    participant check_proxy
    participant can_connect
    participant requests
    get_working_proxy->>ThreadPoolExecutor: Submit check_proxy for each proxy
    ThreadPoolExecutor->>check_proxy: Execute check_proxy(proxy)
    check_proxy->>can_connect: Call can_connect(url)
    can_connect->>requests: requests.head(url, timeout)
    requests-->>can_connect: Response (ok or exception)
    can_connect-->>check_proxy: Return True/False
    check_proxy-->>ThreadPoolExecutor: Return (url, result)
    ThreadPoolExecutor-->>get_working_proxy: Return first successful proxy or ""
Loading

File-Level Changes

Change Details Files
Refactor can_connect signature and implementation
  • Add type hints for url, timeout, and return type
  • Replace status_code range check with response.ok
  • Simplify exception catch to requests.RequestException
install_windows.py
Redesign check_proxy to use can_connect and return tuple
  • Change signature to (url: str, timeout: float) -> tuple[str,bool]
  • Delegate URL health check to can_connect with specific GitHub archive URL
  • Always return (url, result) instead of None
install_windows.py
Simplify get_working_proxy concurrency and cleanup
  • Annotate return type as str and return empty string if none found
  • Use list comprehension to submit futures and drop manual future-to-proxy map
  • Add executor.shutdown with cancel_futures in finally block
install_windows.py

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LY-Xiang - 我已经审阅了你的更改,它们看起来很棒!

AI 代理提示
请解决此代码审查中的评论:
## 个人评论

### 评论 1
<location> `install_windows.py:472` </location>
<code_context>
-            if result is not None:
-                return result
-    return None
+    executor = ThreadPoolExecutor()
+    try:
+        for future in as_completed(
+            [executor.submit(check_proxy, proxy) for proxy in proxies]
+        ):
+            proxy, result = future.result()
+            if result:
+                return proxy
+        return ""
+    finally:
+        executor.shutdown(wait=False, cancel_futures=True)


</code_context>

<issue_to_address>
ThreadPoolExecutor 未受代理数量限制,这可能会影响资源使用。

明确将 max_workers 设置为 len(proxies) 可能会提供更可预测的资源使用和并行性,尤其是在代理数量变化时。
</issue_to_address>

<suggested_fix>
<<<<<<< SEARCH
    executor = ThreadPoolExecutor()
=======
    executor = ThreadPoolExecutor(max_workers=len(proxies))
>>>>>>> REPLACE

</suggested_fix>

### 评论 2
<location> `install_windows.py:482` </location>
<code_context>
+                return proxy
+        return ""
+    finally:
+        executor.shutdown(wait=False, cancel_futures=True)


</code_context>

<issue_to_address>
如果所有 future 都已完成,则使用 cancel_futures=True 可能不是必需的。

考虑删除 cancel_futures=True,除非您预期有未完成的 future,因为 as_completed 确保所有 future 都已完成。
</issue_to_address>

### 评论 3
<location> `install_windows.py:463` </location>
<code_context>
-    except requests.exceptions.RequestException:
-        pass
-    return None
+def check_proxy(url: str, timeout: float = 5) -> tuple[str, bool]:
+    return url, can_connect(
+        f"{url}/https://github.com/Mzdyl/LiteLoaderQQNT_Install/archive/refs/heads/main.zip",
</code_context>

<issue_to_address>
考虑重构 check_proxy 以仅返回布尔值,并在 get_working_proxy 中使用 ThreadPoolExecutor 和上下文管理器以及 executor.map,以实现更简洁和可维护的实现。

您可以通过以下方式简化这两个函数:

 1. 使 `check_proxy` 仅返回 `bool`2. 使用 `with ThreadPoolExecutor(...)` 上下文管理器 + `executor.map`
 3. 使用单个 `next(...)` 选择第一个可用的代理

这恢复了简洁的流程,并避免了调用方中的手动关闭或元组解包:

```python
def check_proxy(proxy: str, timeout: float = 5) -> bool:
    test_url = f"{proxy}/https://github.com/Mzdyl/LiteLoaderQQNT_Install/archive/refs/heads/main.zip"
    return can_connect(test_url, timeout)
```

```python
from concurrent.futures import ThreadPoolExecutor

def get_working_proxy() -> str:
    proxies = get_github_proxy_urls()
    # use len(proxies) to maximize concurrency, and auto‐shutdown
    with ThreadPoolExecutor(max_workers=len(proxies)) as executor:
        # map each proxy to (proxy, ok)
        results = executor.map(lambda p: (p, check_proxy(p)), proxies)
        # return the first proxy where ok is True, or empty string
        return next((p for p, ok in results if ok), "")
```

这保持了相同的行为,恢复了上下文管理器,并删除了样板解包和手动执行器关闭。
</issue_to_address>

Sourcery 对开源项目免费 - 如果您喜欢我们的评论,请考虑分享它们 ✨
帮助我更有用!请点击每个评论上的 👍 或 👎,我将使用反馈来改进您的评论。
Original comment in English

Hey @LY-Xiang - I've reviewed your changes and they look great!

Prompt for AI Agents
Please address the comments from this code review:
## Individual Comments

### Comment 1
<location> `install_windows.py:472` </location>
<code_context>
-            if result is not None:
-                return result
-    return None
+    executor = ThreadPoolExecutor()
+    try:
+        for future in as_completed(
+            [executor.submit(check_proxy, proxy) for proxy in proxies]
+        ):
+            proxy, result = future.result()
+            if result:
+                return proxy
+        return ""
+    finally:
+        executor.shutdown(wait=False, cancel_futures=True)


</code_context>

<issue_to_address>
ThreadPoolExecutor is not limited by the number of proxies, which may impact resource usage.

Explicitly setting max_workers to len(proxies) may provide more predictable resource usage and parallelism, especially when the number of proxies varies.
</issue_to_address>

<suggested_fix>
<<<<<<< SEARCH
    executor = ThreadPoolExecutor()
=======
    executor = ThreadPoolExecutor(max_workers=len(proxies))
>>>>>>> REPLACE

</suggested_fix>

### Comment 2
<location> `install_windows.py:482` </location>
<code_context>
+                return proxy
+        return ""
+    finally:
+        executor.shutdown(wait=False, cancel_futures=True)


</code_context>

<issue_to_address>
Using cancel_futures=True may not be necessary if all futures have completed.

Consider removing cancel_futures=True unless you anticipate incomplete futures, as as_completed ensures all are finished.
</issue_to_address>

### Comment 3
<location> `install_windows.py:463` </location>
<code_context>
-    except requests.exceptions.RequestException:
-        pass
-    return None
+def check_proxy(url: str, timeout: float = 5) -> tuple[str, bool]:
+    return url, can_connect(
+        f"{url}/https://github.com/Mzdyl/LiteLoaderQQNT_Install/archive/refs/heads/main.zip",
</code_context>

<issue_to_address>
Consider refactoring check_proxy to return only a bool and using ThreadPoolExecutor with a context manager and executor.map in get_working_proxy for a more concise and maintainable implementation.

You can simplify both functions by

 1. Making `check_proxy` return only a `bool`  
 2. Using a `with ThreadPoolExecutor(...)` context manager + `executor.map`  
 3. Picking the first working proxy with a single `next(...)`

This restores the concise flow and avoids manual shutdown or tuple‐unpacking in the caller:

```python
def check_proxy(proxy: str, timeout: float = 5) -> bool:
    test_url = f"{proxy}/https://github.com/Mzdyl/LiteLoaderQQNT_Install/archive/refs/heads/main.zip"
    return can_connect(test_url, timeout)
```

```python
from concurrent.futures import ThreadPoolExecutor

def get_working_proxy() -> str:
    proxies = get_github_proxy_urls()
    # use len(proxies) to maximize concurrency, and auto‐shutdown
    with ThreadPoolExecutor(max_workers=len(proxies)) as executor:
        # map each proxy to (proxy, ok)
        results = executor.map(lambda p: (p, check_proxy(p)), proxies)
        # return the first proxy where ok is True, or empty string
        return next((p for p, ok in results if ok), "")
```

This keeps the same behavior, restores the context manager, and removes boilerplate unpacking and manual executor shutdown.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment thread install_windows.py
Comment thread install_windows.py
Comment thread install_windows.py
@LY-Xiang LY-Xiang merged commit 749b4b3 into LL1.2.4 Aug 13, 2025
2 of 6 checks passed
@LY-Xiang LY-Xiang deleted the refactor/getworkingproxy branch August 13, 2025 10:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant