-
Notifications
You must be signed in to change notification settings - Fork 185
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using FetchContent to fetch libdawn.dylib from github when library not found #44
Conversation
This looks good overall, thanks! Tagging @MichealReed as well. A question - I'd expect a build to go down one of two mutually exclusive paths:
Mixing these two could raise potential issues since webgpu.h is generated by the build itself. Perhaps there's a flag/option that toggles between the two, but currently it seems to do both at the moment. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems like a good approach to fall back to downloading if not found. @austinvhuang how often do you think people will want to actually change and build their own dawnlib? We may be able to get by with downloads being the default or only support path for CMake?
cmake/gpu.cmake
Outdated
endif() | ||
FetchContent_Declare( | ||
libdawn | ||
URL https://github.com/austinvhuang/dawn-artifacts/releases/download/prerelease/libdawn.${libdawn_ext} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure if they're hosted yet but we should use the libdawn_${ARCH}_${BUILD_TYPE} convention unless @austinvhuang wants to migrate away from that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For windows yes but windows libraries are in another place. Under this url it's just libdawn.{so/dylib}. Maybe I can add one for Windows with the architecture and build type for windows.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we were going to structure all the libs with this sort of convention, could be mistaken, need Austin to clarify. Other OS will have debug/release/architecture differences too, so I think we wanted to capture all with a generalized approach.
cmake/gpu.cmake
Outdated
message("libdawn not found, try downloading from the release") | ||
if(${CMAKE_SYSTEM_NAME} MATCHES "Darwin") | ||
set(libdawn_ext "dylib") | ||
elseif(UNIX) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will definitely fail on windows until we get artifacts hosted for windows and this to accommodate the naming convention. Biggest challenge is debug/release builds with pre-downloaded artifacts. @austinvhuang weren't you thinking about using those google built libs instead of hosting?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Inside this else statement the system cannot be Windows, I can add download step to windows (i.e. line 53)
Maybe we should follow the @jgh- approach here https://github.com/jgh-/webgpu-workbench/blob/main/cmake/dawn.cmake |
True... this will build the dawn lib according to the os. Let me try it in my fork to see if this works. |
from [this](#44 (comment))
Well I'm trying to use similar grammar in the script but it just stucks in one place and stop responding. |
@austinvhuang I investigated downloading the artifacts from the dawn CI. The API requires a token for this -- https://docs.github.com/en/rest/actions/artifacts?apiVersion=2022-11-28#download-an-artifact we likely could set this up via CI though https://github.com/marketplace/actions/download-workflow-artifact This script hits the forbidden from lack of token import requests
import zipfile
import os
REPO_OWNER = 'google'
REPO_NAME = 'dawn'
WORKFLOW_FILE_NAME = 'ci.yml' # The specific workflow file name
def get_latest_successful_workflow_run():
url = f'https://api.github.com/repos/{REPO_OWNER}/{REPO_NAME}/actions/workflows/{WORKFLOW_FILE_NAME}/runs?status=success'
response = requests.get(url)
response.raise_for_status()
runs = response.json()['workflow_runs']
if not runs:
raise Exception('No successful workflow runs found')
return runs[0]['id'] # Get the latest successful run ID
def list_artifacts(run_id):
url = f'https://api.github.com/repos/{REPO_OWNER}/{REPO_NAME}/actions/runs/{run_id}/artifacts'
response = requests.get(url)
response.raise_for_status()
return response.json()['artifacts']
def download_artifact(artifact):
download_url = artifact['archive_download_url']
response = requests.get(download_url, stream=True)
response.raise_for_status()
with open(f"{artifact['name']}.zip", 'wb') as f:
for chunk in response.iter_content(chunk_size=8192):
f.write(chunk)
with zipfile.ZipFile(f"{artifact['name']}.zip", 'r') as zip_ref:
zip_ref.extractall(artifact['name'])
os.remove(f"{artifact['name']}.zip")
def main():
print("Fetching the latest successful workflow run ID...")
latest_run_id = get_latest_successful_workflow_run()
print(f"Listing artifacts for run ID: {latest_run_id}...")
artifacts = list_artifacts(latest_run_id)
for artifact in artifacts:
print(f"Downloading and extracting {artifact['name']}...")
download_artifact(artifact)
print(f"Extracted {artifact['name']}")
if __name__ == "__main__":
main() |
Yeah I think the repository is just huge, mine downloaded over 10GB. Building each run might not be so viable, we'll likely run into cache issues. May be better to initialize the repository for dawn using a submodule rather than relying on cmake. Probably we will have to find a way to automate the artifact downloads instead. |
You have to make sure you aren't downloading the submodules. I believe the dawn repository itself is reasonably sized but once you start cloning the submodules it gets huge. The dependency python script it runs is sufficient to get everything it needs to be built. |
Yeah after disabling download submodules it is much faster, but I'm having this error:
probably because dawn is previously built by webgpu and some variables haven't been cleaned yet. |
Might be better to initialize a submodule under third_party/local for dawn and then approach like # Set the DAWN_EXT_SOURCE_DIR to the local Dawn directory
set(DAWN_EXT_SOURCE_DIR "${TARGET_DIR}/third_party/local/dawn" CACHE STRING "Path to the local Dawn source directory")
execute_process(
COMMAND python3 ${DAWN_EXT_SOURCE_DIR}/tools/fetch_dawn_dependencies.py
WORKING_DIRECTORY ${DAWN_EXT_SOURCE_DIR}
RESULT_VARIABLE FETCH_RESULT
)
set(DESIRED_CMAKE_ARGS
-DDAWN_DISABLE_LOGGING=ON
-DCMAKE_POSITION_INDEPENDENT_CODE=ON
)
include(ExternalProject)
ExternalProject_Add(
dawn_project
PREFIX dawn
SOURCE_DIR ${DAWN_EXT_SOURCE_DIR}
CMAKE_ARGS ${DESIRED_CMAKE_ARGS}
BUILD_COMMAND ${CMAKE_COMMAND} --build <BINARY_DIR>
INSTALL_DIR ${INSTALL_DIR}
STEP_TARGETS install
)
add_custom_target(install_dawn
DEPENDS dawn_project-install
)
|
Got this working fairly well here, feel free to take any of it as inspiration for this refactor 👍 https://github.com/MichealReed/minigpu/tree/master/minigpu_ffi/src |
google/dawn#20 opened this, the dawn builds will error unless you add googletest to the dependencies list to init. |
In this version the project will be installed to third_party/local/dawn/install. Maybe the next step is to move the libs and include files to their designated place. |
Catching up on this thread. Thanks for investigating - I love the idea of getting artifacts from Dawn directly, last time I tried it out - I was able to download manually however, OSX security restrictions meant I could not use the mac dylib artifacts after downloading them because they're not signed (iirc couldn't even get to a dialog to override the restriction). Doesn't have to be solved in this PR though. If there's a workaround for that + API keys would be open to making it work, though there should probably still be a manual human approval for pointing to upstream versions since breaking API changes still happen with dawn. BTW, besides the issues - if you want to bring things up with the dawn team they seem to frequent matrix chat https://app.element.io/#/room/#webgpu-dawn:matrix.org
|
Downloads as a default common case make sense to me. Building from dawn would be a nice-to-have (but not must-have) for power users - for example @junjihashimoto is investigating subgroup features that landed in dawn the dawn repo at head just a few days ago. Another reason it could come up is for environments that we don't have the bandwidth to provide prebuilt binaries for. Ideally we could put that behind a flag (and/or as a fallback if downloads fail), but if it adds too much complexity we can consider leaving it out. |
Seems reasonable for now. In general, keeping cmake builds/locations coherent is good but mixing locations with the simplified make builds could be tricky since the generated include header is part of the repo and pinned to the prebuilt binary artifact version (so it's fine, maybe preferable, to keep those locations separate unless we have a strategy to make them play nice together). |
Okay - going ahead and merging for now. Happy to explore further refinements (eg dawn artifacts) in separate PRs or in the discord chat. Thanks! |
This enables the project to be fetched with cmake FetchContent and to be directly built with cmake without calling the python script.
Solution to this issue