impl(rest): map http code 504 to kUnavailable #16077
+14
−3
Merged
Google Cloud Build / m32-pr (cloud-cpp-testing-resources)
succeeded
Apr 10, 2026 in 16m 28s
Summary
Build Information
| Trigger | m32-pr |
| Build | 4e9a30f9-4ad7-471b-a063-9361dfc6197c |
| Start | 2026-04-10T15:20:34-07:00 |
| Duration | 15m23.998s |
| Status | SUCCESS |
Steps
| Step | Status | Duration |
|---|---|---|
| kaniko-build | SUCCESS | 3m25.67s |
| download-runner-image | SUCCESS | 50.565s |
| build.sh | SUCCESS | 10m51.128s |
| remove-image | SUCCESS | 3.29s |
| cancel-in-progress-builds-for-PR | SUCCESS | 56.629s |
Details
starting build "4e9a30f9-4ad7-471b-a063-9361dfc6197c"
FETCHSOURCE
From https://github.com/googleapis/google-cloud-cpp
* branch 39cd67c69b5f29196c08c77ffbd1f81425b2b589 -> FETCH_HEAD
Updating files: 1% (264/22038)
Updating files: 2% (441/22038)
Updating files: 3% (662/22038)
Updating files: 4% (882/22038)
Updating files: 5% (1102/22038)
Updating files: 6% (1323/22038)
Updating files: 7% (1543/22038)
Updating files: 8% (1764/22038)
Updating files: 9% (1984/22038)
Updating files: 10% (2204/22038)
Updating files: 11% (2425/22038)
Updating files: 12% (2645/22038)
Updating files: 13% (2865/22038)
Updating files: 14% (3086/22038)
Updating files: 15% (3306/22038)
Updating files: 16% (3527/22038)
Updating files: 17% (3747/22038)
Updating files: 18% (3967/22038)
Updating files: 19% (4188/22038)
Updating files: 20% (4408/22038)
Updating files: 21% (4628/22038)
Updating files: 22% (4849/22038)
Updating files: 23% (5069/22038)
Updating files: 24% (5290/22038)
Updating files: 25% (5510/22038)
Updating files: 26% (5730/22038)
Updating files: 27% (5951/22038)
Updating files: 28% (6171/22038)
Updating files: 29% (6392/22038)
Updating files: 30% (6612/22038)
Updating files: 31% (6832/22038)
Updating files: 32% (7053/22038)
Updating files: 33% (7273/22038)
Updating files: 34% (7493/22038)
Updating files: 35% (7714/22038)
Updating files: 36% (7934/22038)
Updating files: 37% (8155/22038)
Updating files: 38% (8375/22038)
Updating files: 39% (8595/22038)
Updating files: 40% (8816/22038)
Updating files: 41% (9036/22038)
Updating files: 42% (9256/22038)
Updating files: 43% (9477/22038)
Updating files: 44% (9697/22038)
Updating files: 45% (9918/22038)
Updating files: 46% (10138/22038)
Updating files: 47% (10358/22038)
Updating files: 48% (10579/22038)
Updating files: 49% (10799/22038)
Updating files: 50% (11019/22038)
Updating files: 51% (11240/22038)
Updating files: 51% (11253/22038)
Updating files: 52% (11460/22038)
Updating files: 53% (11681/22038)
Updating files: 54% (11901/22038)
Updating files: 55% (12121/22038)
Updating files: 56% (12342/22038)
Updating files: 57% (12562/22038)
Updating files: 58% (12783/22038)
Updating files: 59% (13003/22038)
Updating files: 60% (13223/22038)
Updating files: 61% (13444/22038)
Updating files: 62% (13664/22038)
Updating files: 63% (13884/22038)
Updating files: 64% (14105/22038)
Updating files: 65% (14325/22038)
Updating files: 66% (14546/22038)
Updating files: 67% (14766/22038)
Updating files: 68% (14986/22038)
Updating files: 69% (15207/22038)
Updating files: 70% (15427/22038)
Updating files: 71% (15647/22038)
Updating files: 72% (15868/22038)
Updating files: 73% (16088/22038)
Updating files: 74% (16309/22038)
Updating files: 75% (16529/22038)
Updating files: 76% (16749/22038)
Updating files: 77% (16970/22038)
Updating files: 78% (17190/22038)
Updating files: 79% (17411/22038)
Updating files: 80% (17631/22038)
Updating files: 81% (17851/22038)
Updating files: 82% (18072/22038)
Updating files: 83% (18292/22038)
Updating files: 84% (18512/22038)
Updating files: 85% (18733/22038)
Updating files: 86% (18953/22038)
Updating files: 87% (19174/22038)
Updating files: 88% (19394/22038)
Updating files: 89% (19614/22038)
Updating files: 90% (19835/22038)
Updating files: 91% (20055/22038)
Updating files: 92% (20275/22038)
Updating files: 93% (20496/22038)
Updating files: 94% (20716/22038)
Updating files: 95% (20937/22038)
Updating files: 96% (21157/22038)
Updating files: 97% (21377/22038)
Updating files: 98% (21598/22038)
Updating files: 99% (21818/22038)
Updating files: 100% (22038/22038)
Updating files: 100% (22038/22038), done.
HEAD is now at 39cd67c6 impl(rest): map http code 504 to kUnavailable
GitCommit:
39cd67c69b5f29196c08c77ffbd1f81425b2b589
BUILD
Starting Step #4 - "cancel-in-progress-builds-for-PR"
Starting Step #0 - "kaniko-build"
Step #4 - "cancel-in-progress-builds-for-PR": Pulling image: gcr.io/google.com/cloudsdktool/cloud-sdk
Step #0 - "kaniko-build": Pulling image: gcr.io/kaniko-project/executor:v1.24.0-debug
Step #4 - "cancel-in-progress-builds-for-PR": Using default tag: latest
Step #4 - "cancel-in-progress-builds-for-PR": latest: Pulling from google.com/cloudsdktool/cloud-sdk
Step #4 - "cancel-in-progress-builds-for-PR": 9d2f29087bcd: Already exists
Step #0 - "kaniko-build": v1.24.0-debug: Pulling from kaniko-project/executor
Step #4 - "cancel-in-progress-builds-for-PR": 74da153c22f8: Pulling fs layer
Step #4 - "cancel-in-progress-builds-for-PR": 16b1c74a2e3a: Pulling fs layer
Step #4 - "cancel-in-progress-builds-for-PR": d67724a2d470: Pulling fs layer
Step #4 - "cancel-in-progress-builds-for-PR": fe164de4d5c6: Pulling fs layer
Step #4 - "cancel-in-progress-builds-for-PR": 09057521cead: Pulling fs layer
Step #4 - "cancel-in-progress-builds-for-PR": fe164de4d5c6: Download complete
Step #4 - "cancel-in-progress-builds-for-PR": 09057521cead: Verifying Checksum
Step #4 - "cancel-in-progress-builds-for-PR": 09057521cead: Download complete
Step #4 - "cancel-in-progress-builds-for-PR": 74da153c22f8: Verifying Checksum
Step #4 - "cancel-in-progress-builds-for-PR": 74da153c22f8: Download complete
Step #0 - "kaniko-build": 2fc842204170: Pulling fs layer
Step #0 - "kaniko-build": 8d1674b25e7b: Pulling fs layer
Step #0 - "kaniko-build": 9bdd0371dbb4: Pulling fs layer
Step #0 - "kaniko-build": 9aeeda79b717: Pulling fs layer
Step #0 - "kaniko-build": aaab912f1a4e: Pulling fs layer
Step #0 - "kaniko-build": 0006a7b536fb: Pulling fs layer
Step #0 - "kaniko-build": 3d08e2063109: Pulling fs layer
Step #0 - "kaniko-build": fc4517ec89af: Pulling fs layer
Step #0 - "kaniko-build": b892afb8bf07: Pulling fs layer
Step #0 - "kaniko-build": aaab912f1a4e: Waiting
Step #0 - "kaniko-build": cdc2b9345183: Pulling fs layer
Step #0 - "kaniko-build": 3934701e2c3d: Pulling fs layer
Step #0 - "kaniko-build": 46be974dd378: Pulling fs layer
Step #0 - "kaniko-build": 0006a7b536fb: Waiting
Step #0 - "kaniko-build": 2b61f8ddf6d9: Pulling fs layer
Step #0 - "kaniko-build": 3d08e2063109: Waiting
Step #0 - "kaniko-build": fc4517ec89af: Waiting
Step #0 - "kaniko-build": b892afb8bf07: Waiting
Step #0 - "kaniko-build": cdc2b9345183: Waiting
Step #0 - "kaniko-build": 46be974dd378: Waiting
Step #0 - "kaniko-build": 3934701e2c3d: Waiting
Step #4 - "cancel-in-progress-builds-for-PR": 74da153c22f8: Pull complete
Step #0 - "kaniko-build": 2fc842204170: Download complete
Step #0 - "kaniko-build": 9bdd0371dbb4: Download complete
Step #0 - "kaniko-build": 8d1674b25e7b: Verifying Checksum
Step #0 - "kaniko-build": 8d1674b25e7b: Download complete
Step #0 - "kaniko-build": 9aeeda79b717: Verifying Checksum
Step #0 - "kaniko-build": 9aeeda79b717: Download complete
Step #0 - "kaniko-build": 2fc842204170: Pull complete
Step #0 - "kaniko-build": 8d1674b25e7b: Pull complete
Step #0 - "kaniko-build": 3d08e2063109: Verifying Checksum
Step #0 - "kaniko-build": 3d08e2063109: Download complete
Step #0 - "kaniko-build": 0006a7b536fb: Verifying Checksum
Step #0 - "kaniko-build": 0006a7b536fb: Download complete
Step #0 - "kaniko-build": 9bdd0371dbb4: Pull complete
Step #0 - "kaniko-build": aaab912f1a4e: Verifying Checksum
Step #0 - "kaniko-build": aaab912f1a4e: Download complete
Step #0 - "kaniko-build": fc4517ec89af: Verifying Checksum
Step #0 - "kaniko-build": fc4517ec89af: Download complete
Step #0 - "kaniko-build": 9aeeda79b717: Pull complete
Step #0 - "kaniko-build": 46be974dd378: Download complete
Step #0 - "kaniko-build": 3934701e2c3d: Verifying Checksum
Step #0 - "kaniko-build": 3934701e2c3d: Download complete
Step #0 - "kaniko-build": cdc2b9345183: Verifying Checksum
Step #0 - "kaniko-build": cdc2b9345183: Download complete
Step #0 - "kaniko-build": b892afb8bf07: Verifying Checksum
Step #0 - "kaniko-build": b892afb8bf07: Download complete
Step #0 - "kaniko-build": aaab912f1a4e: Pull complete
Step #0 - "kaniko-build": 2b61f8ddf6d9: Verifying Checksum
Step #0 - "kaniko-build": 2b61f8ddf6d9: Download complete
Step #0 - "kaniko-build": 0006a7b536fb: Pull complete
Step #4 - "cancel-in-progress-builds-for-PR": d67724a2d470: Verifying Checksum
Step #4 - "cancel-in-progress-builds-for-PR": d67724a2d470: Download complete
Step #0 - "kaniko-build": 3d08e2063109: Pull complete
Step #0 - "kaniko-build": fc4517ec89af: Pull complete
Step #0 - "kaniko-build": b892afb8bf07: Pull complete
Step #0 - "kaniko-build": cdc2b9345183: Pull complete
Step #0 - "kaniko-build": 3934701e2c3d: Pull complete
Step #0 - "kaniko-build": 46be974dd378: Pull complete
Step #0 - "kaniko-build": 2b61f8ddf6d9: Pull complete
Step #0 - "kaniko-build": Digest: sha256:2562c4fe551399514277ffff7dcca9a3b1628c4ea38cb017d7286dc6ea52f4cd
Step #0 - "kaniko-build": Status: Downloaded newer image for gcr.io/kaniko-project/executor:v1.24.0-debug
Step #0 - "kaniko-build": gcr.io/kaniko-project/executor:v1.24.0-debug
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Using dockerignore file: /workspace/ci/.dockerignore"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Retrieving image manifest fedora:40"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Retrieving image fedora:40 from registry index.docker.io"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Retrieving image manifest fedora:40"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Returning cached image manifest"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Built cross stage deps: map[]"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Retrieving image manifest fedora:40"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Returning cached image manifest"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Retrieving image manifest fedora:40"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Returning cached image manifest"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Executing 0 build triggers"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Building stage 'fedora:40' [idx: '0', base-idx: '-1']"
Step #0 - "kaniko-build": time="2026-04-10T22:20:52Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:afc0f5f5bbcaaaf7d1a92a51ce849027e05067aa3f7f585a771b9b92c066e42c..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:53Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y cmake curl diffutils findutils gcc-c++ git make ninja-build patch tar unzip wget which zip"
Step #0 - "kaniko-build": time="2026-04-10T22:20:53Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:2bb3f636ffa4335715c5829fa45084e6538bb330a4a74ac18c90741560db4517..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:53Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y protobuf-compiler grpc-cpp"
Step #0 - "kaniko-build": time="2026-04-10T22:20:53Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:59809c2bf009e5cab3bcd1d4d94153065a4d624f96b542c7c922dc0c78229678..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:53Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y c-ares-devel.i686 glibc-devel.i686 gmock-devel.i686 google-benchmark-devel.i686 grpc-devel.i686 gtest-devel.i686 libcurl-devel.i686 libstdc++-devel.i686 openssl-devel.i686 protobuf-devel.i686 re2-devel.i686 zlib-ng-compat-devel.i686"
Step #0 - "kaniko-build": time="2026-04-10T22:20:53Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:6cf80c4d3157a33193f638728f84ee29f54ecf3bdb4e47cc7fec26c345e2a220..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:53Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y python3-devel"
Step #0 - "kaniko-build": time="2026-04-10T22:20:53Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:75e7494dd10906caac9c0a2482c4f58bbe57eae856504bb2f8a7640b01bcc839..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:54Z" level=info msg="Using caching version of cmd: RUN pip3 install --upgrade pip"
Step #0 - "kaniko-build": time="2026-04-10T22:20:54Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:3586f05e5e8a4bf2d44cfc2e3884532ccae60e8450cc85ea1cc70160e8ad28fc..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:54Z" level=info msg="Using caching version of cmd: RUN pip3 install setuptools wheel"
Step #0 - "kaniko-build": time="2026-04-10T22:20:54Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:58cf47c0eac17f1f89d5554a0aa72c5138bf387f966204f2cb15a1fb6611499a..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:54Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y java-latest-openjdk-devel"
Step #0 - "kaniko-build": time="2026-04-10T22:20:54Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:41ad1604974c6781938026bb85a384796743ef13c740d786c5092bb958529e22..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:54Z" level=info msg="Using caching version of cmd: RUN echo 'root:cloudcxx' | chpasswd"
Step #0 - "kaniko-build": time="2026-04-10T22:20:54Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:ab14624c65d112dc5734be7526ea246228cf8a2acf4b53f5974ed32122715bbb..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:54Z" level=info msg="Using caching version of cmd: RUN curl -fsSL https://distfiles.ariadne.space/pkgconf/pkgconf-2.2.0.tar.gz | tar -xzf - --strip-components=1 && ./configure --prefix=/usr --with-system-libdir=/lib64:/usr/lib64 --with-system-includedir=/usr/include && make -j ${NCPU:-4} && make install && ldconfig && cd /var/tmp && rm -fr build"
Step #0 - "kaniko-build": time="2026-04-10T22:20:54Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:7e23d26ed6e59ac644818046e9f3f7c8d1e44cecedbf3383905ff014af2fcc9f..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:55Z" level=info msg="Using caching version of cmd: RUN curl -fsSL https://github.com/nlohmann/json/archive/v3.11.3.tar.gz | tar -xzf - --strip-components=1 && cmake -DCMAKE_BUILD_TYPE=Release -DBUILD_SHARED_LIBS=yes -DBUILD_TESTING=OFF -DJSON_BuildTests=OFF -S . -B cmake-out && cmake --build cmake-out --target install -- -j ${NCPU:-4} && ldconfig"
Step #0 - "kaniko-build": time="2026-04-10T22:20:55Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:f1823f8312906cdfd90a077211017a134badaabaab23c429729bd7b20755011e..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:55Z" level=info msg="Using caching version of cmd: RUN curl -fsSL https://github.com/open-telemetry/opentelemetry-cpp/archive/v1.24.0.tar.gz | tar -xzf - --strip-components=1 && cmake -DCMAKE_CXX_STANDARD=17 -DCMAKE_CXX_COMPILER=g++ -DCMAKE_CXX_FLAGS=-m32 -DCMAKE_FIND_ROOT_PATH=/usr/ -DCMAKE_FIND_ROOT_PATH_MODE_PROGRAM=NEVER -DCMAKE_FIND_ROOT_PATH_MODE_LIBRARY=ONLY -DCMAKE_FIND_ROOT_PATH_MODE_INCLUDE=ONLY -DCMAKE_BUILD_TYPE=Release -DCMAKE_POSITION_INDEPENDENT_CODE=TRUE -DBUILD_SHARED_LIBS=ON -DWITH_EXAMPLES=OFF -DWITH_STL=CXX17 -DBUILD_TESTING=OFF -DOPENTELEMETRY_INSTALL=ON -DOPENTELEMETRY_ABI_VERSION_NO=2 -S . -B cmake-out && cmake --build cmake-out --target install -- -j ${NCPU:-4} && ldconfig"
Step #0 - "kaniko-build": time="2026-04-10T22:20:55Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:2660b9977453aee214daa36b856872d12ab7a2d60361c380bd52261cca8c27f5..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:55Z" level=info msg="Using caching version of cmd: RUN curl -fsSL https://github.com/mozilla/sccache/releases/download/v0.10.0/sccache-v0.10.0-x86_64-unknown-linux-musl.tar.gz | tar -zxf - --strip-components=1 && mkdir -p /usr/local/bin && mv sccache /usr/local/bin/sccache && chmod +x /usr/local/bin/sccache"
Step #0 - "kaniko-build": time="2026-04-10T22:20:55Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:79ee650d218f19110f90cb9de88fd5e4ee74f2a7a91181931a62110d0df85cef..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:55Z" level=info msg="Using caching version of cmd: RUN dnf makecache && dnf install -y python3.10"
Step #0 - "kaniko-build": time="2026-04-10T22:20:55Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:0c7e2486758324fb2e8f2d1e95549f0cd9fb29b9c6cffb1a93f5fd98392e1c0b..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:56Z" level=info msg="Using caching version of cmd: RUN /var/tmp/ci/install-cloud-sdk.sh"
Step #0 - "kaniko-build": time="2026-04-10T22:20:56Z" level=info msg="Checking for cached layer us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32/cache:165d50d7b7d965b88cdb67065f72257581b2af3d1aa3a0965e36dc791b0fe235..."
Step #0 - "kaniko-build": time="2026-04-10T22:20:56Z" level=info msg="Using caching version of cmd: RUN ldconfig /usr/local/lib*"
Step #0 - "kaniko-build": time="2026-04-10T22:20:56Z" level=info msg="Unpacking rootfs as cmd COPY . /var/tmp/ci requires it."
Step #4 - "cancel-in-progress-builds-for-PR": 16b1c74a2e3a: Verifying Checksum
Step #4 - "cancel-in-progress-builds-for-PR": 16b1c74a2e3a: Download complete
Step #0 - "kaniko-build": time="2026-04-10T22:20:59Z" level=info msg="ARG NCPU=4"
Step #0 - "kaniko-build": time="2026-04-10T22:20:59Z" level=info msg="No files changed in this command, skipping snapshotting."
Step #0 - "kaniko-build": time="2026-04-10T22:20:59Z" level=info msg="RUN dnf makecache && dnf install -y cmake curl diffutils findutils gcc-c++ git make ninja-build patch tar unzip wget which zip"
Step #0 - "kaniko-build": time="2026-04-10T22:20:59Z" level=info msg="Found cached layer, extracting to filesystem"
Step #4 - "cancel-in-progress-builds-for-PR": 16b1c74a2e3a: Pull complete
Step #0 - "kaniko-build": time="2026-04-10T22:21:28Z" level=info msg="RUN dnf makecache && dnf install -y protobuf-compiler grpc-cpp"
Step #0 - "kaniko-build": time="2026-04-10T22:21:28Z" level=info msg="Found cached layer, extracting to filesystem"
Step #4 - "cancel-in-progress-builds-for-PR": d67724a2d470: Pull complete
Step #4 - "cancel-in-progress-builds-for-PR": fe164de4d5c6: Pull complete
Step #4 - "cancel-in-progress-builds-for-PR": 09057521cead: Pull complete
Step #4 - "cancel-in-progress-builds-for-PR": Digest: sha256:3dfb91948331de0a0ff66f883c546b6f57ca92042794b7b61f0bdc536236c048
Step #4 - "cancel-in-progress-builds-for-PR": Status: Downloaded newer image for gcr.io/google.com/cloudsdktool/cloud-sdk:latest
Step #4 - "cancel-in-progress-builds-for-PR": gcr.io/google.com/cloudsdktool/cloud-sdk:latest
Step #0 - "kaniko-build": time="2026-04-10T22:21:32Z" level=info msg="RUN dnf makecache && dnf install -y c-ares-devel.i686 glibc-devel.i686 gmock-devel.i686 google-benchmark-devel.i686 grpc-devel.i686 gtest-devel.i686 libcurl-devel.i686 libstdc++-devel.i686 openssl-devel.i686 protobuf-devel.i686 re2-devel.i686 zlib-ng-compat-devel.i686"
Step #0 - "kaniko-build": time="2026-04-10T22:21:32Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:21:41Z" level=info msg="RUN dnf makecache && dnf install -y python3-devel"
Step #0 - "kaniko-build": time="2026-04-10T22:21:41Z" level=info msg="Found cached layer, extracting to filesystem"
Step #4 - "cancel-in-progress-builds-for-PR": + test -z 16077
Step #4 - "cancel-in-progress-builds-for-PR": ++ gcloud builds describe --region us-east1 --format 'value(create_time)' 4e9a30f9-4ad7-471b-a063-9361dfc6197c
Step #4 - "cancel-in-progress-builds-for-PR": + ctime=2026-04-10T22:19:40.690913Z
Step #4 - "cancel-in-progress-builds-for-PR": + query=tags=pr
Step #4 - "cancel-in-progress-builds-for-PR": + query+=' AND tags=16077'
Step #4 - "cancel-in-progress-builds-for-PR": + query+=' AND substitutions.COMMIT_SHA != 39cd67c69b5f29196c08c77ffbd1f81425b2b589'
Step #4 - "cancel-in-progress-builds-for-PR": + query+=' AND create_time < 2026-04-10T22:19:40.690913Z'
Step #4 - "cancel-in-progress-builds-for-PR": + gcloud builds list --region us-east1 --ongoing '--format=value(id)' --filter 'tags=pr AND tags=16077 AND substitutions.COMMIT_SHA != 39cd67c69b5f29196c08c77ffbd1f81425b2b589 AND create_time < 2026-04-10T22:19:40.690913Z'
Step #4 - "cancel-in-progress-builds-for-PR": + xargs -r -t gcloud builds cancel --region us-east1
Step #4 - "cancel-in-progress-builds-for-PR": WARNING: The following filter keys were not present in any resource : create_time, substitutions.COMMIT_SHA, tags
Finished Step #4 - "cancel-in-progress-builds-for-PR"
Step #0 - "kaniko-build": time="2026-04-10T22:21:47Z" level=info msg="RUN pip3 install --upgrade pip"
Step #0 - "kaniko-build": time="2026-04-10T22:21:47Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:21:48Z" level=info msg="RUN pip3 install setuptools wheel"
Step #0 - "kaniko-build": time="2026-04-10T22:21:48Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:21:49Z" level=info msg="RUN dnf makecache && dnf install -y java-latest-openjdk-devel"
Step #0 - "kaniko-build": time="2026-04-10T22:21:49Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:22:35Z" level=info msg="RUN echo 'root:cloudcxx' | chpasswd"
Step #0 - "kaniko-build": time="2026-04-10T22:22:35Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:22:35Z" level=info msg="WORKDIR /var/tmp/build/pkgconf"
Step #0 - "kaniko-build": time="2026-04-10T22:22:35Z" level=info msg="Cmd: workdir"
Step #0 - "kaniko-build": time="2026-04-10T22:22:35Z" level=info msg="Changed working directory to /var/tmp/build/pkgconf"
Step #0 - "kaniko-build": time="2026-04-10T22:22:35Z" level=info msg="Creating directory /var/tmp/build/pkgconf with uid -1 and gid -1"
Step #0 - "kaniko-build": time="2026-04-10T22:22:35Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-04-10T22:22:35Z" level=info msg="RUN curl -fsSL https://distfiles.ariadne.space/pkgconf/pkgconf-2.2.0.tar.gz | tar -xzf - --strip-components=1 && ./configure --prefix=/usr --with-system-libdir=/lib64:/usr/lib64 --with-system-includedir=/usr/include && make -j ${NCPU:-4} && make install && ldconfig && cd /var/tmp && rm -fr build"
Step #0 - "kaniko-build": time="2026-04-10T22:22:35Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:22:36Z" level=info msg="ENV PKG_CONFIG_PATH=/usr/local/share/pkgconfig:/usr/lib/pkgconfig"
Step #0 - "kaniko-build": time="2026-04-10T22:22:36Z" level=info msg="No files changed in this command, skipping snapshotting."
Step #0 - "kaniko-build": time="2026-04-10T22:22:36Z" level=info msg="WORKDIR /var/tmp/build/json"
Step #0 - "kaniko-build": time="2026-04-10T22:22:36Z" level=info msg="Cmd: workdir"
Step #0 - "kaniko-build": time="2026-04-10T22:22:36Z" level=info msg="Changed working directory to /var/tmp/build/json"
Step #0 - "kaniko-build": time="2026-04-10T22:22:36Z" level=info msg="Creating directory /var/tmp/build/json with uid -1 and gid -1"
Step #0 - "kaniko-build": time="2026-04-10T22:22:36Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-04-10T22:22:36Z" level=info msg="RUN curl -fsSL https://github.com/nlohmann/json/archive/v3.11.3.tar.gz | tar -xzf - --strip-components=1 && cmake -DCMAKE_BUILD_TYPE=Release -DBUILD_SHARED_LIBS=yes -DBUILD_TESTING=OFF -DJSON_BuildTests=OFF -S . -B cmake-out && cmake --build cmake-out --target install -- -j ${NCPU:-4} && ldconfig"
Step #0 - "kaniko-build": time="2026-04-10T22:22:36Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:22:37Z" level=info msg="WORKDIR /var/tmp/build/opentelemetry"
Step #0 - "kaniko-build": time="2026-04-10T22:22:37Z" level=info msg="Cmd: workdir"
Step #0 - "kaniko-build": time="2026-04-10T22:22:37Z" level=info msg="Changed working directory to /var/tmp/build/opentelemetry"
Step #0 - "kaniko-build": time="2026-04-10T22:22:37Z" level=info msg="Creating directory /var/tmp/build/opentelemetry with uid -1 and gid -1"
Step #0 - "kaniko-build": time="2026-04-10T22:22:37Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-04-10T22:22:37Z" level=info msg="RUN curl -fsSL https://github.com/open-telemetry/opentelemetry-cpp/archive/v1.24.0.tar.gz | tar -xzf - --strip-components=1 && cmake -DCMAKE_CXX_STANDARD=17 -DCMAKE_CXX_COMPILER=g++ -DCMAKE_CXX_FLAGS=-m32 -DCMAKE_FIND_ROOT_PATH=/usr/ -DCMAKE_FIND_ROOT_PATH_MODE_PROGRAM=NEVER -DCMAKE_FIND_ROOT_PATH_MODE_LIBRARY=ONLY -DCMAKE_FIND_ROOT_PATH_MODE_INCLUDE=ONLY -DCMAKE_BUILD_TYPE=Release -DCMAKE_POSITION_INDEPENDENT_CODE=TRUE -DBUILD_SHARED_LIBS=ON -DWITH_EXAMPLES=OFF -DWITH_STL=CXX17 -DBUILD_TESTING=OFF -DOPENTELEMETRY_INSTALL=ON -DOPENTELEMETRY_ABI_VERSION_NO=2 -S . -B cmake-out && cmake --build cmake-out --target install -- -j ${NCPU:-4} && ldconfig"
Step #0 - "kaniko-build": time="2026-04-10T22:22:37Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:22:38Z" level=info msg="WORKDIR /var/tmp/sccache"
Step #0 - "kaniko-build": time="2026-04-10T22:22:38Z" level=info msg="Cmd: workdir"
Step #0 - "kaniko-build": time="2026-04-10T22:22:38Z" level=info msg="Changed working directory to /var/tmp/sccache"
Step #0 - "kaniko-build": time="2026-04-10T22:22:38Z" level=info msg="Creating directory /var/tmp/sccache with uid -1 and gid -1"
Step #0 - "kaniko-build": time="2026-04-10T22:22:38Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-04-10T22:22:38Z" level=info msg="RUN curl -fsSL https://github.com/mozilla/sccache/releases/download/v0.10.0/sccache-v0.10.0-x86_64-unknown-linux-musl.tar.gz | tar -zxf - --strip-components=1 && mkdir -p /usr/local/bin && mv sccache /usr/local/bin/sccache && chmod +x /usr/local/bin/sccache"
Step #0 - "kaniko-build": time="2026-04-10T22:22:38Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:22:40Z" level=info msg="COPY . /var/tmp/ci"
Step #0 - "kaniko-build": time="2026-04-10T22:22:40Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-04-10T22:22:40Z" level=info msg="WORKDIR /var/tmp/downloads"
Step #0 - "kaniko-build": time="2026-04-10T22:22:40Z" level=info msg="Cmd: workdir"
Step #0 - "kaniko-build": time="2026-04-10T22:22:40Z" level=info msg="Changed working directory to /var/tmp/downloads"
Step #0 - "kaniko-build": time="2026-04-10T22:22:40Z" level=info msg="Creating directory /var/tmp/downloads with uid -1 and gid -1"
Step #0 - "kaniko-build": time="2026-04-10T22:22:40Z" level=info msg="Taking snapshot of files..."
Step #0 - "kaniko-build": time="2026-04-10T22:22:40Z" level=info msg="RUN dnf makecache && dnf install -y python3.10"
Step #0 - "kaniko-build": time="2026-04-10T22:22:40Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:22:45Z" level=info msg="ENV CLOUDSDK_PYTHON=python3.10"
Step #0 - "kaniko-build": time="2026-04-10T22:22:45Z" level=info msg="No files changed in this command, skipping snapshotting."
Step #0 - "kaniko-build": time="2026-04-10T22:22:45Z" level=info msg="RUN /var/tmp/ci/install-cloud-sdk.sh"
Step #0 - "kaniko-build": time="2026-04-10T22:22:45Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:24:08Z" level=info msg="ENV CLOUD_SDK_LOCATION=/usr/local/google-cloud-sdk"
Step #0 - "kaniko-build": time="2026-04-10T22:24:08Z" level=info msg="No files changed in this command, skipping snapshotting."
Step #0 - "kaniko-build": time="2026-04-10T22:24:08Z" level=info msg="ENV PATH=${CLOUD_SDK_LOCATION}/bin:${PATH}"
Step #0 - "kaniko-build": time="2026-04-10T22:24:08Z" level=info msg="No files changed in this command, skipping snapshotting."
Step #0 - "kaniko-build": time="2026-04-10T22:24:08Z" level=info msg="RUN ldconfig /usr/local/lib*"
Step #0 - "kaniko-build": time="2026-04-10T22:24:08Z" level=info msg="Found cached layer, extracting to filesystem"
Step #0 - "kaniko-build": time="2026-04-10T22:24:08Z" level=info msg="Pushing image to us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32:4e9a30f9-4ad7-471b-a063-9361dfc6197c"
Step #0 - "kaniko-build": time="2026-04-10T22:24:09Z" level=info msg="Pushed us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32@sha256:475abb0ce415205b348b3bde53447195201252864ce257e135528836a3ff0f36"
Finished Step #0 - "kaniko-build"
Starting Step #1 - "download-runner-image"
Step #1 - "download-runner-image": Pulling image: us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32:4e9a30f9-4ad7-471b-a063-9361dfc6197c
Step #1 - "download-runner-image": 4e9a30f9-4ad7-471b-a063-9361dfc6197c: Pulling from cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32
Step #1 - "download-runner-image": fc4518b6c5f9: Pulling fs layer
Step #1 - "download-runner-image": b89505923033: Pulling fs layer
Step #1 - "download-runner-image": fa0f762e3a72: Pulling fs layer
Step #1 - "download-runner-image": 440c4fe9c7f1: Pulling fs layer
Step #1 - "download-runner-image": 9cecf32711c8: Pulling fs layer
Step #1 - "download-runner-image":
...
[Logs truncated due to log size limitations. For full logs, see https://storage.cloud.google.com/cloud-cpp-community-publiclogs/logs/google-cloud-cpp/16077/39cd67c69b5f29196c08c77ffbd1f81425b2b589/fedora-m32-m32-__default__/log-4e9a30f9-4ad7-471b-a063-9361dfc6197c.txt.]
...
h": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.141208343Z [DEBUG] <4125135744> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-04-10T22:35:41.141227120Z [DEBUG] <4125135744> Read(25)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.141318568Z [DEBUG] <4125135744> Read(25)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.151284190Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/internal/bigtable_stub_factory.cc:76)
Step #2 - "build.sh": 2026-04-10T22:35:41.152039048Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/monitoring/v3/internal/metric_stub_factory.cc:60)
Step #2 - "build.sh": 2026-04-10T22:35:41.152304586Z [INFO] <4076858176> Cloud Monitoring Export skipped. No data. (/workspace/google/cloud/opentelemetry/internal/monitoring_exporter.cc:132)
Step #2 - "build.sh": 2026-04-10T22:35:41.152608230Z [DEBUG] <4125135744> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5bz1vd0tp4vihc48g1ddrh4azv3bqo0c4r"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.154200153Z [DEBUG] <4125135744> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-04-10T22:35:41.154220406Z [DEBUG] <4125135744> Read(26)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.154579118Z [DEBUG] <4125135744> Read(26)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data1"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-04-10T22:35:41.154644172Z [DEBUG] <4125135744> Read(26)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.154721084Z [DEBUG] <4125135744> Read(26)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-3"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "data2"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-04-10T22:35:41.154749438Z [DEBUG] <4125135744> Read(26)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.154792238Z [DEBUG] <4125135744> Read(26)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.155866699Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/admin/internal/bigtable_table_admin_stub_factory.cc:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.155987848Z [DEBUG] <4125135744> DropRowRange() << google.bigtable.admin.v2.DropRowRangeRequest {
Step #2 - "build.sh": name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5bz1vd0tp4vihc48g1ddrh4azv3bqo0c4r"
Step #2 - "build.sh": delete_all_data_from_table: true
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.157852003Z [DEBUG] <4125135744> DropRowRange() >> status=OK (/workspace/google/cloud/internal/log_wrapper.cc:29)
Step #2 - "build.sh": 2026-04-10T22:35:41.166067033Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/internal/bigtable_stub_factory.cc:76)
Step #2 - "build.sh": 2026-04-10T22:35:41.166830189Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/monitoring/v3/internal/metric_stub_factory.cc:60)
Step #2 - "build.sh": 2026-04-10T22:35:41.166991210Z [INFO] <4076858176> Cloud Monitoring Export skipped. No data. (/workspace/google/cloud/opentelemetry/internal/monitoring_exporter.cc:132)
Step #2 - "build.sh": 2026-04-10T22:35:41.167255978Z [DEBUG] <4125135744> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5bz1vd0tp4vihc48g1ddrh4azv3bqo0c4r"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.168440425Z [DEBUG] <4125135744> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-04-10T22:35:41.168458196Z [DEBUG] <4125135744> Read(27)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.168637979Z [DEBUG] <4125135744> Read(27)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.169284943Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/admin/internal/bigtable_table_admin_stub_factory.cc:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.169406080Z [DEBUG] <4125135744> DropRowRange() << google.bigtable.admin.v2.DropRowRangeRequest {
Step #2 - "build.sh": name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5bz1vd0tp4vihc48g1ddrh4azv3bqo0c4r"
Step #2 - "build.sh": delete_all_data_from_table: true
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.170733523Z [DEBUG] <4125135744> DropRowRange() >> status=OK (/workspace/google/cloud/internal/log_wrapper.cc:29)
Step #2 - "build.sh": 2026-04-10T22:35:41.171424145Z [DEBUG] <4125135744> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5kpj5ln1t95xpdnlc4ox1zaolqxgvngde8"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.172398909Z [DEBUG] <4125135744> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-04-10T22:35:41.172413566Z [DEBUG] <4125135744> Read(28)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.172715364Z [DEBUG] <4125135744> Read(28)() >> NOT_FOUND: table "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5kpj5ln1t95xpdnlc4ox1zaolqxgvngde8" not found (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.178967062Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/internal/bigtable_stub_factory.cc:76)
Step #2 - "build.sh": 2026-04-10T22:35:41.179474687Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/monitoring/v3/internal/metric_stub_factory.cc:60)
Step #2 - "build.sh": 2026-04-10T22:35:41.179631529Z [INFO] <4076858176> Cloud Monitoring Export skipped. No data. (/workspace/google/cloud/opentelemetry/internal/monitoring_exporter.cc:132)
Step #2 - "build.sh": 2026-04-10T22:35:41.179870597Z [DEBUG] <4125135744> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5bz1vd0tp4vihc48g1ddrh4azv3bqo0c4r"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.181011833Z [DEBUG] <4125135744> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-04-10T22:35:41.181028074Z [DEBUG] <4125135744> Read(29)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.181294931Z [DEBUG] <4125135744> Read(29)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.182015214Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/admin/internal/bigtable_table_admin_stub_factory.cc:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.182110807Z [DEBUG] <4125135744> DropRowRange() << google.bigtable.admin.v2.DropRowRangeRequest {
Step #2 - "build.sh": name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5bz1vd0tp4vihc48g1ddrh4azv3bqo0c4r"
Step #2 - "build.sh": delete_all_data_from_table: true
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.183664502Z [DEBUG] <4125135744> DropRowRange() >> status=OK (/workspace/google/cloud/internal/log_wrapper.cc:29)
Step #2 - "build.sh": 2026-04-10T22:35:41.184551102Z [DEBUG] <4125135744> MutateRow() << google.bigtable.v2.MutateRowRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5bz1vd0tp4vihc48g1ddrh4azv3bqo0c4r"
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": mutations {
Step #2 - "build.sh": set_cell {
Step #2 - "build.sh": family_name: "family4"
Step #2 - "build.sh": column_qualifier: "c1"
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "V1000"
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": mutations {
Step #2 - "build.sh": set_cell {
Step #2 - "build.sh": family_name: "family4"
Step #2 - "build.sh": column_qualifier: "c2"
Step #2 - "build.sh": timestamp_micros: 2000
Step #2 - "build.sh": value: "V2000"
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": mutations {
Step #2 - "build.sh": set_cell {
Step #2 - "build.sh": family_name: "family4"
Step #2 - "build.sh": column_qualifier: "c3"
Step #2 - "build.sh": timestamp_micros: 3000
Step #2 - "build.sh": value: "V3000"
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.187464950Z [DEBUG] <4125135744> MutateRow() >> response=google.bigtable.v2.MutateRowResponse {
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.h:76)
Step #2 - "build.sh": 2026-04-10T22:35:41.187910626Z [DEBUG] <4125135744> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5bz1vd0tp4vihc48g1ddrh4azv3bqo0c4r"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.189099369Z [DEBUG] <4125135744> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-04-10T22:35:41.189116532Z [DEBUG] <4125135744> Read(30)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.189452569Z [DEBUG] <4125135744> Read(30)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "V1000"
Step #2 - "build.sh": }
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c2"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 2000
Step #2 - "build.sh": value: "V2000"
Step #2 - "build.sh": }
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c3"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 3000
Step #2 - "build.sh": value: "V3000"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-04-10T22:35:41.189529413Z [DEBUG] <4125135744> Read(30)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.189570486Z [DEBUG] <4125135744> Read(30)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.195963424Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/internal/bigtable_stub_factory.cc:76)
Step #2 - "build.sh": 2026-04-10T22:35:41.196508083Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/monitoring/v3/internal/metric_stub_factory.cc:60)
Step #2 - "build.sh": 2026-04-10T22:35:41.196662962Z [INFO] <4076858176> Cloud Monitoring Export skipped. No data. (/workspace/google/cloud/opentelemetry/internal/monitoring_exporter.cc:132)
Step #2 - "build.sh": 2026-04-10T22:35:41.196904562Z [DEBUG] <4125135744> ReadRows() << google.bigtable.v2.ReadRowsRequest {
Step #2 - "build.sh": table_name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5bz1vd0tp4vihc48g1ddrh4azv3bqo0c4r"
Step #2 - "build.sh": rows {
Step #2 - "build.sh": row_ranges {
Step #2 - "build.sh": }
Step #2 - "build.sh": }
Step #2 - "build.sh": filter {
Step #2 - "build.sh": pass_all_filter: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.198274271Z [DEBUG] <4125135744> ReadRows() >> not null (/workspace/google/cloud/internal/log_wrapper.cc:53)
Step #2 - "build.sh": 2026-04-10T22:35:41.198293222Z [DEBUG] <4125135744> Read(31)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.198739147Z [DEBUG] <4125135744> Read(31)() >> google.bigtable.v2.ReadRowsResponse {
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c1"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 1000
Step #2 - "build.sh": value: "V1000"
Step #2 - "build.sh": }
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c2"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 2000
Step #2 - "build.sh": value: "V2000"
Step #2 - "build.sh": }
Step #2 - "build.sh": chunks {
Step #2 - "build.sh": row_key: "row-key-1"
Step #2 - "build.sh": family_name {
Step #2 - "build.sh": value: "family4"
Step #2 - "build.sh": }
Step #2 - "build.sh": qualifier {
Step #2 - "build.sh": value: "c3"
Step #2 - "build.sh": }
Step #2 - "build.sh": timestamp_micros: 3000
Step #2 - "build.sh": value: "V3000"
Step #2 - "build.sh": commit_row: true
Step #2 - "build.sh": }
Step #2 - "build.sh": } (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:64)
Step #2 - "build.sh": 2026-04-10T22:35:41.198867167Z [DEBUG] <4125135744> Read(31)() << (void) (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:59)
Step #2 - "build.sh": 2026-04-10T22:35:41.198941188Z [DEBUG] <4125135744> Read(31)() >> OK (/workspace/google/cloud/internal/streaming_read_rpc_logging.h:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.199905128Z [INFO] <4125135744> Enabled logging for gRPC calls (/workspace/google/cloud/bigtable/admin/internal/bigtable_table_admin_stub_factory.cc:62)
Step #2 - "build.sh": 2026-04-10T22:35:41.200053279Z [DEBUG] <4125135744> DropRowRange() << google.bigtable.admin.v2.DropRowRangeRequest {
Step #2 - "build.sh": name: "projects/cloud-cpp-testing-resources/instances/test-instance/tables/tbl-2026-04-10-5bz1vd0tp4vihc48g1ddrh4azv3bqo0c4r"
Step #2 - "build.sh": delete_all_data_from_table: true
Step #2 - "build.sh": } (/workspace/google/cloud/internal/log_wrapper.cc:24)
Step #2 - "build.sh": 2026-04-10T22:35:41.202279430Z [DEBUG] <4125135744> DropRowRange() >> status=UNAVAILABLE: failed to connect to all addresses; last error: UNKNOWN: Invalid argument (/workspace/google/cloud/internal/log_wrapper.cc:29)
Step #2 - "build.sh": [ FAILED ] DataIntegrationTest.TableBulkApply (19 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableReadRowNotExistTest
Step #2 - "build.sh": [ OK ] DataIntegrationTest.TableReadRowNotExistTest (21 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.QueryWithNulls
Step #2 - "build.sh": /workspace/google/cloud/bigtable/tests/data_integration_test.cc:1047: Skipped
Step #2 - "build.sh":
Step #2 - "build.sh":
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.QueryWithNulls (10 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableReadRowsPartialRows
Step #2 - "build.sh": [ OK ] DataIntegrationTest.TableReadRowsPartialRows (19 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.ClientQueryColumnFamilyWithHistory
Step #2 - "build.sh": /workspace/google/cloud/bigtable/tests/data_integration_test.cc:669: Skipped
Step #2 - "build.sh":
Step #2 - "build.sh":
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.ClientQueryColumnFamilyWithHistory (11 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableApplyWithLogging
Step #2 - "build.sh": [ OK ] DataIntegrationTest.TableApplyWithLogging (36 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableReadRowsAllRows
Step #2 - "build.sh": [ OK ] DataIntegrationTest.TableReadRowsAllRows (26 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableReadMultipleCellsBigValue
Step #2 - "build.sh": [ OK ] DataIntegrationTest.TableReadMultipleCellsBigValue (1482 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableApply
Step #2 - "build.sh": [ OK ] DataIntegrationTest.TableApply (217 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.MultiColumnQuery
Step #2 - "build.sh": /workspace/google/cloud/bigtable/tests/data_integration_test.cc:980: Skipped
Step #2 - "build.sh":
Step #2 - "build.sh":
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.MultiColumnQuery (10 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.SingleColumnQueryWithHistory
Step #2 - "build.sh": /workspace/google/cloud/bigtable/tests/data_integration_test.cc:876: Skipped
Step #2 - "build.sh":
Step #2 - "build.sh":
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.SingleColumnQueryWithHistory (10 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableReadModifyWriteRowMultipleTest
Step #2 - "build.sh": [ OK ] DataIntegrationTest.TableReadModifyWriteRowMultipleTest (14 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableCellValueInt64Test
Step #2 - "build.sh": [ OK ] DataIntegrationTest.TableCellValueInt64Test (14 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableReadModifyWriteAppendValueTest
Step #2 - "build.sh": [ OK ] DataIntegrationTest.TableReadModifyWriteAppendValueTest (14 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.ClientQueryColumnFamily
Step #2 - "build.sh": /workspace/google/cloud/bigtable/tests/data_integration_test.cc:603: Skipped
Step #2 - "build.sh":
Step #2 - "build.sh":
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.ClientQueryColumnFamily (11 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableCheckAndMutateRowPass
Step #2 - "build.sh": [ OK ] DataIntegrationTest.TableCheckAndMutateRowPass (17 ms)
Step #2 - "build.sh": [ RUN ] DataIntegrationTest.TableReadRowsReverseScan
Step #2 - "build.sh": /workspace/google/cloud/bigtable/tests/data_integration_test.cc:278: Skipped
Step #2 - "build.sh":
Step #2 - "build.sh":
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.TableReadRowsReverseScan (10 ms)
Step #2 - "build.sh": [----------] 25 tests from DataIntegrationTest (2700 ms total)
Step #2 - "build.sh":
Step #2 - "build.sh": [----------] Global test environment tear-down
Step #2 - "build.sh": [==========] 25 tests from 1 test suite ran. (2720 ms total)
Step #2 - "build.sh": [ PASSED ] 17 tests.
Step #2 - "build.sh": [ SKIPPED ] 7 tests, listed below:
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.SingleColumnQuery
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.QueryWithNulls
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.ClientQueryColumnFamilyWithHistory
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.MultiColumnQuery
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.SingleColumnQueryWithHistory
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.ClientQueryColumnFamily
Step #2 - "build.sh": [ SKIPPED ] DataIntegrationTest.TableReadRowsReverseScan
Step #2 - "build.sh": [ FAILED ] 1 test, listed below:
Step #2 - "build.sh": [ FAILED ] DataIntegrationTest.TableBulkApply
Step #2 - "build.sh":
Step #2 - "build.sh": 1 FAILED TEST
Step #2 - "build.sh":
Step #2 - "build.sh": Start 685: bigtable_data_integration_test
Step #2 - "build.sh": 10/13 Test #686: bigtable_filters_integration_test ................ Passed 3.50 sec
Step #2 - "build.sh": Test #685: bigtable_data_integration_test ................... Passed 2.54 sec
Step #2 - "build.sh": 12/13 Test #683: bigtable_scan_throughput_benchmark ............... Passed 7.37 sec
Step #2 - "build.sh": 13/13 Test #682: bigtable_scan_async_throughput_benchmark ......... Passed 10.55 sec
Step #2 - "build.sh":
Step #2 - "build.sh": 100% tests passed, 0 tests failed out of 13
Step #2 - "build.sh":
Step #2 - "build.sh": Label Time Summary:
Step #2 - "build.sh": integration-test = 31.07 sec*proc (13 tests)
Step #2 - "build.sh": integration-test-emulator = 31.07 sec*proc (13 tests)
Step #2 - "build.sh":
Step #2 - "build.sh": Total Test time (real) = 10.60 sec
Step #2 - "build.sh": 2026-04-10T22:35:50Z (+646s): Killing Bigtable Emulators...
Step #2 - "build.sh": EMULATOR_PID=250564 .+.
Step #2 - "build.sh": INSTANCE_ADMIN_EMULATOR_PID=250565 .+.
Step #2 - "build.sh": ================ emulator.log ================
Step #2 - "build.sh": cat: /h/emulator.log: No such file or directory
Step #2 - "build.sh": 1 Cloud Bigtable emulator running on 127.0.0.1:8480
Step #2 - "build.sh": ================ emulator.log ================
Step #2 - "build.sh": cat: /h/emulator.log: No such file or directory
Step #2 - "build.sh": 1 Cloud Bigtable emulator running on 127.0.0.1:8480
Step #2 - "build.sh": ================ emulator.log ================
Step #2 - "build.sh": ================ instance-admin-emulator.log ================
Step #2 - "build.sh": cat: /h/instance-admin-emulator.log: No such file or directory
Step #2 - "build.sh": 1 env -C /workspace /workspace/cmake-out/google/cloud/bigtable/tests/instance_admin_emulator 8490
Step #2 - "build.sh": 2 Cloud Bigtable emulator running on localhost:8490
Step #2 - "build.sh": 3 ListInstances() request=parent: "projects/emulated"
Step #2 - "build.sh": 4
Step #2 - "build.sh": 5 CreateInstance() request=parent: "projects/cloud-cpp-testing-resources"
Step #2 - "build.sh": 6 instance_id: "it-2026-04-10-etync9s2tvad"
Step #2 - "build.sh": 7 instance {
Step #2 - "build.sh": 8 display_name: "IT it-2026-04-10-etync9s2tvad"
Step #2 - "build.sh": 9 type: PRODUCTION
Step #2 - "build.sh": 10 }
Step #2 - "build.sh": ================ instance-admin-emulator.log ================
Step #2 - "build.sh": cat: /h/instance-admin-emulator.log: No such file or directory
Step #2 - "build.sh": 230 CreateInstance() request=parent: "projects/cloud-cpp-testing-resources"
Step #2 - "build.sh": 231 instance_id: "it-2026-04-10-kvt5v60c2yl6"
Step #2 - "build.sh": 232 instance {
Step #2 - "build.sh": 233 display_name: "IT it-2026-04-10-kvt5v60c2yl6"
Step #2 - "build.sh": 234 type: PRODUCTION
Step #2 - "build.sh": 235 }
Step #2 - "build.sh": 236 clusters {
Step #2 - "build.sh": 237 key: "it-2026-04-10-kvt5v60c2yl6-c1"
Step #2 - "build.sh": 238 value {
Step #2 - "build.sh": 239 ================ instance-admin-emulator.log ================
Step #2 - "build.sh":
Step #2 - "build.sh": 2026-04-10T22:35:50Z (+646s)
Step #2 - "build.sh": ------------------------------------------------------
Step #2 - "build.sh": | Running REST integration tests (with emulator) |
Step #2 - "build.sh": ------------------------------------------------------
Step #2 - "build.sh": 2026-04-10T22:35:51Z (+647s): Launching Cloud Storage emulator on port 0
Step #2 - "build.sh": Successfully connected to emulator [251362]
Step #2 - "build.sh": 2026-04-10T22:35:52Z (+648s): Successfully connected to gRPC server at port 43019
Step #2 - "build.sh": Test project /workspace/cmake-out
Step #2 - "build.sh": Start 173: common_internal_curl_rest_client_integration_test
Step #2 - "build.sh": 1/1 Test #173: common_internal_curl_rest_client_integration_test ... Passed 0.78 sec
Step #2 - "build.sh":
Step #2 - "build.sh": 100% tests passed, 0 tests failed out of 1
Step #2 - "build.sh":
Step #2 - "build.sh": Label Time Summary:
Step #2 - "build.sh": integration-test = 0.78 sec*proc (1 test)
Step #2 - "build.sh": integration-test-emulator = 0.78 sec*proc (1 test)
Step #2 - "build.sh":
Step #2 - "build.sh": Total Test time (real) = 0.83 sec
Step #2 - "build.sh": Killing emulator server [251362] ... done.
Step #2 - "build.sh": 2026-04-10T22:35:54Z (+650s): ===> sccache stats
Step #2 - "build.sh": Compile requests 6071
Step #2 - "build.sh": Compile requests executed 6071
Step #2 - "build.sh": Cache hits 5730
Step #2 - "build.sh": Cache hits (C/C++) 5730
Step #2 - "build.sh": Cache misses 341
Step #2 - "build.sh": Cache misses (C/C++) 341
Step #2 - "build.sh": Cache hits rate 94.38 %
Step #2 - "build.sh": Cache hits rate (C/C++) 94.38 %
Step #2 - "build.sh": Cache timeouts 0
Step #2 - "build.sh": Cache read errors 0
Step #2 - "build.sh": Forced recaches 0
Step #2 - "build.sh": Cache write errors 5
Step #2 - "build.sh": Cache errors 0
Step #2 - "build.sh": Compilations 341
Step #2 - "build.sh": Compilation failures 0
Step #2 - "build.sh": Non-cacheable compilations 0
Step #2 - "build.sh": Non-cacheable calls 0
Step #2 - "build.sh": Non-compilation calls 0
Step #2 - "build.sh": Unsupported compiler calls 0
Step #2 - "build.sh": Average cache write 0.150 s
Step #2 - "build.sh": Average compiler 16.223 s
Step #2 - "build.sh": Average cache read hit 0.083 s
Step #2 - "build.sh": Failed distributed compilations 0
Step #2 - "build.sh": Cache location gcs, name: cloud-cpp-testing-resources_cloudbuild, prefix: /sccache/fedora-m32-m32/
Step #2 - "build.sh": Version (client) 0.10.0
Step #2 - "build.sh": ==> 🕑 m32 completed in 649.290 seconds
Finished Step #2 - "build.sh"
Starting Step #3 - "remove-image"
Step #3 - "remove-image": Already have image (with digest): gcr.io/google.com/cloudsdktool/cloud-sdk
Step #3 - "remove-image": Digests:
Step #3 - "remove-image": - us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32@sha256:475abb0ce415205b348b3bde53447195201252864ce257e135528836a3ff0f36
Step #3 - "remove-image":
Step #3 - "remove-image": Tags:
Step #3 - "remove-image": - us-east1-docker.pkg.dev/cloud-cpp-testing-resources/gcb/cloudbuild/fedora-m32:4e9a30f9-4ad7-471b-a063-9361dfc6197c
Step #3 - "remove-image": Delete request issued.
Step #3 - "remove-image": Waiting for operation [projects/cloud-cpp-testing-resources/locations/us-east1/operations/f26ed696-629b-47c2-a5a2-c1a4445bdf14] to complete...
Step #3 - "remove-image": .....done.
Finished Step #3 - "remove-image"
PUSH
DONE
Loading