CI: Failed pulls from docker.io causing jobs to fail #31797

issue ismaelsadeeq openend this issue on February 4, 2025
  1. ismaelsadeeq commented at 9:07 pm on February 4, 2025: member
    0[15:50:52.950] STEP 1/7: FROM docker.io/debian:bookworm
    1[15:50:52.956] Trying to pull docker.io/library/debian:bookworm...
    2[15:51:23.200] time="2025-02-04T15:51:23-05:00" level=warning msg="Failed, retrying in 2s ... (1/3). Error: initializing source docker://debian:bookworm: reading manifest bookworm in docker.io/library/debian: toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit"
    3[15:51:55.409] time="2025-02-04T15:51:55-05:00" level=warning msg="Failed, retrying in 2s ... (2/3). Error: initializing source docker://debian:bookworm: reading manifest bookworm in docker.io/library/debian: toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit"
    4[15:52:57.833] time="2025-02-04T15:52:57-05:00" level=warning msg="Failed, retrying in 2s ... (3/3). Error: copying system image from manifest list: determining manifest MIME type for docker://debian:bookworm: reading manifest sha256:f6008b2d1c096556ec301a0c76fe0b40657594ffcfa43aaf63ca9d40767f9a63 in docker.io/library/debian: toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit"
    5[15:53:30.043] Error: creating build container: initializing source docker://debian:bookworm: reading manifest bookworm in docker.io/library/debian: toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit
    6[15:53:30.051] 
    7[15:53:30.051] Exit status: 125
    

    https://cirrus-ci.com/task/6014627928080384 https://github.com/bitcoin/bitcoin/pull/31384/checks?check_run_id=36674073808

  2. maflcko commented at 9:28 pm on February 4, 2025: member

    This should only happen when a new ci image build is triggered without the images being cached from previous builds. So it should be rare enough to not happen often. When it happens, it should clear itself after an hour or two.

    An alternative would be to use (or run) a mirror that has a higher rate limit, but I am not sure if this is worth it.

    I’ve re-run the CI tasks for now.

  3. maflcko added the label CI failed on Feb 4, 2025
  4. Sjors commented at 1:12 pm on February 5, 2025: member
    @maflcko can you re-run #30975 as well?
  5. maflcko commented at 1:15 pm on February 5, 2025: member
    I’ve rebooted the CI machines yesterday and today, which caused the issues here. I’ll keep an eye on the tasks an re-run any that fail due to dockerhub limits for the next few hours. This should fix it once and for all (for now 😅 ).
  6. l0rinc commented at 6:38 pm on February 5, 2025: contributor
  7. maflcko commented at 8:12 pm on February 5, 2025: member
    Reminds me that the CI script will silently continue after an error, see https://cirrus-ci.com/task/4909301485010944?logs=ci#L194. It would be good to stop using bash for scripts, or force pipefail (etc) to be set in all of them.
  8. maflcko commented at 1:06 pm on February 10, 2025: member
    If someone set’s up a new CI machine, one could consider the local build cache from 6835e9686c41b31d14ef25dc8df9409a99f079a8 as a fix, but I haven’t tried if that works with podman.
  9. maflcko commented at 11:56 am on February 13, 2025: member

    This still happens: https://cirrus-ci.com/task/4901485584056320

    Though, it seems odd, because that run should not have pulled more than two images. Rate-limiting that seems that the new rate limits are stronger that they document themselves?

    If that is true, maybe the easiest fix would be to run (or pick) a caching pass-through mirror instead?

  10. l0rinc commented at 1:48 pm on February 19, 2025: contributor
  11. maflcko commented at 2:04 pm on February 19, 2025: member

    24.04 is already cached, so it is unclear why noble gets rate-limited. I guess that’d be a reason to use the same name consistently.

    Absent that, a real fix would still be good.

  12. fanquake closed this on Feb 20, 2025

  13. fanquake referenced this in commit 46a9c73083 on Feb 20, 2025

github-metadata-mirror

This is a metadata mirror of the GitHub repository bitcoin/bitcoin. This site is not affiliated with GitHub. Content is generated from a GitHub metadata backup.
generated: 2025-03-18 12:13 UTC

This site is hosted by @0xB10C
More mirrored repositories can be found on mirror.b10c.me