Block a user
llama-cpp-rocm-nightly (0.5.6-gfx1151-b8753-gemma4fix2-slim)
Published 2026-05-02 18:39:26 +00:00 by dan
Installation
docker pull gitea.coffee-anon.com/dan/llama-cpp-rocm-nightly:0.5.6-gfx1151-b8753-gemma4fix2-slimsha256:87c1c0eff00f5e9ed43c3257964ad171537e70e329b47bbb2f4d40afb47e4c52
Image Layers
| ARG RELEASE |
| ARG LAUNCHPAD_BUILD_ARCH |
| LABEL org.opencontainers.image.version=24.04 |
| ADD file:8ce1caf246e7c778bca84c516d02fd4e83766bb2c530a0fffa8a351b560a2728 in / |
| CMD ["/bin/bash"] |
| ARG DEBIAN_FRONTEND=noninteractive |
| RUN |1 DEBIAN_FRONTEND=noninteractive /bin/sh -c apt-get update && apt-get install -y --no-install-recommends ca-certificates curl libcurl4 libgomp1 libnuma1 libelf1 libdrm2 libdrm-amdgpu1 libatomic1 pciutils libtinfo6 libncurses6 zlib1g libzstd1 && rm -rf /var/lib/apt/lists/* # buildkit |
| COPY /opt/rocm-runtime/ /opt/rocm/ # buildkit |
| COPY /usr/local/ /usr/local/ # buildkit |
| RUN |1 DEBIAN_FRONTEND=noninteractive /bin/sh -c mkdir -p /opt/rocm/amdgcn && ln -sf /opt/rocm/lib/llvm/amdgcn/bitcode /opt/rocm/amdgcn/bitcode # buildkit |
| RUN |1 DEBIAN_FRONTEND=noninteractive /bin/sh -c printf '%s\n' /usr/local/lib /opt/rocm/lib /opt/rocm/lib/rocm_sysdeps/lib > /etc/ld.so.conf.d/llama-rocm.conf && ldconfig # buildkit |
| RUN |1 DEBIAN_FRONTEND=noninteractive /bin/sh -c echo "=== ldd llama-server (must show no 'not found' lines) ===" && ldd /usr/local/bin/llama-server | tee /tmp/ldd-server.txt && ! grep -q "not found" /tmp/ldd-server.txt && echo "=== llama-server --version (no GPU required for this) ===" && env -u LD_LIBRARY_PATH /usr/local/bin/llama-server --version && echo "=== final image ROCm footprint ===" && du -sh /opt/rocm # buildkit |
| ENV ROCM_PATH=/opt/rocm HIP_PATH=/opt/rocm HIP_PLATFORM=amd HIP_DEVICE_LIB_PATH=/opt/rocm/amdgcn/bitcode PATH=/opt/rocm/bin:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HSA_OVERRIDE_GFX_VERSION=11.5.1 HIP_VISIBLE_DEVICES=0 GPU_DEVICE_ORDINAL=0 ROCBLAS_USE_HIPBLASLT=1 HSA_ENABLE_SDMA=0 OCL_SET_SVM_SIZE=262144 |
| WORKDIR /usr/local/bin |
| HEALTHCHECK &{["CMD-SHELL" "curl -f http://localhost:8080/health || exit 1"] "30s" "10s" "1m0s" "0s" '\x03'} |
| EXPOSE [8080/tcp] |
| ENTRYPOINT ["/usr/local/bin/llama-server"] |
| CMD ["--host" "0.0.0.0" "--port" "8080"] |
Labels
| Key | Value |
|---|---|
| org.opencontainers.image.version | 24.04 |
Details
2026-05-02 18:39:26 +00:00
Versions (2)
View all
Container
0
OCI / Docker
linux/amd64
195 MiB
0.5.6-gfx1151-b8753-gemma4fix2-slim
2026-05-02
0.5.6-gfx1151-b8753-slim
2026-05-02