Skip to content

redhat-buildpacks/testing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

How to build a runtime using buildpack

badge badge badge badge badge testing

Introduction

The goal of this project is to test/validate different approaches to build a runtime using:

Project Testing Matrix

The following table details the versions of the different tools/images used within the GitHub e2e workflows or examples of this project.

This project currently verifies the following scenario:

Tool/Image Version Tested Note

Lifecycle

0.20.9

Yes

-

Platform

0.13.0

Yes

-

Pack client

v0.37.0

Yes

-

Java Buildpacks library

0.0.14

Yes

Support platform from 0.4 to 0.13 !

Quarkus Buildpacks

3.22

Yes

-

Paketo Build base ubi8 image

0.0.114

Yes

Base image to be used to build

Paketo Run base ubi8 image

0.0.114

Yes

Image to be used to run an application

Paketo UBI8 builder

0.1.42

Yes

Include Node.js, Quarkus Java Buildpack.

Paketo Node.js Extension for ubi

1.3.1

Yes

-

Paketo Java Extension for ubi

0.2.0

Yes

-

0. Prerequisites

ℹ️

For local tests, we suggest to create a kubernetes kind cluster and a container registry using the idpbuilder tool:

idpbuilder create --dev-password --name buildpack

As the registry that we will access uses the HTTPS protocol and a self-signed certificate, it is then needed to define it as an insecure registry using the following command (when you use podman) in order to avoid a TLS error as the certificate has been signed by an unknown authority:

echo 'printf "\n[[registry]]\nlocation = \""gitea.cnoe.localtest.me:8443\""\ninsecure = true\n" >> /etc/containers/registries.conf' |  podman machine ssh --username root --
podman machine stop; podman machine start

1. Java Buildpacks Client

Create a Java maven project using as reference the following sample: .github/samples/build-me

Review the DSL of the BuildMe class to configure the Client in order to build a Java project

...
        int exitCode = BuildConfig.builder()
            .withBuilderImage(new ImageReference("paketobuildpacks/builder-ubi8-base:latest"))
            .withOutputImage(new ImageReference(APPLICATION_IMAGE_REF))
            .withNewPlatformConfig()
              .withEnvironment(envMap)
            .endPlatformConfig()
            .withNewDockerConfig()
              .withAuthConfigs(authInfo)
              .withUseDaemon(false)
            .endDockerConfig()
            .withNewLogConfig()
              .withLogger(new SystemLogger())
              .withLogLevel("debug")
            .and()
            .addNewFileContentApplication(new File(PROJECT_PATH))
            .build()
            .getExitCode();

and set the mandatory environment variables.

export PROJECT_PATH=<JAVA_PROJECT_PATH>
# <IMAGE_REF> can be <IMAGE_NAME> without registry or a full registry reference with host, port(optional), path & tag
export IMAGE_REF=<IMAGE_REF>
ℹ️

Review the documentation of the java-buildpack-client project for more information about how to customize the Buildpacks builder !

2. Quarkus Buildpacks

First, create a Quarkus Hello example using the following maven command executed in a terminal.

mvn io.quarkus.platform:quarkus-maven-plugin:3.21.3:create \
  -DprojectGroupId=dev.snowdrop \
  -DprojectArtifactId=quarkus-hello \
  -DprojectVersion=1.0 \
  -DplatformVersion=3.22.1 \
  -Dextensions='resteasy,kubernetes,buildpack'

Test the project locally:

cd quarkus-hello
mvn compile quarkus:dev

In a separate terminal, curl the HTTP endpoint

curl http://localhost:8080/hello

To build the container image, do the build using the ubi8 builder image paketobuildpacks/builder-ubi8-base:0.0.122 and pass BP_*** env variable(s) in order to configure properly the Quarkus Buildpacks build process:

mvn clean package \
 -Dquarkus.container-image.image=gitea.cnoe.localtest.me:8443/giteaadmin/quarkus-hello:1.0 \
 -Dquarkus.container-image.build=true \
 -Dquarkus.container-image.push=true \
 -Dquarkus.buildpack.jvm-builder-image=paketobuildpacks/builder-ubi8-base:0.1.42 \
 -Dquarkus.buildpack.builder-env.BP_JVM_VERSION=21 \
 -Dquarkus.buildpack.registry-user."gitea.cnoe.localtest.me:8443"=giteaAdmin \
 -Dquarkus.buildpack.registry-password."gitea.cnoe.localtest.me:8443"=developer
ℹ️

To get the debug messages and configure the logger slf4j, add the following quarkus properties:

 ...
 -Dquarkus.buildpack.log-level=debug \
 -Dorg.jboss.logging.provider=slf4j \
 -Dorg.slf4j.simpleLogger.log.io.quarkus.container.image.buildpack.deployment=DEBUG \
 -Dorg.slf4j.simpleLogger.log.dev.snowdrop.buildpack.docker=DEBUG

Next, start the container and curl the endpoint

podman run -i --rm -p 8080:8080 gitea.cnoe.localtest.me:8443/giteaadmin/quarkus-hello:1.0
curl http://localhost:8080/hello

3. Pack client

To validate this scenario top of the existing quarkus-hello project, we will use the pack client.

podman rmi $REGISTRY_HOST/giteaadmin/quarkus-hello:1.0
pack build $REGISTRY_HOST/giteaadmin/quarkus-hello:1.0 \
     --builder paketobuildpacks/builder-ubi8-base:latest \
     --volume $HOME/.m2:/home/cnb/.m2:rw \
     -e BP_JVM_VERSION=21

Next, start the container and curl the endpoint curl http://localhost:8080/hello

podman run -i --rm -p 8080:8080 gitea.cnoe.localtest.me:8443/giteaadmin/quarkus-hello:1.0

4. Tekton

To use Tekton, it is needed to have a k8s cluster (>= 1.28), and a container registry. To install it like the dashboard, we will rely on the idplatform cluster we created using the idpbuilder tool

idpbuilder create --dev-password --name buildpack \
  -p https://github.com/ch007m/my-idp-packages//tekton

When the platform is ready, you should be able to access the Tekton UI at the following address: https://tekton-ui.cnoe.localtest.me:8443/. You can verify if Tekton has been well installed using the Argo CD console: https://argocd.cnoe.localtest.me:8443/

Deploy now the different resources that we need to build an application:

kubectl delete -f https://raw.githubusercontent.com/tektoncd/catalog/refs/heads/main/task/buildpacks-phases/0.3/buildpacks-phases.yaml

kubectl apply -f https://raw.githubusercontent.com/tektoncd/catalog/refs/heads/main/task/buildpacks-phases/0.3/buildpacks-phases.yaml

Create a dockercfg’s secret using the gitea registry credentials to access it and link it to the ServiceAccount that Tekton will use.

kubectl apply -f k8s/tekton/secret-dockercfg.yml
kubectl apply -f k8s/tekton/sa-with-reg-creds.yml

Create a PVC

kubectl apply -f k8s/tekton/ws-pvc.yml

Set next the following variables:

IMAGE_NAME=<CONTAINER_REGISTRY>/<ORG>/quarkus-hello
BUILDER_IMAGE=<PAKETO_BUILDER_IMAGE_OR_YOUR_OWN_BUILDER_IMAGE>

It is time to create a Pipelinerun to build the Quarkus application

IMAGE_NAME=my-gitea-http.gitea.svc.cluster.local:3000/giteaadmin/quarkus-hello

CNB_BUILDER_IMAGE=paketobuildpacks/builder-ubi8-base:0.1.1
CNB_INSECURE_REGISTRIES=my-gitea-http.gitea.svc.cluster.local:3000

echo "apiVersion: tekton.dev/v1
kind: Pipeline
metadata:
  name: buildpacks
spec:
  workspaces:
    - name: source-ws
  tasks:
    - name: fetch-repository
      taskRef:
        resolver: http
        params:
        - name: url
          value: https://raw.githubusercontent.com/tektoncd/catalog/refs/heads/main/task/git-clone/0.10/git-clone.yaml
      workspaces:
        - name: output
          workspace: source-ws
      params:
        - name: url
          value: https://github.com/quarkusio/quarkus-quickstarts.git
        - name: deleteExisting
          value: true

    - name: buildpacks-phases
      taskRef:
        name: buildpacks-phases
      runAfter:
        - fetch-repository
      workspaces:
        - name: source
          workspace: source-ws
      params:
        - name: APP_IMAGE
          value: $CONTAINER_IMAGE
        - name: SOURCE_SUBPATH
          value: getting-started
        - name: CNB_BUILDER_IMAGE
          value: $CNB_BUILDER_IMAGE
        - name: CNB_INSECURE_REGISTRIES
          value: $CNB_INSECURE_REGISTRIES
        - name: CNB_LOG_LEVEL
          value: $CNB_LOG_LEVEL
        - name: CNB_ENV_VARS
          value:
            - BP_JVM_VERSION=21
---
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
  name: buildpacks
spec:
  taskRunTemplate:
    serviceAccountName: sa-with-creds
  pipelineRef:
    name: buildpacks
  workspaces:
  - name: source-ws
    subPath: source
    persistentVolumeClaim:
      claimName: ws-pvc" | kubectl apply -f -

Follow the execution of the pipeline using the dashboard: https://tekton-ui.cnoe.localtest.me:8443/#/namespaces/default/pipelineruns or using the client: tkn pipelinerun logs -f

When the pipelinerun finishes and no error has been reported, then launch the container

podman run -i --rm -p 8080:8080 gitea.cnoe.localtest.me:8443/giteaadmin/quarkus-hello

5. Shipwright

See the project documentation for more information: https://github.com/shipwright-io/build

To use shipwright, it is needed to have a k8s cluster, a container registry and Tekton installed (>= v0.62)

Next, deploy the release 0.15.x of shipwright

kubectl create -f https://github.com/shipwright-io/build/releases/download/v0.15.6/release.yaml

Apply the following hack to create a self-signed certificate on the cluster, otherwise the shipwright webhook will fail to start

curl --silent --location https://raw.githubusercontent.com/shipwright-io/build/v0.15.6/hack/setup-webhook-cert.sh | bash

Next, install the Buildpacks BuildStrategy using the following command:

kubectl delete -f k8s/shipwright/clusterbuildstrategy.yml
kubectl apply -f k8s/shipwright/clusterbuildstrategy.yml

Create a Build CR using as source the Quarkus Getting started repository:

kubectl delete -f k8s/shipwright/build.yml
kubectl apply -f k8s/shipwright/build.yml

To check the Build resource you just created, execute the following command:

kubectl get build
NAME                      REGISTERED   REASON      BUILDSTRATEGYKIND      BUILDSTRATEGYNAME   CREATIONTIME
buildpack-quarkus-build   True         Succeeded   ClusterBuildStrategy   buildpacks          43s

Create a configMap containing the self-signed certificate of the registry

kubectl get secret -n default idpbuilder-cert -ojson | jq -r '.data."ca.crt"' | base64 -d > ca.cert

kubectl delete configmap certificate-registry
kubectl create configmap certificate-registry \
  --from-file=ca.cert

Create a secret containing the registry credentials as it is used by the Shipwright ServiceAccount

kubectl delete secret dockercfg
kubectl apply -f k8s/shipwright/secret-dockercfg.yml

To trigger a BuildRun do this:

kubectl delete buildrun -lbuild.shipwright.io/name=buildpack-quarkus-build
kubectl delete -f k8s/shipwright/pvc.yml
kubectl delete -f k8s/shipwright/sa.yml

kubectl create -f k8s/shipwright/sa.yml
kubectl create -f k8s/shipwright/pvc.yml
kubectl create -f k8s/shipwright/buildrun.yml

Wait until your BuildRun is completed, and then you can view it as follows:

kubectl get buildrun -lbuild.shipwright.io/name=buildpack-quarkus-build
NAME                               SUCCEEDED   REASON      STARTTIME   COMPLETIONTIME
buildpack-quarkus-buildrun-fbs84   True        Succeeded   103s        25s

When the task is finished and no error is reported, then launch the container

podman run -i --rm -p 8080:8080 gitea.cnoe.localtest.me:8443/giteaadmin/quarkus-hello

Enjoy !

Troubleshooting section

The instructions described hereafter should help to resolve the issue when lifecycle access the gitea registry inside a pod

set -x BUILDER paketobuildpacks/builder-ubi8-base:0.1.1

echo "Test 1 using patched lifecycle, insecure registry define & CNB_REGISTRY_AUTH - OK"
podman run -it \
  -e CNB_PLATFORM_API=0.13 \
  -e CNB_REGISTRY_AUTH='{"https://gitea.cnoe.localtest.me:8443": "Basic Z2l0ZWFBZG1pbjpkZXZlbG9wZXI="}' \
  --network=host \
  $BUILDER \
  /cnb/lifecycle/analyzer \
  -log-level=debug \
  -layers=/layers \
  -run-image=paketobuildpacks/run-ubi8-base:0.0.114 \
  -uid=1002 \
  -gid=1000 \
  -insecure-registry=gitea.cnoe.localtest.me:8443 \
  gitea.cnoe.localtest.me:8443/giteaadmin/quarkus-hello

podman run -it \
  -e CNB_PLATFORM_API=0.13 \
  -v $(pwd)/auth.json:/home/cnb/.docker/config.json:ro \
  --network=host \
  $BUILDER \
  /cnb/lifecycle/analyzer \
  -log-level=debug \
  -layers=/layers \
  -run-image=paketobuildpacks/run-ubi8-base:0.0.114 \
  -uid=1002 \
  -gid=1000 \
  -insecure-registry=gitea.cnoe.localtest.me:8443 \
  gitea.cnoe.localtest.me:8443/giteaadmin/quarkus-hello

Using a pod

kubectl delete secret/dockercfg
kubectl create secret generic dockercfg \
  --from-file=.dockerconfigjson=$(pwd)/.tmp/dockercfg.json \
  --type=kubernetes.io/dockerconfigjson

kubectl delete -f $(pwd)/.tmp/task-pod-1.yaml; kubectl apply -f $(pwd)/.tmp/task-pod-1.yaml

About

Project aiming to help us to perform e2e tests using Buildpacks

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •