13 changed files with 173 additions and 0 deletions
@ -0,0 +1,60 @@ |
|||
--- |
|||
title: "Red Hat Open Demo: Red Hat Open Demo-Build multi-architecture CI/CD pipelines to run your apps in the cloud and at the Edge!" |
|||
date: 2024-09-05T00:00:00+02:00 |
|||
draft: false |
|||
resources: |
|||
- '*.jpeg' |
|||
- '*.png' |
|||
- '*.pdf' |
|||
# Featured images for Social Media promotion (sorted from by priority) |
|||
images: |
|||
#- mission-impossible-crazy-train-cover.png |
|||
topics: |
|||
- Artificial Intelligence |
|||
- Edge Computing |
|||
--- |
|||
|
|||
On September 5, 2024, I presented a webinar named [Red Hat Open Demo-Build multi-architecture CI/CD pipelines to run your apps in the cloud and at the Edge!](https://events.redhat.com/profile/form/index.cfm?PKformID=0x11759490001), based on the technology intelligence I have gone through in the past months. |
|||
|
|||
<!--more--> |
|||
|
|||
In the rapidly advancing field of edge computing, deploying applications across diverse hardware platforms, such as ARM and x86_64, has become essential. |
|||
Multi-architecture container images have emerged as a powerful solution, supporting multiple processor architectures within a single image package and simplifying the deployment process across platforms. |
|||
|
|||
{{< attachedFigure src="slide-arm-devices.png" >}} |
|||
|
|||
During this demonstration, I explored how these multi-architecture images work seamlessly across different CPU architectures, automatically selecting the appropriate client architecture from a registry. |
|||
Using tools like Podman, Buildah, and Tekton, I showcased how easy it is to build these images. |
|||
Additionally, I demonstrated the robust support for multi-architecture CI/CD pipelines offered by platforms like Red Hat OpenShift on AWS. |
|||
|
|||
Here’s what was covered during the demo: |
|||
|
|||
- **Overview of Use Cases**: I began with a comprehensive overview of scenarios where multi-architecture support is critical, especially in edge and hybrid cloud environments. |
|||
|
|||
- **Running OpenShift on AWS in Multi-Architecture Mode**: Participants saw OpenShift running in a multi-architecture setup on AWS, showcasing its flexibility in supporting nodes with multiple CPU architectures. |
|||
|
|||
- **Persistent Storage for CI/CD Pipelines (AWS EFS)**: I explored the AWS Elastic File System (EFS) as a possible solution for persistent storage within CI/CD pipelines, for managing artifacts across nodes. |
|||
|
|||
- **Multi-Architecture Pipelines for Quarkus, NodeJS, and Buildah**: The demo included a hands-on look at creating pipelines that support multiple architectures, focusing on Quarkus, NodeJS, and raw Containerfile with Buildah. |
|||
This allowed participants to understand how to structure pipelines for a wide range of application types. |
|||
|
|||
- **Tekton Task Binding to the Right Node**: I also demonstrated how Tekton tasks could be directed to specific nodes based on architecture, ensuring efficient execution across mixed environments. |
|||
|
|||
{{< attachedFigure src="slide-architecture.png" >}} |
|||
|
|||
Key Highlights from the Demo: |
|||
|
|||
1. **Live Demonstration of an OpenShift Cluster with Mixed Architecture Nodes**: The highlight was the live showcase of an OpenShift cluster featuring ARM and x86_64 nodes, which enabled participants to observe multi-architecture functionality in real-time. |
|||
|
|||
2. **Hands-on Session on Tekton Pipelines**: Attendees took part in a practical session focused on creating and managing Tekton pipelines for multi-architecture builds, covering both foundational setup and advanced configurations. |
|||
|
|||
3. **Building and Pushing Multi-Architecture Container Images to quay.io**: I concluded with a real-time demonstration of building and pushing multi-architecture images to the quay.io registry, emphasizing the role of registry support in deploying cross-platform applications efficiently. |
|||
|
|||
4. **Running the same container on two different CPU architectures**: I ran a container image built with the multi-architecture pipeline on both my Laptop (x86_64) and {{< internalLink path="/blog/homelab-server-2u-short-depth-front-io-ampere-altra-arm64-architecture/index.md" title="my Ampere Altra server" >}} (ARM64). |
|||
|
|||
This demo has been designed for DevOps professionals, cloud architects, and developers looking to leverage OpenShift and AWS in multi-architecture container image creation. |
|||
The session provided them with both a high-level understanding and practical skills to implement and manage these capabilities in their environments. |
|||
|
|||
If you have not been able to attend the live session, I invite you to [watch the replay](https://events.redhat.com/profile/form/index.cfm?PKformID=0x11759490001) and [download the slides](slides.pdf)! |
|||
|
|||
If you are ready to dive deeper, have a look at the article I wrote on this subject: {{< internalLink path="/blog/build-multi-architecture-container-images-with-kubernetes-buildah-tekton-aws/index.md" >}}! |
|||
@ -0,0 +1,3 @@ |
|||
version https://git-lfs.github.com/spec/v1 |
|||
oid sha256:1deb6f7184f8f2b066150e3e609a2ddc39c8e5f82d9963d817c0f0977d339358 |
|||
size 97998 |
|||
@ -0,0 +1,3 @@ |
|||
version https://git-lfs.github.com/spec/v1 |
|||
oid sha256:fc1ccc1f4c7b84691e38d10667189103f6c45ad87db0555fb85a5e482fbc2f11 |
|||
size 351091 |
|||
@ -0,0 +1,3 @@ |
|||
version https://git-lfs.github.com/spec/v1 |
|||
oid sha256:9d6c17f81181fcd59b4fed914b22cb09cc885f180094ed988b0534e06bb79111 |
|||
size 289639 |
|||
@ -0,0 +1,3 @@ |
|||
version https://git-lfs.github.com/spec/v1 |
|||
oid sha256:2c6ad704a80a950d120559abca92928be9a085522d5b01075bd4e86a14fa3990 |
|||
size 1735185 |
|||
@ -0,0 +1,92 @@ |
|||
--- |
|||
title: "Red Hat Open Demo: Mission impossible #1 - Stop the crazy Train with AI and Edge before it is too late" |
|||
date: 2024-11-05T00:00:00+02:00 |
|||
draft: false |
|||
resources: |
|||
- '*.jpeg' |
|||
- '*.png' |
|||
- '*.mp4' |
|||
# Featured images for Social Media promotion (sorted from by priority) |
|||
images: |
|||
- mission-impossible-crazy-train-cover.png |
|||
topics: |
|||
- Artificial Intelligence |
|||
- Edge Computing |
|||
--- |
|||
|
|||
On November 5, 2024, I presented a webinar named [Mission impossible 1 : Stop the crazy Train with AI and Edge before it is too late](https://events.redhat.com/profile/form/index.cfm?PKformID=0x1270056abcd&extIdCarryOver=true&sc_cid=701f2000000txokAAA) with three colleagues: [Adrien](https://www.linkedin.com/in/adrien-legros-78674a133/), [Mourad](https://www.linkedin.com/in/mourad-ouachani-0734218/) and [Pauline](https://www.linkedin.com/in/trg-pauline/). |
|||
|
|||
This webinar is the pinacle of ten months of hard work. |
|||
In this article, I will give you an overview of the demo and you will be able to watch the replay in case you missed the live event. |
|||
|
|||
<!--more--> |
|||
|
|||
## The "Mission impossible" demo |
|||
|
|||
We designed this demo for the {{< internalLink path="/speaking/platform-day-2024/index.md" >}} event based on the latest opus of the movie **Mission Impossible: Dead Reckoning**. |
|||
In this demo, **Ethan Hunt** needs help to stop the **Lego City #60337** train before it's too late! |
|||
Nothing less than the fate of humanity is at stake! |
|||
|
|||
{{< attachedFigure src="mission-impossible-plot.png" >}} |
|||
|
|||
The scenario requires **Ethan Hunt** to board the train to connect a **Nvidia Jetson Orin Nano** card to the train's computer network and deploy an AI that will recognise the traffic signs and stop the train on time before it derails! |
|||
A console will provide a remote view of the train's video surveillance camera, with the results of the AI model's inference overlaid. |
|||
|
|||
{{< attachedFigure src="mission-impossible-scenario.png" >}} |
|||
|
|||
To run this demo, we equipped the **Lego** train with a **Nvidia Jetson Orin Nano** card, a webcam and a portable battery. |
|||
The Nvidia Jetson Orin card is a System On Chip (SoC), it includes all the hardware that **Ethan Hunt** needs for its mission: CPU, RAM, storage... |
|||
Plus a GPU to speed up the calculations! |
|||
The Jetson receives the video stream from the onboard camera and transmits orders to the **Lego** Hub via the **Bluetooth Low Energy** protocol. |
|||
It is powered by a portable battery for the duration of the mission. |
|||
|
|||
{{< attachedFigure src="rhel-booth-mission-impossible-demo.jpeg" >}} |
|||
|
|||
We are in an Edge Computing context. |
|||
On the Jetson, we have installed **Red Hat Device Edge**. |
|||
This is a variant of Red Hat Enterprise Linux adapted to the constraints of **Edge Computing**. |
|||
We installed **Microshift**, Red Hat's Kubernetes tailored for the Edge. |
|||
And using Microshift, we deployed *over-the-air* microservices, an **MQTT broker** and the artificial intelligence model. |
|||
|
|||
The Jetson is connected, for the duration of the mission, to an OpenShift cluster in the AWS cloud via a 5G connection. |
|||
In the AWS cloud, there is a RHEL 9 VM that we can use to build the **Red Hat Device Edge** images for the Jetson SoC. |
|||
In the OpenShift cluster, the video surveillance application that broadcasts the video stream from the train's on-board camera. |
|||
The video stream is broadcast from the Jetson via a **Kafka broker**! |
|||
On top of this, there are MLops pipelines to train the AI model. |
|||
And finally CI/CD pipelines to build the container images of our microservices for x86 and ARM architectures. |
|||
|
|||
{{< attachedFigure src="mission-impossible-hardware-architecture.png" >}} |
|||
|
|||
To enable **Ethan Hunt** to carry out its mission successfully, we had to guarantee end-to-end data transmission. |
|||
To do this, we implemented five services that communicate via an asynchronous message transmission system (**MQTT**). |
|||
|
|||
The first service captures ten images per second at regular intervals. |
|||
Each image is resized to 600x400 pixels and encapsulated in an event with a unique identifier. |
|||
This event is transmitted to the AI model, which enriches it with the result of the prediction. |
|||
The latter is transmitted to a transformation service whose role is to extract the train's action, transmit it to the train controller to slow down or stop the train and at the same time send the event to the streaming service (**Kafka**) deployed on a remote Openshift, which displays the images and the prediction in real time. |
|||
|
|||
{{< attachedFigure src="mission-impossible-software-architecture.png" >}} |
|||
|
|||
And finally, we had to build an artificial intelligence model. |
|||
To do this, we followed best practices for managing the model's lifecycle, known as **MLOps**: |
|||
|
|||
- **Acquire the data**: We used an open source dataset containing data from an on-board camera mounted on a car, which was annotated with the signs encountered on its route. |
|||
The photos were taken on roads in the European Union and therefore show "standard" road signs (potentially slightly different from **Lego** signs). |
|||
- **Develop an AI model**: We chose a learning algorithm and trained the model on an OpenShift cluster with GPUs to speed up the calculation. |
|||
- **Deploying the model**: We deployed the model in an inference server for consumption via APIs. |
|||
The model had to be integrated into the software architecture (via MQTT). |
|||
- **Measure performance and re-train**: By observing the model's behaviour, we were able to measure the quality of the predictions and note that not all **Lego** panels were well recognised. |
|||
We decided to re-train the model by refining it with an enriched dataset. |
|||
|
|||
{{< attachedFigure src="mission-impossible-ai.png" >}} |
|||
|
|||
## Watch the replay! |
|||
|
|||
If you have not been able to attend the live session, I invite you to watch the replay! |
|||
|
|||
{{< youtube 8BTLBF0eQqc >}} |
|||
|
|||
If you’re ready to dive deeper, have questions, or just want to connect, I would love to hear from you. |
|||
Feel free to reach out directly on [LinkedIn](https://www.linkedin.com/in/nicolasmasse/), [X](https://x.com/nmasse_itix), or your favorite social platform to start a conversation. |
|||
You can also engage [with the Red Hat team behind this demo](https://github.com/Demo-AI-Edge-Crazy-Train) for more insights and guidance on how we’re innovating in open-source technology. |
|||
Let’s connect and build together! |
|||
@ -0,0 +1 @@ |
|||
../../../english/speaking/red-hat-summit-connect-france-2024/mission-impossible-ai.png |
|||
@ -0,0 +1,3 @@ |
|||
version https://git-lfs.github.com/spec/v1 |
|||
oid sha256:9becff2ea070ebe8ffe0278a58aefc32433dcfac4556179c302fbb76056f4a41 |
|||
size 507943 |
|||
@ -0,0 +1 @@ |
|||
../../../english/speaking/red-hat-summit-connect-france-2024/mission-impossible-hardware-architecture.png |
|||
@ -0,0 +1 @@ |
|||
../../../english/speaking/red-hat-summit-connect-france-2024/mission-impossible-plot.png |
|||
@ -0,0 +1 @@ |
|||
../../../english/speaking/red-hat-summit-connect-france-2024/mission-impossible-scenario.png |
|||
@ -0,0 +1 @@ |
|||
../../../english/speaking/red-hat-summit-connect-france-2024/mission-impossible-software-architecture.png |
|||
@ -0,0 +1 @@ |
|||
../../../french/speaking/red-hat-summit-connect-france-2024/rhel-booth-mission-impossible-demo.jpeg |
|||
Loading…
Reference in new issue