Tutorials#
Running default use case (anomalib UDF)#
In this tutorial, we will run a pipeline on a sample video- anomalib_pcb_test.avi. The video will loop continuously, generating frames and inferenced using anomalib UDF (more about udfloader element ). The frames at the end of the pipeline are collected by evam and published over secured gRPC communication. If you are familiar with gstreamer pipelines, here’s the pipeline-
"multifilesrc loop=TRUE location=/home/pipeline-server/resources/videos/anomalib_pcb_test.avi name=source ! h264parse ! decodebin ! queue max-size-buffers=10 ! videoconvert ! video/x-raw,format=RGB ! udfloader name=udfloader ! appsink name=destination"
As soon as EVAM starts, the pipeline will start (since auto_start
is set to "true"
) and will start publishing.
Ensure you have finished Getting started steps here.
cd into build directory
cd [EIS_WORKDIR]/IEdgeInsights/build
Run the builder script in
[EIS_WORKDIR]/IEdgeInsights/build
to generate updated consolidated eii_config.json and docker-compose.yml files. We will be using the use casevideo-streaming-evam-datastore.yml
which has EVAM, clients- Visualizer and DataStore and ConfigMgrAgent for provisioning.python3 builder.py -f usecases/video-streaming-datastore.yml
Build the services. If the image is not available/built, this might take some time.
docker compose build
Run the application and wait till the services are up.
./run.sh
Once up, the respective clients would be receiving the frames and metadata. For visualizer, you can check
https://<HOST_IP>:5003/edge_video_analytics_results
Stop the services.
docker compose down -v
Using a different UDF(Pallet Defect Detection)#
In this tutorial, we will run an Geti based pallet defect detection model on a sample warehouse video. We will begin by updating the default configuration at configs/eii/default/config.json Replace the pipeline string with the following.
"multifilesrc loop=TRUE location=/home/pipeline-server/resources/videos/warehouse.avi name=source ! decodebin ! videoconvert ! video/x-raw,format=RGB ! udfloader name=udfloader ! appsink name=destination"
Also, the model (as udf) should be replaced with the following,
<snip>
"udfs": {
"udfloader": [
{
"name": "python.geti_udf.geti_udf",
"type": "python",
"device": "CPU",
"visualize": "true",
"deployment": "./resources/models/geti/pallet_defect_detection/deployment",
"metadata_converter": "null"
}
]
}
<snip>
cd into build directory
cd [EIS_WORKDIR]/IEdgeInsights/build
Run the builder script in
[EIS_WORKDIR]/IEdgeInsights/build
to generate updated consolidated eii_config.json and docker-compose.yml files on the use casevideo-streaming-evam-datastore.yml
which has EVAM, clients- Visualizer and DataStore and ConfigMgrAgent.python3 builder.py -f usecases/video-streaming-datastore.yml
Build the services. If the image is not available/built, this might take some time.
docker compose build
Run the application and wait till the services are up.
./run.sh
Once up, the respective clients would be receiving the frames and metadata. For visualizer, you can check
https://<HOST_IP>:5003/edge_video_analytics_results
Stop the services.
docker compose down -v
Running multiple-pipelines#
In this tutorial, we will run multiple pipeline side by side in the same EVAM container. User can add as many as pipelines by adding their respective pipeline configuration into “pipelines” list.
The pipelines are distinguished by their names mentioned in the “name” key of each pipeline config section. The same set of clients can receive frames and metadata from all the pipelines mentioned in the configuration.
Often, users would like to know what pipeline this metadata belongs to. We can set the env variable "APPEND_PIPELINE_NAME_TO_PUBLISHER_TOPIC"
to "true"
("false"
by default), this would append respective pipeline names to the client topics published. More on this, here
A sample configuration has been provided at [EIS_WORKDIR]/IEdgeInsights/EdgeVideoAnalytics/configs/eii/sample_multi_pipeline/config.json
.
Copy the contents to
[EIS_WORKDIR]/IEdgeInsights/EdgeVideoAnalytics/configs/eii/default/config.json
. Also, set"APPEND_PIPELINE_NAME_TO_PUBLISHER_TOPIC"
in docker/docker-compose-eis.yml to"true"
in environment section.Run the builder script in IEdgeInsights/build to generate updated consolidated eii_config.json and docker-compose.yml files. We will be using
video-streaming-evam-datastore.yml
which has EVAM, clients- Visualizer and DataStore and ConfigMgrAgent for provisioning.python3 builder.py -f usecases/video-streaming-datastore.yml
Build the services. If the image is not available/built, this might take some time.
docker compose build
Run the application
./run.sh
Wait until the services are up.
We can check the Visualizer service to see if the frames are correctly sent. Go to the browser and check the following.
https://<HOST_IP>:5003/edge_video_analytics_results_anomalib
https://<HOST_IP>:5003/edge_video_analytics_results_pallet_defect_detection
Recall that we set APPEND_PIPELINE_NAME_TO_PUBLISHER_TOPIC to true, as a results Visualizer started publishing to topic with an appended pipeline name.
Stop the services.
docker compose down -v