Tutorials#

These tutorials show how to use the Data Store microservice.

Prerequisites#

Configuring Data Store#

The configurations for Data Store Service are added in etcd. The configuration details are available in the docker-compose file, under AppName in the environment section of the app’s service definition.

For the scenario, when the AppName is DataStore, the following example shows how the app’s config will look for /DataStore/config key in etcd:

  "datatypes": {
    "json": {
         "host" : "ia_influxdb",
         "port": 8086,
         "dbname": "datain",
         "verifySsl": false,
         "ignoreKeys": [
           "defects"
         ],
         "tagKeys": [],
         "retention": "1h",
         "topics": [
           "*"
         ],
         "retentionPollInterval": "60s"          
     },
    "blob": {
         "host" : "ia_miniodb",
         "port": 9000,
         "dbname": "image-store-bucket", 
         "retention": "1h",
         "topics": [
           "edge_video_analytics_results"
         ],
         "retentionPollInterval": "60s",
         "ignoreKeys": [
           "model-registry"
         ]
    }
  },
  "dbs": {
     "json": "influxdb",
     "blob": "miniodb"
  }

The following are the details of the keys in the above config:

  • datatype (required)

    • The host is optional parameter in configuration, which is used for connecting the respective Database servers (Local/Remote). If the parameter is not provided, by default JSON Datatype will be selected with ia_influxdb and Blob Datatype will be selected with ia_miniodb

    • The port is optional parameter in configuration, which is used for connecting the respective Database servers port(Local/Remote). If the parameter is not provided, by default JSON Datatype will be selected with 8086 for Influx DB and Blob Datatype will be selected with 9000 for Minio Object Storage

    • The topics key determines which messages are to be processed by the corresponding DB microservice. Only the messages with a topic listed in topics key are processed by the individual module. If topics contain \*, then all the messages are processed.

    • The retention is required parameter in configuration. The retention parameter specifies the retention policy to apply for the images stored in MinIO object storage. In case of infinite retention time, set it to “”. Suitable duration string value as mentioned at https://golang.org/pkg/time/#ParseDuration.

    • The retentionPollInterval is required parameter in configuration. Used to set the time interval for checking images for expiration. Expired images will become candidates for deletion and no longer retained. In case of infinite retention time, this attribute will be ignored. Suitable duration string value as mentioned at https://golang.org/pkg/time/#ParseDuration

    • The ignoreKeys is list of string which allows to add keys. For JSON, this list will be used parser to choose whether to parse the nested json or save as string to database. For BLOB, this list will be used for not applying the retention policy for the respective buckets.

  • dbs (Optional)

    • The json is optional parameter in dbs configuration, which is used for selection of db for JSON(Metadata) Datatype. Options available are influxdb

    • The blob is optional parameter in dbs configuration, which is used for selection of db for BLOB Datatype. Options available are miniodb

  • restserver

    • The Port key is used to configure the port exposed to access the REST APIs.

      Note: Same should be updated under Port section in docker-compose.yml

    • The cacertname key is filename of pem certificate

    • The caprivatekey key is filename of pem key

    To use different certificates for restserver change the LHS of the volume mount path in docker-compose.yml for line item ./Certificates/DataStore_Server/:/run/secrets/RestAPI_Server:ro, update the cacertname and caprivatekey with respective file names in config.json.

    Refer REST API Interface for details of REST APIs. Swagger(API Documentation) is available at /docs. To access swagger, after starting data store open browser and enter url https://<ip>:8080. For DEV_MODE as true, use http://<ip>:8080

By default, both the DBs will be enabled. If you want to disable any of the above DBs, remove the corresponding key and its value from the config.

For Example, if you are not using MinIO object storage, you can disable the same and modify the config as below:

     "datatypes": {
       "json": {
            "host" : "ia_influxdb",
            "port": 8086,
            "dbname": "datain",
            "verifySsl": false,
            "ignoreKeys": [
              "defects"
            ],
            "tagKeys": [],
            "retention": "1h",
            "topics": [
              "*"
            ],
            "retentionPollInterval": "60s"          
        }
     }

JSON Datatype (InfluxDB)#

For nested json data, by default, Data Store will flatten the nested json and push the flat data to InfluxDB to avoid the flattening of any particular nested key mention the tag key in the config.json file. Currently the defects key is ignored from flattening. Every key to be ignored has to be in a new line.

For example,

  ignore_keys = [ "Key1", "Key2", "Key3" ]

By default, all the keys in the data schema will be pushed to InfluxDB as fields. If tags are present in data schema, it can be mentioned in the config.json** file then the data pushed to InfluxDB, will have fields and tags both. At present, no tags are visible in the data scheme and tag_keys are kept blank in the config file.

For Example,

  tag_keys = [ "Tag1", "Tag2" ]

Blob Datatype (MinIO Object Storage)#

The MinIO object storage primarily subscribes to the stream that comes out of the EdgeVideoAnalayticsMicroservice app via gRPC and stores the frame into minio for historical analysis.

The high-level logical flow of MinIO object storage is as follows:

  1. The gRPC server will receive message from EdgeVideoAnalayticsMicroservice The img_handle is extracted out of the metadata and is used as the key and the frame is stored as a value for that key in minio persistent storage.

Tutorial 1: Using Data Store Sample app to exercise gRPC Request-Response interfaces#

Data Store Sample App is a Web based interactive application used to exercise all the functionality of gRPC Request-Response Server with pre-loaded request with test data. Also gives a sample code of execution for req-resp server in python language.

For building the Data Store Sample App, run the below command:

  mv local_env .env
  # Source the .env using the following command:
  set -a && source .env && set +a
  # Build the Data Store Sample App
  docker compose build  

For running the Data Store Sample App, run the below command:

    # Source the .env using the following command:
    set -a && source .env && set +a
    # Launch the Data Store Sample App
    docker compose -f docker-compose.yml -f docker-compose-dev.override.yml up -d

Once the Sample App is been deployed, it is accessible at http://localhost:9001/

Sample App consists of pre-loaded sample of both metadata, blob and custom request:

  1. Pre-loaded requests are Write Meta, Write Blob, Write both, Read Meta, Read Blob, Read both, List Meta, List Blob, List both, Delete Meta, Delete Blob, Delete both,Clear Meta, Clear Blob, Clear both, Clear local, Update Meta. Pre-loaded request contains the request test_topic (measurement/bucket name) filled with test data.

  2. Custom Request for which user has to provide the request JSON

Using the Sample App#

  1. On accessing the url, sample app open web based user interactive session which communicates with Data Store over ZMQ/gRPC Request-Response server.

  2. For executing the Pre-Loaded samples, select one of the options from Write Meta, Write Blob, Write both, Read Meta, Read Blob, Read both, List Meta, List Blob, List both, Delete Meta, Delete Blob, Delete both, Clear Meta, Clear Blob, Clear both, Clear local, Update Meta using the radio button and click on Submit button

  3. For executing Custom Request,

a. Select Custom Request radio button and click on Submit button

b. Prepare the the request json based type of request, for request skeleton please refer gRPC Request-Response Endpoints

c. Enter the prepared request packet into Request text box and click on Submit button.

  1. On submitting, it will open new web page where the request packet sent and response received from Data Store will be displayed with two options of Go Back or Exit Buttons

  2. On clicking Go back, it will lead to main screen where you can select an option from Pre-loaded samples/Custom Request or on clicking Exit, it will terminate the current session. To restart the session, refresh the page on browser.

Note: To Go back to the main menu from the Custom Request, reload or refresh the Web Page in Browser

gRPC interfaces#

gRPC Request-Response Endpoints#

Path

Description

read

Return Data from database based on query/blob request.

write

Write Data to database based on the request. Returns code and err (if any).

update

Update Data in database based on the time field (Applicable only for Metadata). Returns status code and err (if any)

list

List API returns the list of measurement/table/bucket names in the respective database based on request

clear

Clear API clears all the data in the local storage or data/measurement/table/bucket in the respective database based on request

delete

Delete API Deletes all the data and measurement/table/bucket in the respective database based on request

read#

Return Data from database based on query/blob request or err if any.

Request#
{
  "command": "read",
  "payload": {
    "topic": "edge_video_analytics_results",
    "query": "select * from edge_video_analytics_results",
    "blob": {
      "key": "img_handle"
    }
  }
}

Request Fields

  • command: (Required/String) provide type of command for gRPC Req-Resp server. Supported commands: read, write, update, list, clear, delete.

  • payload: (Required/JSON Object)

    • topic: (Required/String) Should contain name of measurement/table/bucket on which operation to be performed

    • query: (Optional/String) Should contain query for requesting data from Metadata Database (Can be empty if blob only request)

    • blob: (Optional/JSON Object) (Can be empty if meta only request)

      • key: (Required/String) Should contain Blob identifier(if single read request), Key from Metadata whose values will be used as blob identifier(if bulk read based on metadata response)

NOTE: Either of one query/blob should be filled in case of Single read.

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "[{\"img_handle\":\"test_handle\",\"encoding_type\":\"jpeg\",\"encoding_level\":251,\"annotation\":\"test_data\",\"time\":\"1685090211099234630\"}]",
    "blobdata": ["ssadgfsbdsbasbdsadbgsadfds==/Qa"]
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of JSON Object as String) Based on the request, if meta data is requested this will be filled. Will be Empty string if not requested for metadata/no data for given query

    • blobdata: (Array of bytearray) Based on the request, if meta data is requested this will be filled. Will be Bulk data only if requested for both. Will be Empty array if not requested for blob.

  • err: (String) Error Message in Detail. Will be Empty string if no error

Example Request: Read - Meta only (Bulk Response based on query)#
{
  "command": "read",
  "payload": {
    "topic": "edge_video_analytics_results",
    "query": "select * from edge_video_analytics_results"
  }
}
Example Request: Read - Blob only (Single Read Request)#
{
  "command": "read",
  "payload": {
    "topic": "edge_video_analytics_results",
     "blob": {
      "key": "blob_identifier"
    }
  }
}
Example Request: Read - Both (Bulk Response based on query, and blob for each row)#
{
  "command": "read",
  "payload": {
    "topic": "edge_video_analytics_results",
    "query": "select * from edge_video_analytics_results",
    "blob": {
      "key": "img_handle"
    }
  }
}
write#

Write Data to database based on the request. Returns status code and err (if any)

Request#
{
  "command": "write",
  "payload": {
    "topic": "sample_topic",
    "metadata": {
      "img_handle": "c229634589",
      "encoding_type": "jpeg",
      "encoding_level": 25
    },
    "blob": {
      "key": "c229634589",
      "data": "bytearray of blob"
    }
  }
}

Request Fields

  • command: (Required/String) provide type of command for gRPC Req-Resp server. Supported commands: read, write, update, list, clear, delete.

  • payload: (Required/JSON Object)

    • topic: (Required/String) Should contain name of measurement/table/bucket on which operation to be performed

    • metadata: (Optional/JSON Object) Filled if meta data to be inserted. Should Contains the Key value pair of JSON meta data with Key as column name and values as column value. (Can be NULL if blob only request)

    • blob: (Optional/JSON Object) (Can be empty if meta only request)

      • key: (Required/String) Blob identifier to be stored in blob DB

      • data: (Required/Array of bytearray) Blob data of the corresponding blob identifier(key). Only 0th position will be considered.

NOTE: Either of one metadata/blob should be filled in case of Single write. Data type of Columns should be maintained in case of meta data else, it will be created as new columns

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "",
    "blobdata": []
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of JSON Object as String) Will be empty string.

    • blobdata: (Array of bytearray) Will be empty array

  • err: (String) Error Message in Detail. Will be Empty string if no error

Example Request: write - Meta only (Single Row write Request)#
{
  "command": "write",
  "payload": {
    "topic": "sample_topic",
    "metadata": {
      "img_handle": "c229634589",
      "encoding_type": "jpeg",
      "encoding_level": 25
    }
  }
}
Example Request: write - Blob only (Single Blob write Request)#
{
  "command": "write",
  "payload": {
    "topic": "sample_topic",
    "blob": {
      "key": "sample_image",
      "data": "bytearray of blob"
    }
  }
}
Example Request: write - Both (Single Meta & Blob write Request)#
{
  "command": "write",
  "payload": {
    "topic": "sample_topic",
    "metadata": {
      "img_handle": "c229634589",
      "encoding_type": "jpeg",
      "encoding_level": 25
    },
    "blob": {
      "key": "sample_image",
      "data": "bytearray of blob"
    }
  }
}
update#

Update Data in database based on the time field (Applicable only for Metadata). Returns status code and err (if any)

Request#
{
  "command": "update",
  "payload": {
    "topic": "sample_topic",
    "metadata": [{
      "img_handle": "c229634589",
      "encoding_type": "jpeg",
      "encoding_level": 25,
      "time": "1685090211099234630"
    }]
  }
}

Request Fields

  • command: (Required/String) provide type of command for gRPC Req-Resp server. Supported commands: read, write, update, list, clear, delete.

  • payload: (Required/JSON Object)

    • topic: (Required/String) Should contain name of measurement/table/bucket on which operation to be performed

    • metadata: (Required/JSON Object) Filled if meta data to be inserted. Should Contains the Key value pair of JSON meta data with Key as column name and values as column value.

NOTE: Data type of Columns should be maintained in case of meta data else, it will be created as new columns. Time should in string format.

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "",
    "blobdata": []
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of JSON Object as String) Will be empty string.

    • blobdata: (Array of bytearray) Will be empty array

  • err: (String) Error Message in Detail. Will be Empty string if no error

Example Request: update - Meta only (Single Meta Update Request)#
{
  "command": "update",
  "payload": {
    "topic": "sample_topic",
    "metadata": [{
      "img_handle": "c229634589",
      "encoding_type": "jpeg",
      "encoding_level": 25,
      "time": "1685090211099234630"
    }]
  }
}
list#

List API returns the list of measurement/table/bucket names in the respective database based on request

Request#
{
  "command": "list",
  "payload": "both"
}

Request Fields

  • command: (Required/String) provide type of command for gRPC Req-Resp server. Supported commands: read, write, update, list, clear, delete.

  • payload: (Required/String) Option to select which database to list the topic. Options are meta, blob and both

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "[\"sample_topic1\",\"sample_topic2\"]",
    "blobdata": ["sample_topic1","sample_topic2"]
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of string as String) List of measurement/table name of meta database

    • blobdata: (Array of string) List of bucket name of blob database

  • err: (String) Error Message in Detail. Will be Empty string if no error

Example Request: list - Meta only#
{
  "command": "list",
  "payload": "meta"
}
Example Request: list - Blob only#
{
  "command": "list",
  "payload": "blob"
}
Example Request: list - Both#
{
  "command": "list",
  "payload": "both"
}
clear#

Clear API clears all the data in the local storage or data/measurement/table/bucket in the respective database based on request

Request#
{
  "command": "clear",
  "payload": {
    "dbType": "both",
    "topic": "edge_video_analytics_results"
  }
}

Request Fields

  • command: (Required/String) provide type of command for gRPC Req-Resp server. Supported commands: read, write, update, list, clear, delete.

  • payload: (Required/JSON Object)

    • topic: (Required/String) Should contain name of measurement/table/bucket on which operation to be performed

    • dbType: (Required/String) Option to select which database to perform clear operation. Options are meta, blob, both and local

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "",
    "blobdata": []
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of JSON Object as String) Based on the request, if meta data is requested this will be filled. Will be Empty string if not requested for metadata/no data for given query

    • blobdata: (Array of bytearray) Based on the request, if meta data is requested this will be filled. Will be Bulk data only if requested for both. Will be Empty array if not requested for blob.

  • err: (String) Error Message in Detail. Will be Empty string if no error

NOTE: Deleting the blob data might send an error when EVAM ingestion is active.

Example Request: clear - Meta only#
{
  "command": "clear",
  "payload": {
    "dbType": "meta",
    "topic": "edge_video_analytics_results"
  }
}
Example Request: clear - Blob only#
{
  "command": "clear",
  "payload": {
    "dbType": "blob",
    "topic": "edge_video_analytics_results"
  }
}
Example Request: clear - Both#
{
  "command": "clear",
  "payload": {
    "dbType": "both",
    "topic": "edge_video_analytics_results"
  }
}
Example Request: clear - local#
{
  "command": "clear",
  "payload": {
    "dbType": "local",
    "topic": "all"
  }
}
delete#

Delete API Deletes all the data for selected measurement/table/bucket in the respective database based on requested time range. For meta/both data, timestamp should be in nano seconds and for blob data, timestamp should be in seconds

Request#
{
    "command": "delete",
    "payload": {
        "dbType": "both",
        "topic": "test_topic",
        "start": "1700560635226126767",
        "stop": "1700561235226127691",
        "blobIdentifier": "img_handle"
    }
}

Request Fields

  • command: (Required/String) provide type of command for gRPC Req-Resp server. Supported commands: read, write, update, list, clear, delete.

  • payload: (Required/JSON Object)

    • topic: (Required/String) Should contain name of measurement/table/bucket on which operation to be performed

    • dbType: (Required/String) Option to select which database to perform delete operation. Options are meta, blob and both

    • start: Timestamp from which the data should be deleted. For meta/both it should be nano seconds and for blob it should be seconds

    • stop: Timestamp to which the data should be deleted. For meta/both it should be nano seconds and for blob it should be seconds

    • BlobIdentifier: Should contain Blob identifier. Need to be filled only for both request (Optional)

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "",
    "blobdata": []
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of JSON Object as String) Based on the request, if meta data is requested this will be filled. Will be Empty string if not requested for metadata/no data for given query

    • blobdata: (Array of bytearray) Based on the request, if meta data is requested this will be filled. Will be Bulk data only if requested for both. Will be Empty array if not requested for blob.

  • err: (String) Error Message in Detail. Will be Empty string if no error

NOTE: Deleting the blob data might send an error when EVAM ingestion is active.

Example Request: delete - Meta only#
{
    "command": "delete",
    "payload": {
        "dbType": "meta",
        "topic": "test_topic",
        "start": "1700560635226126767",
        "stop": "1700561235226127691",
        "BlobIdentifier": ""
    }
}
Example Request: delete - Blob only#
{
    "command": "delete",
    "payload": {
        "dbType": "blob",
        "topic": "test_topic",
        "start": "1700471030",
        "stop": "1700483045",
        "BlobIdentifier": ""
    }
}
Example Request: delete - Both#
{
    "command": "delete",
    "payload": {
        "dbType": "both",
        "topic": "test_topic",
        "start": "1700560635226126767",
        "stop": "1700561235226127691",
        "BlobIdentifier": "img_handle"
    }
}

Status Codes#

Description

Status Code

Success

0

Fail

1

API Generic Failure

2

DB Handler API Failure

3

Factory Interface API Failure

4

JSON Packing/Unpacking Failure

5

Partial DB Execution Failed/Success

6

Summary#

In this tutorial, you learned how to query the databases over gRPC interfaces

Tutorial 2: Using Data Store Swagger documentation to exercise REST API interface#

The swagger documentation for REST APIs of Data Store would be available at http://localhost:8080/docs. Please refer REST API Interface for more details.

REST API interface#

REST APIs Endpoints#

Path

Description

/read

Return Data from database based on query/blob request.

/write

Write Data to database based on the request. Returns status code and err (if any).

/update

Update Data in database based on the time field (Applicable only for Metadata). Returns status code and err (if any)

/list

List API returns the list of measurement/table/bucket names in the respective database based on request

/clear

Clear API clears all the data and measurement/table/bucket in the respective database based on request

/delete

Delete API Deletes all the data and measurement/table/bucket in the respective database based on requested time range. For meta/both data, timestamp should be in nano seconds and for blob data, timestamp should be in seconds

read#

Return Data from database based on query/blob request or err if any.

Request#
{
  "topic": "edge_video_analytics_results",
  "query": "select * from edge_video_analytics_results",
  "blob": {
    "key": "img_handle"
  }
}

Request Fields

  • topic: (Required/String) Should contain name of measurement/table/bucket on which operation to be performed

  • query: (Optional/String) Should contain query for requesting data from Metadata Database (Can be empty if blob only request)

  • blob: (Optional/JSON Object) (Can be empty if meta only request)

    • key: (Required/String) Should contain Blob identifier(if single read request), Key from Metadata whose values will be used as blob identifier(if bulk read based on metadata response)

NOTE: Either of one query/blob should be filled in case of Single read.

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "[{\"img_handle\":\"test_handle\",\"encoding_type\":\"jpeg\",\"encoding_level\":251,\"annotation\":\"test_data\",\"time\":\"1685090211099234630\"}]",
    "blobdata": ["ssadgfsbdsbasbdsadbgsadfds==/Qa"]
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of JSON Object as String) Based on the request, if meta data is requested this will be filled. Will be Empty string if not requested for metadata/no data for given query

    • blobdata: (Array of bytearray) Based on the request, if meta data is requested this will be filled. Will be Bulk data only if requested for both. Will be Empty array if not requested for blob.

  • err: (String) Error Message in Detail. Will be Empty string if no error

Example Request: Read - Meta only (Bulk Response based on query)#
{
  "topic": "edge_video_analytics_results",
  "query": "select * from edge_video_analytics_results"
}
Example Request: Read - Blob only (Single Read Request)#
 {
    "topic": "edge_video_analytics_results",
     "blob": {
      "key": "blob_identifier"
    }
  }
Example Request: Read - Both (Bulk Response based on query, and blob for each row)#
 {
    "topic": "edge_video_analytics_results",
    "query": "select * from edge_video_analytics_results",
    "blob": {
      "key": "img_handle"
    }
  }
write#

Write Data to database based on the request. Returns status code and err (if any)

Request#
 {
    "topic": "sample_topic",
    "metadata": {
      "img_handle": "c229634589",
      "encoding_type": "jpeg",
      "encoding_level": 25
    },
    "blob": {
      "key": "sample_image",
      "data": ["bytearray of blob"]
    }
  }

Request Fields

  • topic: (Required/String) Should contain name of measurement/table/bucket on which operation to be performed

  • metadata: (Optional/JSON Object) Filled if meta data to be inserted. Should Contains the Key value pair of JSON meta data with Key as column name and values as column value. (Can be NULL if blob only request)

  • blob: (Optional/JSON Object) (Can be empty if meta only request)

    • key: (Required/String) Blob identifier to be stored in blob DB

    • data: (Required/Array of bytearray) Blob data of the corresponding blob identifier(key). Only 0th position will be considered.

NOTE: Either of one metadata/blob should be filled in case of Single write. Data type of Columns should be maintained in case of meta data else, it will be created as new columns

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "",
    "blobdata": []
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of JSON Object as String) Will be empty string.

    • blobdata: (Array of bytearray) Will be empty array

  • err: (String) Error Message in Detail. Will be Empty string if no error

Example Request: write - Meta only (Single Row write Request)#
{
  "topic": "sample_topic",
  "metadata": {
    "img_handle": "c229634589",
    "encoding_type": "jpeg",
    "encoding_level": 25
  }
}
Example Request: write - Blob only (Single Blob write Request)#
{
  "topic": "sample_topic",
  "blob": {
    "key": "sample_image",
    "data": ["bytearray of blob"]
  }
}
Example Request: write - Both (Single Meta & Blob write Request)#
{
  "topic": "sample_topic",
  "metadata": {
    "img_handle": "c229634589",
    "encoding_type": "jpeg",
    "encoding_level": 25
  },
  "blob": {
    "key": "sample_image",
    "data": ["bytearray of blob"]
  }
}
update#

Update Data in database based on the time field (Applicable only for Metadata). Returns status code and err (if any)

Request#
{
  "topic": "sample_topic",
  "metadata": [{
    "img_handle": "c229634589",
    "encoding_type": "jpeg",
    "encoding_level": 25,
    "time": "1685090211099234630"
  }]
}

Request Fields

  • topic: (Required/String) Should contain name of measurement/table/bucket on which operation to be performed

  • metadata: (Required/JSON Object) Filled if meta data to be inserted. Should Contains the Key value pair of JSON meta data with Key as column name and values as column value.

NOTE: Data type of Columns should be maintained in case of meta data else, it will be created as new columns. Time should in string format.

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "",
    "blobdata": []
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of JSON Object as String) Will be empty string.

    • blobdata: (Array of bytearray) Will be empty array

  • err: (String) Error Message in Detail. Will be Empty string if no error

Example Request: update - Meta only (Single Meta Update Request)#
{
  "topic": "sample_topic",
  "metadata": {
    "img_handle": "c229634589",
    "encoding_type": "jpeg",
    "encoding_level": 25,
    "time": "1685090211099234630"
  }
}
list#

List API returns the list of measurement/table/bucket names in the respective database based on request

Request#
"both"

Request Fields

  • payload: (Required/String) Option to select which database to list the topic. Options are meta, blob and both

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "[\"sample_topic1\",\"sample_topic2\"]",
    "blobdata": ["sample_topic1","sample_topic2"]
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of string as String) List of measurement/table name of meta database

    • blobdata: (Array of string) List of bucket name of blob database

  • err: (String) Error Message in Detail. Will be Empty string if no error

Example Request: list - Meta only#
 "meta"
Example Request: list - Blob only#
"blob"
Example Request: list - Both#
 "both"
clear#

Clear API clears all the data and measurement/table/bucket in the respective database based on request

Request#
{
  "dbType": "both",
  "topic": "edge_video_analytics_results"
}

Request Fields

  • topic: (Required/String) Should contain name of measurement/table/bucket on which operation to be performed

  • dbType: (Required/String) Option to select which database to perform delete operation. Options are meta, blob and both

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "",
    "blobdata": []
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of JSON Object as String) Based on the request, if meta data is requested this will be filled. Will be Empty string if not requested for metadata/no data for given query

    • blobdata: (Array of bytearray) Based on the request, if meta data is requested this will be filled. Will be Bulk data only if requested for both. Will be Empty array if not requested for blob.

  • err: (String) Error Message in Detail. Will be Empty string if no error

NOTE: Deleting the blob data might send an error when EVAM ingestion is active.

Example Request: clear - Meta only#
{
  "dbType": "meta",
  "topic": "edge_video_analytics_results"
}
Example Request: clear - Blob only#
{
  "dbType": "blob",
  "topic": "edge_video_analytics_results"
}
Example Request: clear - Both#
{
  "dbType": "both",
  "topic": "edge_video_analytics_results"
}
delete#

Delete API Deletes all the data and measurement/table/bucket in the respective database based on requested time range. For meta/both data, timestamp should be in nano seconds and for blob data, timestamp should be in seconds

Request#
{
    "dbType": "both",
    "topic": "test_topic",
    "start": "1700560635226126767",
    "stop": "1700561235226127691",
    "blobIdentifier": "img_handle"
}

Request Fields

  • command: (Required/String) provide type of command for gRPC Req-Resp server. Supported commands: read, write, update, list, clear, delete.

  • payload: (Required/JSON Object)

    • topic: (Required/String) Should contain name of measurement/table/bucket on which operation to be performed

    • dbType: (Required/String) Option to select which database to perform delete operation. Options are meta, blob and both

    • start: Timestamp from which the data should be deleted. For meta/both it should be nano seconds and for blob it should be seconds

    • stop: Timestamp to which the data should be deleted. For meta/both it should be nano seconds and for blob it should be seconds

    • BlobIdentifier: Should contain Blob identifier. Need to be filled only for both request (Optional)

Response#
{
  "statuscode": 0,
  "response": {
    "metadata": "",
    "blobdata": []
  },
  "err": ""
}

Responses Fields

  • statuscode: (Integer) Status Code of Request. Refer Status Codes & Descriptions for details

  • response: (JSON Object)

    • metadata: (Array of JSON Object as String) Based on the request, if meta data is requested this will be filled. Will be Empty string if not requested for metadata/no data for given query

    • blobdata: (Array of bytearray) Based on the request, if meta data is requested this will be filled. Will be Bulk data only if requested for both. Will be Empty array if not requested for blob.

  • err: (String) Error Message in Detail. Will be Empty string if no error

NOTE: Deleting the blob data might send an error when EVAM ingestion is active.

Example Request: delete - Meta only#
{
  "dbType": "meta",
  "topic": "test_topic",
  "start": "1700560635226126767",
  "stop": "1700561235226127691",
  "BlobIdentifier": ""
}
Example Request: delete - Blob only#
{
  "dbType": "blob",
  "topic": "test_topic",
  "start": "1700471030",
  "stop": "1700483045",
  "BlobIdentifier": ""
}
Example Request: delete - Both#
{
  "dbType": "both",
  "topic": "test_topic",
  "start": "1700560635226126767",
  "stop": "1700561235226127691",
  "BlobIdentifier": "img_handle"
}

Status Codes#

Description

Status Code

Success

0

Fail

1

API Generic Failure

2

DB Handler API Failure

3

Factory Interface API Failure

4

JSON Packing/Unpacking Failure

5

Partial DB Execution Failed/Success

6

Summary#

In this tutorial, you learned how to query the databases over REST interface

Tutorial 3: Using Data Store to exercise data export to local/remote storage#

Step1#

Update the required configuration as shown below in config.json#

For Local Backup up of Json and Blob data following configuration is provided in config.json

    "storage": {
      "enabled": true,
      "type": ["local"],
      "azureAccountName": "",
      "azureAccountKey": "",
      "azureContainerName": "",
      "autoSyncIntervalInSec": 3600,
      "blobIdentifier":"img_handle"
    }

For Cloud Backup up of Json and Blob data following configuration is provided in config.json. Please update your azure credentials accordingly.

    "storage": {
      "enabled": true,
      "type": ["cloud"],
      "azureAccountName": "<your-azure-account-name>",
      "azureAccountKey": "<your-azure-account-key>",
      "azureContainerName": "<your-storage-container-name",
      "autoSyncIntervalInSec": 3600,
      "blobIdentifier":"img_handle"
    }

For both Local and cloud Backup up of Json and Blob data following configuration is provided in config.json. Please update your azure credentials accordingly.

    "storage": {
      "enabled": true,
      "type": ["local","cloud"],
      "azureAccountName": "<your-azure-account-name>",
      "azureAccountKey": "<your-azure-account-key>",
      "azureContainerName": "<your-storage-container-name",
      "autoSyncIntervalInSec": 3600,
      "blobIdentifier":"img_handle"
    }

NOTE:

  1. If only cloud in enabled in type of storage config then local data will be cleared post pushing data to cloud

  2. Enabling storage feature with type local as above will backup both json and blob data which would lead to high disk space consumption on the host system and the consumption will be more in case of multi streams.For example 1hr of data for a single stream with image resolution as 1920 * 1200 consumes 3.1Gb of disk space.

The following are the details of the keys in the above config:

  • The enabled is by default false but if the backup of data is needed then set the enabled flag as true.

  • The type denotes the type of storage needed i.e. either local or cloud.

  • The autoSyncIntervalInSec denotes the time interval in seconds indicating how often the database had to be queried to backup the data.

  • The blobIdentifier denotes the identifier used for BLOB data. For example below is a Edge Video Analytics Microservice’s data snipet where the blobIdentifier will be img_handle

  • The azureAccountName denotes the azure account name

  • The azureAccountKey denotes your azure account secret key

  • The azureContainerNamedenotes the azure storage name

NOTE

  1. The backed up data is stored at /opt/intel/eii/data/ds_backup/ folder which consist of two subfolder one for json data and one for blob data.

  2. The json data is stored in the json format with filename is made up of <topic_name>_<timestamp>.json i.e edge_video_analytics_results_2024-01-23-14_44_58.json

  3. For blob data the format is taken from ‘encoding_type’ field of Edge Video Analytics Microservice’s data with file name is made up of <topic_name>_<blobIdentifier>.<encoding_type> and blob data is stored in the folder blob/<timestamp> i.e blob/2024-02-01-16:25:32/edge_video_analytics_results_ff875f1df.jpeg

Step 2#

Deploy the Data Store service by referring Get-Started-Guide

Summary#

In this tutorial, you learned how to export data to local/remote storage by editing config.json

Learn More#

  • Understand the components, services, architecture, and data flow, in the Overview.