idf_ci.idf_gitlab package

Submodules

exception idf_ci.idf_gitlab.api.ArtifactError

Bases: RuntimeError

Base exception for artifact-related errors.

class idf_ci.idf_gitlab.api.ArtifactManager

Bases: object

Tool interface for managing artifacts in GitLab pipelines.

This class provides a unified interface for downloading and uploading artifacts, supporting both GitLab’s built-in storage and S3 storage. It handles:

  1. GitLab API operations (pipeline, merge request queries)

  2. S3 storage operations (artifact upload/download)

  3. Fallback to GitLab storage when S3 is not configured

Variables:
  • envs – GitLab environment variables

  • settings – CI settings

download_artifacts(*, commit_sha: str | None = None, branch: str | None = None, artifact_type: str | None = None, folder: str | None = None, presigned_json: str | None = None, pipeline_id: str | None = None) None

Download artifacts from a pipeline.

This method downloads artifacts from either GitLab’s built-in storage or S3 storage, depending on the configuration and artifact type.

Parameters:
  • commit_sha – Optional commit SHA. If no commit_sha provided, will use 1) PIPELINE_COMMIT_SHA env var, 2) latest commit from branch

  • branch – Optional Git branch. If no branch provided, will use current branch

  • artifact_type – Type of artifacts to download (debug, flash, metrics)

  • folder – download artifacts under this folder

  • presigned_json – Path to the presigned.json file. If provided, will use this file to download artifacts. If not, will use s3 credentials to download

  • pipeline_id – GitLab pipeline ID to download presigned.json from. Cannot be used together with presigned_json

generate_presigned_json(*, commit_sha: str | None = None, branch: str | None = None, artifact_type: str | None = None, folder: str | None = None, expire_in_days: int = 4) Dict[str, str]

Generate presigned URLs for artifacts in S3 storage.

This method generates presigned URLs for artifacts that would be uploaded to S3 storage. The URLs can be used to download the artifacts directly from S3.

Parameters:
  • commit_sha – Optional commit SHA. If no commit_sha provided, will use 1) PIPELINE_COMMIT_SHA env var, 2) latest commit from branch

  • branch – Optional Git branch. If no branch provided, will use current branch

  • artifact_type – Type of artifacts to generate URLs for (debug, flash, metrics)

  • folder – Base folder to generate relative paths from

  • expire_in_days – Expiration time in days for the presigned URLs (default: 4 days)

Returns:

Dictionary mapping relative paths to presigned URLs

Raises:

S3Error – If S3 is not configured

property gl
property project

Lazily initialize and cache the GitLab project.

property s3_client: Minio | None

Get or create the S3 client.

upload_artifacts(*, commit_sha: str | None = None, branch: str | None = None, artifact_type: str | None = None, folder: str | None = None) None

Upload artifacts to S3 storage.

This method uploads artifacts to S3 storage only. GitLab’s built-in storage is not supported. The commit SHA is required to identify where to store the artifacts.

Parameters:
  • commit_sha – Optional commit SHA. If no commit_sha provided, will use 1) PIPELINE_COMMIT_SHA env var, 2) latest commit from branch

  • branch – Optional Git branch. If no branch provided, will use current branch

  • artifact_type – Type of artifacts to upload (debug, flash, metrics)

  • folder – upload artifacts under this folder

Raises:

S3Error – If S3 is not configured

class idf_ci.idf_gitlab.api.ArtifactParams(commit_sha: str | None = None, branch: str | None = None, folder: str | None = None)

Bases: object

Common parameters for artifacts operations.

The commit SHA can be determined in the following order of precedence:

  1. Explicitly provided commit_sha parameter

  2. PIPELINE_COMMIT_SHA environment variable

  3. Latest commit from branch (where branch is determined by branch parameter or current git branch)

branch: str | None = None
commit_sha: str | None = None
folder: str | None = None
exception idf_ci.idf_gitlab.api.PresignedUrlError

Bases: ArtifactError

Exception raised for presigned URL-related errors.

exception idf_ci.idf_gitlab.api.S3Error

Bases: ArtifactError

Exception raised for S3-related errors.

idf_ci.idf_gitlab.api.execute_concurrent_tasks(tasks: List[Callable[[...], Any]], max_workers: int | None = None, task_name: str = 'executing task') List[Any]

Execute tasks concurrently using ThreadPoolExecutor.

Parameters:
  • tasks – List of callable tasks to execute

  • max_workers – Maximum number of worker threads

  • task_name – Error message prefix for logging

Returns:

List of successful task results, sequence is not guaranteed

This file is used for generating the child pipeline for build jobs.

idf_ci.idf_gitlab.pipeline.build_child_pipeline(*, paths: List[str] | None = None, modified_files: List[str] | None = None, compare_manifest_sha_filepath: str | None = None, yaml_output: str | None = None) None

Generate build child pipeline.

idf_ci.idf_gitlab.pipeline.dump_apps_to_txt(apps: List[App], output_file: str) None

Dump a list of apps to a text file, one app per line.

idf_ci.idf_gitlab.pipeline.test_child_pipeline(yaml_output: str, *, cases: GroupedPytestCases | None = None) None

This function is used to generate the child pipeline for test jobs.

Suppose the ci_build_artifacts_filepatterns is downloaded already

Note

parallel:matrix does not support array as value, we generate all jobs here

Example output:

.default_test_settings:
    script:
        - pytest ${nodes}

esp32 - generic:
    extends:
        - .default_test_settings
    tags:
        - esp32
        - generic
    variables:
        nodes: "nodeid1 nodeid2"
idf_ci.idf_gitlab.scripts.pipeline_variables() Dict[str, str]

Extract pipeline variables from Gitlab MR predefined variables.

Possibly set the following variables:

  • IDF_CI_IS_DEBUG_PIPELINE

    Set to ‘1’ if the pipeline is a debug pipeline, will fail at the last stage.

  • IDF_CI_SELECT_ALL_PYTEST_CASES

    Selecting all pytest cases to run

  • IDF_CI_SELECT_BY_FILTER_EXPR

    Build and test only the test cases that match the filter expression (pytest -k)

  • PIPELINE_COMMIT_SHA

    Real commit SHA, instead of the merged result commit SHA

  • INCLUDE_NIGHTLY_RUN

    Run all test cases with or without nightly_run marker

  • NIGHTLY_RUN

    Run only test cases with nightly_run marker, by default, test cases with nightly_run marker are skipped

Module contents

class idf_ci.idf_gitlab.ArtifactManager

Bases: object

Tool interface for managing artifacts in GitLab pipelines.

This class provides a unified interface for downloading and uploading artifacts, supporting both GitLab’s built-in storage and S3 storage. It handles:

  1. GitLab API operations (pipeline, merge request queries)

  2. S3 storage operations (artifact upload/download)

  3. Fallback to GitLab storage when S3 is not configured

Variables:
  • envs – GitLab environment variables

  • settings – CI settings

download_artifacts(*, commit_sha: str | None = None, branch: str | None = None, artifact_type: str | None = None, folder: str | None = None, presigned_json: str | None = None, pipeline_id: str | None = None) None

Download artifacts from a pipeline.

This method downloads artifacts from either GitLab’s built-in storage or S3 storage, depending on the configuration and artifact type.

Parameters:
  • commit_sha – Optional commit SHA. If no commit_sha provided, will use 1) PIPELINE_COMMIT_SHA env var, 2) latest commit from branch

  • branch – Optional Git branch. If no branch provided, will use current branch

  • artifact_type – Type of artifacts to download (debug, flash, metrics)

  • folder – download artifacts under this folder

  • presigned_json – Path to the presigned.json file. If provided, will use this file to download artifacts. If not, will use s3 credentials to download

  • pipeline_id – GitLab pipeline ID to download presigned.json from. Cannot be used together with presigned_json

generate_presigned_json(*, commit_sha: str | None = None, branch: str | None = None, artifact_type: str | None = None, folder: str | None = None, expire_in_days: int = 4) Dict[str, str]

Generate presigned URLs for artifacts in S3 storage.

This method generates presigned URLs for artifacts that would be uploaded to S3 storage. The URLs can be used to download the artifacts directly from S3.

Parameters:
  • commit_sha – Optional commit SHA. If no commit_sha provided, will use 1) PIPELINE_COMMIT_SHA env var, 2) latest commit from branch

  • branch – Optional Git branch. If no branch provided, will use current branch

  • artifact_type – Type of artifacts to generate URLs for (debug, flash, metrics)

  • folder – Base folder to generate relative paths from

  • expire_in_days – Expiration time in days for the presigned URLs (default: 4 days)

Returns:

Dictionary mapping relative paths to presigned URLs

Raises:

S3Error – If S3 is not configured

property gl
property project

Lazily initialize and cache the GitLab project.

property s3_client: Minio | None

Get or create the S3 client.

upload_artifacts(*, commit_sha: str | None = None, branch: str | None = None, artifact_type: str | None = None, folder: str | None = None) None

Upload artifacts to S3 storage.

This method uploads artifacts to S3 storage only. GitLab’s built-in storage is not supported. The commit SHA is required to identify where to store the artifacts.

Parameters:
  • commit_sha – Optional commit SHA. If no commit_sha provided, will use 1) PIPELINE_COMMIT_SHA env var, 2) latest commit from branch

  • branch – Optional Git branch. If no branch provided, will use current branch

  • artifact_type – Type of artifacts to upload (debug, flash, metrics)

  • folder – upload artifacts under this folder

Raises:

S3Error – If S3 is not configured

class idf_ci.idf_gitlab.ArtifactParams(commit_sha: str | None = None, branch: str | None = None, folder: str | None = None)

Bases: object

Common parameters for artifacts operations.

The commit SHA can be determined in the following order of precedence:

  1. Explicitly provided commit_sha parameter

  2. PIPELINE_COMMIT_SHA environment variable

  3. Latest commit from branch (where branch is determined by branch parameter or current git branch)

branch: str | None = None
commit_sha: str | None = None
folder: str | None = None
idf_ci.idf_gitlab.build_child_pipeline(*, paths: List[str] | None = None, modified_files: List[str] | None = None, compare_manifest_sha_filepath: str | None = None, yaml_output: str | None = None) None

Generate build child pipeline.

idf_ci.idf_gitlab.pipeline_variables() Dict[str, str]

Extract pipeline variables from Gitlab MR predefined variables.

Possibly set the following variables:

  • IDF_CI_IS_DEBUG_PIPELINE

    Set to ‘1’ if the pipeline is a debug pipeline, will fail at the last stage.

  • IDF_CI_SELECT_ALL_PYTEST_CASES

    Selecting all pytest cases to run

  • IDF_CI_SELECT_BY_FILTER_EXPR

    Build and test only the test cases that match the filter expression (pytest -k)

  • PIPELINE_COMMIT_SHA

    Real commit SHA, instead of the merged result commit SHA

  • INCLUDE_NIGHTLY_RUN

    Run all test cases with or without nightly_run marker

  • NIGHTLY_RUN

    Run only test cases with nightly_run marker, by default, test cases with nightly_run marker are skipped

idf_ci.idf_gitlab.test_child_pipeline(yaml_output: str, *, cases: GroupedPytestCases | None = None) None

This function is used to generate the child pipeline for test jobs.

Suppose the ci_build_artifacts_filepatterns is downloaded already

Note

parallel:matrix does not support array as value, we generate all jobs here

Example output:

.default_test_settings:
    script:
        - pytest ${nodes}

esp32 - generic:
    extends:
        - .default_test_settings
    tags:
        - esp32
        - generic
    variables:
        nodes: "nodeid1 nodeid2"