The goal of this setup is to allow an Azure DevOps pipeline running in the Commercial cloud to move files (e.g., build artifacts, documentation, or deployment packages) into a Storage Account in GCC High. Because these are two different clouds, the connection must be established carefully to remain secure, compliant, and tenant-scoped.
Although this guide is written for Commercial → GCC High, the same approach can also be used for file transfers between Commercial environments or even across tenants within Commercial Azure. By relying on federated credentials instead of secrets, the process ensures secure, governed transfers that honor existing Azure AD (Entra ID) boundaries.
To achieve this, we use an Azure User Assigned Managed Identity (UAMI) in the target environment, link it with Workload Identity Federation from Azure DevOps (Commercial), and grant it the minimum necessary roles. This way, files can flow between environments without storing long-lived secrets or keys.
This is the list of the placeholder for all of the azure resources and connection string we will need during the setup process. This can be helpful when going through the instructions below.
From GCC High Azure Portal (https://portal.azure.us/)
<<gcc-subscription-id>>
<<gcc-subscription-name>>
<<gcc-resource-group-name>>
<<gcc-storage-account-name>>
<<gcc-storage-container-name>>
<<gcc-container-name>>
<<gcc-managed-identity-name>>
<<gcc-managed-identity-client-id>>
<<gcc-federated-credential-name>>
<<gcc-tenant-id>>
From Azure DevOps / Commercial Azure Portal (https://portal.azure.com/)
<<commercial-issuer-url-from-ADO>>
<<commercial-subject-id-from-ADO>>
<<commercial-service-connection-name>>
In the GCC High Azure Portal, create a new Storage Account.
<<gcc-resource-group-name>>
<<gcc-storage-account-name>>
Note the subscription ID (<<gcc-subscription-id>>
) and resource group name for later use.
Create a container within the storage acocunt.
In the GCC High Azure Portal, within the same Resource Group (<<gcc-resource-group-name>>
), create a User Assigned Managed Identity (UAMI).
<<gcc-managed-identity-name>>
Copy the Client ID (<<gcc-managed-identity-client-id>>
) and Object ID.
At the Subscription level (<<gcc-subscription-id>>
):
<<gcc-managed-identity-name>>
At the Storage Account level (<<gcc-storage-account-name>>
in <<gcc-resource-group-name>>
):
<<gcc-managed-identity-name>>
In the Commercial Azure Portal, go to Azure DevOps → Project Settings → Service Connections.
Create a new Azure Resource Manager service connection.
Configure the wizard as follows:
<<gcc-tenant-id>>
(from GCC High tenant)From the ADO wizard (Commercial), copy the:
<<commercial-issuer-url-from-ADO>>
<<commercial-subject-id-from-ADO>>
In the GCC High Azure Portal → Managed Identity (<<gcc-managed-identity-name>>
in <<gcc-resource-group-name>>
):
<<commercial-issuer-url-from-ADO>>
<<commercial-subject-id-from-ADO>>
<<gcc-federated-credential-name>>
api://AzureADTokenExchange
)Back in ADO’s Service Connection Wizard:
<<gcc-subscription-id>>
<<gcc-subscription-name>>
<<gcc-managed-identity-client-id>>
Important: Uncheck Grant access permissions to all pipelines.
Click Verify and Save.
In ADO (Commercial) go to Project Settings → Service connections → <<commercial-service-connection-name>>
.
Open Security.
Under User permissions, explicitly grant access to the project and/or the specific pipeline that will run this connection.
Project Name Build Service (Organization)
), set Use permission to Allow.Save.
Alternative: If you try to run the pipeline without this step, ADO will pause with “Needs approval.” You can approve it from the run page, but configuring Security first is cleaner.
In ADO (Commercial), create a new repository with a single file README.md
.
Create a new Pipeline pointing to this repo.
windows-latest
Add an Azure File Copy task to the pipeline.
Source Path: README.md
Azure Subscription: <<commercial-service-connection-name>>
Destination: Azure Blob Storage
Storage Account: <<gcc-storage-account-name>>
(in <<gcc-resource-group-name>>
)
Container: <<gcc-container-name>>
Here is an example pipline. Make sure to replace the placeholders in the AzureFileCopy@6 tasks based on the names you used earlier
# Example pipeline to copy README.MD file
trigger:
- master
pool:
vmImage: windows-latest
steps:
- script: echo Lets copy a file!
displayName: 'Run a one-line script'
- task: AzureFileCopy@6
inputs:
SourcePath: 'README.MD'
azureSubscription: '<<commercial-service-connection-name>>'
Destination: 'AzureBlob'
storage: '<<gcc-storage-account-name>>'
ContainerName: '<<gcc-storage-container-name>>'
Run the pipeline.
README.md
appears in the GCC High Storage Blob container.Original Post http://www.richardawilson.com/2025/08/bridging-clouds-secure-pipelines-from.html