How to publish and download artifacts from multiple jobs into one location on pipeline?

gomfy

Original question (check update in next section)

I would like to download files that are produced by multiple jobs into one folder on azure pipelines. Here is a schema of what I'd like to accomplish:

jobs:                                                                           
 - job: job1                                                         
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          printf "Hello form job1\n" > $(Pipeline.Workspace)/file.1 
      - task: PublishPipelineArtifact@1                                         
        inputs:                                                                 
            targetPath: $(Pipeline.Workspace)/file.1

 - job: job2                                                         
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          printf "Hello form job2\n" > $(Pipeline.Workspace)/file.2 
      - task: PublishPipelineArtifact@1                                         
        inputs:                                                                 
            targetPath: $(Pipeline.Workspace)/file.2

 - job: check_prev_jobs
   dependsOn: "all other jobs"
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          mkdir -p  $(Pipeline.Workspace)/previous_artifacts
      - task: DownloadPipelineArtifact@2                                         
        inputs:
            source: current
            path: $(Pipeline.Workspace)/previous_artifacts       

Where the directory $(Pipeline.Workspace)/previous_artifacts only contains file.1 and file.2 and does not have directories job1 and job2 that contain /file.1 and /file.2 respectively.

Thanks!

Update

Using @Yujun Ding-MSFT's answer. I created the following azure-pipelines.yml file:

stages:
- stage: generate
  jobs:
    - job: Job_1
      displayName: job1
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job1\n" > $JOB_NAME.time
          printf "Hash form job1\n" > $JOB_NAME.hash
          printf "Raw form job1\n" > $JOB_NAME.raw
          printf "Nonesense form job1\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact
        inputs:
          path: $(DIR)
          artifactName: job1
      
    - job: Job_2
      displayName: job2
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job2\n" > $JOB_NAME.time
          printf "Hash form job2\n" > $JOB_NAME.hash
          printf "Raw form job2\n" > $JOB_NAME.raw
          printf "Nonesense form job2\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact copy
        inputs:
          path: $(DIR)
          artifactName: job2

- stage: analyze
  jobs:
    - job: download_display
      displayName: Download and display
      pool:
        vmImage: ubuntu-20.04
      variables:
          DIR: $(Pipeline.Workspace)/artifacts
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          path: $(DIR)
          patterns: '**/*.time'
      - bash: |
          ls -lR $DIR
          cd $DIR
        displayName: Check dir content

However, as shown on the screenshot below, I still get each .time file in a separate job-related directory:

enter image description here

Unfortunately, it seems to me that what I would like may not possible with Pipeline.Artifacts as explained in this Microsoft discussion. This would be a bummer given that Build.Artifacts are deprecated at this point.

Yujun Ding-MSFT

In your current situation, we recommend you can add the keyword:artifactName to your publishArtifact task. I modified your script and test on my side. Hope this will help you:

trigger: none

# pool:
#   vmImage: ubuntu-latest

jobs:
- job: Job_1
  displayName: job 1
  pool:
    vmImage: ubuntu-20.04
  steps:
  - checkout: self
    persistCredentials: True
  - task: Bash@3
    displayName: Bash Script
    inputs:
      targetType: inline
      script: 'printf "Hello form job1\n" > $(Pipeline.Workspace)/file.1 '
  - task: PublishPipelineArtifact@1
    displayName: Publish Pipeline Artifact
    inputs:
      path: $(Pipeline.Workspace)/file.1
      artifactName: job1
  
- job: Job_2
  displayName: job2
  pool:
    vmImage: ubuntu-20.04
  steps:
  - checkout: self
  - task: Bash@3
    displayName: Bash Script copy
    inputs:
      targetType: inline
      script: 'printf "Hello form job1\n" > $(Pipeline.Workspace)/file.2 '
  - task: PublishPipelineArtifact@1
    displayName: Publish Pipeline Artifact copy
    inputs:
      path: $(Pipeline.Workspace)/file.2
      artifactName: job2
      
- job: Job_3
  displayName: Agent job
  dependsOn:
  - Job_1
  - Job_2
  pool:
    vmImage: ubuntu-20.04
  steps:
  - checkout: self
  - task: Bash@3
    displayName: Bash Script
    inputs:
      targetType: inline
      script: ' mkdir -p  $(Pipeline.Workspace)/previous_artifacts'
  - task: DownloadPipelineArtifact@2
    displayName: Download Pipeline Artifact
    inputs:
      path: '$(Pipeline.Workspace)/previous_artifacts   '

Attach my test result: enter image description here enter image description here

Update: Because of the jobs are running with different sessions. So we can not just copy file or use publish artifact to help us merge the both job artifacts. I modified your yaml file and this might help you:

stages:
- stage: generate
  jobs:
    - job: Job_1
      displayName: job1
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job1\n" > $JOB_NAME.time
          printf "Hash form job1\n" > $JOB_NAME.hash
          printf "Raw form job1\n" > $JOB_NAME.raw
          printf "Nonesense form job1\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact
        inputs:
          path: $(DIR)
          artifactName: job1

    - job: Job_2
      displayName: job2
      dependsOn: 
      - Job_1
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job2\n" > $JOB_NAME.time
          printf "Hash form job2\n" > $JOB_NAME.hash
          printf "Raw form job2\n" > $JOB_NAME.raw
          printf "Nonesense form job2\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          buildType: 'current'
          artifactName: 'job1'
          targetPath: '$(DIR)'
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact copy
        inputs:
          path: $(DIR)
          artifactName: job2
 
- stage: analyze
  jobs:
    - job: download_display
      displayName: Download and display
      pool:
        vmImage: ubuntu-20.04
      variables:
          DIR: $(Pipeline.Workspace)/artifacts
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          buildType: 'current'
          artifactName: 'job2'
          itemPattern: '**/*.time'
          targetPath: '$(DIR)'
          
      - bash: |
          ls -lR $DIR
          cd $DIR
          cd $(System.ArtifactsDirectory)
        displayName: Check dir content

attach the build result: enter image description here

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

How to publish Travis CI artifacts to GitHub Release from several jobs

How can I copy artifacts from executed jobs with declarative pipeline?

Gradle Artifactory Plugin - How to publish artifacts from multiple modules in a project?

How to publish artifacts separately for each project in solution from VSTS CI pipeline?

How to generate one report from different jobs in jenkins pipeline

How to publish custom artifacts to Artifactory from TeamCity?

Download same pipeline artifact latestFromBranch over multiple jobs

How to publish multiple artifacts with different group using gradle?

Unable to copy artifacts from one stage to another stage in azure pipeline

Azure DevOps - Maven Pipeline publish artifacts

MSbuild command line - Publish artifacts on specfic location

How to Copy Artifacts from other Jenkins Job from a Pipeline?

How to publish events to eventhub from different location?

IBM DevOps Pipeline: How to Access Artifacts from Previous Job?

Download the latest artifacts of failed gitlab pipeline

Calling multiple downstream jobs from an upstream Jenkins pipeline job

TFS 2015 Publish Build Artifacts in one directory

Release definition to publish nuget packages for multiple artifacts

yml pipeline: publish artifact with code from multiple repositories

How to download the latest build artifacts from Azure DevOps programmatically?

How to NOT download artifacts from previous stages for build configuration?

How to publish one socket with multiple threads in ZeroMQ

Download artifacts archive from Artifactory

gitlab: how can i programatically download the artifacts issued at end of CI pipeline

Archive / publish artifacts to a Samba share using Jenkins pipeline

"Download pipeline artifact" vs. "Download build artifacts"

How to get the distance of multiple location in array from one location and Sort that array by nearest distance in iOS?

Pipelines: cannot publish a file from artifacts directory

Download and publish artifact only if exist in azure pipeline