Deploying nested artifacts to S3

alex's Avatar

alex

27 Aug, 2018 05:17 AM

Hey,

I have build scripts that outputs my artifacts to a path like builds/<version>/. Capturing these artifacts works fine with wildcards path like e.g. builds/*/*.nupkg. Deploying these artifacts to GitHub releases also works perfectly fine as the file structure for uploads is flat.

However I've run into issues when deploying these artifacts also to S3:

# Deploy log example
Uploading artifact "builds/<version>/artifact.nupkg" to S3 bucket "bucket" as "builds/<version>/artifact.nupkg"...

I would like to upload builds/*/artifact as artifact, at the root of the S3 bucket.

Is there a way to achieve this without having my build script place artifacts in the root directory?

  1. Support Staff 1 Posted by Owen McDonnell on 28 Aug, 2018 04:48 AM

    Owen McDonnell's Avatar

    You could add an artifact section to your appveyor.yml file (or do equivalent in the UI if that is what you use for configuration).

    artifacts:
    - path: <solution_name>/**/*.nupkg
    
    And then, having uploaded those artifacts in the build stage, you can access them from before_deploy: script via the $artifacts variable (described here) and package them like this,
    before_deploy:
      - ps: |
              foreach ($artifactName in $artifacts.keys) {
                7z a <your_package>.zip $artifacts[$artifactName].path
              }
    
    Whatever you choose to name your zip package can be accessed from the root of your project directory and deployed to S3 bucket with unzip option set to true.
  2. Ilya Finkelshteyn closed this discussion on 28 Oct, 2018 09:00 PM.

Comments are currently closed for this discussion. You can start a new one.

Keyboard shortcuts

Generic

? Show this help
ESC Blurs the current field

Comment Form

r Focus the comment reply box
^ + ↩ Submit the comment

You can use Command ⌘ instead of Control ^ on Mac