Hello,
I’m trying to setup Unity Cloud Build as my CI and CD.
I have successfully linked my external repository (Gitlab) to Unity Cloud build triggering with commits in a specific branch.
I would like to after the Build process is done, export the generated build to an Amazon S3 storage. Someone can point how could I do that?
I was thinking in maybe using the Post-export methods to try to upload to the S3 using REST or maybe setting up a webhook to do that, but I’m not so sure about how to do that.
If someone can help me, thanks!
Any updates here?
I would like to know an answer about this too.
4 years later and still no answer for this?
this one might have something to work with
“A Node App that downloads builds from Unity Cloud Build and uploads them to Steam”
https://github.com/alanlawrance/steam-deploy-public
Upping post
No news on this one? Would be nice at least to have AWS CLI on the build workers. So we could run and upload to S3 using a post build script.
Trying to hook our webGL build to auto deploy on AWS S3
In the Advanced Config there is a option to upload addressables to CDN, but no option to upload the WebGL build
im gonna be trying the following:
1 post build shell script
- coulndt install aws-cli in unity cloud build, is there another library? fastlane also doesnt seem to be present but i did use it for an android build.
2 pre/post export method - i have the aws sdk in the code, maybe i can use it in the build?
3 webhook to either a server or lambda - i have a server but this would be a bit of work.
4 ci/cd?
Fastlane should be around. It is installed on all of our machines.
How can I use fastlane to trigger an upload to s3? I don’t find a way to run a custom action for that
See the fastlane docs here:
Well, given that Unity Technologies couldn’t care less about this, I’ll explain here how I’ve done it, providing a sample repo to use as reference.
As suggested by Benjamin Gooding, I used fastlane to do the uploads. This is because, apparently, the workers don’t have the AWS CLI installed.
Process breakdown
- Add fastlane to your Unity project. It’s how we’ll do all the syncing with S3
- Create a script to trigger the lanes
- Modify the build configuration in Unity Cloud for:
3.1. Setting the former script as the post-build script in theScript hooks
section
3.2. Adding the necessary environment variables
Reference project
This is the reference project, where you can find the code to trigger the WebGL sync.
The important parts are:
- fastlane/Fastfile: this has the two lanes that make the job:
- upload_dir_to_s3 lane
- empty_s3_bucket lane
- Scripts/upload_webgl_build_to_s3.sh: the script that calls both lanes; this needs to be added in the build configuration as the post-build script
From the process breakdown, that reference project covers points 1. and 2. You then need to do the 3rd step by customizing the build configuration for calling the script and adding the environment variables. In this reference project we need 4:
- AWS_REGION
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
- WEBGL_BUCKET
The variable UNITY_PLAYER_PATH used in the script is automatically set in Unity Cloud worker.
If you want to test this locally, you can trigger the script by executing:
projectRoot~> AWS_REGION=your-preferred-region \
AWS_ACCESS_KEY_ID=your-access-key-id \
AWS_SECRET_ACCESS_KEY=your-secret-access-key \
WEBGL_BUCKET=the-bucket-name \
UNITY_PLAYER_PATH=/home/jane-doe/projects/webgl-continuous-delivery/build \
Scripts/upload_webgl_build_to_s3.sh
Little caveats I had to get through along the way
- don’t use fastlane built-in
s3
action, it’s deprecated - don’t use fastlane plguin aws_s3, it didn’t work for me
- do use ruby’s offical aws-sdk-s3
- you need to create a lane for emptying the S3 bucket
- you need another lane for copying the new build
- you must iterate through the files in the directory and copy them sequentially, given there’s no bulk operation for uploading a whole directory (an alternative to this is to upload the .zip file containing everything and then using a lambda for decompression)
- ruby’s AWS SDK doesn’t set properly the metadata for content-type nor content-encoding when uploading a file (this is handled correctly by AWS if you upload the directory through the web console). This is crucial to get the thing running properly. So I had to use Marcel to infer the mime type, and examined the files extension to determine whether it’s gzipped or brotlied, then explicitly set the content-type and content-encoding on the put_object operation
- the build artifacts are under the folder determined by UNITY_PLAYER_PATH (environment variable provided by unity cloud build). Everything inside here is exactly what we should copy to the bucket
- don’t forget to set the environment variables to provide AWS configuration
- it’s recommended to run a CloudFront invalidation to evict cached resources every time you deploy