Search Unity

AWS for WebGL Hosting

Discussion in 'Web' started by ironfiftynine, Apr 21, 2015.

  1. ironfiftynine

    ironfiftynine

    Joined:
    Jan 22, 2013
    Posts:
    71
    Good day. I tried hosting my unity webgl protoype game on some free hosting sites but I've encountered some problems, enough for me to consider using a paid hosting service instead. With this, I'm looking at Amazon Web Services. Has anyone tried hosting a WebGL build in AWS? If yes, how was setup process and performance of the build? Do you have any better recommendations over AWS? My apologies that I'm asking several questions in one post. I'm trying to be specific as possible. Thanks in advance. :)
     
  2. gfoot

    gfoot

    Joined:
    Jan 5, 2011
    Posts:
    550
    It might be worth looking at basic shared VPS hosting, it will probably work out cheaper than AWS. The cheapest ones won't handle a lot of traffic but should be OK for testing and development.

    Basically though, anything that lets you install your own server software should not present any problems. The places you'll have trouble with are the ones who run specific server software on shared hosting and don't give individual clients access to the config files.
     
    ironfiftynine likes this.
  3. Andrejus

    Andrejus

    Joined:
    Mar 2, 2015
    Posts:
    10
  4. liortal

    liortal

    Joined:
    Oct 17, 2012
    Posts:
    3,562
    Our game is hosted on Amazon S3 (part of AWS) without any issues.

    The only thing is that it doesn't work with the .htaccess file, so we have to rename a few files and set their content type to gzip.
     
  5. ironfiftynine

    ironfiftynine

    Joined:
    Jan 22, 2013
    Posts:
    71
    Thanks for your inputs. Just to add, part of the issues that I encountered are:
    • Upload timeout in FTP for large files (I tried 70 MB which I think is still fairly small). I used FireZilla and SmartFTP but both of them resulted to an error, so when I tested the build by accessing the actual external URL of the html page, the javascript threw an error since the file was not completely transferred. I've already searched for guides and forum posts and based on what I've read so far, it's either the server or the router that I'm using. I think it's the former...
    • When I uploaded the file successfully (at least that's what I think it is when FireFTP returned a success message after upload with no errors), the javascript error still occurs. My hunch is that the server needs to be refreshed for the updated file contents. Is my hypothesis correct?
    Will I still encounter the same problems when I use AWS, VPS, or other paid hosting services?

    @Andrejus I was actually using hostinger in the Philippine domain, but I can't successfully upload the 70 MB file due t timeout when I used the mentioned FTP clients. Is there a configuration that I need to adjust? I can't find such setting in the cPanel. Thanks.

    @liortal I see, so it will still has some tweaking to do. That's what I'm trying to avoid given that I'm not that much acquainted with server setups, but if the service is that reliable, then I'm willing to tinker. Thanks!
     
  6. sokki

    sokki

    Joined:
    Jan 31, 2015
    Posts:
    166
    I don't know what is the solution on that, but I know that some servers needs to be recoded (probably by adding some html file or sorta) in order to accept and stream webgl games. Good luck :)
     
  7. polytropoi

    polytropoi

    Joined:
    Aug 16, 2006
    Posts:
    681
    @liortal, can you elaborate on what changes you make to enable hosting on s3?
     
  8. liortal

    liortal

    Joined:
    Oct 17, 2012
    Posts:
    3,562
    The only thing we did was to enable compression support. S3 does not use the .htaccess file, so we had to manually do the steps that this file does:
    • Rename Compressed folder -> Release
    • Rename all *.*gz files to drop the gz extension, eg: blabla.datagz renamed to blabla.data
    • Mark all file as content-encoding: gzip after uploading to S3.
     
    ironfiftynine likes this.
  9. polytropoi

    polytropoi

    Joined:
    Aug 16, 2006
    Posts:
    681
    That's very helpful - thank you!
     
    liortal likes this.
  10. sathya

    sathya

    Joined:
    Jul 30, 2012
    Posts:
    297
    @liortal
    Followed all the steps except the last one.
    Could not make out how to set the content encoding.
    Please refer the image attached. I modified the metadata property and added content-encoding. Is it correct ?
    I could not get the game running with this setting
     

    Attached Files:

    • gzip.PNG
      gzip.PNG
      File size:
      19 KB
      Views:
      1,980
  11. liortal

    liortal

    Joined:
    Oct 17, 2012
    Posts:
    3,562
    Yes thats how we do it (maybe we dont set the first property though)
     
  12. airsickness

    airsickness

    Joined:
    Sep 3, 2014
    Posts:
    4
    As @liortal said, we renamed by killing the gz because during the async download process its looking for the .mem file and if its memgz it bails out.
     
  13. Charles-Van-Norman

    Charles-Van-Norman

    Joined:
    Aug 10, 2010
    Posts:
    86
    Hello everyone -- I would like to help out this thread by sharing what worked for me to deploy WebGL through AWS S3. You cannot do this by simply copying folders over if you want to deliver /Compressed/ content, because .htaccess handles rerouting the requests for /Release/ to /Compressed/, and S3 does not allow .htaccess. Here is what I did that works.

    Deploy /compressed/ webgl build to S3
    1. Build WebGL project from Unity
    2. Delete /Release/ folder
    3. Remove gz from the end of each filename in /Compressed/ folder
    4. Rename /Compressed/ to /Release/ folder.
    5. Upload your entire /WebGLgame/ folder to S3. Make sure you set permission to publicly accessible.
    6. Modify the headers of each file in /Release/ by adding "Content-Encoding":"gzip" through the S3 console.

    voila, you should be able to navigate to s3bucket.aws.com/WebGLgame/index.html and play your game using the compressed files.
     
    Sharlei and theANMATOR2b like this.
  14. nbalexis1

    nbalexis1

    Joined:
    Jul 21, 2011
    Posts:
    89
    Hi! is it able to do such things with WebGL in Unity5.3.2?
    We tried:
    1. Build WebGL project from Unity
    2. Remove gz from the end of each filename in /Release/ folder
    3. Remove gz from the end of each url in index.html
    4. Remove .htaccess file
    5. Upload the entire folder to S3, make public.
    6. Modify the content encoding files (appname.js,mem,data) which is originally jsgz, memjz, datajz to application/x-gzip.

    But we have a "SyntaxError: illegal character" error.

    We need help, thanks!
     
  15. alexsuvorov

    alexsuvorov

    Unity Technologies

    Joined:
    Nov 15, 2015
    Posts:
    327
    Hello nbalexis1.

    Did you try all those steps separately or in combination?
    5.3.x builds should normally work without any modifications (you might need to remove the .htaccess on some hostings though). It might however not work properly after some modifications to the server configuration have been made. Could you undo all the server configuration changes, upload the original build to the server and provide the link? (here or privately)
     
  16. nbalexis1

    nbalexis1

    Joined:
    Jul 21, 2011
    Posts:
    89

    We finally make it work!

    It turns out that we are confusing about Content-Type and Content-Encoding.
     
  17. nbalexis1

    nbalexis1

    Joined:
    Jul 21, 2011
    Posts:
    89
    Hi! We make it work though I have not yet check without modifications. Thank you all the same. We just set Content-Type to octet-stream/javascript and set Content-Encoding to Gzip
     
  18. JayaramanC

    JayaramanC

    Joined:
    Nov 7, 2014
    Posts:
    1
    Hello nbalexis1,

    I have tried Content-Type --> "octet-stream/javascript" & Content-Encoding --> "gzip"
    Made the entire folder public.
    Still doesnt work.

    I use Unity 5.3.x to create the WebGL build
    Can you share the full setup details of WebGL on Amazon S3...
     
  19. Ben-Sampson

    Ben-Sampson

    Joined:
    Nov 5, 2014
    Posts:
    10
    CheeseGames.net is a free hosting site for WebGL Unity games.

    Give it a try.
     
  20. JanusAnderson

    JanusAnderson

    Joined:
    Apr 30, 2013
    Posts:
    27
    Just FYI, I followed nbalexis's instructions successfully using a build from Unity 5.3.4 with no issues.

    1. Build WebGL project from Unity
    2. Remove gz from the end of each filename in /Release/ folder

    3. Remove gz from the end of each url in index.html
    note: I did not have to do this step, my index.html did not specify any 'gz' files.

    4. Remove .htaccess file
    5. Upload the entire folder to S3, make public.

    6. Modify the content encoding files (appname.js,mem,data) which is originally jsgz, memjz, datajz to application/x-gzip.
    (this means add the header 'content-encoding: gzip' to the 3 files up on S3, I did this with the free Cloudberry Explorer application)

    7. everything works just fine! I don't see any more gzip decompression delays when loading the game of of S3.
     
    polytropoi likes this.
  21. wendigo

    wendigo

    Joined:
    Nov 21, 2013
    Posts:
    7
    This not works for me.

    I am getting this message in the browser console:

    Invoking error handler due to
    Uncaught SyntaxError: Invalid or unexpected token


    My files are all Content-Encoding gzip and permissions to everyone.
    index.html
    Release/UnityLoader.js
    Release/WebGL.asm.js
    Release/WebGL.data
    Release/WebGL.js
    Release/WebGL.mem

    Thanks in advance.
     
    Nolex likes this.
  22. Nolex

    Nolex

    Joined:
    Dec 10, 2010
    Posts:
    116
    I have a same problem. Unity 5.4
     
  23. Umresh

    Umresh

    Joined:
    Oct 14, 2013
    Posts:
    56
    Hi I'm trying to host in s3 and I have to make it as a public folder and gzip doesn't work. How to make gzip work in s3 and can I run without making the files public?

    [EDIT : I'm using unity 2018.3]
     
  24. JJJohan

    JJJohan

    Joined:
    Mar 18, 2016
    Posts:
    214
    It doesn't make sense to not make the Unity-specific S3 files public as the only alternative access method would be via S3 directly. You can still have your S3 bucket's default visibility be private though.

    Regarding gzip compression, assuming by "doesn't work" you mean you're getting the JavaScript message about it not being decompressed automatically by the browser, you simply need to apply the Content-Encoding header with the value 'gzip'. You can do this directly via the AWS management console or an S3 browser like CloudBerry. Only apply this metadata to the unityweb files.

    My project is hosted via S3, the only thing that's somewhat difficult to get working regarding storing Unity WebGL builds is streaming instantiation which is a 2019.1 preview feature.

    Edit: Added important note I previously left out, sorry!
     
    Last edited: Feb 5, 2019
    Umresh likes this.
  25. Umresh

    Umresh

    Joined:
    Oct 14, 2013
    Posts:
    56
    Added Content-Encoding(Metadata) to 'gzip' directly via the AWS management console to folder in bucket and make public. When opening the index.html it says cannot reach the page.
     
  26. JJJohan

    JJJohan

    Joined:
    Mar 18, 2016
    Posts:
    214
    Ah yes it's 'Metadata' when done through the management console, that's correct. If you're getting an error saying it cannot find the page that's a bit odd. Are you able to share the URL to the index page? If not that's fine, but am simply trying to diagnose the issue.

    My own application is behind a secure API so unfortunately I can't use it as an example directly, but just to rule out any misconceptions I can run you through an example.

    If your S3 bucket name is my-bucket and you've uploaded the index.html and the corresponding Build folder from the WebGL output directly to the root of your bucket, it should be accessible (regardless of AWS region) from

    http://my-bucket.s3.amazonaws.com/index.html

    Keep in mind S3 isn't really intended as a web server so it won't automatically add the 'index.html' on the end for you - this would be something the Cloudfront service handles though. You should at least be seeing something - if you're getting a 403 just trying to open the page, you should verify all the files are definitely given public ACLs (just read, not write). I can't really suggest anything more than that as it doesn't sound like a Unity WebGL related issue at the moment.
     
  27. Will_Croxford

    Will_Croxford

    Joined:
    Mar 7, 2018
    Posts:
    5
    Hi, I got first part of my game hosted now on AWS, but it is not reading the JSON file in the Streaming Assets folder. This JSON file is read properly with same build on local host my computer. Game is at: https://s3.eu-west-2.amazonaws.com/hellsbellsthegame/index.html (can't go very far in it as I have general issue with it being too big; need to bundle assets next). This is in S3 bucket, which I understand means a static website so only client-side Javascript can run, does this affect Streaming Assets folder?

    In the Access Control List tab for the bucket, for the Group "Everyone" have Read bucket permissions showing as "Yes" and the other permissions as "-". In the Permissions tab for this JSON file in particular, for Everyone group have Read object as "Yes" and other permissions as "-". No actual ACL scripts written (tried this at first, thought I had copied standard public one but clearly still blocked access, so gave up on that one).

    JSON file itself can be viewed at (spoiler alert): https://s3.eu-west-2.amazonaws.com/hellsbellsthegame/StreamingAssets/DungeonQuestions.json and as I said, this JSON file does load when same build run on local host.

    In case relevant: I didn't do anything with .htaccess file (mentioned above don't really know what this is), and just built the WebGL where you get 4 .unityweb, 1 JSON and 1 Javascript file in the Build folder, in Unity Build Settings configured: gzip compression, 256MB build, Full stack trace, Linker Target: Web Assembly, Name Files As Hashes yes, Data caching yes, Debug symbols yes.

    I'm very new to AWS and still my first game in Unity though it dragged on for ages with WebGL deployment problems, thanks any tips!
     
  28. JJJohan

    JJJohan

    Joined:
    Mar 18, 2016
    Posts:
    214
    Will_Croxford likes this.
  29. Will_Croxford

    Will_Croxford

    Joined:
    Mar 7, 2018
    Posts:
    5
    Yes thanks so much triple J @JJJohan exactly that, I just changed file name from .json to .JSON and it works. As URLs themselves are not case-sensitive I would have taken ages to work that out, of course loading the file name and assigning URL are two different things entirely
     
  30. TheRoccoB

    TheRoccoB

    Joined:
    Jun 29, 2017
    Posts:
    54
    AWS can be a little painful.

    Personally I would recommend Firebase hosting or Github pages. Firebase is really just a helpful wrapper around google cloud, and they have great hosting tutorials on line. It's got a generous free plan.

    However if you want to do AWS, I wrote some articles! These could be a little out of date, but hopefully still relevant
    https://hackernoon.com/how-i-built-and-deployed-a-webgl-game-to-a-new-website-in-35m-15b2e8339c31
    https://hackernoon.com/secure-flappy-bird-https-just-got-insanely-easy-on-aws-6fe1d41ed12f

    If you want to do github pages, check out this video I made
    Github also has a mechanism for connecting a domain to your pages.

    FYI I run the Unity WebGL site SIMMER.io with Firebase hosting FYI, after giving up on AWS because of its difficulty of use.

    Finally, to improve the look of your hosted, game, consider using https://assetstore.unity.com/packages/tools/gui/responsive-webgl-template-117308

    Hope this helps!
     
  31. ARTfunny

    ARTfunny

    Joined:
    Mar 15, 2019
    Posts:
    3
    SIMMER.io is the way to go, I recently signed up and it is a piece of cake to use thank god !
    I have been struggling with this issue and gave up on getting my game onto my site until I found Simmer.io
    It is straight forward, easy to use I cant recommend it enough.
    Create your WEB GL build
    Drag it into the simmer uploader
    Add some titles & tags
    Then copy the link into your HTML plugin (I use wix)
    it just works, there are free and paid subscriptions to chose from.
    I have no connection to simmer other than being a happy customer
    Thanks Rocco, Good Job!
    https://simmer.io/
     
    GallopingGames likes this.
  32. Hypertectonic

    Hypertectonic

    Joined:
    Dec 16, 2016
    Posts:
    75
    So since late last year I've been using S3 to host several WebGL projects after reading this thread and other online information. I's fairly straightforward once you learn the process. The biggest complication is having to manually update the metadata on the build files on every upload. So I finally got some free time to explore Amazon's Lambda to see if it could be automated, and after a little python learning and googling, I have managed to solve the issue.

    So here's an updated guide (Working with Unity 2020 and 2021) on how to setup an Amazon S3 bucket for hosting, and most importantly automating the metadata changes with Lambda.

    Host a static website with Amazon S3
    With Amazon S3 you can create a static website to host your game. Hosting a static website on Amazon S3 delivers a highly performant and scalable website at a fraction of the cost of a traditional web server.

    To host a static website on Amazon S3, configure an Amazon S3 bucket for website hosting and upload your website content. This is already well explained in Amazon's implementation guide.
    1. Create a bucket for static website
    2. Configure permissions to public
    Preparing and Building your Unity WebGL Project
    1. On the Project Settings, enable Gzip or Brotli compression with no decompression fallback to minimize the build size. Native browser decompression is faster than the JavaScript decompression fallback. With Amazon S3 we can configure the server to serve the files with the necessary HTML headers for browsers to use the native decompression.
    2. Build the WebGL project.
    3. Upload the entire build folder your S3 bucket. You can place it anywhere you want. I have a folder with the live deployment, and a bunch of folders with old archived versions, and upcoming version in testing.
    Manual Metadata Config
    (This will be automated later, but its here for completeness sake)

    In the S3 console, navigate to the Build subfolder and modify the metadata of the files. You need to set both the Content-Encoding and the Content-Type. They are both System-Defined metadata. This list shows how to set up gzip. If you are using Brotli encoding, the file extensions will end in .br, and you should set Content-Encoding to br. The Content-Type will be the same as for gzip:

    *.data.gz
    Content-Type = application/octet-stream
    Content-Encoding = gzip​
    *.wasm.gz
    Content-Type = application/wasm
    Content-Encoding = gzip​
    *.framework.js.gz
    Content-Type = application/javascript
    Content-Encoding = gzip​
    *.symbols.json.gz
    Content-Type = application/octet-stream
    Content-Encoding = gzip​

    This process has to be repeated every time you upload a build, even if you are just overwriting the previous build's files. That gets tedious real fast which is why we will automate it later.

    Testing
    Everything should be ready now. Navigate to the folder where your index.html is, click it and copy the Object URL. This is the actual url that you can open to view your project, or embed it in an iframe in another website. It'll look something like https://your-bucket-name.s3.us-east-2.amazonaws.com/your-directory-structure/index.html

    Automating the Metadata with Lambda
    Lambda is another Amazon service that lets you execute code without having to set up a server. We will use it to automatically detect when you upload the unity build files and change the metadata accordingly.

    PART 1 - Creating a Role
    1. Go to IAM (Identity and Access Management) dashboard in your Amazon account.
    2. Go to roles and click create new role.
    3. Choose AWS Service, and select Lambda use case.
    4. Next, set the permission policy by searching for AmazonS3FullAccess and enabling its checkbox.
    5. Go to the last step, skipping tags, and give your role a good name and description. You don't want to accidentally use this role for something else in the future. Maybe unity-lambda-s3-fullaccess or something like it.
    PART 2 - Creating the Lambda function
    1. Go to Lambda in your Amazon account and create a function.
    2. Choose Author from Scratch.
    3. Give it a name. YOU WILL NOT BE ABLE TO CHANGE THIS LATER. So make it meaningful, like configure-unity-metadata
    4. Under Runtime choose Python 3.8 (you could use something else, but this guide is python)
    5. Under Permissions, change the default execution role to the existing one we just created. Finish the creation process.
    6. Go down to the code editor, double click the file in the left panel. Replace the code with this code.
    7. Save (Ctrl+S) and Deploy.
    PART 3 - Setting up the Trigger Event(s)
    1. Go back to your S3 bucket and go to properties. Scroll down to Event Notifications, and create a new one.
    2. Give it a meaningful name like uploaded-unity-build
    3. On Suffix type .br or .gz (depending on your chosen compression). Note that you can only set one. If you for some reason will be uploading different compression builds to the same bucket, you can create another event and send it to the same function.
    4. For Event types, choose POST, PUT and Multipart Upload (no Copy). Our function is actually using copy internally, which could cause issues by invoking recursively if it was triggered on copy.
    5. Destination is Lambda function, and on the dropdown below, choose your function.
    6. Save your changes.
    If everything went right, each time you upload a new build to your bucket, the lambda function should automatically configure the necessary metadata. :)

    This is something I only just begun to do, so the code or process still might not be ideal. If anyone has issues or suggestions it would be nice to know.
     
    Last edited: May 5, 2021
  33. GallopingGames

    GallopingGames

    Joined:
    Mar 13, 2013
    Posts:
    26
    Thanks so much to Hypertectonic for sharing this and helping me finally get my WebGL builds working without errors on AWS! Many posts and blogs and YouTube vids across the web had outdated or partial fixes, but this had everything up to date in one place. I appreciate the amount of time writing these clear instructions must have taken.

    For others, please be aware that AWS metadata editing seemed a little flaky and made me think this solution was not viable at first. It seems that if you make ANY mistake/typo in changing the metadata of each file first time, the subsequent re-edits didn't seem to hold. I had to upload and do every step right first time to get it to work.

    Coming from a long career with shockwave3D games, I was stunned to learn that a vast majority of Shared hosting companies are unwilling to run the 'mime' Apache module to get the .htaccess working. I actually had to cancel paid hosting and move to AWS, as a virtual server was eye watering money and hassle just to test some webgames. I wonder if there really was no way of creating Unity3D compressed WebGL content that works on vanilla shared servers?

    Yesterday I had twenty tabs of pain and heart ache from frustrated, confused people across the web unaware of why their shared hosting wasn't working for Unity WebGL builds.

    Thanks again! (I have yet to try the automation I am new to AWS and Lamda python)

    PS. I wonder if Amazon could auto-detect the .data.gz style unity extensions upon upload and correctly assign a http type without the whole metadata automation thing?


     
  34. sama-van

    sama-van

    Joined:
    Jun 2, 2009
    Posts:
    1,734

    I followed every steps for the Manual Metadata Config.
    However I am still having issue while running the build...
    By chance any idea?... :)
    AWS_MetaData.png

    webGLerror.png

    upload_2021-8-30_19-16-25.png
     
    Last edited: Aug 30, 2021
    Wekthor likes this.
  35. GallopingGames

    GallopingGames

    Joined:
    Mar 13, 2013
    Posts:
    26
     
    sama-van likes this.
  36. sama-van

    sama-van

    Joined:
    Jun 2, 2009
    Posts:
    1,734
    Come on, that thing actually FIX the problem =_=#

    Loading a charm for both Brotli and Gzip build! o_o! yeah!
     
    GallopingGames likes this.
  37. GallopingGames

    GallopingGames

    Joined:
    Mar 13, 2013
    Posts:
    26
    Awesome news!

    Just as a heads up, I read that Brotli only works with HTTPS: (Secure) traffic. If you receive mixed or HTTP: traffic, Gzip works on both, but has slightly less compression. That was a real gotcha for me at one point when I set up a custom domain on Route 53 (Amazon Domain Manager).
     
    sama-van likes this.
  38. Hypertectonic

    Hypertectonic

    Joined:
    Dec 16, 2016
    Posts:
    75
    Yeah like GallopingGames said, you need to force the browser to redownload the files instead of using the cached files to avoid issues and make sure you are actually running your latest upload.
     
  39. ManuBera

    ManuBera

    Joined:
    Aug 19, 2021
    Posts:
    70
    My build (2020.3.18f1) runs fine when played on a Windows or Mac browser, but it doesn't run on ANY mobile device. I even tried building with WebGL 1.0. On my android devices I get a load of error messages before loading has been finished:

    RuntimeError: memory access out of bounds at https://{cloudfront domain}/Build/WebGL.wasm.br:wasm-function[{four digits}]:{address}

    The project is still pretty small, so I'm a bit flabbergasted on why it is already complaining about memory...

    Any ideas?

    Edit: By accident I found out that it works fine with 2020.3.17f1 on all devices without error or anything. I haven't tried other versions yet...
     
    Last edited: Oct 20, 2021
  40. Jainnikita279

    Jainnikita279

    Joined:
    Jul 29, 2021
    Posts:
    2
    Hi, I tried hosting Unity WebGL build on cpanel but getting errors -
    Build.framework.js.br:1 Uncaught SyntaxError: Invalid or unexpected token
    Build.loader.js:1 Uncaught ReferenceError: unityFramework is not defined
    at HTMLScriptElement.r.onload (Build.loader.js:1:3167)
     

    Attached Files:

  41. McSwan

    McSwan

    Joined:
    Nov 21, 2013
    Posts:
    129
    I made a quick aws batch file to copy web builds into aws cloud, based off Hypertectonics suggestions. It can be if helpfull if you are restricted to aws, and only allowed to use aws client. Also, rather than drag dropping files into aws, I think it's a touch faster to run the batch file.

    You'll need to run aws configure first, with your secret keys.

    Just add the code below into a .bat file, Change the folder of where your build is, and change the s3 location.
    Probably could be done more efficently (maybe with sync), as it uploads multiple files twice but with different encodings, but works well enough for me.

    Code (CSharp):
    1.  
    2. @echo Started: %date% %time%
    3. rem uncomment if you want to delete the web page first
    4. :: aws s3 rm s3://roames-world/89A6CF1789244A139A6FE346677ED0C5/WebGL/Nightly/Unity2021/
    5. aws s3 cp C:\proj\builds\World2021 s3://roames-world/89A6CF1789244A139A6FE346677ED0C5/WebGL/Nightly/Unity2021 --recursive --acl public-read
    6. aws s3 cp C:\proj\builds\World2021 s3://roames-world/89A6CF1789244A139A6FE346677ED0C5/WebGL/Nightly/Unity2021 --exclude "*" --include "*.data.gz" --content-encoding gzip --content-type application/octet-stream --acl public-read --recursive
    7. aws s3 cp C:\proj\builds\World2021 s3://roames-world/89A6CF1789244A139A6FE346677ED0C5/WebGL/Nightly/Unity2021 --exclude "*" --include "*.wasm.gz" --content-encoding gzip --content-type application/wasm --acl public-read --recursive
    8. aws s3 cp C:\proj\builds\World2021 s3://roames-world/89A6CF1789244A139A6FE346677ED0C5/WebGL/Nightly/Unity2021 --exclude "*" --include "*.framework.js.gz" --content-encoding gzip --content-type application/javascript --acl public-read --recursive
    9. aws s3 cp C:\proj\builds\World2021 s3://roames-world/89A6CF1789244A139A6FE346677ED0C5/WebGL/Nightly/Unity2021 --exclude "*" --include "*.json.gz" --content-encoding gzip --content-type application/octet-stream --acl public-read --recursive
    10. @echo Completed: %date% %time%
    11.  
    12.  
    13.  
     
    Last edited: Mar 11, 2022
    Hypertectonic likes this.
  42. sama-van

    sama-van

    Joined:
    Jun 2, 2009
    Posts:
    1,734
    Some update on this.
    (Also self memo in case I search about it in the next 10 years :D)

    If you are using a GoDaddy or another provided for your website domain you may not able to use the Brotli compression.

    However you can cross server with your AmazonAWS...
    ... but you may have the CORS error from the web console showing up while loading the Build files from your other domain (usually press F12 to see the console error from your webbrowser)

    To allow the CORS, here is an easy way :

    1. From your AmazonS3 console, select your bucket.
    2. Go to Permissions tab.
    3. Scroll down to the Cross-origin resource sharing (CORS)
    4. Press edit and paste the following json :
    Code (csharp):
    1.  
    2. [
    3.     {
    4.         "AllowedHeaders": [
    5.             "*"
    6.         ],
    7.         "AllowedMethods": [
    8.             "GET"
    9.         ],
    10.         "AllowedOrigins": [
    11.             "http://exemple.com"
    12.         ],
    13.         "ExposeHeaders": []
    14.     }
    15. ]
    16.  
    Replace http://exemple.com by your domain.

    5. Press Save Changes.

    You shouldn't have CORS issue anymore from your other domain while loading the *.br data from there.
     
    Last edited: Jan 13, 2023
  43. sama-van

    sama-van

    Joined:
    Jun 2, 2009
    Posts:
    1,734
    Made some research on how to upload webGL build to AWS as single click and set the whole brotli settings on the fly ;)

    Took me a while, but here we are!!

    Things you may need to know 1st before to adventure yourself with script below :

    1. How to get a AccesKeyId & SecretAccess Key :
    - https://aws.amazon.com/getting-started/hands-on/backup-to-s3-cli/

    2. Run Terminal from your Finder directory :
    - https://www.maketecheasier.com/launch-terminal-current-folder-mac/

    3. Copy/paste and run the following line in Terminal :
    Code (Boo):
    1. python3 WebGlToAWS.py
    Tested on Mac OS only

    Save the following as WebGlToAWS.py.
    Edit WebGLBuild location and AWS fields.

    Code (Boo):
    1. import boto3
    2. import logging
    3. import os
    4. import time
    5. from botocore.exceptions import ClientError
    6. from os import walk
    7. from pathlib import Path
    8.  
    9. # Location of the WebGL build
    10. localDir = '/Volumes/YourDirectory/'
    11.  
    12. # AWS
    13. awsBucket = 'YourBucket'
    14. awsDirectory = 'YourDirectoryFromYouBucket'
    15. awsServiceName = 's3'
    16. awsRegionName= 'us-east-1' # depends on your setting
    17. awsAccessKeyId= 'YoutAccessKeyId'
    18. awsSecretAccessKey='YourSecretAccesskey'
    19.  
    20.  
    21. def Batch():
    22.     print ("Started + " + time.strftime("%Y-%m-%d %H:%M"))
    23.  
    24.     # Local Files
    25.     folders = ListDirectories(localDir)
    26.     files = ListFiles(localDir)
    27.  
    28.     # AWS
    29.     s3 = Login()
    30.     AWSBucketListLog(s3)
    31.     RemoveAWSDirectory(s3, awsBucket, awsDirectory)
    32.     MakeAWSDirectories(s3, awsBucket, localDir, awsDirectory, folders);
    33.     UploadFilesToAWS(s3, awsBucket, localDir, awsDirectory, files)
    34.  
    35.     print ("Completed + " + time.strftime("%Y-%m-%d %H:%M"))
    36.  
    37. def ListDirectories(root):
    38.     res = [x[0] for x in os.walk(root)]
    39.     return res
    40.  
    41. def ListFiles(dir_path):
    42.     res = []
    43.     for path, currentDirectory, files in os.walk(dir_path):
    44.         for file in files:
    45.             res.append(os.path.join(path, file))
    46.     return res
    47.  
    48. def Login():
    49.     s3 = boto3.resource(
    50.         service_name = awsServiceName,
    51.         region_name = awsRegionName,
    52.         aws_access_key_id = awsAccessKeyId,
    53.         aws_secret_access_key = awsSecretAccessKey
    54.     )
    55.     return s3
    56.  
    57. def AWSBucketListLog(s3):
    58.     logBucket = "Buckets available : "
    59.     for bucket in s3.buckets.all():
    60.         logBucket += "\n- " + str(bucket).replace("s3.Bucket(name='","").replace("')", "")
    61.     print(logBucket)
    62.  
    63. def RemoveAWSDirectory(s3, bucket, dir):
    64.     s3 = boto3.resource(awsServiceName)
    65.     objects_to_delete = s3.meta.client.list_objects(Bucket=bucket, Prefix=dir)
    66.  
    67.     found = objects_to_delete.get('Contents', [])
    68.     if len(found) == 0 :
    69.         print ("> No " + dir + " found in " + bucket + ".")
    70.         return;
    71.  
    72.     delete_keys = {'Objects' : []}
    73.     delete_keys['Objects'] = [{'Key' : k} for k in [obj['Key'] for obj in objects_to_delete.get('Contents', [])]]
    74.  
    75.     s3.meta.client.delete_objects(Bucket=bucket, Delete=delete_keys)
    76.     print(" > " + dir + " found and removed from AWS.");
    77.  
    78. def MakeAWSDirectories(s3, bucket, localDir, onlineDir, paths):
    79.     s3 = boto3.client(awsServiceName)
    80.  
    81.     for i in range(0, len(paths)):
    82.         path = paths[i]
    83.         dest = onlineDir + path.replace(localDir, "") + "/"
    84.         print("\nAdding Path : " + str(i+1) + "/" + str(len(paths)) + " :"
    85.                 + "\n\t- Source : " + path + ";"
    86.                 + "\n\t- Dest : " + dest + ";")
    87.         MakeAWSDirectory(s3, bucket, dest)
    88.  
    89. def MakeAWSDirectory(s3, bucket, path):
    90.     s3.put_object(Bucket=bucket,Body='', Key=path)
    91.  
    92.  
    93. def UploadFilesToAWS(s3, bucket, localDir, onlineDir, files):
    94.     for i in range(0, len(files)):
    95.         file = files[i]
    96.         if ".ds_store" in file.lower():
    97.             continue
    98.         dest = onlineDir + file.replace(localDir, "")
    99.         print("\nUploading file : " + str(i+1) + "/" + str(len(files)) + " :"
    100.                 + "\n\t- Source : " + file + ";"
    101.                 + "\n\t- Dest : " + dest + ";")
    102.         UploadFileToAWS(s3, bucket, file, dest)
    103.  
    104. def UploadFileToAWS(s3, bucket, source, dest):
    105.     s3 = boto3.client(awsServiceName)
    106.  
    107.     metadata = {}
    108.  
    109.     # make it public for online access
    110.  
    111.     metadata['ACL'] = "public-read"
    112.  
    113.     # File Encoding
    114.  
    115.     if ".br" in dest :
    116.         metadata['ContentEncoding'] = "br"
    117.     elif ".gz" in dest :
    118.         metadata['ContentEncoding'] = "gzip"
    119.  
    120.     # Generic File Type
    121.  
    122.     if ".css" in dest:
    123.         metadata['ContentType'] = "text/css"
    124.     elif ".html" in dest:
    125.         metadata['ContentType'] = "text/html"
    126.  
    127.     # Image File Type
    128.  
    129.     elif ".gif" in dest:
    130.         metadata['ContentType'] = "image/gif"
    131.     elif ".ico" in dest:
    132.         metadata['ContentType'] = "image/x-icon"
    133.     elif ".png" in dest:
    134.         metadata['ContentType'] = "image/png"
    135.  
    136.     # Build File Type
    137.  
    138.     elif ".data." in dest:
    139.         metadata['ContentType'] = "application/octet-stream"
    140.     elif ".framework.js." in dest:
    141.         metadata['ContentType'] = "application/javacript"
    142.     elif ".loader.js." in dest:
    143.         metadata['ContentType'] = "application/javascript"
    144.     elif ".wasm.js." in dest:
    145.         metadata['ContentType'] = "application/wasm"
    146.  
    147.     print ("\t- metadata : " + str(metadata) + ";")
    148.  
    149.     try:
    150.         response = s3.upload_file(
    151.             source,
    152.             bucket,
    153.             dest,
    154.             ExtraArgs=metadata
    155.         )
    156.     except ClientError as e:
    157.         logging.error(e)
    158.         return False
    159.     return True
    160.  
    161. Batch()
     
    Last edited: Jan 16, 2023
    Hypertectonic likes this.
  44. Hypertectonic

    Hypertectonic

    Joined:
    Dec 16, 2016
    Posts:
    75
    I actually made an in-editor tool to be able to build and upload automatically, you don't need to install python or the AWS sdk or use a terminal; it simply lives inside your project as a plugin, and you just press a button, which makes it far easier to share. It also allows you do some build pre and post processing, or can add customizable parameters.

    Unfortunately I haven't had time to clean it up in order to share it.
     
  45. sama-van

    sama-van

    Joined:
    Jun 2, 2009
    Posts:
    1,734
    Nice!
    So... no share for the community?..... :(

    To make it python has some pretty nice side such as uploading only a tiny part of the folder tree ;)
    and you can use it for pretty much anything outside of Unity!
     
  46. Bald1nh0

    Bald1nh0

    Joined:
    Feb 14, 2023
    Posts:
    1
    Has anyone had experience getting WebGL builds to work with AWS GameLift?