Flag of Ukraine

Assembly Instructions

We briefly introduced Assembly Instructions while going over Transloadit's concepts and in My first App. Now, let's dig a little deeper ⛏ and take a look at this example:

{
  "steps": {
    "imported": {
      "robot": "/http/import",
      "url": "https://transloadit.com/assets/images/face.jpg"
    },
    "resized": {
      "robot": "/image/resize",
      "use": "imported",
      "width": 100,
      "height": 100,
      "resize_strategy": "fillcrop"
    },
    "exported": {
      "robot": "/s3/store",
      "use": ["imported", "resized"],
      "bucket": "YOUR_S3_BUCKET",
      "key": "YOUR_S3_KEY",
      "secret": "YOUR_S3_SECRET",
      "path": "/my_images/${file.id}/${file.url_name}"
    }
  }
}

As indicated, Assembly Instructions consist of Steps. If one or more Steps fail, the whole Assembly will end and show an error in its Assembly Status JSON. The example shows three Steps: imported, resized and exported. You can name your Steps anything you like. Giving them good names makes it easier to reference and understand what's going on. One exception is 🤖/upload/handle: it can only be used in a single Step, and that must be called :original.

Please also note that you must not name any Step :original that is not also using 🤖/upload/handle, as that can lead to weird job spawning behavior. Your Assembly will error out with a validation error if you do this.

For example, notice how the imported Step is used as an input to the resized Step. And both the imported and the resized Steps are used as the input for the exported Step. This way, we'll import an image from a web server, have it resized, and export both the imported image and the resized version to S3.

Not all Steps require inputs. Our imported Step, for instance, provides the first input by downloading it, so that's where we'll omit use. Other examples of Robots that don't require input files are 🤖/html/convert, which can take a screenshot from a website and create the first file that way, or the 🤖/upload/handle, which takes its files from your app's visitors, instead of from another Step.

Step parameters

As you can see, each Step is defined as an object with a handful of properties, or parameters. Most of them are in fact Robot Parameters as they instruct, for instance, the width of an image after a resize. Those are all covered in the respective Robot docs. There are, however, also 4 parameters that instruct the Assembly engine itself, defining which Robots are invoked and how they are interconnected:

  • use

    String / Array of Strings / Object required

    Specifies which Step(s) to use as input.

    • You can pick any names for Steps except ":original" (reserved for user uploads handled by Transloadit)

    • You can provide several Steps as input with arrays:

      "use": [
        ":original",
        "encoded",
        "resized"
      ]
      

    💡 That’s likely all you need to know about use, but you can view Advanced use cases.

  • robot

    String required

    Specifies which Robot should process files passed to this Step.

    There are 72 Robots, each with their own parameters, such as width to control how an image is resized. The full list of parameters per Robot can be taken from the Robot docs.

  • result

    Boolean ⋅ default: Automatic

    Controls whether the results of this Step should be present in the Assembly Status JSON.

    If set to true, the result of this Step will be present. If files from that Step weren't exported to your storage, their location will be set to a temporary URL.

    By default, we set this to true for leaf Steps and false for any intermediate Step.

    Explicitly setting it to false can be a useful tool in keeping the Assembly Status JSON small.

    Setting result: true on storage Steps does not add those Steps to the Assembly JSON, but only changes the returned URL values for the results of any transcoding Steps passed into those storage Steps. If you pipe a transcoding Step into multiple storage Steps (for example /s3/store) with each having result: true, then multiple results for this transcoding Step will be added to the Assembly JSON, giving you a quick overview of all file URLs for the various S3 buckets (in this example).

  • force_accept

    Boolean ⋅ default: false

    Force a Robot to accept a file type it would have ignored.

    By default Robots ignore files they are not familiar with. 🤖/video/encode, for example, will happily ignore and refuse to emit images.

    With the force_accept parameter set to true you can force Robots to accept all files thrown at them. This will typically lead to errors and should only be used for debugging or combatting edge cases.

Order of execution

In order to speed up Assemblies, Steps will be executed as soon as their input Steps emit files. In other words, many things are processed in parallel. For example, let's say you want to encode an uploaded video and would also like to extract thumbnails from it:

{
  "steps": {
    ":original": {
      "robot": "/upload/handle"
    },
    "encoded": {
      "use": ":original",
      "robot": "/video/encode",
      "preset": "ipad-high"
    },
    "thumbed": {
      "use": ":original",
      "robot": "/video/thumbs",
      "count": 4
    },
    "exported": {
      "use": ["encoded", "thumbed"],
      "robot": "/s3/store",
      "credentials": "YOUR_S3_CREDENTIALS"
    }
  }
}

Both the encoded and the thumbed Steps will be executed in parallel as soon as the first file upload is complete. The exported Step is fired for each of the files coming from encoded and thumbed. It is likely that the thumbnails will hit your S3 bucket before the video that was optimized for iPad, even though thumbnails were defined later. So, the order of Steps does not really matter. The use parameter defines the input for each Step and this ultimately dictates how our Steps are chained.

Filtering to make Steps conditional

Using 🤖/file/filter, you can execute Steps based on a file's properties. This allows you to create Assembly Instructions that: cater to both video and audio uploads, reject files that are too small, only apply an effect on images that have transparent areas, etc. These and more things are also covered in the Robot's docs.