Action Modules
The various pipeline actions that Slingshot can perform.
add_to_shotgrid¶
Adds a file as a Shotgrid version, or imports a .csv/.xlsx of metadata into Shotgrid versions.
actions:
- module: add_to_shotgrid
description: null
file_types: # (1)!
shot: # (2)!
sg_link_entity: Shot # (3)!
sg_link_entity_code_is: ${name} # (4)!
sg_link_filters: # (5)!
- field: sg_episode
is_entity: Episode
where:
- code
- is
- "'${episode}'"
sg_link_status_field: sg_status_list # (6)!
sg_link_status: rev # (7)!
exempt_link_statuses: [omt, hld, fin] # (8)!
sg_task_status: rev # (9)!
exempt_task_statuses: [] # (10)!
sg_version_status: rev # (11)!
sg_new_task_step_id: 8 # (12)!
sg_new_task_extra_data: null # (13)!
set_path_to_movie: local # (14)!
sg_link_to_movie_field: sg_link_to_movie # (15)!
set_path_to_frames: null # (16)!
sg_link_to_frames_field: null # (17)!
filename_inclusions: [] # (18)!
filename_exclusions: [] # (19)!
is_image_sequence: false # (20)!
sg_version_extra_data: null # (21)!
upload_media: true # (22)!
import_csv: # (23)!
spreadsheet_file_types: #(24)!
- csv
create_versions_if_missing: true # (25)!
start_row: 5 # (26)!
column_field_map: # (27)!
"1": code
"2": description # (28)!
"3": sg_submission_notes
"4": sg_format
- Which file types to add to Shotgrid, and specific configurations for each one.
- Each entry must match a configured file type.
- Which Shotgrid entity to link the newly created version to. Common options are
Shot
,Asset
,Sequence
,Episode
, etc. - The name of the above entity (Shot, for example) to link the version to. You can build this using template values captured from the file_type
regex
and/orpackage_regex
. -
Optional additional filters to use to narrow down which entity to link to. For example, if you have multiple Shots named
100
, but you want to narrow it down to the one that is itself linked to Episode 101.Most of the time this can be an empty list:
[]
-
The linked entity (e.g. Shot) field to set the status on. Usually
sg_status_list
but some shows or entities might have custom status fields. -
Optional: If set, the status of the above field will be changed to the configured value.
Uses the status short code, e.g.
rev
-
Optional: a list of exempt or 'sticky' statuses. If the entity's current status is in this list, the status will not be changed.
For example, you might not want a shot of status
omit
to be updated torev
, even if a new version is received for it for some reason. -
Optional: If set, the version's task status will be set to the configured value.
Uses the status short code, e.g.
rev
-
Optional: a list of exempt or 'sticky' task statuses. If the task's current status is in this list, the status will not be changed.
-
Optional: If set, the status of the Version will be changed to the configured value.
Uses the status short code, e.g.
rev
-
Optional: If a new task is created during ingest, it will be set to the pipeline step with this id.
- Optional: a dictionary of extra data to add when a new Task is created. For example, if you wanted to set a task
type
ordescription
field. - Optional: can be either
local
orremote
. If set, it will set sg_path_to_movie to the file's stored local or remote path, respectively. - Optional: If set to a valid
File/Link
field, it will create a file link to the local path. - Optional: can be either
local
orremote
. If set, it will set sg_path_to_frames to the file's stored local or remote path, respectively. - Optional: If set to a valid
File/Link
field, it will create a file link to the local path. -
Optional: a list of strings required to be in the filename for the version to ingest.
For example, you might only want to add versions for files that contain
_vfx
in the filename, even if they'd still match the same file type without it.It is common for file_types defined strictly enough that you don't need this setting.
-
Optional: a list of strings required not to be in the filename for the version to ingest.
For example, you might not want to add versions for files that contain
_avid
in the filename, even if they'd still match the same file type without it.It is common for file_types defined strictly enough that you don't need this setting.
-
Must be set to
true
for file_types that consist of a sequence of images, like raw exr/dpx frames.Since the
add_to_shotgrid
action will run on every single frame in a delivered batch of .exrs, this is a magic setting that will handle that appropriately and not create hundreds of versions. -
A dictionary of extra data to add to newly created Versions.
For example, you can manually set a version
type
,vendor
, etc here. -
If
true
, the file will be uploaded as media to the Shotgrid version. -
Optional: Settings for the csv importer.
-
A list of file_types to import as vendor submission spreadsheets.
-
If
true
, this will create Shotgrid versions if they don't exist. Iffalse
, it will only update existing ones.Because we cannot guarantee that the .csv will process after all the version .movs, this should almost always be
true
. -
The first row of the spreadsheet to read data from. This should either be the header row (if Slingshot is reading the headers) or the first row of version data (if a hard-coded
column_field_map
is provided). -
An optional mapping of column number: shotgrid fields to import to.
If this is
null
, then Slingshot will attempt to map columns to existing Shotgrid fields based on the column headers in the spreadsheet, either by field name (e.g. Description) or code (e.g. sg_description).If it's not null, then Slingshot will import any specified columns to the cooresponding Shotgrid field. The only required field is
code
(the Version name.) -
For example: This will import the value of cells in the second column (starting at row 5) onto the Version's
description
field.
add_to_shotgrid_delivery¶
Finds or creates a Shotgrid Delivery matching the package_name of the file, and links the current file's Shotgrid Version (if applicable)
This module is deprecated
In recent updates, we have both this module, which creates/updates a delivery for each individual file processed, and the shotgrid_delivery finisher module, which creates a delivery once at the end of package processing.
Generally using the finisher module makes more sense and should be sufficient.
This action module is still around for legacy support and weird edge cases.
If you configure this action module you should also configure the finish module using the same values. Only the finisher module populates the package summary in the Delivery contents
field.
actions:
- module: add_to_shotgrid_delivery
description: null
sg_delivery_entity: Delivery # (1)!
sg_name_field: title # (2)!
sg_versions_field: version_sg_deliveries_versions # (3)!
sg_summary_field: sg_contents # (4)!
extra_data: # (5)!
sg_delivery_type: IN
- The Shotgrid entity to use for deliveries. Almost always
Delivery
-
The field to use for the delivery name.
Shotgrid default for Deliveries is
title
-
The field to link the ingested version(s) to.
Shotgrid default for Deliveries is
version_sg_deliveries_versions
, but on some older sites it'ssg_versions
. -
If set, a text summary of the package will be put onto the field.
Shotgrid default for Deliveries if
sg_contents
. See summaries. -
A mapping of any additional
fields: values
to add to the Delivery. These can include template variables
Delivery Summaries¶
If the sg_summary_field
value is set, Slingshot will set that field to "Package ingest in progress" when a Delivery is first created.
When the delivery finisher runs, that placeholder is replaced with a text summary of the files in the package.
add_to_shotgrid_playlist¶
add_to_shotgrid_file¶
cleanup¶
Deletes the source file or package folder, specifically to remove succussfully processed files from a watchfolder. Should generally go last in the pipeline.
- If true, the item's source file will be deleted (e.g., the file in the watchfolder)
-
If true, will delete the file in the item's local_path.
This was originally to delete files that were only downloaded temporarily but since Slingshot v4, the TEMP path can be used which cleans up automatically.
This setting should probably be deprecated in the future.
download¶
Downloads the file from the input, or copies the file from the stored local_path
.
actions:
- module: download
description: download shots to IN # (5)!
operation: download # (1)!
overwrite_existing_files: true # (2)!
set_local_path: true # (3)!
paths: # (4)!
shot: /IN/${vendor|lower}/to_vfx/$package_name/$parent_folders
ref: TEMP
- download, copy, symlink, or hardlink. See operation
- If
true
, existing files will be silently overwritten. Iffalse
, an IngestError will raise and the item will fail. - Whether or not to store the destination path as the item's
local_path
.local_path
is used for subsequent copy operations or as a template variable. -
A mapping of file_types to destination path.
Destination paths need to be valid, absolute paths on the disk.
The magic path
TEMP
can be used to download a file to a temp folder which will be automatically cleaned up after processing.The magic type
ALL
can be used to download ALL files. You can specificy individualfile_types
or useALL
, but not both.The destination paths can include template variables.
-
Descriptions are optional, but useful for our own reference later
operation¶
download
: downloads the file via the Input methodcopy
: creates a copy of the filesymlink
: creates a symbolic link. Symbolic links don't survive file moves or renames so this option should only be used if a client specifically requests it.hardlink
: creates a hard link. This should be used overcopy
if the filesystem supports it as it won't duplicate the physical storage space required.
Info
You must download
a file at least once before you can copy or link it so that it exists locally and its local_path
is set.
remote_copy¶
test¶
wait_for_lucidlink_upload¶
This module waits until the remote LucidLink file upload is actually complete.
Why?
When a user uploads a file to LucidLink, the file shows up on all shared computers immediately with a "syncing" icon, but can't actually be read until the user's upload is complete, which may take some time.
Because of this, when using a watchfolder input on a shared LucidLink folder, the files are added to Slingshot immediately. But, we need to wait for the remote upload to complete before we can actually access and ingest them.
Fortunately, LucidLink provides a local API we can query to see if a file upload is complete or not. This module queries that in a loop and waits until it reports the upload is complete.
actions:
- module: wait_for_lucidlink_upload
description: null
base_url: http://localhost:7778/v1 # (1)!
filespace: filespace.domain # (2)!
local_mount_point: /Volumes/lucidlink # (3)!
sleep_time: 3 # (4)!
-
The LucidLink client api base url to use.
The default is
http://localhost:7778/v1
-
The LucidLink filespace to access, e.g.
filespace.domain
-
The mount point of the LucidLink share, e.g.
/Volumes/drive
This will be stripped off the item's local path to compute the relative LucidLink path for the API call.
-
The number of seconds to wait in between API polls.