Purging External Assets

Up until 7.6.2, Zoom has only supported purging of content directly held within the Zoom repository database. Users had to manually remove the unwanted external assets from tier-2 archive storage locations like S3, and also from their individual TPM paths, or shared P-SAN paths. Starting with 7.6.2, Zoom provides support for clearing the high-res / mid-res / image-sequence assets from archive locations as well as P-SAN / TPM paths.

Controls for External Purge

  1. Login to Web Management Console.
  2. Open the Server Control Panel, and click on Hierarchical Archive Settings.
  3. Enable the checkbox labeled Purge External Assets on Zoom Purge, and save.
Enabling External Asset Cleanup

Purging assets from the external archive storages like S3 may not be instantaneous. Therefore, soon after you purge, if you try to ingest another asset of the same name, you may still get an error citing that there is another asset of the same name already and that the ingest would fail. If you see this error, please wait and retry a little later.

If you want to track the status of the purge operation, you could do that either from the Web Management Console’s page tracking hub job submissions, as well as from the hub dashboard from the client machine that triggered the purge operation (or server hub dashboard, if it was triggered from a non-C3 hub).

Safety Backups on Purge

It is additionally possible to build a safety net against accidental or inadvertent purge operations. Admins can configure a backup S3 bucket, in the hub configuration settings. If such a backup bucket is defined, then the purge operation copies the purged data to the backup bucket, vacating the space in the Zoom archive S3 bucket, as well as the TPM paths. However, since the asset is not entirely removed, the admins get an opportunity to inspect and retrieve data if required, from the backup bucket.

  1. Login to Web Management Console.
  2. Open Hub Settings, and click on Hub Configuration Panel.
  3. Select the hub preset for which you would like to add the purge backup option.
  4. Add the name of the S3 bucket that will be used as a backup storage to keep the purged data, and save.
Purge backup options

It is recommended that the admins configure an automatic deletion policy for the backup bucket from the AWS console so that assets moved into this bucket will get periodically cleared, and not end up costing money for storage.

If the assets of the same name are added and purged repeatedly, then the backup bucket will only retain the most recently purged data. All old data will be overwritten.

Executing Purge, Tracking Purge Status

Users mapped to roles that have ADMINISTER permission, will be able to purge assets from Zoom. This can be invoked from the right-click menu in the Asset Browser. Once purge is triggered, the assets are removed from the view in the Asset Browser itself. But you can still track the status of the purge operation from a couple of places.

  • From the Web Management Console -> Hub Settings -> Hub Jobs Panel. This page shows the distribution of jobs across the various hubs (both server-hubs and client-hubs) of your deployment. Double-click on the relevant hub and open the list of pending jobs. In here, you would see an entry for the purged asset ID, with its job type set to Pending Purge
  • From Z-menu -> Hub (Running) -> Dashboard, on the client machine from where you trigger the purge operation, you will be able to see the status of the purge job.
Purge being executed by the client-hub
  • In addition, all the users who had restored the hi-res media into their user machines earlier, will also be able to see that the asset gets cleaned up from their machine. This will be reflected in their client-hub dashboards.
Purged asset being cleaned up from another client machine