Wednesday, June 5, 2024

Sitecore Content Hub Dam: A Fun Dive into Automated File Handling with Sitecore Content Hub and Azure Functions!


Welcome back to another thrilling installment in our journey of Sitecore Content Hub DAM (Digital Asset Management) automation! If you joined us in our last episode, we took a delicious bite into how to
move assets into the Sitecore Content Hub DAM with Azure storage containers. Now, we're here to sprinkle some extra seasoning on that recipe by automating the addition of correct headers and updating those all-important external component links. Picture this like adding the perfect garnish to an already mouth-watering dish!

Setting the Table: Recap from Our Last Meal

Previously, we explored the basics of uploading assets built using React Vite.js to an Azure storage container. The main course involved understanding how these assets could be moved seamlessly into the Sitecore Content Hub DAM, ensuring everything was neatly organized and easily accessible. But, as any great chef knows, it's not just about having the ingredients; it's about how you use them!

Today’s Special: Full Automation with Azure Functions

In this follow-up, we're diving deeper into the kitchen to see how automation can take our asset management to a Michelin-star level. Here’s the secret sauce: we’re going to automate the addition of the correct headers to our files and update the external component links in the Sitecore Content Hub DAM. Let’s break down this recipe step-by-step.

Ingredients

  • Azure Storage Account: Where your .js.gz files will be uploaded.
  • Azure Functions: Our automation tool to handle events and update headers and links.
  • Sitecore Content Hub DAM: The final destination for our beautifully prepared assets.
  • React Vite.js: Our asset generator.

The Automation Recipe

Step 1: Uploading the File

When you upload your .js.gz file (let's say it's named tasty-script.js.gz), it lands in the Azure storage container like a fresh ingredient in your pantry.

Step 2: Trigger the Azure Function

Our Azure Function, H_HandleBlobCreatedOrUpdated, is like a sous-chef that springs into action as soon as the file hits the container. Here’s what happens next:

  1. Event Detection: The function listens for the blob created event.
  2. Setting the Table: It sets the appropriate headers (Content-Type, Content-Encoding, and Cache-Control) to ensure our file is served just right.

Step 3: Update External Component Links

Once our headers are perfectly set, it’s time to update the links. Think of this as plating your dish and adding that final drizzle of sauce.

  1. Generate a Unique Version Identifier: This ensures that every time a new file version is uploaded, it doesn’t get mixed up with the old ones. We use a random identifier like 3e21a59a to make sure each serving is unique and fresh.
  2. Update Links in Sitecore Content Hub: The function then goes on a hunt through the Sitecore Content Hub, finding any references to the old file and updating them with the new version. It’s like making sure every guest at the restaurant gets the latest and greatest dish.

Here’s a Slice of the Code

Let’s take a peek at our master recipe: github.com


The Magic Behind the Scenes

So, what's the big deal about adding headers and updating links? Let's put it in restaurant terms. Imagine you're running a bustling kitchen. Each new dish (file) needs to be prepared just right before it reaches the dining room (your users). The headers are like the finishing touches - the perfect garnish, the right temperature. Without them, the dish might not be as appealing or could even be spoiled by the time it reaches the table.

Now, updating the external component link is like updating your menu. Every time you make a slight improvement to a recipe, you want to ensure your menu reflects the latest version. Otherwise, your customers might order something that no longer exists or get a previous version that doesn't showcase your culinary advancements.

Benefits of Full Automation Flow

1. Consistency and Accuracy:

  • Automation ensures that every file uploaded gets the exact headers and metadata it needs. No more manual mistakes or missed steps!

2. Time Savings:

  • Automating these processes means your team spends less time on repetitive tasks and more time on what really matters – creating great content.

3. Improved Performance:

  • Properly set headers can improve your site's performance by enabling efficient caching and ensuring that files are served correctly.

4. Seamless Updates:

  • Automatically updating links in the Sitecore Content Hub ensures that users always access the latest version of your files. This reduces the risk of broken links or outdated content.

The Complete Journey

Here’s a quick recap of the automation journey:

  1. Upload File: A new .js.gz file is built using React Vite.js and uploaded to an Azure storage container.
  2. Trigger Azure Function: The Azure Function detects the upload, sets the appropriate headers, and updates the metadata.
  3. Update Sitecore Content Hub: The function then updates the external component link in the Sitecore Content Hub to ensure the latest version is always available.

Why This Matters

In the digital world, ensuring your assets are always up-to-date and served correctly is crucial. This automation flow not only simplifies the process but also guarantees consistency and reliability. It's like having a top-notch kitchen where every dish is prepared perfectly and always updated with the latest recipe tweaks, ensuring your diners get the best experience every time.

What's Next?

Feeling inspired to take your content management to the next level? Try implementing this automation flow in your own projects and see the difference it makes. Stay tuned for more exciting tips and tricks as we continue to explore the delicious world of Sitecore Content Hub DAM automation!

Join the Conversation

Have you implemented similar automation in your workflows? Share your experiences and insights in the comments below. Let’s learn and grow together as we cook up the best digital experiences!

Tuesday, June 4, 2024

Sitecore Content Hub DAM - Moving External Components to Azure: A Recipe for Simplified Deployments


Welcome to the "Restaurant of Mistaken Orders" blog, where we serve up tech tips with a side of humor! Today, we're diving into a deliciously simple way to manage your external React components by moving them to Azure Storage. This will make your deployments as smooth as a perfectly cooked soufflé. So, grab your chef's hat, and let's get cooking!

Why Move to Azure?

Currently, we store our external component (React) code as portal components in Sitecore Content Hub DAM. Every time we have a release, we need to upload new versions to the portal assets and generate new public links. This process is as tedious as peeling a hundred potatoes by hand. To make deployments easier, we're moving our external component files to Azure. By having one SAS profile for all files in the container, we only need to upload new files to Azure during the release. A script will trigger changes, updating the public link in Content Hub with a caching parameter, not the complete URL. Voilà, deployment made easy!

Step-by-Step Guide to Azure Storage

1. Uploading Your .js.gz and .css.gz Files

First, let's upload our gzipped JavaScript and CSS files to Azure Storage.


For JavaScript Files:

  1. Upload the .js.gz File:
    • Upload your gzipped JavaScript file to the content-hub-external-components container in Azure Storage.
  2. Set Properties:
    • Open the file in Azure Storage Explorer or the Azure Portal.
    • Set the Content-Type to application/javascript.
    • Set the Content-Encoding to gzip.

For CSS Files:

  1. Upload the .css.gz File:
    • Upload your gzipped CSS file to the content-hub-external-components container in Azure Storage.
  2. Set Properties:
    • Open the file in Azure Storage Explorer or the Azure Portal.
    • Set the Content-Type to text/css.
    • Set the Content-Encoding to gzip.

This ensures that browsers correctly interpret the files as gzipped and decompress them accordingly.

Creating a Stored Access Policy

Now, let's create a Stored Access Policy to simplify access management.

  1. Log in to the Azure Portal:
    • Navigate to your Storage Account.
  2. Locate the Container:
    • Find the content-hub-external-components container.
  3. Create the Access Policy:
    • In the container's menu, select "Access policy" or "Shared access signature".
    • Click on "Add policy" to create a new Stored Access Policy.
    • Name the policy (e.g., external-components-policy).
    • Set the permissions to "Read".
    • Set the start and expiration dates/times (e.g., 4 years from now).
    • Click "Save" to create the policy.

Generating the Container SAS URL

  1. Generate the SAS URL:
    • Right-click on the container and select "Get Shared Access Signature".
    • In the dialog, select the Stored Access Policy you created.
    • Ensure the "Allowed resource types" is set to "Container".
    • Copy the generated "Blob SAS URL".

The SAS URL will look something like this:


https://<storage-account-name>.blob.core.windows.net/<container-name>?si=<policy-name>&spr=https&sv=<version>&sr=c&sig=<signature>

Accessing Files Using the Container SAS URL

To access a specific file within the container, append the file name to the URL path. For example, to access script.js.gz:


https://<storage-account-name>.blob.core.windows.net/<container-name>/script.js.gz?si=<policy-name>&spr=https&sv=<version>&sr=c&sig=<signature>


This URL allows you to upload, download, or access the file based on the permissions defined in the Stored Access Policy.


Why This Approach Rocks

By using a Stored Access Policy at the container level, you don't need to generate and manage individual SAS URLs for each file. The container-level SAS URL, combined with the file name, provides access to any file within the container. This simplifies access management and reduces the overhead of generating and maintaining SAS URLs for individual files, especially when you have a large number of files.


Wrapping Up

And there you have it! By moving your external component files to Azure and using a Stored Access Policy, you can streamline your deployment process and make it as easy as pie. No more generating new public links for each release—just upload your files, and let the script handle the rest. Bon appétit!


Stay tuned for more tech recipes from the "Restaurant of Mistaken Orders" blog. Until next time, happy coding!

Automate the process
https://www.restaurantofmistakenorders.com/2024/06/sitecore-content-hub-dam-fun-dive-into.html