Aem Archive

Serverless continuous delivery for AEM on AWS

As mentioned in previous posts, I have been working on solutions that involve Adobe Experience Manager (AEM) and Amazon Web Services (AWS) to deliver digital experiences to consumers for the past few years. This was partly through custom implementations for our clients, and sometimes through Fluent, our turnkey digital marketing platform that combines AEM, AWS and Razorfish Managed Services and DevOps. I highlighted some of our ideas and best practices around managing a platform like that in a white paper created together with the AWS team.

One of the rules we apply in our work is striving for 100% automation in our deployment process. As I mention in the white paper, we leverage Jenkins on all our work to enable this. Combined with the fact that both AEM and AWS provide API access to various services and functionality, we have been very successful in automating this process.

That said, configuration of Jenkins and configuration of specific jobs is often a somewhat manual task. In addition, Jenkins (and the builds that are being executed) runs on its own EC2 instance(s), and therefore requires monitoring and incurring an ongoing cost. Especially in cases where deployments become less frequent, it would be ideal to utilize an on-demand & serverless deployment infrastructure.

AWS released CodePipeline earlier, and combined with the new AWS CodeBuild announcement, we can now look at replacing Jenkins by this set of AWS services, eliminating the need for a running AWS server that is a potential single point of failure. And with a unified architecture leveraging AWS tools, we have a solution that also integrates well with the overall AWS ecosystem, providing additional other benefits around artifact management and deployment.

I’ll describe how AWS Codepipeline can be used in the next section.

Architecture

To describe the solution, a simple code pipeline is described, patterned after the WordPress example in the AWS documentation. It omits some critical additional steps, such a performance, security, regression, visual and functional tests, which I will touch upon in a subsequent post.

The we-retail sample application was used, a reference AEM application for a retail site developed by the Adobe team.

AWS CodePipeline Flow

As shown in the diagram, the CodePipeline connects to Github for the latest source code in the first step. It also retrieves configuration information and CloudFormation templates from AWS S3. If any updates to Github or to the S3 objects are made, CodePipeline will be triggered to rerun.

The next step is to compile the application. AWS CodeBuild is a perfect fit for this task, and a basic profile can be added to perform this compilation.

If compilation is successful, the next step is to create the staging infrastructure. We leverage our CloudFormation templates to instantiate this environment, and deploy the application into this. A manual approval step is then inserted to into the flow to ensure manual review and approval. Unfortunately, at this point it is not possible yet to include the CloudFormation output (the staging URL) into the approval message.

Once approval is gained, the staging infrastructure can be removed, and the application can be deployed to production. As production is an already existing environment, we will deploy the application using the standard API AEM provides for this. To do so, a Lambda job was created that mimics the set of curl commands that can be used to deploy a package and install it.

Conclusion

With AWS CodePipeline and CodeBuild, we can now create a fully automated serverless deployment platform.

Jenkins is still a very powerful solution. At this point, Jenkins also has a large number of plugins that help automate the various tasks. However, many of these can fairly easily be migrated to AWS CodePipeline. AWS CodePipeline already provides built in support for many different services, including BlazeMeter and HPE StormRunner. It also supports Jenkins integration itself, and for any other services, AWS Lambda can fill that gap. For example, in our original standard AEM on AWS setups, a large number of tasks (deployment, content transfer, etc) were performed by Jenkins invoking curl scripts, which can easily be implemented using AWS Lambda as shown in the example above.

Obviously, the example above is simplified. Additional steps for testing, content deployment or migration and code review will need to be incorporated. I will describe some of this in a next post. That said, creating this simple unified architecture is very powerful.

In addition, by leveraging the CodePipeline and CodeBuild architecture, the deployment process itself can be moved to a CloudFormation template, making it possible to instantiate a system with multiple environments with automated deployments with little more than a click of button, getting us closer to 100% automation.

written by: Martin Jacobs (GVP, Technology)

AEM on AWS whitepaper

For a number of years now, I have been working on solutions that involve Adobe Experience Manager (AEM) and Amazon Web Services (AWS) to deliver digital experiences to consumers. Partly through custom implementations for our clients, and sometimes through Fluent, our turnkey digital marketing platform that combines AEM, AWS and Razorfish Managed Services and DevOps.

In the past year, I had to pleasure of working with AWS on creating a whitepaper that outlines some of our ideas and best practices around deploying on the AWS infrastructure.

The whitepaper delves into a few areas:

Why Use AEM on AWS?

Cloud-based solutions offer numerous advantages over on-premise infrastructure, particularly when it comes to flexibility.

For example, variations in traffic volume are a major challenge for content providers. After all, visitor levels can spike for special events such Black Friday Shopping, the Super Bowl, or other one-time occasions. AWS’s Cloud-based flexible capacity enables you to scale workloads up or down as needed.

Marketers and businesses utilize AEM as the foundation of their digital marketing platforms. Running it on AWS facilitates easy integration with third-party solutions, and make it a complete platform. Blogs, social media, and other auxiliary channels are simple to add and to integrate. In particular, using AEM in conjunction with AWS allows you to combine the API’s from each into powerful custom-configurations uniquely suited to your business needs. As a result, the tools for new features such as mobile delivery, analytics, and managing big data are at your fingertips.

A Look Under the Hood – Architecture, Features, and Services

In most cases, the AEM architecture involves three sets of services:

  1. Author – Used for the creation, management, and layout of the AEM experience.

  2. Publisher – Delivers the content experience to the intended audience. It includes the ability to personalize content and messaging to target audiences.

  3. Dispatcher – This is a caching and/or load-balancing tool that helps you realize a fast and performant experience for the end-user.

The whitepaper details some common configuration options for each service, and how to address that with the different storage options AEM provides. It also outlines some of the security considerations when deploying AEM in a certain configuration.

AWS Tools and Management

Not only does AWS allow you to build the right solution, it provides additional capabilities to make your environment more agile and secure.

Auditing and Security

The paper also highlights a couple of capabilities to make your platform more secure. For example, AWS has audit tools such as AWS Trusted Advisor. This tool automatically inspects your environment and makes recommendations that will help you cut costs, boost performance, improve, reliability, and enhance security. Other recommended tools include Amazon Inspector, which scans for vulnerabilities and deviations from best practices.

Automation through APIs

AWS provides API access to all its services and AEM does the same. This allows for a very clean organization of the integration and deployment process. Used in conjunction with an automated open-source server like Jenkins you can initiate manual, scheduled, or triggered deployments.

You can fully automate the deployment process. However, depending on which data storage options are used, separate arrangements may need to be made for backing up data. Also, policies and procedures for dealing with data loss and recovery need to be considered too.

Additional AWS Services

There are numerous other services and capabilities you can leverage when you use AEM in conjunction with AEM.

One great service is Amazon CloudWatch, which allows you to monitor a variety of performance metrics from a single place.

In addition, the constant stream of new AWS offerings, such as, Amazon’s Elastic File System (which allows you to configure file storage for your servers), provide new options for your platform.

Takeaway

Using Adobe’s Experience Manager in tandem with AWS provides a powerful platform for easily delivering highly immersive and engaging digital experiences. It is a solution that answers the dilemma that many marketers and content providers face – how to deliver an immersive, relevant, and seamless experience for end users from a unified platform, and the whitepaper aims to provide more insight into how to achieve this.

To learn more about running AEM on AWS, you can download and read the full whitepaper here.

written by: Martin Jacobs (GVP, Technology)