Technical Thursday – Jenkins pipeline and AMArETTo

Last week I showed you How you can integrate Git and Jenkins. Inside that post I did not provide script part for Azure related operation. Today I would like to show it.

In Step 4.4.5 we configured a file which is located on our Git. (pipeline/Jenkinsfile). This file is the “link” which can call an upload-to-azure method script. I know you ask: How?

At first I have a good news AMArETTo supports these operations from v0.0.2.9. AMArETTo is available on Git and on PyPi. πŸ™‚

This is the best position for you to create a cool automation solution at your company.

And now let’s see how can we implement the Azure functionality to our Jenkins pipeline.

Step 1: Install AMArETTo to our Jenkins server.

  1. This step is quite easy because we merely should follow the installation steps for AMArETTo.
    # Install from bash
    sudo pip install amaretto


Step 2: Create Python script which calls AMArETTo

In this step we will create a small python script which execute the upload function from AMArETTo.

  1. Create file into pipeline directory under your GitLab project’s root.
  2. Write a short code which get some external parameters
    # import amaterro
    import amaretto
    from amaretto import amarettostorage
    # import some important packages
    import sys
    import json
    # Get arguments
    fileVersion = str(sys.argv[1])
    storageaccountName = str(sys.argv[2])
    sasToken = str(sys.argv[3])
    filePath = str(sys.argv[4])
    modificationLimitMin = str(sys.argv[5])
    print "--- Upload ---"
    uploadFiles = amaretto.amarettostorage.uploadAllFiles(fileVersion = fileVersion, storageaccountName = storageaccountName, sasToken = sasToken, filePath = filePath, modificationLimitMin = modificationLimitMin)
    	result = json.loads(uploadFiles)
    	print "--- Upload files' result: '{0}' with following message: {1}".format(result["status"], result["result"])
    	print "--- Something went wrong during uploading files."
    print "-----------------------------"


  3. Β Create the Jenkinsfile into pipeline directory under your GitLab project’s root.
  4. Write a valid and lightweight Jenkinsfile code for Python which call our with the right parameters.
    pipeline {
        agent any
        environment {
            FILE_VERSION = ""
            AZURE_SA_NAME = "thisismystorage"
            AZURE_SA_SAS = "?sv=..."
            FILE_PATH = "./upload/" 
        stages {
            stage('Build') {
                steps {
                    withCredentials([azureServicePrincipal('c66gbz87-aabb-4096-8192-55d554565fff')]) {
                        sh '''
                            # Login to Azure with ServicePrincipal
                            az login --service-principal -u $AZURE_CLIENT_ID -p $AZURE_CLIENT_SECRET --tenant $AZURE_TENANT_ID
                            # Set default subscription
                            az account set --subscription $AZURE_SUBSCRIPTION_ID
                            # Execute upload to Azure
                            python pipeline/ "$FILE_VERSION" "$AZURE_SA_NAME" "$AZURE_SA_SAS" "$FILE_PATH" "$MODIFICATION_LIMIT_IN_MINUTES"
                            # Logout from Azure
                            az logout --verbose



Let me explain the Jenkinsfile. As you can see there is a unfamiliar part above bash code withCredentials(). This comes from Jenkins and this contains the Azure Service Principal related data for our Storage Account. (this was configured in the Step 2 in the post from last week) When you use this credential you have well configured variables which contain the related values such as AZURE_CLIENT_ID, AZURE_CLIENT_SECRET, AZURE_TENANT_ID and AZURE_SUBSCRIPTION_ID. These are fully enough to login Azure.

Step 3: Push files to Git

  1. Finaly we have to push these files to our Git
  2. Then push to Build now button in Jenkins
  3. And check the result πŸ™‚


I hope together with previous post you can improve your own Pipeline and provide a cool solution to your management. πŸ˜‰


Leave a comment