Using Azure Artifacts for your own NPM modules

Henk Jan van Wijk

Currently I am working on a customer project in which we build a lot of Azure (serverless) functions, mostly in Javascript. Sometime ago we foresaw that we need some shared functionality which was to be used in multiple functions. So wouldn’t it be a good idea to create some shared library to put this shared functionality in?

Because the functionality is to be used in our Javascript functions, the most logical step would be to create a NPM (Node Package Manager) module. But we do not want this module to be public available, so the public NPM registry can not be used.

In comes Azure Artifacts, which is part of Azure DevOps. As we are already using Azure DevOps for our repositories and CI/CD pipelines, using Artifacts should be an easy step. And, spoiler alert, it is.

Despite the name, you might think Azure Artifacts is all about build artifacts like Artifactory, but the main purpose is a multi language package registry with support for .Net (dotnet/NuGet), Java (Maven/Gradle), Python (pip/twine) and Javascript (npm). You have also the option to publish universal packages, but I have not used that yet.

In this article I will show you how to create your own NPM module, publish it to Azure Artifacts and use it in your own node.js code. The module will be a wrapper interface for working with Azure Blob storage.

Prerequisites

To follow this tutorial you need the following:

  • An Azure account. If you do not have access to one, you can create a free account at https://azure.microsoft.com/en-us/free/
  • An Azure DevOps account. If you do not have access to one, you can create a free account at https://dev.azure.com.
  • Visual Studio Code (vscode).
  • Node.JS and NPM. We use Node.js v12 LTS as this this default version for Javascript Azure function v3.
  • In vscode add the following extentions:
    • Azure Account
    • Azure Functions
    • Optionally you can install ESLint for your coding style

Setup

Login on https://dev.azure.com.

Select the project in which you want to create your artifacts. If this your first login, you will need to create a project first.

You should see the home page of the project with in the left sidebar the different Azure DevOps modules with Artifacts at the bottom.

Go to Artifacts.

Press Create Feed.

Give the feed a name, keep the checkbox “Include package from common public sources” checked and press Create.

We will return shortly to this page, but first we need to create a folder on your local system in which to put the source code. Then start vscode and open this folder.

Start the integrated terminal (press ctrl + `).

And type:

npm init

Answer the questions. And choose a lower version number than the proposed 1.0.0. Like:

Important to note is that the version you enter here will be the version of the artifact in the Artifact repository when you first publish the artifact and you can not overwrite a version, so every time you publish an update you must increase the version. So start in the beginning for example with 0.1.0.

Next: go back to the Azure DevOps page and press the ‘Connect to feed’ button.

Choose npm as we are creating an npm module.

Press the ‘Get the tools’ button.

Because you should have already Node.js and npm installed (otherwise you could not do the npm init), run the command as mentioned in step 2:

npm install -g vsts-npm-auth --registry https://registry.npmjs.com --always-auth false
Note: if you get an error that you are not permitted to perform this operation (on Windows): open a separate Powershell window as administrator and re-run the command.

After running the command follow the steps as mentioned under Project setup. On Windows create a new file in vscode named .npmrc and set the content to whatever is mentioned in the first step, like:

registry=https://pkgs.dev.azure.com/<account>/<feedname>/_packaging/tutorial/npm/registry/
always-auth=true

Save the file and now you can authenticate with Azure DevOps artifacts by running, in the integrated terminal the following command:

vsts-npm-auth -config .npmrc

You will need to login with your Microsoft credentials again. And now you are ready to get started.

Create a new file: index.js and past in the following:

// Copyright (c) Henk Jan van Wijk. All rights reserved.
// Licensed under the MIT License.

const { BlobServiceClient, StorageSharedKeyCredential } = require('@azure/storage-blob')

/**
 * Utility class for working with Blob container in Azure storage
 *
 * @class BlobContainer
 */
class BlobContainer {
  /**
   * Creates an instance of BlobContainer.
   * Use either shared key credentials: accountName + accountKey or SAS (accountName + sasToken)
   * as authentication method.
   * @param {object} params - Inputparameters
   * @param {string} params.accountName - Storage account name
   * @param {string} [params.accountKey] - Storage account key
   * @param {string} [params.sasToken] - Shared access signatures (SAS) token
   * @memberof BlobContainer
   */
  constructor ({ accountName, accountKey = null, sasToken = null }) {
    this.accountName = accountName
    this.accountKey = accountKey
    this.sasToken = sasToken
    try {
      if (this.accountName &amp;&amp; this.accountKey) {
        this.sharedKeyCredential = new StorageSharedKeyCredential(this.accountName, this.accountKey)
        this.client = new BlobServiceClient(`https://${this.accountName}.blob.core.windows.net`,
          this.sharedKeyCredential
        )
      } else if (this.accountName &amp;&amp; this.sasToken) {
        this.client = new BlobServiceClient(`https://${this.accountName}.blob.core.windows.net?${this.sasToken}`)
      } else {
        throw new StorageError(401, 'Missing authentication')
      }
    } catch (error) {
      throw new StorageError(error.statusCode || 500, error.message)
    }
  }

  /**
   * Create a new blob container
   *
   * @param {object} params - Inputparameters
   * @param {string} params.containerName - Name of the container
   * @returns When succesfull: requestId
   * @memberof BlobContainer
   */
  async createContainer ({ containerName }) {
    try {
      const containerClient = this.client.getContainerClient(containerName)
      const createContainerResponse = await containerClient.create()
      console.log(`Create container ${containerName} successfully`, createContainerResponse.requestId)
      return createContainerResponse.requestId
    } catch (error) {
      throw new StorageError(error.statusCode || 500, error.message)
    }
  }

  /**
   * Create a new blob in a blob container
   *
   * @param {object} params - Inputparameters
   * @param {string} params.containerName - Name of the container
   * @param {string} params.blobName - Name of the blob to create
   * @param {any} params.content - Content of the blob
   * @returns When succesfull: requestId
   * @memberof BlobContainer
   */
  async createBlob ({ containerName, blobName, content }) {
    try {
      const containerClient = this.client.getContainerClient(containerName)
      const blockBlobClient = containerClient.getBlockBlobClient(blobName)
      const uploadBlobResponse = await blockBlobClient.upload(content, Buffer.byteLength(content))
      console.log(`Upload block blob ${blobName} successfully`, uploadBlobResponse.requestId)
      return uploadBlobResponse.requestId
    } catch (error) {
      throw new StorageError(error.statusCode || 500, error.message)
    }
  }

  /**
   * List all blobs in a blob container
   *
   * @param {object} params - Inputparameters
   * @param {string} params.containerName - Name of the container
   * @returns {array} - list of blobs
   * @memberof BlobContainer
   */
  async listBlobs ({ containerName }) {
    const bloblist = []
    try {
      const containerClient = this.client.getContainerClient(containerName)
      for await (const blob of containerClient.listBlobsFlat()) {
        bloblist.push(blob)
      }
      return bloblist
    } catch (error) {
      throw new StorageError(error.statusCode || 500, error.message)
    }
  }

  /**
   * Get the content of a blob
   *
   * @param {object} params - Inputparameters
   * @param {string} params.containerName - Name of the container
   * @param {string} params.blobName - Name of the blob to create
   * @returns {string} - content of blob
   * @memberof BlobContainer
   */
  async getBlobContent ({ containerName, blobName }) {
    // Get blob content from position 0 to the end
    // In Node.js, get downloaded data by accessing downloadBlockBlobResponse.readableStreamBody
    try {
      const containerClient = this.client.getContainerClient(containerName)
      const blockBlobClient = containerClient.getBlockBlobClient(blobName)
      const downloadBlockBlobResponse = await blockBlobClient.download(0)
      const content = await streamToString(downloadBlockBlobResponse.readableStreamBody)
      return content
    } catch (error) {
      throw new StorageError(error.statusCode || 500, error.message)
    }
  }

  /**
   * Delete a blob with all its snapshots
   *
   * @param {object} params - Inputparameters
   * @param {string} params.containerName - Name of the container
   * @param {string} params.blobName - Name of the blob to create
   * @returns {boolean} - Return True if succesfull, otherwise an error will be raised
   * @memberof BlobContainer
   */
  async deleteBlob ({ containerName, blobName }) {
    try {
      const containerClient = this.client.getContainerClient(containerName)
      const blockBlobClient = containerClient.getBlockBlobClient(blobName)
      await blockBlobClient.delete()
      return true
    } catch (error) {
      throw new StorageError(error.statusCode || 500, error.message)
    }
  }
}

class StorageError extends Error {
  constructor (code, message) {
    super(message)
    this.code = code
  }
}

// A helper method used to read a Node.js readable stream into string
async function streamToString (readableStream) {
  return new Promise((resolve, reject) => {
    const chunks = []
    readableStream.on('data', (data) => {
      chunks.push(data.toString())
    })
    readableStream.on('end', () => {
      resolve(chunks.join(''))
    })
    readableStream.on('error', reject)
  })
}

module.exports = {
  BlobContainer
}

This is a simplified file of a more complete version which can be found on Github. The file follows the StandardJS coding style. On Github you can also find the eslint configuration for this.

Add the dependencies by running in the integrated terminal:

npm install @azure/storage-blob --save

This will add a line to your package.json and installs all dependencies locally in node_modules.

It also creates a package-lock.json file which you always should commit to your repository.

Now publish the library to Azure Artifacts by running:

npm publish

You should get an output like:

PS C:\Projects\amis\amis-storage-lib-node> npm publish
npm notice
npm notice package: amis-storage-lib-node@0.1.0
npm notice === Tarball Contents ===
npm notice 5.7kB index.js
npm notice 303B package.json
npm notice === Tarball Details ===
npm notice name: amis-storage-lib-node
npm notice version: 0.1.0
npm notice package size: 1.6 kB
npm notice unpacked size: 6.0 kB
npm notice shasum: b6be74b45648ea7fd4f14b3e9c641a1d99d4e713
npm notice integrity: sha512-Vw1rJrFD3uK8G[…]G5Ka0Ufw/fCPw==
npm notice total files: 2
npm notice
amis-storage-lib-node@0.1.0

Now the package will be visible in Azure Artifacts. Have a look.

You will see that there a lot more libraries in the repository than just your library. That is because also all dependant node packages are installed with the versions as you have in your package-lock.json file.

When, after publishing, you want to update the code with a new version you can use the npm version commands, like npm version patch -m "Bump version to %s" to update the version patch level, e.g. from 0.1.0 to 0.1.1 and perform an git commit with the mentioned message. Or you just increase the version in package.json and commit the file yourself.

Now we can use this library in our code. You can create an Azure function in which to to do this, but for sake of this tutorial we will do a simple local testscript.

Create a new folder on your test, e.g. test-storage-lib and open this folder in vscode.

Do again an npm init and just keep the default values as we are only interested in creating a package.json file.

Next create the .npmrc file as we have done earlier.

Next install our package:

npm install amis-storage-lib-node --save
One note about the naming of the package: npm install will search first in our own Azure Artifacts package feed and then in the public npm registry. So keep in mind that the name of the library should be best globally unique. One way to do this is the follow the naming pattern: @name/package-name, for example @amis/storage-lib-node. This pattern is called scoped package. @name is effectively some kind of namespace. When you signup on the public npm registry you will get a scope assigned to make it clear that the package is made / owned by you. But you can use it also for your own packages in Azure Artifacts to mark them as your own packages.

Next create a new file index.js with the following content:

const { BlobContainer } = require('amis-storage-lib-node')

require('dotenv').config()

const config = {
  storageAccountName: process.env.STORAGE_ACCOUNT_NAME,
  storageAccountKey: process.env.STORAGE_ACCOUNT_KEY
}

const containerName = `testcontainer${new Date().getTime()}`
const container = new BlobContainer({
  accountName: config.storageAccountName,
  accountKey: config.storageAccountKey
})

const run = async (container) => {
  const resp1 = await container.createContainer({
    containerName: containerName
  })
  console.log(`Created a new blob container ${containerName}. RequestId = ${resp1}`)

  const resp2 = await container.createBlob({
    containerName: containerName,
    blobName: 'testblob.txt',
    content: 'Lorem ipsum dolor sit amet, consectetur adipiscing elit. Morbi eleifend.'
  })
  console.log(`Created blob: ${resp2}`)

  const listOfBlobs = await container.listBlobs({
    containerName: containerName
  })
  for (const blob of listOfBlobs) {
    console.log(`${blob.name} created on ${blob.properties.createdOn}`)
  }
}

run(container)
  .then(() => {
    console.log('Done.')
  })

Next create a file .env with the following content:

STORAGE_ACCOUNT_NAME=<name storage account>
STORAGE_ACCOUNT_KEY=<primary key storage account>

Fill in your storage account name and primary key which can be found on the Azure Portal on the storage account page, tab Access keys.

If you do not have created a storage account yet, create one first:

  1. Press the Create a resource icon
  2. Search for storage account
  3. Press the Create button
  4. Select your subscription and a resource group
  5. Enter a storage account name. Note the name must start with a letters, may only contain letters and numbers up to 24 characters.
  6. Select a location nearby, e.g. West Europe.
  7. Select replication: Locally-redundant storage (LRS)
  8. You can leave all other options to the default values
  9. Press Review + Create
  10. Press Create

After you have entered the account information in the .env file, you can test your script by running:

node .\index.js

You should see something like the following in your terminal window:

Create container testcontainer1590788180711 successfully 2098c0e5-901e-010f-0901-36c036000000
Created a new blob container testcontainer1590788180711. RequestId = 2098c0e5-901e-010f-0901-36c036000000
Upload block blob testblob.txt successfully 2098c101-901e-010f-2201-36c036000000
Created blob: 2098c101-901e-010f-2201-36c036000000
testblob.txt created on Fri May 29 2020 23:36:22 GMT+0200 (GMT+02:00)
Done.

Check in Azure Portal with your Storage account if the mentioned blob container is created with a blob inside.

To delete the blob or container, you need to do this manually in the Azure Portal or get the full package code from the mentioned Github page which includes these methods.

I hope that this tutorial has shown you how you can create your own Node.js NPM packages for common functionality in your projects and that is not too difficult to get started.

In a next article I will explain how you can add an Azure DevOps pipeline to publish the module code to Azure Artifacts instead of doing it by hand.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Next Post

AWS Shop example: Amazon X-Ray

Facebook 0 Twitter Linkedin Introduction We are in production with our shop example [1]. We’d like to get some statistics about our implementation: how often are the Lambda functions called? How fast are they? Of course, we could use the statistics from the performance test, but there is a faster […]