Woodshop for old agent generation
Try the new agent generation
  • What is woodshop
  • How to's
    • Smart Relationship
      • GetIdsFromRequest
    • Smart views
      • Display a calendar view
      • Create a custom tinder-like validation view
      • Create a custom moderation view
      • Create a dynamic calendar view for an event-booking use case
    • Configure environment variables
      • NodeJS/Express projects
    • Elasticsearch Integration
      • Interact with your Elasticsearch data
      • Elasticsearch service/utils
      • Another example
    • Zendesk Integration
      • Authentication, Filtering & Sorting
      • Display Zendesk tickets
      • Display Zendesk users
      • View tickets related to a user
      • Bonus: Direct link to Zendesk + change priority of a ticket
    • Dwolla integration
      • Display Dwolla customers
      • Display Dwolla funding sources
      • Display Dwolla transfers
      • Link users and Dwolla customers
      • Dwolla service
    • Make filters case insensitive
    • Use Azure Table Storage
    • Create multiple line charts
    • Create Charts with AWS Redshift
    • View soft-deleted records
    • Send Smart Action notifications to Slack
    • Authenticate a Forest Admin API against an OAuth protected API Backend
    • Translate your project into TypeScript
      • V8
        • Migrate Mongoose files
        • Migrate Sequelize files
      • v7
        • Migrate Mongoose files
        • Migrate Sequelize files
      • v6
    • Geocode an address with Algolia
    • Display/edit a nested document
    • Send an SMS with Zapier
    • Hash a password with bcrypt
    • Display a customized response
    • Search on a smart field with two joints
    • Override the count route
    • Make a field readOnly with Sequelize
    • Hubspot integration
      • Create a Hubspot company
      • Display Hubspot companies
    • Impersonate a user
    • Import data from a CSV file
    • Import data from a JSON file
    • Load smart fields using hook
    • Pre-fill a form with data from a relationship
    • Re-use a smart field logic
    • Link to record info in a smart view
    • Display data in html format
    • Upload files to AWS S3
    • Display AWS S3 files from signed URLs
    • Prevent record update
    • Display, search and update attributes from a JSON field
    • Add many existing records at the same time (hasMany-belongsTo relationship)
    • Track users’ logs with morgan
    • Search on relationship fields by default
    • Export related data as CSV
    • Run automated tests
  • Forest Admin Documentation
Powered by GitBook
On this page
  • The Table Storage Definition
  • Install Azure data-tables package
  • Smart Collection definition
  • The Azure Data Tables Service Wrapper
  • Routes definition
  • Result

Was this helpful?

  1. How to's

Use Azure Table Storage

PreviousMake filters case insensitiveNextCreate multiple line charts

Last updated 3 years ago

Was this helpful?

This How to is based on the by

The implementation is done using a and a CRUD service that will wrap the .

The Table Storage Definition

You can use the new to create and populate a Table Storage in your .

In our example, we are going to use the Table Customers with the fields:

  • Id: PartitionKey + RowKey

  • Timestamp (updated at)

  • Email as String

  • FirstName as String

  • LastName as String

Install Azure data-tables package

npm install @azure/data-tables --save

Smart Collection definition

const { collection } = require('forest-express-sequelize');

collection('customers', {
  fields: [{
    field: 'id',
    type: 'String',
    get: (customer) => `${customer.partitionKey}|${customer.rowKey}`,
  }, {
    field: 'partitionKey',
    type: 'String',
  }, {
    field: 'rowKey',
    type: 'String',
  }, {
    field: 'timestamp',
    type: 'Date',
  }, {
    field: 'Email',
    type: 'String',
  }, {
  field: 'LastName',
  type: 'String',
  }, {
    field: 'FirstName',
    type: 'String',
  }, ],
});

The Azure Data Tables Service Wrapper

const { TableClient } = require('@azure/data-tables');

const getClient = (tableName) => {
  const client = TableClient.fromConnectionString(
    process.env.AZURE_STORAGE_CONNECTION_STRING,
    tableName);
  return client;
}

const azureTableStorageService = {
  deleteEntityAsync: async (tableName, partitionKey, rowKey) => {
    const client = getClient(tableName);
    await client.deleteEntity(partitionKey, rowKey);
  },

  getEntityAsync: async (tableName, partitionKey, rowKey) => {
    const client = getClient(tableName);
    return client.getEntity(partitionKey, rowKey);
  },

  listEntitiesAsync: async (tableName, options) => {
    const client = getClient(tableName);
    var azureResponse = await client.listEntities();

    let iterator = await azureResponse.byPage({maxPageSize: options.pageSize});

    for (let i = 1; i < options.pageNumber; i++) iterator.next(); // Skip pages

    let entities = await iterator.next();
    let records = entities.value.filter(entity => entity.etag);

    // Load an extra page if we need to allow (Next Page)
    const entitiesNextPage = await iterator.next();
    let nbNextPage = 0;
    if (entitiesNextPage && entitiesNextPage.value) {
      nbNextPage = entitiesNextPage.value.filter(entity => entity.etag).length;
    }

    // Azure Data Tables does not provide a row count. 
    // We just inform the user there is a new page with at least x items
    const minimumRowEstimated = (options.pageNumber-1) * options.pageSize + records.length + nbNextPage;

    return {records, count: minimumRowEstimated};
  },

  createEntityAsync: async (tableName, entity) => {
    const client = getClient(tableName);
    delete entity['__meta__'];
    await client.createEntity(entity);
    return client.getEntity(entity.partitionKey, entity.rowKey);
  },

  udpateEntityAsync: async (tableName, entity) => {
    const client = getClient(tableName);
    await client.updateEntity(entity, "Replace");
    return client.getEntity(entity.partitionKey, entity.rowKey);
  },

};

module.exports = azureTableStorageService;

Routes definition

const express = require('express');
const { PermissionMiddlewareCreator, RecordCreator, RecordUpdater } = require('forest-express');
const { RecordSerializer } = require('forest-express');

const router = express.Router();

const COLLECTION_NAME = 'customers';
const permissionMiddlewareCreator = new PermissionMiddlewareCreator(COLLECTION_NAME);
const recordSerializer = new RecordSerializer({ name: COLLECTION_NAME });

const azureTableStorageService = require("../services/azure-table-storage-service");

// Get a list of Customers
router.get(`/${COLLECTION_NAME}`, permissionMiddlewareCreator.list(), async (request, response, next) => {
  const pageSize = parseInt(request.query.page.size) || 15;
  const pageNumber = parseInt(request.query.page.number);

  azureTableStorageService.listEntitiesAsync(COLLECTION_NAME, {pageSize, pageNumber})
  .then( async ({records, count}) => {
    const recordsSerialized = await recordSerializer.serialize(records);
    response.send({ ...recordsSerialized, meta: { count }});
  })
  .catch ( (e) => {
    console.error(e);
    next(e);

  });
});

// Get a Customer
router.get(`/${COLLECTION_NAME}/:recordId`, permissionMiddlewareCreator.details(), async (request, response, next) => {
  const parts = request.params.recordId.split('|');
  azureTableStorageService.getEntityAsync(COLLECTION_NAME, parts[0], parts[1])
  .then( (record) => recordSerializer.serialize(record))
  .then( (recordSerialized) => response.send(recordSerialized))
  .catch ( (e) => {
    console.error(e);
    next(e);  
  });
});

// Create a Customer
router.post(`/${COLLECTION_NAME}`, permissionMiddlewareCreator.create(), async (request, response, next) => {      
  const recordCreator = new RecordCreator({name: COLLECTION_NAME}, request.user, request.query);
  recordCreator.deserialize(request.body)
  .then( (recordToCreate) => {
    return azureTableStorageService.createEntityAsync(COLLECTION_NAME, recordToCreate);
  })
  .then((record) => recordSerializer.serialize(record))
  .then((recordSerialized) => response.send(recordSerialized))
  .catch ( (e) => {
    console.error(e);
    next(e);
  });
});

// Update a Customer
router.put(`/${COLLECTION_NAME}/:recordId`, permissionMiddlewareCreator.update(), async (request, response, next) => {
  const parts = request.params.recordId.split('|');
  const recordUpdater = new RecordUpdater({name: COLLECTION_NAME}, request.user, request.query);
  recordUpdater.deserialize(request.body)
  .then( (recordToUpdate) => {
    recordToUpdate.partitionKey = parts[0];
    recordToUpdate.rowKey = parts[1];
    return azureTableStorageService.udpateEntityAsync(COLLECTION_NAME, recordToUpdate);
  })
  .then( (record) => recordSerializer.serialize(record) )
  .then( (recordSerialized) => response.send(recordSerialized) )
  .catch ( (e) => {
    console.error(e);
    next(e);
  });

});

// Delete a list of Customers
router.delete(`/${COLLECTION_NAME}`, permissionMiddlewareCreator.delete(), async (request, response, next) => {
  try {
    for (const key of request.body.data.attributes.ids) {
      const parts = key.split('|');
      await azureTableStorageService.deleteEntityAsync(COLLECTION_NAME, parts[0], parts[1]);
    }
    response.status(204).send()
  } catch (e) {
    console.error(e);
    next(e);
  }
});

module.exports = router;

Result

Medium article
Andrew Varnon
Smart Collection
Azure Table Storage API
Azure Data Explorer
Azure Storage account