Woodshop for old agent generation
Try the new agent generation
  • What is woodshop
  • How to's
    • Smart Relationship
      • GetIdsFromRequest
    • Smart views
      • Display a calendar view
      • Create a custom tinder-like validation view
      • Create a custom moderation view
      • Create a dynamic calendar view for an event-booking use case
    • Configure environment variables
      • NodeJS/Express projects
    • Elasticsearch Integration
      • Interact with your Elasticsearch data
      • Elasticsearch service/utils
      • Another example
    • Zendesk Integration
      • Authentication, Filtering & Sorting
      • Display Zendesk tickets
      • Display Zendesk users
      • View tickets related to a user
      • Bonus: Direct link to Zendesk + change priority of a ticket
    • Dwolla integration
      • Display Dwolla customers
      • Display Dwolla funding sources
      • Display Dwolla transfers
      • Link users and Dwolla customers
      • Dwolla service
    • Make filters case insensitive
    • Use Azure Table Storage
    • Create multiple line charts
    • Create Charts with AWS Redshift
    • View soft-deleted records
    • Send Smart Action notifications to Slack
    • Authenticate a Forest Admin API against an OAuth protected API Backend
    • Translate your project into TypeScript
      • V8
        • Migrate Mongoose files
        • Migrate Sequelize files
      • v7
        • Migrate Mongoose files
        • Migrate Sequelize files
      • v6
    • Geocode an address with Algolia
    • Display/edit a nested document
    • Send an SMS with Zapier
    • Hash a password with bcrypt
    • Display a customized response
    • Search on a smart field with two joints
    • Override the count route
    • Make a field readOnly with Sequelize
    • Hubspot integration
      • Create a Hubspot company
      • Display Hubspot companies
    • Impersonate a user
    • Import data from a CSV file
    • Import data from a JSON file
    • Load smart fields using hook
    • Pre-fill a form with data from a relationship
    • Re-use a smart field logic
    • Link to record info in a smart view
    • Display data in html format
    • Upload files to AWS S3
    • Display AWS S3 files from signed URLs
    • Prevent record update
    • Display, search and update attributes from a JSON field
    • Add many existing records at the same time (hasMany-belongsTo relationship)
    • Track users’ logs with morgan
    • Search on relationship fields by default
    • Export related data as CSV
    • Run automated tests
  • Forest Admin Documentation
Powered by GitBook
On this page
  • Requirements
  • How it works
  • Directory: /models
  • Directory: /forest
  • Directory: /routes
  • Uploading large files

Was this helpful?

  1. How to's

Import data from a CSV file

PreviousImpersonate a userNextImport data from a JSON file

Last updated 4 years ago

Was this helpful?

This example shows you how to create a Smart Action "Import data" to import data from a CSV file.

Forest Admin natively supports data creation but it’s sometimes more efficient to simply import it.

Requirements

  • An admin backend running on forest-express-sequelize/forest-express-mongoose

How it works

Directory: /models

This directory contains the products.js file where the model is declared.

/models/products.js

module.exports = (sequelize, DataTypes) => {
  const { Sequelize } = sequelize;
  const Products = sequelize.define('products', {
    price: {
      type: DataTypes.INTEGER,
    },
    label: {
      type: DataTypes.STRING,
    },
    picture: {
      type: DataTypes.STRING,
    },
    ...
  }, {
    tableName: 'products',
    underscored: true,
    schema: process.env.DATABASE_SCHEMA,
  });
​
  Produtcs.associate = (models) => {
  };
​
  return Products;
};
/models/products.js
const mongoose = require('mongoose');

const schema = mongoose.Schema({
  'price': Integer,
  'label': String,
  'picture': String,
  ...
}, {
  timestamps: false,
});

module.exports = mongoose.model('products', schema, 'products');

Directory: /forest

This directory contains the products.js file where the Smart Action Import datais declared.

/forest/products.js
const { collection } = require('forest-express-sequelize');
const models = require('../models');

collection('products', {
  actions: [{
    name: 'Import data',
    endpoint: '/forest/products/actions/import-data',
    type: 'global',
    fields: [{
      field: 'CSV file',
      description: 'A semicolon separated values file stores tabular data (numbers and text) in plain text',
      type: 'File',
      isRequired: true
    }, {
      field: 'Type',
      description: 'Specify the product type to import',
      type: 'Enum',
      enums: ['phone', 'dress', 'toy'],
      isRequired: true
    }]
  }],

  // ...
});
/forest/products.js
const { collection } = require('forest-express-mongoose');
const models = require('../models');

Liana.collection('products', {
  actions: [{
    name: 'Import data',
    endpoint: '/forest/products/actions/import-data',
    type: 'global',
    fields: [{
      field: 'CSV file',
      description: 'A semicolon separated values file stores tabular data (numbers and text) in plain text',
      type: 'File',
      isRequired: true
    }, {
      field: 'Type',
      description: 'Specify the product type to import',
      type: 'Enum',
      enums: ['phone', 'dress', 'toy'],
      isRequired: true
    }]
  }],

  // ...
});

Directory: /routes

This directory contains the products.js file where the implementation of the route is handled. The POST /forest/actions/import-data API call is triggered when you click on the Smart Action in the Forest UI.

You may find below the coding examples you need to make this Smart action work:

/routes/products.js
const P = require('bluebird');
const express = require('express');
const router = express.Router();
const faker = require('faker');
const parseDataUri = require('parse-data-uri');
const csv = require('csv');
const models = require('../models');

//...

router.post('/products/actions/import-data',
  (req, res) => {
    let parsed = parseDataUri(req.body.data.attributes.values['CSV file']);
    let productType = req.body.data.attributes.values['Type'];

    csv.parse(parsed.data, { delimiter: ';' }, function (err, rows) {
      if (err) {
        res.status(400).send({
          error: `Cannot import data: ${err.message}` });
      } else {
        return P
          .each(rows, (row) => {
            // Random price for the example purpose. In a real situation, the price 
            // should certainly be available in the CSV file.
            let price = faker.commerce.price(5, 1000) * 100;

            return models.products.create({
              label: row[0],
              price: price,
              picture: row[1]
            });
          })
          .then(() => {
            res.send({ success: 'Data successfuly imported!' });
          });
      }
    });
  });
  
//...

module.exports = router;
/routes/products.js
const P = require('bluebird');
const express = require('express');
const router = express.Router();
const faker = require('faker');
const parseDataUri = require('parse-data-uri');
const csv = require('csv');
const models = require('../models');

...

router.post('/products/actions/import-data',
  (req, res) => {
    let parsed = parseDataUri(req.body.data.attributes.values['CSV file']);
    let productType = req.body.data.attributes.values['Type'];

    csv.parse(parsed.data, { delimiter: ';' }, function (err, rows) {
      if (err) {
        res.status(400).send({
          error: `Cannot import data: ${err.message}` });
      } else {
        return P
          .each(rows, (row) => {
            // Random price for the example purpose. In a real situation, the price 
            // should certainly be available in the CSV file.
            let price = faker.commerce.price(5, 1000) * 100;

            return models.products.create({
              label: row[0],
              price: price,
              picture: row[1]
            });
          })
          .then(() => {
            res.send({ success: 'Data successfuly imported!' });
          });
      }
    });
  });
  
...

module.exports = router;

Uploading large files

For large file uploads, you should add an option in your Express Server in your app.js file:

app.use(bodyParser.urlencoded({ extended: true ,limit: '50mb' }));
app.use(bodyParser.json({ limit: '50mb' }));

npm package

npm package

npm package

The CSV file passed into the body of the API call is serialized using a base64 encoding .

To deserialize the base64 encoded CSV file, we use the NPM package . We also use the NPM package to iterate over each line of the CSV file.

You can find a sample CSV file we use here to feed our products table on the .

bluebird
parse-data-uri
csv
Data URI scheme
parse-data-uri
csv parser
Live demo Github repository