Node.js import csv with blank fields

Tyler

I'm trying to import & parse a CSV file using the csv-parse package, but having difficulty with requireing the csv file in the first place.

When I do input = require('../../path-to-my-csv-file')

I get an error due to consecutive commas because some fields are blank:

e","17110","CTSZ16","Slitzer™ 16pc Cutlery Set in Wood Block",,"Spice up
                                                                    ^
SyntaxError: Unexpected token ,

How do I import the CSV file into the node environment to begin with?

Package examples are Here

yvanscher

To solve your first problem, reading CSV with empty entries:

Use the 'fast-csv' node package. It will parse csv with emtpy entries.

To answer your second question, how to import a CSV into node:

You don't really "import" csv files into node. You should fs.open the file or use fs.createReadStream to read the csv file at the appropriate location.

Below is a script that uses fs.createReadStream to parse a CSV called 'test.csv' that is one directory up from the script that is running it.

The first section sets up our program, makes basic declarations of the objects were going use to store our parsed list.

var csv = require('fast-csv') // require fast-csv module
var fs = require('fs') // require the fs, filesystem module
var uniqueindex = 0 // just an index for our array
var dataJSON = {} // our JSON object, (make it an array if you wish)

This next section declares a stream that will intercept data as it's read from our CSV file and do stuff to it. In this case we're intercepting the data and storing it in a JSON object and then saving that JSON object once the stream is done. It's basically a filter that intercepts data and can do what it wants with it.

var csvStream = csv() // - uses the fast-csv module to create a csv parser
.on('data',function(data){ // - when we get data perform function(data) 
  dataJSON[uniqueindex] = data; // - store our data in a JSON object dataJSON
  uniqueindex++ // - the index of the data item in our array
})
.on('end', function(){ // - when the data stream ends perform function()
  console.log(dataJSON) // - log our whole object on console
  fs.writeFile('../test.json', // - use fs module to write a file
  JSON.stringify(dataJSON,null,4), // - turn our JSON object into string that can be written
  function(err){ // function(err) only gets performed once were done saving the file and err will be nil if there is no error
    if(err)throw err //if there's an error while saving file throw it
    console.log('data saved as JSON yay!')
  })
})

This section creates what is called a "readStream" from our csv file. The path to the file is relative. A stream is just a way of reading a file. It's pretty powerful though because the data from a stream can be piped into another stream. So we'll create a stream that reads the data from our CSV file, and then well pipe it into our pre-defined readstream / filter in section 2.

var stream = fs.createReadStream('../test.csv')
stream.pipe(csvStream)

This will create a file called 'test.json' one directory up from the place where our csv parsing script is. test.json will contain the parsed CSV list inside a JSON object. The order in which the code appears here is how it should appear in a script you make.

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related