If I have a file like below:
key1 key2
data1 data2
data3 data4
...
Is there an easy way to read this file and convert it to a json object array? The final output I want is :
[{key1:data1, key2: data2},{key1: data3, key2: data4}, ...]
import fs from 'fs';
// read the file and split into rows and cells
const data = fs.readFileSync('./text.txt').toString().split('\n').map(row => row.match(/\w+/g));
// get the first row with cells as key names
const head = data.shift();
// map the rest of the rows into objects with keys from the first row
const result = data.map(row => Object.fromEntries(row.map((cell, idx) => [head[idx], cell])));
console.log(JSON.stringify(result));
The result:
/usr/local/bin/node ./read-csv.mjs
[{"key1":"data1","key2":"data2"},{"key1":"data3","key2":"data4"}]
If you need some speed boost with a big file, use Array::reduce()
:
import fs from 'fs';
const data = fs.readFileSync('./text.txt').toString().split('\n').map(row => row.match(/\w+/g));
const head = data.shift();
const result = data.map(row => row.reduce((obj, cell, idx) => {
obj[head[idx]] = cell;
return obj;
}, {}));
console.log(JSON.stringify(result));
If you need the maximum speed use for
loops:
import fs from 'fs';
const data = fs.readFileSync('./text.txt').toString().split('\n').map(row => row.match(/\w+/g));
const head = data.shift();
const result = [];
for (let i = 0; i < data.length; i++) {
const row = data[i];
const item = result[result.length] = {};
for (let j = 0; j < row.length; j++) {
item[head[j]] = row[j];
}
}
console.log(JSON.stringify(result));
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments