Convert CSV files to JSON with no dependencies. Supports Node.js (Sync & Async), and Browser environments with full RFC 4180 compliance.
Transform CSV data into JSON with a simple, chainable API. Choose your implementation style:
- Synchronous API - Blocking operations for simple workflows
- Asynchronous API - Promise-based for modern async/await patterns
- Browser API - Client-side CSV parsing for web applications
✅ RFC 4180 Compliant - Proper handling of quoted fields, delimiters, newlines, and escape sequences
✅ Zero Dependencies - No external packages required
✅ Full TypeScript Support - Included type definitions for all APIs
✅ Flexible Configuration - Custom delimiters, encoding, trimming, and more
✅ Method Chaining - Fluent API for readable code
✅ Large File Support - Stream processing for memory-efficient handling
✅ Comprehensive Error Handling - Detailed, actionable error messages with solutions (see ERROR_HANDLING.md)
RFC 4180 is the IETF standard specification for CSV (Comma-Separated Values) files. This library is fully compliant with RFC 4180, ensuring proper handling of:
| Aspect | RFC 4180 Specification |
|---|---|
| Default Delimiter | Comma (,) |
| Record Delimiter | CRLF (\r\n) or LF (\n) |
| Quote Character | Double-quote (") |
| Quote Escaping | Double quotes ("") |
firstName,lastName,email
"Smith, John",Smith,john@example.com
Jane,Doe,jane@example.com
"Cooper, Andy",Cooper,andy@company.comNote the quoted fields containing commas are properly handled. See RFC4180_MIGRATION_GUIDE.md for breaking changes and migration details.
npm install convert-csv-to-jsonconst csvToJson = require('convert-csv-to-json');
const json = csvToJson.getJsonFromCsv('input.csv');const csvToJson = require('convert-csv-to-json');
const json = await csvToJson.getJsonFromCsvAsync('input.csv');const convert = require('convert-csv-to-json');
const json = await convert.browser.parseFile(file);| Implementation | Use Case | Learn More |
|---|---|---|
| Sync API | Simple, blocking operations | Read SYNC.md |
| Async API | Concurrent operations, large files | Read ASYNC.md |
| Browser API | Client-side file parsing | Read BROWSER.md |
const json = csvToJson.csvStringToJson('name,age\nAlice,30');const json = csvToJson
.fieldDelimiter(';')
.getJsonFromCsv('input.csv');const json = csvToJson
.formatValueByType()
.getJsonFromCsv('input.csv');
// Converts "30" → 30, "true" → true, etc.const json = csvToJson
.supportQuotedField(true)
.getJsonFromCsv('input.csv');const files = ['file1.csv', 'file2.csv', 'file3.csv'];
const results = await Promise.all(
files.map(f => csvToJson.getJsonFromCsvAsync(f))
);All APIs (Sync, Async and Browser) support the same configuration methods:
fieldDelimiter(char)- Set field delimiter (default:,)formatValueByType()- Auto-convert numbers, booleanssupportQuotedField(bool)- Handle quoted fields with embedded delimitersindexHeader(num)- Specify header row (default: 0)trimHeaderFieldWhiteSpace(bool)- Remove spaces from headersparseSubArray(delim, sep)- Parse delimited arraysmapRows(fn)- Transform, filter, or enrich each rowutf8Encoding(),latin1Encoding(), etc. - Set file encoding
// Semicolon-delimited
csvToJson.fieldDelimiter(';').getJsonFromCsv('data.csv');
// Tab-delimited
csvToJson.fieldDelimiter('\t').getJsonFromCsv('data.tsv');
// Pipe-delimited
csvToJson.fieldDelimiter('|').getJsonFromCsv('data.psv');// Input: name,age,active
// John,30,true
csvToJson.formatValueByType().getJsonFromCsv('data.csv');
// Output: { name: 'John', age: 30, active: true }// Input: name,description
// "Smith, John","He said ""Hello"""
csvToJson.supportQuotedField(true).getJsonFromCsv('data.csv');
// Output: { name: 'Smith, John', description: 'He said "Hello"' }// If headers are in row 2 (3rd line):
csvToJson.indexHeader(2).getJsonFromCsv('data.csv');// Input: " First Name ", " Last Name "
csvToJson.trimHeaderFieldWhiteSpace(true).getJsonFromCsv('data.csv');
// Output: { FirstName: 'John', LastName: 'Doe' }// Input: name,tags
// John,*javascript,nodejs,typescript*
csvToJson.parseSubArray('*', ',').getJsonFromCsv('data.csv');
// Output: { name: 'John', tags: ['javascript', 'nodejs', 'typescript'] }// Filter out rows that don't match a condition
const result = csvToJson
.fieldDelimiter(',')
.mapRows((row) => {
// Only keep rows where age >= 30
if (parseInt(row.age) >= 30) {
return row;
}
return null; // Filters out this row
})
.getJsonFromCsv('input.csv');See mapRows Feature - Usage Guide.
// UTF-8 encoding
csvToJson.utf8Encoding().getJsonFromCsv('data.csv');
// Latin-1 encoding
csvToJson.latin1Encoding().getJsonFromCsv('data.csv');
// Custom encoding
csvToJson.customEncoding('ucs2').getJsonFromCsv('data.csv');See SYNC.md, ASYNC.md or BROWSER.md for complete configuration details.
const csvToJson = require('convert-csv-to-json');
async function processCSV() {
const data = await csvToJson
.fieldDelimiter(',')
.formatValueByType()
.supportQuotedField(true)
.getJsonFromCsvAsync('data.csv');
console.log(`Parsed ${data.length} records`);
return data;
}- RFC 4180 Breaking Changes - See RFC4180_MIGRATION_GUIDE.md
- Sync to Async Migration - See MIGRATION.md
- Error Handling - See docs/ERROR_HANDLING.md for comprehensive error documentation
Install dependencies:
npm installRun tests:
npm testDebug tests:
npm run test-debugSee CI/CD GitHub Action.
When pushing to the master branch:
- Include
[MAJOR]in commit message for major release (e.g., v1.0.0 → v2.0.0) - Include
[PATCH]in commit message for patch release (e.g., v1.0.0 → v1.0.1) - Minor release is applied by default (e.g., v1.0.0 → v1.1.0)
CSVtoJSON is licensed under the MIT License.
Found a bug or need a feature? Open an issue on GitHub.
Follow me and consider starring the project to show your support ⭐
If you find this project helpful and would like to support its development:
BTC: 37vdjQhbaR7k7XzhMKWzMcnqUxfw1njBNk