fs
The module not only provides basic file read and write operations, but also supports some advanced functions, such as streaming read and write, file monitoring, Promise API, etc. These advanced usages can better meet the needs of complex scenarios, such as large file processing, real-time monitoring, etc.
1. Streaming reading and writing
Stream is the core concept in processing large files or continuous data.fs
Module providedand
Method, used to read and write files in a stream.
1.1 Streaming file reading
const fs = require('fs'); // Create a readable streamconst readStream = ('', 'utf8'); // Listen to data events('data', (chunk) => { ('Received chunk:', ); }); // Listen to the end event('end', () => { ('File reading completed'); }); // Listen to error events('error', (err) => { ('Error reading file:', err); });
illustrate:
-
Create a readable stream that reads the file contents block by block.
-
data
Event: Fired every time a data block is read. -
end
Event: Fired when file reading is completed. -
error
Event: Triggered when an error occurs during the reading process.
1.2 Streaming to file
const fs = require('fs'); // Create a writable streamconst writeStream = (''); // Write data('Hello, world!\n'); ('This is a stream example.\n'); // End writing(); // Listen to complete events('finish', () => { ('File writing completed'); }); // Listen to error events('error', (err) => { ('Error writing file:', err); });
illustrate:
-
Create a writable stream to write the file contents block by block.
-
write
Method: Write data. -
end
Method: End writing. -
finish
Event: Triggered when the write is completed. -
error
Event: Triggered when an error occurs during writing.
1.3 Pipeline Operation
Pipe is a convenient way to connect readable and writable streams, and is often used for file replication.
const fs = require('fs'); // Create readable and writable streamsconst readStream = (''); const writeStream = (''); // Copy files using pipeline(writeStream); // Listen to complete events('finish', () => { ('File copied successfully'); }); // Listen to error events('error', (err) => { ('Error reading file:', err); }); ('error', (err) => { ('Error writing file:', err); });
2. File monitoring
fs
Module providedand
Method, used to monitor changes in files or directories.
2.1 Use
const fs = require('fs'); // Monitor file changesconst watcher = ('', (eventType, filename) => { (`Event type: ${eventType}`); if (filename) { (`File changed: ${filename}`); } }); // Turn off the monitorsetTimeout(() => { (); ('Watcher closed'); }, 10000); // 10 Close in seconds
illustrate:
-
Monitor changes in files or directories.
-
eventType
: Event type (egchange
、rename
)。 -
filename
: The file name that changed.
2.2 Use
const fs = require('fs'); // Monitor file changes('', { interval: 1000 }, (curr, prev) => { if ( !== ) { ('File modified'); } }); // Stop monitoringsetTimeout(() => { (''); ('Stopped watching file'); }, 10000); // 10 Stop in seconds
illustrate:
-
Check file status regularly.
-
curr
andprev
: Current and previous file status objects. -
interval
: Check interval (milliseconds).
3. Promise API
It has been available from v10API, supports Promise-based file operations.
3.1 Read files using
const fs = require('fs').promises; async function readFile() { try { const data = await ('', 'utf8'); ('File content:', data); } catch (err) { ('Failed to read file:', err); } } readFile();
3.2 Write to a file using
const fs = require('fs').promises; async function writeFile() { try { await ('', 'Hello, world!', 'utf8'); ('File written successfully'); } catch (err) { ('Failed to write file:', err); } } writeFile();
4. Recursive directory operation
4.1 Recursively read the directory
const fs = require('fs'); const path = require('path'); async function readDirRecursive(dir) { const files = await (dir); for (const file of files) { const filePath = (dir, file); const stats = await (filePath); if (()) { await readDirRecursive(filePath); //Recursively read subdirectories } else { ('File:', filePath); } } } readDirRecursive('./').catch((err) => { ('Failed to read directory:', err); });
4.2 Recursively delete directory
const fs = require('fs'); const path = require('path'); async function deleteDirRecursive(dir) { const files = await (dir); for (const file of files) { const filePath = (dir, file); const stats = await (filePath); if (()) { await deleteDirRecursive(filePath); // Recursively delete subdirectories } else { await (filePath); // Delete the file ('Deleted file:', filePath); } } await (dir); // Delete empty directory ('Deleted directory:', dir); } deleteDirRecursive('./temp').catch((err) => { ('Failed to delete directory:', err); });
5. Summary
- Streaming reading and writing: Suitable for processing large files to avoid excessive memory usage.
- File monitoring: Monitor changes in files or directories in real time.
- Promise API: Simplify asynchronous operations and avoid callback hell.
- Recursive operation: Handle nested directory structure.
By masteringfs
Advanced usage of modules can better deal with complex file operation scenarios and improve code performance and maintainability.
This is the end of this article about the advanced usage of the fs module. For more related fs module content, please search for my previous articles or continue browsing the related articles below. I hope everyone will support me in the future!