Swiftorial Logo
Home
Swift Lessons
Matchups
CodeSnaps
Tutorials
Career
Resources

Express.js Streams

Streams are a powerful feature in Node.js that allow you to handle data in chunks, providing a more efficient way to read and write large data sets. This guide covers key concepts, examples, and best practices for using streams in Express.js applications.

Key Concepts of Streams

  • Readable Stream: A stream from which data can be read (e.g., file reading, HTTP requests).
  • Writable Stream: A stream to which data can be written (e.g., file writing, HTTP responses).
  • Duplex Stream: A stream that is both readable and writable (e.g., TCP sockets).
  • Transform Stream: A duplex stream that can modify or transform the data as it is read and written (e.g., zlib streams).

Using Readable Streams

Readable streams allow you to read data in chunks. Here is an example of using a readable stream to read a file:

Example: Readable Stream

// readable-stream.js
const express = require('express');
const fs = require('fs');
const app = express();
const port = 3000;

app.get('/read-file', (req, res) => {
    const readableStream = fs.createReadStream('example.txt', 'utf8');
    readableStream.pipe(res);
});

app.listen(port, () => {
    console.log(`Server running at http://localhost:${port}/`);
});

Using Writable Streams

Writable streams allow you to write data in chunks. Here is an example of using a writable stream to write to a file:

Example: Writable Stream

// writable-stream.js
const express = require('express');
const fs = require('fs');
const app = express();
const port = 3000;

app.use(express.text());

app.post('/write-file', (req, res) => {
    const writableStream = fs.createWriteStream('output.txt');
    writableStream.write(req.body);
    writableStream.end(() => {
        res.send('File written successfully');
    });
});

app.listen(port, () => {
    console.log(`Server running at http://localhost:${port}/`);
});

Using Duplex Streams

Duplex streams are both readable and writable. Here is an example of a duplex stream using a TCP server:

Example: Duplex Stream

// duplex-stream.js
const net = require('net');
const port = 3000;

const server = net.createServer((socket) => {
    socket.write('Hello, client!\n');
    socket.pipe(socket);
});

server.listen(port, () => {
    console.log(`TCP server running at port ${port}`);
});

Using Transform Streams

Transform streams can modify or transform the data as it is read and written. Here is an example of a transform stream that compresses data using zlib:

Example: Transform Stream

// transform-stream.js
const express = require('express');
const fs = require('fs');
const zlib = require('zlib');
const app = express();
const port = 3000;

app.get('/compress-file', (req, res) => {
    const readableStream = fs.createReadStream('example.txt', 'utf8');
    const gzip = zlib.createGzip();
    res.setHeader('Content-Encoding', 'gzip');
    readableStream.pipe(gzip).pipe(res);
});

app.listen(port, () => {
    console.log(`Server running at http://localhost:${port}/`);
});

Combining Streams

You can combine multiple streams to create a pipeline of operations. Here is an example of reading a file, compressing it, and then writing it to another file:

Example: Combining Streams

// combining-streams.js
const express = require('express');
const fs = require('fs');
const zlib = require('zlib');
const app = express();
const port = 3000;

app.get('/compress-file', (req, res) => {
    const readableStream = fs.createReadStream('example.txt', 'utf8');
    const writableStream = fs.createWriteStream('example.txt.gz');
    const gzip = zlib.createGzip();
    readableStream.pipe(gzip).pipe(writableStream);
    writableStream.on('finish', () => {
        res.send('File compressed successfully');
    });
});

app.listen(port, () => {
    console.log(`Server running at http://localhost:${port}/`);
});

Best Practices for Using Streams

  • Use Streams for Large Data: Use streams to handle large data sets efficiently without consuming too much memory.
  • Handle Errors: Always handle errors in your streams to prevent crashes and ensure smooth operation.
  • Pipe Streams: Use the pipe method to connect streams and create pipelines.
  • Close Streams: Always close your streams properly to free up resources.
  • Use Transform Streams: Use transform streams to modify or transform data as it is read and written.

Testing Streams

Test your stream-based code using frameworks like Mocha, Chai, and Supertest:

Example: Testing Streams

// Install Mocha, Chai, and Supertest
// npm install --save-dev mocha chai supertest

// test/streams.test.js
const chai = require('chai');
const expect = chai.expect;
const request = require('supertest');
const express = require('express');
const fs = require('fs');
const app = express();

app.get('/read-file', (req, res) => {
    const readableStream = fs.createReadStream('example.txt', 'utf8');
    readableStream.pipe(res);
});

describe('GET /read-file', () => {
    it('should read the file content', (done) => {
        request(app)
            .get('/read-file')
            .expect(200)
            .end((err, res) => {
                if (err) return done(err);
                expect(res.text).to.be.a('string');
                done();
            });
    });

    it('should return 404 if the file does not exist', (done) => {
        request(app)
            .get('/read-nonexistent-file')
            .expect(404, done);
    });
});

// Define test script in package.json
// "scripts": {
//   "test": "mocha"
// }

// Run tests with NPM
// npm run test

Key Points

  • Readable Stream: A stream from which data can be read.
  • Writable Stream: A stream to which data can be written.
  • Duplex Stream: A stream that is both readable and writable.
  • Transform Stream: A duplex stream that can modify or transform the data as it is read and written.
  • Follow best practices for using streams, such as using streams for large data, handling errors, piping streams, closing streams, and using transform streams.

Conclusion

Streams are a powerful feature in Node.js that allow you to handle data in chunks, providing a more efficient way to read and write large data sets. By understanding and implementing the key concepts, examples, and best practices covered in this guide, you can effectively manage streams in your Express.js applications. Happy coding!