- 1. Node.js TypeScript #1. Modules, process arguments, basics of the File System
- 2. Node.js TypeScript #2. The synchronous nature of the EventEmitter
- 3. Node.js TypeScript #3. Explaining the Buffer
- 4. Node.js TypeScript #4. Paused and flowing modes of a readable stream
- 5. Node.js TypeScript #5. Writable streams, pipes, and the process streams
- 6. Node.js TypeScript #6. Sending HTTP requests, understanding multipart/form-data
- 7. Node.js TypeScript #7. Creating a server and receiving requests
- 8. Node.js TypeScript #8. Implementing HTTPS with our own OpenSSL certificate
- 9. Node.js TypeScript #9. The Event Loop in Node.js
- 10. Node.js TypeScript #10. Is Node.js single-threaded? Creating child processes
- 11. Node.js TypeScript #11. Harnessing the power of many processes using a cluster
- 12. Node.js TypeScript #15. Benefits of the HTTP/2 protocol
- 13. Node.js TypeScript #12. Introduction to Worker Threads with TypeScript
- 14. Node.js TypeScript #13. Sending data between Worker Threads
- 15. Node.js TypeScript #14. Measuring processes & worker threads with Performance Hooks
The HTTP is a protocol allowing you to fetch resources such as JSON data and HTML documents. Two sides of the connection, the client and the server, communicate by exchanging messages. The message sent by the client is a request. The message sent by the server is a response. When using Node.js, you can act like either one of them. In this article, we cover how to make requests.
This article presents the way to make HTTP requests in pure Node.js. Other viable solution is using a library like axios.
Node.js TypeScript: sending HTTP requests
To send a request, we need to use the http module. It contains the request method. Let’s try it out!
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
import { request } from 'http'; const req = request( { host: 'jsonplaceholder.typicode.com', path: '/todos/1', method: 'GET', }, response => { console.log(response.statusCode); // 200 } ); req.end(); |
The first argument of the request function is the options object. As you can see, the name of the host and the path are two distinct parameters.
The last argument of a request function is a callback. Its first argument is an instance of IncomingMessage representing the response. It holds information about the response that we got such as the status code.
A significant thing that is is a readable stream. Since we’ve covered it in one of the previous parts of the series we know how to take advantage of it. Let’s redirect the response straight into a file.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
import { request } from 'http'; import { createWriteStream } from 'fs'; const fileStream = createWriteStream('./file.txt'); const req = request( { host: 'jsonplaceholder.typicode.com', path: '/todos/1', method: 'GET', }, response => { response.pipe(fileStream); } ); req.end(); |
And just like that, we have a file created with the response content of our request.
file.txt
1 2 3 4 5 6 |
{ "userId": 1, "id": 1, "title": "delectus aut autem", "completed": false } |
Another thing that we might want to do is to store the response body in a variable. Since it is a readable stream, we need to parse all its chunks.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
import { request } from 'http'; const req = request( { host: 'jsonplaceholder.typicode.com', path: '/todos/1', method: 'GET', }, response => { const chunks = []; response.on('data', (chunk) => { chunks.push(chunk); }); response.on('end', () => { const result = Buffer.concat(chunks).toString(); console.log(result); }); } ); req.end(); |
As you can see, a lot is going on here. To simplify this process, we can wrap this into a function returning a promise.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
function performRequest(options: RequestOptions) { return new Promise((resolve, reject) => { request( options, function(response) { const { statusCode } = response; if (statusCode >= 300) { reject( new Error(response.statusMessage) ) } const chunks = []; response.on('data', (chunk) => { chunks.push(chunk); }); response.on('end', () => { const result = Buffer.concat(chunks).toString(); resolve(JSON.parse(result)); }); } ) .end(); }) } |
1 2 3 4 5 6 7 8 9 10 11 12 13 |
performRequest( { host: 'jsonplaceholder.typicode.com', path: '/todos1', method: 'GET', }, ) .then(response => { console.log(response); }) .catch(error => { console.log(error); }); |
As you can see I’ve also put some elementary error handling there.
The response object contains more useful data, like headers. We can go ahead and attach them to the resolve of our function.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
interface Response { data: object, headers: IncomingHttpHeaders } function performRequest(options: RequestOptions) { return new Promise((resolve, reject) => { request( options, function(response) { const { statusCode, headers } = response; if (statusCode >= 300) { reject( new Error(response.statusMessage) ) } const chunks = []; response.on('data', (chunk) => { chunks.push(chunk); }); response.on('end', () => { const data = Buffer.concat(chunks).toString(); const result: Response = { data: JSON.parse(data), headers, }; resolve(result); }); } ) .end(); }) } |
http.ClientRequest
The request function returns an instance of a ClientRequest which inherits from a Stream. We can use it to send some data along a POST request.
To test it, let’s use a REST API that we’ve developed in the first part of the TypeScript Express tutorial.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
import { request } from 'http'; const req = request( { host: 'localhost', port: '5000', path: '/posts', method: 'POST', headers: { 'Content-Type': 'application/json' } }, response => { console.log(response.statusCode); // 200 } ); req.write(JSON.stringify({ author: 'Marcin', title: 'Lorem ipsum', content: 'Dolor sit amet' })); req.end(); |
In all the above example we call the end function. We must always do it to signify the end of the request. It can, but does not have to, contain additional data that we want to send.
Uploading files with multipart/form-data
Another way to take advantage of the request being a stream is to upload files. To do that, we can use multipart/form-data.
FormData provides a way to construct key/value pairs that represent form fields and values. When we use the browser, we can easily create it with the FormData() constructor. Since Node.js does not provide it, we use an external package called form-data.
1 |
npm install @typings/form-data form-data |
Multipart originates from MIME, a standard extending the format of emails standing for Multipurpose Internet Mail Extensions. Requests of that type combine one or more sets of data into a single body, separated by boundaries. Typically, when sending files, we use multipart/form-data which is one of the subtypes of Multipart and is widely supported on the web.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
import * as FormData from 'form-data'; import { request } from 'http'; import { createReadStream } from 'fs'; const readStream = createReadStream('./photo.jpg'); const form = new FormData(); form.append('photo', readStream); form.append('firstName', 'Marcin'); form.append('lastName', 'Wanago'); const req = request( { host: 'localhost', port: '5000', path: '/upload', method: 'POST', headers: form.getHeaders(), }, response => { console.log(response.statusCode); // 200 } ); form.pipe(req); |
The form-data library creates readable streams that we send along with the request. An interesting part of the code above is form.getHeaders().
Boundary
When sending multipart/form-data we need to use appropriate headers. Let’s look into what the form-data library generates for us:
1 2 3 4 5 6 7 8 9 10 11 |
import * as FormData from 'form-data'; import { createReadStream } from 'fs'; const fileStream = createReadStream('./photo.jpg'); const form = new FormData(); form.append('photo', readStream); form.append('firstName', 'Marcin'); form.append('lastName', 'Wanago'); console.log(form.getHeaders()); |
1 2 3 |
{ 'content-type': 'multipart/form-data; boundary=--------------------------898552055688392969814829' } |
As you can see, it sets the type of content to multipart/form-data and sets a boundary with a random string in it that is different every time. It is passed inside of the headers to define a string dividing different parts of the form data.
To fully understand it, let’s pipe our form into a file and read it.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
import * as FormData from 'form-data'; import { createReadStream, createWriteStream } from 'fs'; const readStream = createReadStream('./photo.jpg'); const writeStream = createWriteStream('./file.txt'); const form = new FormData(); form.append('photo', readStream); form.append('firstName', 'Marcin'); form.append('lastName', 'Wanago'); console.log(form.getHeaders()); form.pipe(writeStream); |
1 2 3 |
{ 'content-type': 'multipart/form-data; boundary=--------------------------966991448654339731356450' } |
file.txt
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
----------------------------966991448654339731356450 Content-Disposition: form-data; name="photo"; filename="photo.jpg" Content-Type: image/jpeg ���� JFIF �� ;CREATOR: gd-jpeg v1.0 (using IJG JPEG v90), quality = 82 �� C !'"#%%%),($+!$%$�� C $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$�� ,," �� (...) ----------------------------966991448654339731356450 Content-Disposition: form-data; name="firstName" Marcin ----------------------------966991448654339731356450 Content-Disposition: form-data; name="lastName" Wanago ----------------------------966991448654339731356450-- |
Every part of the form is divided using a generated boundary, with the last one having two extra dashes at the end.
Summary
In this article, we covered how to make HTTP requests in Node.js To do this we needed to use our knowledge of streams from the previous parts of this series. One of the features that we’ve implemented is uploading files. To achieve that we’ve explained the multipart/form-data format. That knowledge can prove to be useful also on the front-end. By doing all this, we’ve covered another big part of the Node.js environment.
A neat and complete explanation.
Thank you so much
that code is really hard to read
but interesting info