
yield - An important generator function in JavaScript
Before we begin with the theory, let me give you a quick gist of terminologies used to explain the concept.
Generator function
A single function which can be called repeatedly i.e., is iterative in nature and the execution is not a continuous process. Each generator can be iterated once and upon execution returns a new generator.
Generator functions
are powerful tool for asynchronous programming as they alleviate the problems with callbacks.
Syntax
function* name([param[, param[, ... param]]]) {
statements
}
Producer-Consumer Pattern
The producer’s job is to generate the data, buffer it and start again. The consumer’s job is to consume the data one piece at a time.
yield
The yield
keyword is a generator function and is beneficial in producer-consumer pattern. It is used to pause and recommence a generator function. It cannot be called from nested functions or callbacks; it can be called from the containing generator function only. Execution of yield
keyword returns an object with two properties: value
and done
.
Syntax
[rv] = yield [expression]
yield*
The yield* expression is used to delegate to another iterable object or a generator.
Syntax
yield* expression
Now, lets take a scenario where you need to know how many files are there on the hard-disk and process them. We may think of two possible approaches here.
- Using an Array
- Use a generator function - yield
Let’s learn the pros and cons of both the approaches.
1. Using an Array
If the count is as high as millions, enumerating one million array items, reading them and processing them is not only a waste of CPU time but also memory (consumption would be high). This approach is recommended for smaller chunks of data.
This is how we would write the logic:
function readAllFiles(root) {
var a = [];
for(var item of fs.readDir(root)) {
a.push(item);
if(fs.isDirectory(item)){
var children = readAllFiles(item);
for (const iterator of children) {
a.push(iterator);
}
}
}
return a;
}
function processFiles() {
var a = readAllFiles();
for (const iterator of a) {
console.log(a.name);
}
}
For e.g. if each file is ~ 100 bytes, to store 1 million files, we would need 100 mb of memory in addition to storage for book keeping (pointer reference etc.) So, this would amount to approx. 200-500 mb.
As shown in the above piece of code, not only will this approach allocate too many arrays but also if the execution process fails for any reason, essentially no work will be done as we might run into memory errors even before a single file is processed. Also, reading time and again is an inefficient approach and would result in to cache busting.
- Cache busting – A process of uploading a new file and replacing a file that is already cached.
So, to conclude large array enumeration is a bad practice and an expensive operation.
2. Use a generator function - yield
In such scenarios, generator function comes handy. The generator function will enumerate and return items so that the generator function can process the items and request generator to give next items.
function* readAllFiles(root) {
for(var item of fs.readDir(root)) {
yield item;
if(fs.isDirectory(item)) {
yield *readAllFiles(item);
}
}
}
function processFiles() {
var a = readAllFiles();
for (const iterator of a) {
console.log(a.name);
}
As we can see in the above piece of code, no array is allocated. Also, for any reason even if only 50% of items are returned, they will be processed by processFile() function. The processFiles() function will not wait for entire list to be read to process it.
Hope we learnt something new today. Keep Learning!
![]() | ![]() | ![]() | ![]() |
Like | Comment | Save | Share |