Concurrency vs Parallelism. What’s the difference?

Concurrency and Parallelism are not the same.

Think of it like this:

Concurrency is how you manage multiple things that need to run at the same time. Parallelism is running multiple things at the same time.

So, manage vs run.

With Concurrency, you may have two things that you need to get done but only one resource available to do them. Sort of like when one person is multitasking.

For example, you are cooking pasta for dinner!

You fill a pot full of water and put it on the stove to heat up. While the water is heating up you don’t wait for it to boil and move to another task. You select the noodles you want to cook and begin preparing the sauce. Then the water is boiling. Stop with the sauce and the noodles. And then back to the sauce and other things while the noodles cook.

You’ve taken multiple things (cooking noodles, preparing the sauce) and broken them up into smaller bits allowing you to do things concurrently instead of in sequence (finish the noodles and then finish the sauce), because in sequence would take way too long, right?

On the other hand, with Parallelism, you may have two things to do and have two resources to do them. One resource per thing. Or two people in the kitchen: one person cooking the noodles and one person preparing the sauce. That’s parallelism.

Are they the same? No!

Manage multiple things at the same time (concurrency) vs have multiple things run at the same time (parallelism).

Here is a 30 min video by Rob Pike explaining the difference. He uses the Go programming language to illustrate his ideas. But the ideas are not language dependent.

What is Concurrency? Give me a code example!

You may or may not get the Concurrency examples that Rob provides in his above video.

To better understand concurrency with a code example we can look at the idea of async await.

Different languages like JavaScript and C# (there are others) allow you to call a function and run other code while that function runs in case it happens to run long. Then of course once that long running function returns the other code pauses allowing your calling code to do whatever it needs to. Then your other code resumes and your program continues on.

Here is a JavaScript example of Concurrency:

import fetch from 'node-fetch';

async function cook() {
  await fetch("https://mikeshannon.com/heatTheWater");
  console.log("The water is now boiling");
}

cook();
console.log("Prepare the sauce");

In the above example, even though the console.log statement looks like it will complete after the cook() function, the console.log() will complete before the cook() function. That’s because we assume the cook() function is long running due to it calling fetch() and waiting for the website to return a response.

The “async” before the function keyword and the “await” before our fetch() command tells our program to begin running other things (in this case the console.log(“Prepare the sauce”)) while the fetch() is waiting on a response.

Running the code we get…

Prepare the sauce
The water is now boiling

We can take the example a little farther and cook some turkey pasta!

In this example we setup what we think will be long running tasks into async await functions: Heating the water, cooking the noodles and cooking the turkey. The other tasks of preparing the sauce and strainer are quick and won’t take as long.

import fetch from 'node-fetch';

async function heatTheWater() {
    console.log("Starting to heat the water");
    await fetch("https://mikeshannon.com/heatTheWater");
    console.log("The water is boiling");
}

async function cookTheNoodles() {
    console.log("Adding noodles");
    await fetch("https://mikeshannon.com/cookTheNoodles");
    console.log("Noodles are done cooking");
}

async function cookTheTurkey() {
    console.log("Starting to cook the turkey");
    await fetch("https://mikeshannon.com/cookTheTurkey");
    console.log("Turkey is done cooking");
}

heatTheWater().then(() => cookTheNoodles());
console.log("Prepare the sauce");
console.log("Prepare the strainer");
cookTheTurkey();

And let’s run it and get the response…

Starting to heat the water
Prepare the sauce
Prepare the strainer       
Starting to cook the turkey
The water is boiling
Adding noodles
Turkey is done cooking
Noodles are done cooking

Instead of doing one activity after another in sequence: heating the water, waiting for it to boil and then cooking the noodles, and then preparing the sauce and then turkey… our program decided to break things up and multitask (and act with concurrency) to make the most of it’s time (using async await in the code):

You can see from the response the program started with heating the water but that typically takes some time to start boiling. In the meantime it prepared the sauce and the strainer. Still waiting for the water to boil, it started cooking the turkey. Then the water started boiling, so right away it added the noodles. Then the turkey finished cooking. And finally the noodles finished cooking too.

Whew what a meal!

(Heads up each run may result in a slightly different response based on how long each fetch() request takes… but I think you get the idea)

Another example of a long running request is when a call is made to a database or any third party service and in that case you may not want that waiting to block your other code from running. And by leveraging the concept of concurrency it helps to make the most of what you have.

Concurrency let’s your code multitask. Whether or not it has one or multiple CPUs to run on.