Parallel is always better than Serial... right?
Some processes are inherently not parallelizable, due to presence of data dependency.
So, the takeaway is that if some tasks are executing concurrently, they might be executing in parallel as well. This is my perception and understanding about the parallel and concurrent worlds.
Simultaneity is strikingly similar to parallelism. The former is used predominantly to describe timing of events in day-to-day conversations.
Asynchronous processing just refers to the fact that some tasks are performed in the background without you having to explicitly wait for the results. You are informed of the results at a later point in time.
Mathematically speaking:
NO
Well, in the most general case, parallel computing is better than serial computing in terms of speed and throughput. Sometimes, we have to make other considerations too.
As a comparison, consider computer networking where serial transmissions are straight-forward and faster than their parallel SCSI counterparts!
As a comparison, consider computer networking where serial transmissions are straight-forward and faster than their parallel SCSI counterparts!
Some processes are inherently not parallelizable, due to presence of data dependency.
(Two bank account withdrawals from different locations which may lead to negative account balance if done simultaneously! Anyway, such a pair of withdrawals with critical section management using semaphores/mutexes conceptually and momentarily reduces to serial execution...)
On a lighter note, the process of Sneezing and keeping your eyes open is not parallelizable for example!
Before jumping into the concepts and principles of parallelizing a given task, let us go through some interesting set of (controversial) keywords:
- Serial
- Concurent
- Parallel
- Simultaneous
- Asynchronous
The above words are prefixes to words like computing, execution or processes. There are a few subtle details differentiating the above set of words.
Concurrency means that two or more tasks are in progress at a given point of time.
Parallelism means that two or more tasks are simultaneously happening at a given point of time(in distinct processing units)
Concurrency means that two or more tasks are in progress at a given point of time.
Parallelism means that two or more tasks are simultaneously happening at a given point of time(in distinct processing units)
Different processing units are necessary for parallelism.
Multiple processing units is not necessitated by concurrency.
Parallelism implies Concurrency.
Concurrency is a necessary condition for Parallelism.
So, the takeaway is that if some tasks are executing concurrently, they might be executing in parallel as well. This is my perception and understanding about the parallel and concurrent worlds.
Simultaneity is strikingly similar to parallelism. The former is used predominantly to describe timing of events in day-to-day conversations.
Asynchronous processing just refers to the fact that some tasks are performed in the background without you having to explicitly wait for the results. You are informed of the results at a later point in time.
Mathematically speaking:
- Parallel lines are purely independent and never meet (never interact or influence each other (at least according to the parallel lines never meet hypothesis)! Perfect parallelism is rare as there is almost always a dependency in the design of algorithms/data structures.
Perfect/Embarrassing parallelism with two processes |
- Concurrent lines converge towards a point and thus "collide" at this point of intersection. This analogy can be used to reason out the interruptibility among concurrent processes/tasks.
Three processes with a single point of communication and information exchange |
- A simultaneous set of equations is said to exists if it binds the dependant variable(s) with certain constraints at the same time. So, a solution to a simultaneous set of equations imply a point of communication between the processing units(hardware/software). No solution to the set of equation implies perfect parallelism.
Simultaneous equations in a pleasing digital handwriting (not mine!) |
- Asynchronous processing decouples the calling and the caller processes with respect to time, so that the invoker need not waste time by waiting idly for the other task to finish. In programmer terms, an asynchronous call eliminates the need for a blocking wait, freeing up the calling thread and allowing efficient usage of resources.
Asynchronous processing for computational efficiency |
Comments
Post a Comment