paint-brush
Understanding Concurrency and Multithreading in iOS Developmentby@mniagolov
1,089 reads
1,089 reads

Understanding Concurrency and Multithreading in iOS Development

by Maksim NiagolovJuly 3rd, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Multithreading describes the ability of a processor to handle multiple tasks at the same time. This enables the user to listen to music in one program while writing code in another. In this article, we will look at the basics of multithreading and concurrency to provide you with the technical knowledge needed to apply them to your iOS application.
featured image - Understanding Concurrency and Multithreading in iOS Development
Maksim Niagolov HackerNoon profile picture

Imagine you are using your, let’s say, “Hello World” app, and every time you make some changes, the application freezes and you cannot perform any other tasks for several seconds.


This is a scenario that would occur if it wasn’t for effective multithreading. Multithreading describes the ability of a processor to handle multiple tasks at the same time. This enables the user to listen to music in one program while writing code in another.


In this article, we will look at the basics of multithreading and concurrency to provide you with the technical knowledge needed to apply them to your iOS application.



Processors, programs, processes, and threads

To understand multithreading and concurrency, we must first become clear on the terminology used around these topics.


The CPU (central processing unit) is the main chip in a machine which is responsible for carrying out all tasks such as processing commands as well as reading and altering data.

It is often used interchangeably with the term processor, which may also describe a coprocessor used to supplement the functions of the CPU.


multi-core processor describes a processor which contains two or more processing units called cores. It is possible to execute tasks on separate cores at the same time.


A program is a file that is executable and is made up of processes and threads.


A process is an instance of a program that is being executed. For example, if you click on the Safari icon, a process will be started which runs the Safari program. Processes are scheduled for execution by the CPU.


A thread is the smallest executable element of a process. A process may have one main thread and several background threads.



Types of threads

Main thread / UI thread

This thread runs when starting the application and listens for user interaction. Changes in the UI, such as the click of a button, are run on the UI thread. It is important not to add a lot of load to it as this may cause the app to freeze. All tasks which are run on the UI thread should respond quickly to ensure minimal waiting times for the user.


Background thread

Depending on the requirements of the program, a background thread may be downloading a large file or making a call to an external interface. As a task like this may take several seconds, it is important that the user is not blocked while waiting for it to complete. Once the task is finished, the UI can be updated with new data.



Concurrency, parallelism, and multithreading

Imagine having multiple conversations on your messenger app and switching between them but never actually chatting with two different people at the same time.


The ability of a program to handle many tasks at once and create the illusion that they are all being executed at the same time when they are just switching very quickly is called concurrency.

It is important not to confuse concurrency with parallelism. In the parallelism model, several tasks are running simultaneously.


Both models fall under the concept of multithreading which describes several threads working together to reach a common goal.


When introducing multithreading to a program, a combination of concurrency and parallelism may be used.


An example of concurrency

Let us look at what happens in the background when you are running your app on one processor.

When a code file is saved, bytes are written to a physical disk. Such an I/O operation is expensive, so while it takes place, the CPU can work on something else.


The I/O thread is switched out for the UI thread, ensuring that when you click on your screen, the app is still responsive instead of being frozen because the I/O operation is not finished.

The I/O process gets scheduled by the CPU to be completed at a later time. The priority of each process waiting to be executed is determined by a scheduling algorithm.


Scheduling

Concurrency creates the illusion of multitasking even though the processor is executing only one thread at a time. Each process is allocated some time and then gets switched out. There are many different methods of scheduling which may be chosen by the developer. Some common examples are:


  • FIFO (First In, First Out): This scheduling method executes processes in the order in which they arrive in the queue. Once a process is scheduled, it runs until completion unless it is blocked.


  • Round Robin: A fixed time slice, also known as the quantum, is allocated to each of the processes which are executed in a cyclical manner. When one process has reached the end of its time slice, the next one in the queue is executed for the given quantum.


  • Shortest remaining time first: The next process in the queue is the one which is estimated to have the least amount of time remaining to be completed. Using this algorithm requires knowledge about how much time each process will take.


  • Priority-based scheduling: Each process is given a priority. The process with the highest priority is executed first, while processes with the same priority are executed based on the FIFO algorithm. The priority may be decided based on requirements regarding memory or time.


Which scheduling algorithm should be used for a given program must be carefully considered and will depend on its unique requirements.



Potential Problems

When multiple threads are running concurrently on shared memory, they must be controlled and used in the most advantageous way. Determining how to manage them efficiently can be difficult.


It is importantt to not keep adding more and more new threads to a program. More threads can mean more problems and the way they work together must be designed very carefully.

If threads are not managed correctly, there are several problems that can occur.

Deadlock

Deadlocks happen when two or more threads are unable to continue since they require a resource that is held by another thread. That thread requires a resource that is held by one of the threads waiting for it to be completed. In essence, each thread is waiting for another one to finish, halting any progress.

Race conditions

Race conditions occur when threads execute a section of code that can be managed concurrently by several threads, making them “race” to alter shared resources. This causes data to be inconsistent as the program output changes depending on which thread wins the race.

Starvation

Starvation happens when a thread never gets any CPU time or access to shared resources as other processes are given a higher priority. The starved thread remains at the end of the scheduling queue and is never executed.

Livelock

A livelock occurs when two threads are executing actions in response to one another instead of continuing with their task. Think of it like two people trying to pass each other in a narrow hallway and both moving in the same direction, blocking each other. Consequently, they both move to the other side, blocking each other again and so on.


Multithreading on multiple cores

Since the introduction of machines with multiple cores, it has been possible to have a dedicated processor run each thread, enabling parallelism.


With a single core, the application would need to switch back and forth to create the illusion of multitasking. With multiple cores, the underlying hardware can be used to run each thread on a dedicated core.


This allows the application to take full advantage of the available processing power, making the program more efficient and minimizing the risk of problems associated with concurrency as not all threads are running on the same core.


Conclusion

With users’ attention spans decreasing, an app that takes a long time to respond to will result in many of them deleting it, impeding any success the app may have had. As standards are constantly being raised, an issue such as the UI being unresponsive while a file is being downloaded in the background is unacceptable.


Therefore, it is important to optimize performance using multithreading. Multithreading makes applications more efficient and user-friendly and ensures that no processing power is wasted.

Managing multiple threads correctly can be challenging since issues such as deadlocks can occur if processes are not scheduled appropriately.


If you want to dive deeper and apply these concepts to your iOS application, it is recommended to use interfaces provided by Apple which support efficient concurrency, notably Grand Central Dispatch and NSOperation. Learn more about these technologies by reading my articles.


Also published here.


The lead image for this article was generated by HackerNoon's AI Image Generator via the prompt "iOS development".