Return to Concurrency, Concurrent Programming, Parallel Programming, Multi-Threaded Programming, Multi-Core Programming, Asynchronous Programming, Concurrency and Functional Programming, Concurrency and Security, Concurrency and Data Science - Concurrency and and Databases, Concurrency Glossary, Concurrency Bibliography, GitHub Concurrency, Awesome Concurrency, Concurrency Topics
Also called: Atomic or Atomic operation. Linearize, Linearizing, Linearizes, Linearized
See also: Atomicity (database systems)
In concurrent programming, an operation (or set of operations) is linearizable if it consists of an ordered list of invocation and response events, that may be extended by adding response events such that:
- The extended list can be re-expressed as a sequential history (is serializable).
- That sequential history is a subset of the original unextended list.
Informally, this means that the unmodified list of events is linearizable if and only if its invocations were serializable, but some of the responses of the serial schedule have yet to return.
In a concurrent system, processes can access a shared object at the same time. Because multiple processes are accessing a single object, a situation may arise in which while one process is accessing the object, another process changes its contents. Making a system linearizable is one solution to this problem. In a linearizable system, although operations overlap on a shared object, each operation appears to take place instantaneously. Linearizability is a strong correctness condition, which constrains what outputs are possible when an object is accessed by multiple processes concurrently. It is a safety property which ensures that operations do not complete unexpectedly or unpredictably. If a system is linearizable it allows a programmer to reason about the system.
Concurrency: Concurrency Programming Best Practices, Concurrent Programming Fundamentals, Parallel Programming Fundamentals, Asynchronous I/O, Asynchronous programming (Async programming, Asynchronous flow control, Async / await), Asymmetric Transfer, Akka, Atomics, Busy waiting, Channels, Concurrent, Concurrent system design, Concurrency control (Concurrency control algorithms, Concurrency control in databases, Atomicity (programming), Distributed concurrency control, Data synchronization), Concurrency pattern, Concurrent computing, Concurrency primitives, Concurrency problems, Concurrent programming, Concurrent algorithms, Concurrent programming languages, Concurrent programming libraries, Java Continuations, Coroutines, Critical section, Deadlocks, Decomposition, Dining philosophers problem, Event (synchronization primitive), Exclusive or, Execution model (Parallel execution model), Fibers, Futures, Inter-process communication, Linearizability, Lock (computer science), Message passing, Monitor (synchronization), Computer multitasking (Context switch, Pre-emptive multitasking - Preemption (computing), Cooperative multitasking - Non-preemptive multitasking), Multi-threaded programming, Multi-core programming, Multi-threaded, Mutual exclusion, Mutually exclusive events, Mutex, Non-blocking algorithm (Lock-free), Parallel programming, Parallel computing, Process (computing), Process state, Producer-consumer problem (Bounded-buffer problem), Project Loom, Promises, Race conditions, Read-copy update (RCU), Readers–writer lock, Readers–writers problem, Recursive locks, Reducers, Reentrant mutex, Scheduling (computing), Semaphore (programming), Seqlock (Sequence lock), Serializability, Shared resource, Sleeping barber problem, Spinlock, Synchronization (computer science), System resource, Thread (computing), Tuple space, Volatile (computer programming), Yield (multithreading), Concurrency bibliography, Manning Concurrency Async Parallel Programming Series, Concurrency glossary, Awesome Concurrency, Concurrency topics, Functional programming. (navbar_concurrency - see also navbar_async, navbar_python_concurrency, navbar_golang_concurrency, navbar_java_concurrency)
© 1994 - 2024 Cloud Monk Losang Jinpa or Fair Use. Disclaimers
SYI LU SENG E MU CHYWE YE. NAN. WEI LA YE. WEI LA YE. SA WA HE.