Data Access Concurrency
Definition: Data access concurrency refers to the situation where multiple users, processes, or threads attempt to read or modify the same data at the same time. Proper concurrency management is essential to ensure data consistency, security, and reliability.
Typical Problems
- Race conditions: two processes act simultaneously and the result is unexpected or incorrect.
- Dirty reads: reading data before a transaction has been committed.
- Lost updates: one change overwrites another process’s change.
- Inconsistencies: data no longer reflects the real state after concurrent operations.
Examples
In a banking database, if two people try to withdraw money from the same account at the same time, without concurrency control, it may allow more money to be withdrawn than is actually available.
In a multithreading program, if two threads increment the same global variable at the same time, some increments may be “lost”.
Management Methods
- Locks and semaphores: ensure that only one process accesses the data at a time.
- ACID transactions: guarantee atomicity, consistency, isolation, and durability.
- Pessimistic concurrency control: locks the data as soon as someone starts modifying it.
- Optimistic concurrency control: allows multiple accesses and checks for conflicts before committing changes.
See also: ACID Transactions, Race conditions.
« Back to Glossary Index