Text view

Big_memory

- wikipedia

Big memory is a software and hardware approach that facilitates storing/retrieval/processing of large data sets (terabytes and higher). The term is akin to big data and in some instances is a form of big data processing architecture implemented in memory rather than in disks/storage. Different caches are one of the usage of the big memory.
The computer memory, namely RAM, works orders of magnitude faster than hard drives or even solid state drives, which is usually due to higher raw data throughput from tighter coupling of CPU and RAM chips (wider bus, CPU and RAM are usually installed on the same motherboard).
Locality of reference is another important characteristic for caches and fast access.
The price of the computer memory chips has significantly declined since the late 2000s, as of 2015 it is affordable to have 256 gigabytes of RAM on a server.
Currently, not many vendors have solid software big memory solutions while there are plentiful hardware options (i.e. cheap RAM planks). Terracotta has developed an "in-memory data management suite"

License information: CC BY-SA 3.0
MPAA: G
Go to source: https://en.wikipedia.org/wiki/Big_memory

Text difficulty