Abstract:
A method and an apparatus for implementing a buffer cache for a persistent file system in a non-volatile memory is provided. A set of data is maintained in one or more extents in a non-volatile random-access memory (NVRAM) of a computing device. At least one buffer header is allocated in a dynamic random-access memory (DRAM) of the computing device. In response to a read request by a first process executing on the computing device to access one or more first data blocks in a first extent of the one or more extents, the first process is granted direct read access of the first extent in the NVRAM. A reference to the first extent in the NVRAM is stored in a first buffer header. The first buffer header is associated with the first process. The first process uses the first buffer header to directly access the one or more first data blocks in the NVRAM.
Abstract:
A method and apparatus for implementing a buffer cache for a persistent file system in non-volatile memory is provided. A set of data is maintained in one or more extents in non-volatile random-access memory (NVRAM) of a computing device. At least one buffer header is allocated in dynamic random-access memory (DRAM) of the computing device. In response to a read request by a first process executing on the computing device to access one or more first data blocks in a first extent of the one or more extents, the first process is granted direct read access of the first extent in NVRAM. A reference to the first extent in NVRAM is stored in a first buffer header. The first buffer header is associated with the first process. The first process uses the first buffer header to directly access the one or more first data blocks in NVRAM.
Abstract:
Techniques are provided for managing cached data objects in a mixed workload environment. In an embodiment, a database system receives request to access a target data object. The database system determines whether the request to access the target data object is associated with a first type of workload or a second type of workload. In response to determining that the request is associated with the first type of workload, the target data object replaces a least recently used data object in a cache. In response to determining that the request is associated with the second type of workload, the target data object is cached based on an associated access-level value.
Abstract:
Techniques herein store database blocks (DBBs) in byte-addressable persistent memory (PMEM) and prevent tearing without deadlocking or waiting. In an embodiment, a computer hosts a DBMS. A reader process of the DBMS obtains, without locking and from metadata in PMEM, a first memory address for directly accessing a current version, which is a particular version, of a DBB in PMEM. Concurrently and without locking: a) the reader process reads the particular version of the DBB in PMEM, and b) a writer process of the DBMS replaces, in the metadata in PMEM, the first memory address with a second memory address for directly accessing a new version of the DBB in PMEM. In an embodiment, a computer performs without locking: a) storing, in PMEM, a DBB, b) copying into volatile memory, or reading, an image of the DBB, and c) detecting whether the image of the DBB is torn.
Abstract:
Techniques herein store database blocks (DBBs) in byte-addressable persistent memory (PMEM) and prevent tearing without deadlocking or waiting. In an embodiment, a computer hosts a DBMS. A reader process of the DBMS obtains, without locking and from metadata in PMEM, a first memory address for directly accessing a current version, which is a particular version, of a DBB in PMEM. Concurrently and without locking: a) the reader process reads the particular version of the DBB in PMEM, and b) a writer process of the DBMS replaces, in the metadata in PMEM, the first memory address with a second memory address for directly accessing a new version of the DBB in PMEM. In an embodiment, a computer performs without locking: a) storing, in PMEM, a DBB, b) copying into volatile memory, or reading, an image of the DBB, and c) detecting whether the image of the DBB is torn.
Abstract:
A method and apparatus for implementing a buffer cache for a persistent file system in non-volatile memory is provided. A set of data is maintained in one or more extents in non-volatile random-access memory (NVRAM) of a computing device. At least one buffer header is allocated in dynamic random-access memory (DRAM) of the computing device. In response to a read request by a first process executing on the computing device to access one or more first data blocks in a first extent of the one or more extents, the first process is granted direct read access of the first extent in NVRAM. A reference to the first extent in NVRAM is stored in a first buffer header. The first buffer header is associated with the first process. The first process uses the first buffer header to directly access the one or more first data blocks in NVRAM.
Abstract:
Techniques are provided for using an intermediate cache between the shared cache of an application and the non-volatile storage of a storage system. The application may be any type of application that uses a storage system to persistently store data. The intermediate cache may be local to the machine upon which the application is executing, or may be implemented within the storage system. In one embodiment where the application is a database server, the database system includes both a DB server-side intermediate cache, and a storage-side intermediate cache. The caching policies used to populate the intermediate cache are intelligent, taking into account factors that may include which object an item belongs to, the item type of the item, a characteristic of the item, or the type of operation in which the item is involved.
Abstract:
Techniques are provided for managing cached data objects in a mixed workload environment. In an embodiment, a database system receives request to access a target data object. The database system determines whether the request to access the target data object is associated with a first type of workload or a second type of workload. In response to determining that the request is associated with the first type of workload, the target data object replaces a least recently used data object in a cache. In response to determining that the request is associated with the second type of workload, the target data object is cached based on an associated access-level value.
Abstract:
A method and apparatus for implementing a buffer cache for a persistent file system in non-volatile memory is provided. A set of data is maintained in one or more extents in non-volatile random-access memory (NVRAM) of a computing device. At least one buffer header is allocated in dynamic random-access memory (DRAM) of the computing device. In response to a read request by a first process executing on the computing device to access one or more first data blocks in a first extent of the one or more extents, the first process is granted direct read access of the first extent in NVRAM. A reference to the first extent in NVRAM is stored in a first buffer header. The first buffer header is associated with the first process. The first process uses the first buffer header to directly access the one or more first data blocks in NVRAM.
Abstract:
Techniques are provided for using an intermediate cache between the shared cache of an application and the non-volatile storage of a storage system. The application may be any type of application that uses a storage system to persistently store data. The intermediate cache may be local to the machine upon which the application is executing, or may be implemented within the storage system. In one embodiment where the application is a database server, the database system includes both a DB server-side intermediate cache, and a storage-side intermediate cache. The caching policies used to populate the intermediate cache are intelligent, taking into account factors that may include which object an item belongs to, the item type of the item, a characteristic of the item, or the type of operation in which the item is involved.