Abstract:
The invention relates to a component such as a rotor or stator for an electrical machine. The component includes a plurality of axially adjacent stacks of laminations. At least one pair of adjacent stacks are spaced apart in the axial direction by spacer means such that a passageway or duct for cooling fluid, e.g. air, is formed therebetween. The spacer means comprises a porous structural mat of metal fibers. The cooling fluid may flow through the spaces or voids between the fibers.
Abstract:
A data prefetcher in a microprocessor having a cache memory receives memory accesses each to an address within a memory block. The access addresses are non-monotonically increasing or decreasing as a function of time. As the accesses are received, the prefetcher maintains a largest address and a smallest address of the accesses and counts of changes to the largest and smallest addresses and maintains a history of recently accessed cache lines implicated by the access addresses within the memory block. The prefetcher also determines a predominant access direction based on the counts and determines a predominant access pattern based on the history. The prefetcher also prefetches into the cache memory, in the predominant access direction according to the predominant access pattern, cache lines of the memory block which the history indicates have not been recently accessed.
Abstract:
A repositionable self-adhesive protective paper configured to protect the valuable surface during building processes is disclosed. The protective paper uses a protection material such as sturdy paper to protect the surface. The surface could be floors, walls, and countertops. The protective paper has a top surface and a bottom surface. The bottom surface is coated with an adhesive layer on at least two edges. The adhesive layer is configured to easily adhere and remain the protective paper in place. Further, the protective paper uses one or more strips of tape to secure the non-adhesive edges to the surface. After completing the building processes, the adhesive layers are removed from the protected surface without any residues or damages.
Abstract:
A microprocessor includes an instruction decoder for decoding a repeat prefetch indirect instruction that includes address operands used to calculate an address of a first entry in a prefetch table having a plurality of entries, each including a prefetch address. The repeat prefetch indirect instruction also includes a count specifying a number of cache lines to be prefetched. The memory address of each of the cache lines is specified by the prefetch address in one of the entries in the prefetch table. A count register, initially loaded with the count specified in the prefetch instruction, stores a remaining count of the cache lines to be prefetched. Control logic fetches the prefetch addresses of the cache lines from the table into the microprocessor and prefetches the cache lines from the system memory into a cache memory of the microprocessor using the count register and the prefetch addresses fetched from the table.
Abstract:
An electrical extending outlet having a rigid body slidably extendable such that, in use, extends the outlet of a wall above a restrictive surface. The rigid body has a male connector on its lower end that is rotatable to allow the rigid body to be positioned in a number of positions relative to the wall outlet. On the upper end of the rigid body is at least one female outlet which a user can insert a male plug of an electrical device. The rigid body has a fastener to attach the rigid body to the wall.
Abstract:
A data prefetcher in a microprocessor having a cache memory receives memory accesses each to an address within a memory block. The access addresses are non-monotonically increasing or decreasing as a function of time. As the accesses are received, the prefetcher maintains a largest address and a smallest address of the accesses and counts of changes to the largest and smallest addresses and maintains a history of recently accessed cache lines implicated by the access addresses within the memory block. The prefetcher also determines a predominant access direction based on the counts and determines a predominant access pattern based on the history. The prefetcher also prefetches into the cache memory, in the predominant access direction according to the predominant access pattern, cache lines of the memory block which the history indicates have not been recently accessed.
Abstract:
The invention relates to a component such as a rotor or stator for an electrical machine. The component includes a plurality of axially adjacent stacks of laminations. At least one pair of adjacent stacks are spaced apart in the axial direction by spacer means such that a passageway or duct for cooling fluid, e.g. air, is formed therebetween. The spacer means comprises a porous structural mat of metal fibres. The cooling fluid may flow through the spaces or voids between the fibres.
Abstract:
A microprocessor includes an instruction decoder for decoding a repeat prefetch indirect instruction that includes address operands used to calculate an address of a first entry in a prefetch table having a plurality of entries, each including a prefetch address. The repeat prefetch indirect instruction also includes a count specifying a number of cache lines to be prefetched. The memory address of each of the cache lines is specified by the prefetch address in one of the entries in the prefetch table. A count register, initially loaded with the count specified in the prefetch instruction, stores a remaining count of the cache lines to be prefetched. Control logic fetches the prefetch addresses of the cache lines from the table into the microprocessor and prefetches the cache lines from the system memory into a cache memory of the microprocessor using the count register and the prefetch addresses fetched from the table.
Abstract:
A data prefetcher in a microprocessor having a cache memory receives memory accesses each to an address within a memory block. The access addresses are non-monotonically increasing or decreasing as a function of time. As the accesses are received, the prefetcher maintains a largest address and a smallest address of the accesses and counts of changes to the largest and smallest addresses and maintains a history of recently accessed cache lines implicated by the access addresses within the memory block. The prefetcher also determines a predominant access direction based on the counts and determines a predominant access pattern based on the history. The prefetcher also prefetches into the cache memory, in the predominant access direction according to the predominant access pattern, cache lines of the memory block which the history indicates have not been recently accessed.
Abstract:
A microprocessor includes a cache memory and a data prefetcher. The data prefetcher detects a pattern of memory accesses within a first memory block and prefetch into the cache memory cache lines from the first memory block based on the pattern. The data prefetcher also observes a new memory access request to a second memory block. The data prefetcher also determines that the first memory block is virtually adjacent to the second memory block and that the pattern, when continued from the first memory block to the second memory block, predicts an access to a cache line implicated by the new request within the second memory block. The data prefetcher also responsively prefetches into the cache memory cache lines from the second memory block based on the pattern.