摘要:
In a fuel injection system for injecting fuel into a combustion chamber of an internal combustion engine by means of a dual fluid injector which includes a nozzle with a nozzle needle normally closing the nozzle and which has formed around the nozzle needle an annular space which is in communication with a fluid source and a pressure chamber in communication with the annular space and the working space of a plunger disposed in the injector so as to be movable between upper and lower end positions but being biased to its upper end position, the plunger working space is in communication with a fuel source via a fuel supply line which includes a control valve and an electromagnetically controlled valve is provided for controlling the admission of a hydraulic operating fluid to the plunger for moving the plunger into the working space for the ejection of the fuel and the fluid from the pressure chamber and the annular space around the nozzle and the fuel supply line includes an electronically controllable valve adapted to close the fuel supply line for a controllable period of time while the plunger is returned to its upper end position to thereby create a vacuum in the working space for drawing an amount of the fluid into the annular space around the nozzle which depends on the time for which the fuel supply line is closed by the control valve.
摘要:
In a fuel injection system for an internal-combustion engine which includes a dual-fluid nozzle which is supplied with fuel by a high-pressure pump and with water by a feed pump providing a substantially lower pressure than the high-pressure pump and which has disposed in a nozzle body a nozzle needle which is spring biased into a closing position where one end of the needle is seated on a valve seat, a fuel supply passage extends to an annular space around the valve needle and a branch passage leads to a control chamber formed at the other end of the valve needle and includes a solenoid valve for controlling the application of pressurized fluid to, and the release thereof from, the other end of the nozzle needle, an additional fluid line extends from the feed pump to the annular space and includes a check valve for permitting flow of additional fluid only toward the annular space and a pressure relief line in communication with the fuel supply passage via a control valve, and a control device is provided for operating the solenoid valve for timed relief of fluid pressure from the control chamber for unseating the nozzle needle and for operating the control valve to either supply fuel under pressure to the annular chamber or releasing pressure from the annular space so as to permit feeding of water into the annular space for subsequent injection, together with the fuel, from the dual-fluid nozzle.
摘要:
In a fuel injection system for a Diesel engine with a high pressure fuel pump supplying fuel under pressure to a common high pressure fuel supply conduit from which the fuel is admitted to a number of fuel injectors having fuel injection control needles engaged by springs so as to be normally seated on a valve seat and a control needle actuator for lifting the control needle off the valve seat under the control of an electronic control unit, the needle includes a cylindrical needle body movable within a cylinder and has slot-shaped orifices formed in its outer surface which are fully covered when the control needle is seated but which are exposed to a degree controllable by the needle actuator for adjustment of the orifice sizes depending on engine operating parameters.
摘要:
A process for reducing nitrogen oxides in the exhaust gas of a combustion device, particularly of an internal-combustion engine, includes melting solid pure urea to obtain a molten product, and adding the molten product to the exhaust gas as the reducing agent for reducing nitrogen oxides. The corresponding device includes a device for the controlled liquefaction of solid pure urea and the injection of the molten product into the exhaust gas.
摘要:
A data recovery system and method are disclosed. Primary data is stored a database in byte-addressable NVRAM, where the database includes one or more persistent tables of data in a byte-addressable, RAM format, and a persistent memory allocator that maps persistent memory pointers of the persistent memory to virtual memory pointers of a virtual memory associated with the database. Secondary data is stored in volatile DRAM. A failure recovery includes recovering the persistent memory allocator, mapping the persistent memory to the virtual memory to recover primary data using their persistent memory pointers, translating the persistent memory pointers to virtual memory pointers, undoing changes to the primary data made by unfinished transactions of the query execution at the time of failure of one of the one or more queries, and reconstructing the secondary data from the primary data.
摘要:
A system is described for processing schema updated in a zero-downtime environment. A technique includes establishing an application session to access a database, receiving a schema update, converting the database to an updated database according to the schema update after establishing the application session, generating a temporary compensation view from the schema update, the temporary compensation view containing compensation logic to locate database objects belonging to the database, receiving a database transaction from the application session to access a database object in the database; and processing the compensation logic to locate the database object.
摘要:
Methods, systems, and computer-readable storage media for providing at least one parameter for use with a forecast model. Implementations include actions of receiving a first context vector, the first context vector including a plurality of context attributes that describe a first context, retrieving a first parameter vector from a repository based on the first context vector, the repository electronically storing a plurality of parameter vector, each parameter vector being associated with a respective context and including one or more parameters, parameterizing the forecast model based on parameters provided in the first parameter vector to provide a parameterized forecast model, optimizing the parameterized forecast model to provide an optimized forecast model, and forecasting one or more values using the optimized forecast model.
摘要:
One embodiment of the present invention provides a method for incrementally maintaining a Bernoulli sample S with sampling rate q over a multiset R in the presence of update, delete, and insert transactions. The method includes processing items inserted into R using Bernoulli sampling and augmenting S with tracking counters during this processing. Items deleted from R are processed by using the tracking counters and by removing newly deleted items from S using a calculated probability while maintaining a degree of uniformity in S.
摘要:
A system, computer-implemented method, and computer-readable storage medium for generating a block-based index, are provided. A block index is generated where the block index comprises a plurality of blocks and a block corresponds to a section of a graph column that stores a value. A block range vector is also generated for the index where the block range vector includes range information for the block that corresponds to the section of the graph and where the block-based index facilitates traversal of the graph column that searches for the value by constraining the traversal to the section of the graph.
摘要:
Technologies are disclosed for generating query execution plans optimized for parallel execution for programs having both core database relational functions and user-defined functions. A variety of optimization strategies can be employed to improve performance in a parallel execution scenarios. A flexible range of permitted partition arrangements can be specified as acceptable to parallelized instances of the user-defined function. The optimizer can leverage such information when constructing an optimized query execution plan. Partitioning arrangements or other properties can be leveraged to avoid additional or unnecessary processing.