摘要:
Embodiments are provided for a compress and forward relaying scheme in joint multi-cell processing. A plurality of base stations receive similar combinations of user signals from a plurality of users, compress the signals using quantization, and relay the signals over respective backhaul links to a processor in the network for decoding the signal. The processor determines suitable quantization noise levels for the backhaul links according to a weighted sum-rate maximization function for optimizing the quantization noise levels, subject to a backhaul sum capacity constraint on the backhaul links. The determined quantization noise levels are sent to the base stations, which then quantize the received combinations of user signals according to the quantization noise levels and relay the quantized signals to the processor. The quantization is according to a Wyner-Ziv coding or a single user compression algorithm that excludes statistical correlations between the user signals at the base stations.
摘要:
Embodiments are provided for a compress and forward relaying scheme in joint multi-cell processing. A plurality of base stations receive similar combinations of user signals from a plurality of users, compress the signals using quantization, and relay the signals over respective backhaul links to a processor in the network for decoding the signal. The processor determines suitable quantization noise levels for the backhaul links according to a weighted sum-rate maximization function for optimizing the quantization noise levels, subject to a backhaul sum capacity constraint on the backhaul links. The determined quantization noise levels are sent to the base stations, which then quantize the received combinations of user signals according to the quantization noise levels and relay the quantized signals to the processor. The quantization is according to a Wyner-Ziv coding or a single user compression algorithm that excludes statistical correlations between the user signals at the base stations.
摘要:
System and method embodiments are provided to optimize uplink multiple-input-multiple-output (MIMO) beamforming for uplink and compression for fronthaul links transmission in cloud radio access network (C-RANs). In an embodiment, cloud-computing based central processor (CP) obtains channel state information for a mobile device (MD) being served by a plurality of access points (APs) in a C-RAN, and generates a channel gain matrix in accordance with the channel state information. A weighted sum-rate maximization model is then established using the channel gain matrix in accordance with power constraints of transmission from the MD to the APs and capacity constraints of fronthaul links connecting the APs to the CP. The CP calculates a transmit beamforming vector for the MD and a quantization noise covariance matrix for the APs jointly by applying a weighted minimum-mean-square-error successive convex approximation algorithm, or separately by applying an approximation algorithm, to solve the weighted sum-rate maximization model.
摘要:
System and method embodiments are provided to optimize uplink multiple-input-multiple-output (MIMO) beamforming for uplink and compression for fronthaul links transmission in cloud radio access network (C-RANs). In an embodiment, cloud-computing based central processor (CP) obtains channel state information for a mobile device (MD) being served by a plurality of access points (APs) in a C-RAN, and generates a channel gain matrix in accordance with the channel state information. A weighted sum-rate maximization model is then established using the channel gain matrix in accordance with power constraints of transmission from the MD to the APs and capacity constraints of fronthaul links connecting the APs to the CP. The CP calculates a transmit beamforming vector for the MD and a quantization noise covariance matrix for the APs jointly by applying a weighted minimum-mean-square-error successive convex approximation algorithm, or separately by applying an approximation algorithm, to solve the weighted sum-rate maximization model.
摘要:
Embodiments are provided for a compress and forward relaying scheme in joint multi-cell processing. A plurality of base stations receive similar combinations of user signals from a plurality of users, compress the signals using quantization, and relay the signals over respective backhaul links to a processor in the network for decoding the signal. The processor determines suitable quantization noise levels for the backhaul links according to a weighted sum-rate maximization function for optimizing the quantization noise levels, subject to a backhaul sum capacity constraint on the backhaul links. The determined quantization noise levels are sent to the base stations, which then quantize the received combinations of user signals according to the quantization noise levels and relay the quantized signals to the processor. The quantization is according to a Wyner-Ziv coding or a single user compression algorithm that excludes statistical correlations between the user signals at the base stations.