Designing for low power can involves counterintuitive insights. Very often you have to burn power by running as fast as possible so that a device can sleep for longer rather than back off on the gas in the hope that lower power for longer will translate into less energy overall. TD-LTE is the latest manifestation of this phenomenon according to Nujira.
The pioneer of envelope tracking in mobile, which uses voltage modulation to improve the efficiency of wideband power amplifiers, has some skin in this game as the technology lends itself to higher peak power levels in mobile handsets, but vice president of sales and marketing Jeremy Hendy claims research from Denmark helps back up the position.
The issue becomes more apparent in TD-LTE, favored by China and Clearwire in the US, rather than the FD-LTE form of 4G communications because of its time-sliced nature.
“The same bandwidth is shared between transmit and receive,” says Hendy. “What’s interesting is the way that the operators are deploying that. You can choose between different ratios of download and upload speed. China Mobile and Clearwire in the US are focusing on a ratio of four to one as it represents users’ average use of data.”
Although the timeslot for the uplink is restricted, the effective bandwidth need not see the same reduction compared to FD-LTE implementations as there is the opportunity to temporarily boost transmit power if conditions for that make sense. “That lends support for envelope tracking as you can spend more time in the high-power, high-bandwidth corner. And it’s actually the more energy-efficient way to transmit data. It costs you twice the peak power to get five times the bandwidth, so it works out 40 per cent more energy efficient as you don’t have the keep the modem working for so long.”
Hendy claims a similar approach could work for FD-LTE implementations and help operators achieve better utilization of basestation capacity although it will mean closer cooperation between operators and equipment makers. “Today people are backing off on the power when they are allowed to but our view is that it decreases battery life,” says Hendy.
Hendy cites research from the University of Aalborg in Denmark performed a couple of years ago as backing for the idea that allowing higher peak power works better in a data-driven cellular environment.
Resource allocation in LTE is performed by the basestation by a scheduler that attempts to balance the needs of users across a cell. The operation of this scheduler is generally a closely guarded secret as small software changes can lead to big gains in average capacity utilization under real-world conditions.
Even the operators generally don’t have a clear picture of why a basestation makes the decisions it does. However, by taking account of the handsets that an operator favors, which may have higher peak power budgets than average through the use of schemes such as envelope tracking, a basestation could not only improve energy efficiency for some of the users but allow more data to be transmitted overall. Handsets in good parts of the cell would be encouraged to increase their transmit levels so that they use fewer resource blocks overall, allowing more devices to access the remaining.
Hendy claims the current tendency to back off on power is largely the result of average power-mode power amplifiers that can sustain the peak transmit levels for which the LTE standard allows.