ConnectivityDesign engineeringElectricalElectronicsIndustry 4.0

Battery life analysis and maximization for wireless IoT sensor nodes and wearables

In addition to proper hardware design and power consumption testing, an accurate estimation of battery life is an important aspect of designing wireless Internet of Things (IoT) sensor nodes and wearables. For some medical or industrial devices, in particular, users’ lives can be at stake if the battery does not live up to expectations. Some of these devices do not have a low battery indicator requiring users to depend heavily on the warranted battery life specifications and making it crucial that the battery life claim is accurate.

The following discussion presents some tools IoT device and wearables developers can use to efficiently simulate, accurately predict, and maximize the device’s battery life. Typical challenges include:

  • How to measure the battery life to substantiate the battery life expectancy claim to customers?
  • What critical events contribute to the power consumption, and when frequently, these events happen? How to set a trigger for events of interest?
  • What design changes or tradeoffs to make to optimize battery life?

Battery simulation

How low can the battery voltage drop before an IoT device turns off? Gauging battery performance at different stages of battery discharge is difficult, as it requires instrumentation that can accurately simulate battery performance.

Keithley’s 2281S-20-6 Battery Simulator makes it easy to model any type of battery. (Image: Keithley)

To address this, Keithley’s 2281S-20-6 Battery Simulator makes it easy to model any type of battery. It allows designers to efficiently test prototype IoT devices in any battery state, with high repeatability to estimate battery life effectively. Combining a 2281S-20-6 Battery Simulator with a DMM7510 Graphical Sampling Multimeter will give designers a complete solution for assessing power consumption and battery life of IoT prototypes.

Accel instruments offer the TS250 and the TS200 modulated power supplies that can mimic sink and source current the same way a real battery performs. They feature a DC OFFSET knob that can adjust the voltage to emulate battery voltage changes. These units are especially useful for simulating a battery for charger circuit testing.

To estimate the battery life of a new IoT device, designers need to analyze the power consumption of the various subsystems such as RF radio, display, beeper, vibrator, etc. The X8712A from Keysight Technologies is designed to enable designers to perform the needed measurements, it consists of the X8712A-DPA DC Power Analyzer, source measure units, electronic load modules, X8712AD RF Event Detector, and the KS833A1B Event-Based Power Analysis software.

The X8712A from Keysight Technologies is designed to enable designers to perform the needed measurements to estimate the battery life of an IoT device (Image: Keysight Technologies)

The X8712A helps designers determine the total power consumption of a device using the X8712A-DPA DC Power Analyzer and its Source Measure Unit (SMU) and electronic load modules, RF event detector, together with the KS833A1B Event-based Power Analysis Software. It captures RF and/or DC events from an IoT device, synchronously matches the events to the current consumption, and estimates the device’s battery life.

The selection of the RF communications protocol is important to maximizing the operating lives and functionality of wireless IoT nodes and wearables. It is a complex process, and a comprehensive discussion is beyond the scope of this FAQ. Options include 2G/3G, Bluetooth LE, 802.15.4, LoRa, LTE, NB-IoT, SigFox, Wi-Fi, WirelessHART, ZigBee, and Z-wave, among others. The following two sections provide an overview of the complexities involved in selecting an RF protocol by presenting some of the considerations for using Bluetooth LE and LTE-M/NB-IoT.

Bluetooth LE power consumption

Power consumption in Bluetooth LE (BLE) SoCs is typically due to the processor and the BLE radio. The processor can draw several mA when running. The Bluetooth radio is another part that can consume significant amounts of power. Some BLE radios can reach a peak of 20mA when transmitting (or more if the output power increased). Which part consumes more depends mainly on the application and duty cycle.

Bluetooth LE was designed to be in sleep mode most of the time to conserve power and maximize battery life in wireless IoT sensors nodes. (Image: Argenox)

Most BLE products are designed to stay asleep as much as possible to conserve power, waking up to process and send data. How much current they draw during sleep depends on the device, but a figure of around 1.5uA is typical of many SoCs. This figure accounts for running a Real-Time Clock (RTC), which wakes the system periodically to advertise or send data. When in sleep, the CPU is off, the BLE Radio isn’t transmitting, and most peripherals are stopped. If the device wakes up periodically, then the current peaks for a few milliseconds during transmission and reception. The system’s average current is then higher and depends on how often the system wakes up to send data.

Some designs employ a deeper sleep mode where no RTC runs, and the system is in a low power mode, but this prevents a smartphone or other node from connecting to a device. In this case, it’s possible to have a current draw of less than 200nA. Peak current consumption of BLE chipsets is affected by several factors:

  • The power output of the BLE device – the higher the output power, the more current is required. Many latest generation devices draw 3mA to 6mA peak, while older devices can draw 3x or more.
  • Amount of data sent – larger advertising packets require the BLE radio to stay on the air longer, drawing more current during this time.
  • Processing – the processor requires significant power while running the BLE stack as well as any other processing required
  • Peripherals – sensors, displays, and so on

10-year battery life for LTE-M/NB-IoT

Power saving mode (PSM) is the power saving feature designed for LTE-M/NB-IoT devices to help them conserve more battery power. This feature is essentially a “sleep” mode and was first introduced in 3GPP Rel. To update the network about its availability, the user equipment (UE) performs periodic tracking area updates (TAU) after a configurable TAU timer has expired. The UE then remains reachable for paging during the paging time window (PTW) of the idle state. Once the PTW expires, it enters deep sleep mode (PSM mode) and becomes dormant and unreachable until the next periodic TAU occurs.

NB-IoT PSM and its message flow: The UE can exit PSM either after the T3412 timer expires, i.e., renewed TAU, or the UE initiates a mobile originated (MO) service or detach. (Image: Rohde & Schwarz)

During PSM mode, the UE turns off its circuitry yet is still registered in the network, meaning that the UE closes the application server (AS) connection yet keeps the non-access-stratum (NAS) status. The advantage of such an approach lies in the fact that the UE can wake up immediately from PSM without re-attaching or re-establishing the packet data network (PDN) connections. This avoids extra power consumption due to additional signaling messages transmission for the higher layer connection establishment procedure. PSM maximizes the downtime of the UE, which significantly reduces battery consumption.

The use of PSM is particularly interesting for use cases that require infrequent mobile terminated, or mobile originated events that allow a certain latency for the services, such as a water meter that sends the counter once a month. With the PSM mechanism, the 10-year battery lifetime as recommended for LTE-M and NB-IoT devices becomes possible.

Duty cycle – how often does the device communicate?

Once a wireless technology is chosen, the next factor facing the designer is determining the required transmission strength, duration, and duty-cycle between active and sleep states. Most modern wireless transceivers offer sleep modes when not in use to save power. The longer a device is in a sleep state, the less power it uses, extending battery life. However, the sleep-state power numbers within a wireless transceiver’s electrical specifications should not be the sole input when examining a device’s wireless power budget. The wireless transceiver’s wake-up times and preprocessing algorithms prior to transmission and return to sleep also affect power and should be included when calculating the total wireless power budget.

During a polling event, power consumption is often inflexible, and changing the duty cycle can be the most effective way to conserve energy. (Image: Silicon Laboratories)

The frequency, or duty-cycle, of wireless transmissions, will directly affect the product’s battery life. The duty-cycle is determined in part by the wireless standard requirements, the software algorithm, and how the device is normally used. For example, a door sensor with an open/close event will cause a wireless data transmission to occur; however, this sensor may also send and receive periodic wireless polling events to and from other mesh network nodes for status updates.

Online energy consumption optimization

Wisebatt was created to allow engineers to assess and optimize the energy consumption of IoT devices. The company has created an online simulation platform that facilitates the development of IoT and consumer devices. The platform can provide simulation results in few minutes. With it, designers can build a virtual prototype device, try several combinations, and select the best components for their products.

With data visualization, designers learn how their device’s components and states perform and consume the most battery power. The Wisebatt software uses a combination of simulation and data collected from silicon vendors and authorized distributors to enable a feasibility study of energy consumption, bill of material, component’s compatibility, configuration, and supply constraints. In addition to the core feature “Power Analysis,” the Wisebatt simulation tool offers complementary information that is essential to a new device development phase.

This FAQ series has provided a comprehensive overview of power systems for wireless IoT nodes and wearables. It started with a review of the components available when designing these power systems. Part two focused on measuring and validating the performance of those power systems. And part three looked at the battery chemistry and charging options available to designers. This final installment reviewed battery life analysis and maximization considerations for wireless IoT sensor nodes and wearables.

References:

Battery Size Matters, Silicon Laboratories
Power saving methods for LTE-M and NB-IoT devices, Rohde & Schwarz (white paper)
Powering wireless and Bluetooth LE products with batteries, Argenox
X8712A IoT Device Battery Life Optimization Solution, Keysight Technologies