You've found the perfect remote site to monitor — a pump station 40 km from the nearest town, a water tank on top of a hill, a flow meter at a pipeline junction. No AC power. No solar budget. You need IoT data from it.

The first instinct is: slap a power bank on it and see how long it lasts.

It won't last long. A typical IoT gateway running continuously draws 150–300mA. A 10,000mAh power bank gives you 33–66 hours. Three days, if you're lucky.

This guide is about doing it right — designing a battery-powered IoT node that lasts years, not days.


Why Most Battery IoT Deployments Fail Early

The failure isn't in the battery. It's in the duty cycle.

Most IoT devices are designed for mains power. When you run them on battery, they're awake 100% of the time — maintaining a WiFi connection, polling sensors every few seconds, keeping the MQTT connection alive. That's death for a battery.

The fix is deep sleep. But deep sleep has to be done right.


The Deep Sleep Architecture

The only way to run an IoT device for years on a battery is to keep it asleep for most of its life.

Here's what the duty cycle looks like on a well-designed gateway like the BusLog4G Bat IO:

State Current Time per cycle
Deep sleep 23.1µA 29+ minutes
WiFi connect + upload ~150mA avg ~22 seconds
4G LTE upload (fallback) ~200mA avg ~52 seconds

On a 30-minute upload interval over WiFi, the device is awake for 22 seconds out of 1800 seconds — about 1.2% of the time.

The other 98.8% of the time, it draws 23.1µA.

That's the key insight: sleep current dominates everything. If your sleep current is 100µA instead of 23µA, you've already burned 4× more power before a single upload happens.


Power Budget: The Real Numbers

Let's work through the math on a real deployment.

Device: BusLog4G Bat IO
Battery: LTC 3.6V D-cell — 19,000 mAh (lithium thionyl chloride, rated for -40°C to +85°C)
Usable capacity (85% derating): ~14,800 mAh

Scenario 1: Weekly 4G uploads, 100 pulses/day (flow meter, quiet site)

Component Calculation Annual mAh
Deep sleep (52 weeks) 23.1µA × 8760h 202 mAh
4G upload (52×/year) 200mA × 52s × 52 150 mAh
DI pulse wakes (100/day) 40mA × 150ms × 36500 60 mAh
Total ~412 mAh/year

Battery life: 14,800 / 412 ≈ 35 years. Practically, you'd estimate 12+ years accounting for battery self-discharge (~2–3% per year for LTC chemistry) and temperature effects.

Scenario 2: 4G uploads twice/day, 1000 pulses/day (busier site)

Component Calculation Annual mAh
Deep sleep 23.1µA × 8760h 202 mAh
4G uploads (730×/year) 200mA × 52s × 730 2,100 mAh
DI pulse wakes (1000/day) 40mA × 150ms × 365,000 608 mAh
Total ~2,910 mAh/year

Battery life: 14,800 / 2,910 ≈ 5 years. Real-world estimate: 3.5–4 years with derating.

The pattern is clear: upload frequency is the biggest lever. More uploads = more 4G radio time = faster battery drain.


The Three Design Decisions That Determine Battery Life

1. Upload Interval

This is your biggest knob. Every 4G upload costs ~2.9 mAh (200mA × 52s). WiFi is cheaper at ~0.9 mAh (150mA × 22s).

Upload Frequency Annual 4G Cost Annual WiFi Cost
Every 5 minutes 87,600 mAh 26,280 mAh
Every hour 7,300 mAh 2,190 mAh
Every day 304 mAh 91 mAh
Every week 43 mAh 13 mAh

Use the longest interval your application can tolerate. Flow totalizer at a remote pump? Daily upload is fine. You're counting pulses in sleep anyway — the count is safe in RTC memory, synced to flash every 30 seconds.

2. Connectivity: WiFi vs 4G

WiFi costs 3× less per upload than 4G LTE. If there's any WiFi coverage at your site, use it as the primary — the gateway will fall back to 4G automatically when WiFi isn't available.

For truly remote sites: 4G only, optimize interval.

3. What Are You Actually Counting?

Not all remote site data needs real-time uploads. Ask:

  • Is this a totalizer (flow meter pulses, kWh)? → Long intervals are fine. Data accumulates in sleep.
  • Is this a threshold alarm (tank overflow, pump fault)? → You need near-real-time. Plan power accordingly.
  • Is this a status (valve open/closed)? → On-change upload via DI wake, daily heartbeat upload. Very low power.

Pulse Counting During Deep Sleep

This is the piece most battery IoT solutions get wrong.

A common approach: wake up every 5 minutes, read the pulse counter, upload. But that means 288 wake cycles per day — all that radio activity burns through your battery in months.

The right approach: count pulses in sleep, upload the total on a slow schedule.

On the BusLog4G Bat IO, each DI pulse triggers a 150ms micro-wake — no WiFi, no 4G, no Modbus. Just: 1. Wake (~10ms boot) 2. Increment RTC memory counter 3. Wait for pin to go HIGH (max 2s) 4. Back to sleep

Power cost: ~0.002 mAh per pulse. Even at 1000 pulses/day, that's 2 mAh/day from pulse counting — basically nothing.

The count survives sleep (RTC memory, +0.75µA) and is synced to flash on every timer wake, so a power loss won't reset your totalizer.

One important note on ESP32 GPIO: GPIO35 is an ADC1 input-only pin on the ESP32. During WiFi radio activity, ADC1 pins can generate phantom interrupt edges due to internal multiplexer glitches. The firmware validates each ISR by checking the actual pin state after debounce — fake pulses are filtered out. Set debounce to 100ms minimum for industrial sensors; 200–500ms for noisy relay contacts.


Battery Chemistry Matters

Not all batteries are equal for this use case.

Chemistry Self-Discharge Temp Range Typical Capacity For IoT?
Alkaline AA 2–3% / year 0°C to 50°C 2,800 mAh ❌ Poor
NiMH 20–30% / year -20°C to 50°C 2,500 mAh ❌ Poor
Li-Ion (18650) 1–2% / year -20°C to 60°C 3,500 mAh 🟡 OK
LTC (Lithium Thionyl Chloride) <1% / year -55°C to 85°C 19,000 mAh (D-cell) ✅ Best

For outdoor, remote industrial deployments: LTC D-cell is the only serious choice. The combination of high capacity, near-zero self-discharge, and extreme temperature range is why it's standard in utility metering and military equipment.

One quirk: LTC batteries have a passivation effect — a thin lithium chloride layer forms on the anode after storage, causing a brief voltage dip on first discharge. Quality devices handle this without issue.


Field Checklist: Before You Deploy

  • [ ] Calculate your power budget — sleep current × hours + (upload current × duration × frequency) + (pulse wake current × pulses/day)
  • [ ] Set the longest acceptable upload interval — match to your data freshness requirement
  • [ ] Use WiFi if available — 3× cheaper than 4G per upload
  • [ ] Disable Modbus polling if you only need DI/pulse counting (saves ~8 seconds per upload cycle)
  • [ ] Set debounce ≥100ms for industrial sensors on DI input
  • [ ] Verify NVS sync is enabled — pulse counts must survive power loss
  • [ ] Check battery voltage in every upload — monitor degradation over time via the BMS
  • [ ] Account for temperature — LTC capacity drops ~10–15% at -20°C vs 25°C; factor this in for cold climates

Summary

Battery-powered IoT works — but only if you design for it. The principles are simple:

  1. Sleep is everything. A device drawing 23µA in sleep versus 100µA has 4× longer life before uploads even start.
  2. Minimize radio time. Every second of WiFi or 4G is the most expensive thing your battery pays for.
  3. Count pulses in sleep. Don't wake the radio to log a flow pulse. Count in RTC memory, upload the total.
  4. Match interval to application. A weekly upload on a remote totalizer is not a compromise — it's good engineering.
  5. Use the right battery chemistry. LTC D-cell for industrial outdoor deployments. Nothing else comes close.

Apply these, and 5–10 year battery life is not marketing. It's arithmetic.


Questions about battery-powered IoT deployments? Contact us — we'll help you spec the right device for your site.