Discussion:
[TECH] LTE-M, Cat-M1, maybe MQTT - How is data metered? TCP/IP overhead included?
Darron Black
2018-07-13 13:55:27 UTC
Permalink
I asked this on the Internet of Things StackExchange web site... but I
realized this list has a good shot at knowing the answer as well:

We're looking at adding an option to our product for Cat-M1 data
connectivity. The data plans look interesting, but it's very unclear how
the data is actually counted. For example... for these AT&T, Verizon,
and T-Mobile plans with 1MB of data, does that include the TCP/IP packet
overhead?

We have reasonably low data requirements (8 bytes/minute per sensor,
1-15 sensors per device... plus a 26 byte system data packet every 15
minutes). However, if TCP/IP overhead is included, then our 8
bytes/minute becomes a minimum of 88 bytes... possibly a LOT more if the
modem is just barely communicating and gets a large number of retries.

Does anyone know how they meter this stuff?

We could drop down to just a data packet every 30 minutes or so, but
we'd very much prefer per-minute data. If it includes TCP/IP overhead,
then with one update a minute just the packet overhead itself is going
to use a minimum of 3.5MB/month.

If TCP/IP overhead is included... Is there some system or protocol (M2X,
MQTT, etc) that would be an alternative to straight TCP/IP where the
network providers would count data differently somehow?


Darron
--
http://www.piclist.com/techref/piclist PIC/SX FAQ & list archive
View/change your membership options at
http://mailman.mit.edu/mailman/listinfo/piclist
Jason White
2018-07-13 20:54:10 UTC
Permalink
I feel that this could vary provider to provider and plan to plan. I
suspect that only they could answer that question.

I would wager that the legal/fine text would leave the precise definition
of "data" to be vague. But considering that cell providers make money
selling data they would certainly have an incentive to include all of the
overhead - probably everything from the IP layer (IPv4 or IPv6) up, maybe
even the physical layer.

-Jason White
Post by Darron Black
I asked this on the Internet of Things StackExchange web site... but I
We're looking at adding an option to our product for Cat-M1 data
connectivity. The data plans look interesting, but it's very unclear how
the data is actually counted. For example... for these AT&T, Verizon,
and T-Mobile plans with 1MB of data, does that include the TCP/IP packet
overhead?
We have reasonably low data requirements (8 bytes/minute per sensor,
1-15 sensors per device... plus a 26 byte system data packet every 15
minutes). However, if TCP/IP overhead is included, then our 8
bytes/minute becomes a minimum of 88 bytes... possibly a LOT more if the
modem is just barely communicating and gets a large number of retries.
Does anyone know how they meter this stuff?
We could drop down to just a data packet every 30 minutes or so, but
we'd very much prefer per-minute data. If it includes TCP/IP overhead,
then with one update a minute just the packet overhead itself is going
to use a minimum of 3.5MB/month.
If TCP/IP overhead is included... Is there some system or protocol (M2X,
MQTT, etc) that would be an alternative to straight TCP/IP where the
network providers would count data differently somehow?
Darron
--
http://www.piclist.com/techref/piclist PIC/SX FAQ & list archive
View/change your membership options at
http://mailman.mit.edu/mailman/listinfo/piclist
--
Jason White
--
http://www.piclist.com/techref/piclist PIC/SX FAQ & list archive
View/change your membership options at
http://mailman.mit.edu/mailman/listinfo/piclist
Loading...