Thursday, February 26, 2015

AT&T Please fix tower 15932433


We spend a lot of time testing codec performance on real-life 4G networks. We've found several spots in cities that function well as worst-case scenarios, including the vicinity of the White House and Times Square. We've also found a good testing ground closer to home--the Boston Common. It's a wide open spot in the center of the city where you can count on changing cell towers pretty regularly as you walk around.

We had some interesting results yesterday while testing our LiveShot video codec there. We had three modems attached. (for you audio codec users, this capability is coming soon in firmware 4.0) Verizon and Sprint modems were champs, but depending on location our AT&T modem would behave miserably--to the point where it's presence was actually more of a hindrance to the overall stream than a help.

That was actually good news, since we were trying to tweak our algorithms to minimize or quarantine the data from a bad network. But we took it a bit further and started reading the stats on the modem as it entered and exited the "zone of doom". Sure enough, it was a discrete tower switch at the exact moment the channel fell apart.

The above graph illustrates the percentage of packet loss over time in one second intervals. When viewed on the product, this graph moves in real-time from right (most recent) to left (60 seconds ago). It's pretty obvious when you're on the bad tower.

Why is this particular cell so bad?  Is it located in a place where multipath is a problem on the Common? Is there a local transmitter with a spur in the 700MHz region blanking the cell tower receiver? I'm sure bribing an AT&T tech would be the only answer, and none were around.

In case you think this problem is specific to AT&T, we were at this exact location about 10 months ago, and AT&T was flawless throughout the area. But at that time, an identical situation occurred on Verizon, at the opposite end of the test area. Go figure.

No comments:

Post a Comment