Click to Skip Ad
Closing in...

Why two seemingly identical wireless network tests give such different results

Published Jul 27th, 2018 8:34AM EDT
Best wireless networks 2018: Tutela, Rootmetrics, Opensignal
Image: Cultura/REX/Shutterstock

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Wireless carriers spend billions of dollars every year to make their network the fastest and most consistent, so as you can imagine, crowning a winner is big business. A handful of third-party companies conduct testing on the US networks every year, studying download and upload speeds as well as coverage and call quality to declare one network bigger than the others.

In the past, the bulk of that testing was done manually: employees took phones from every major carrier, stuck them in a suitcase, and went from place to place in the US running head-to-head tests on the networks. That road-testing, as it’s called, still happens today, but it’s expensive and time-consuming. A new approach is instead to crowd-source data by getting users to download an app onto their phones. That app will let users run speedtests, the results of which can be aggregated into a database and sorted by location and wireless carrier to produce a ranking.

So far, so simple. But the crowd-sourced tests have often produced different results to the road-testing, which draws the methodology into question. Carriers, of course, promote the results that favor their network: Verizon trumpets reports from Rootmetrics, who have scored Verizon the best overall ten times in a row, while T-Mobile likes to talk about Opensignal and Ookla (makers of the popular Speedtest app), who say T-Mobile has the fastest network.

Then there’s Tutela. A relatively new entrant to the world of crowd-sourced testing, Tutela embeds its code into the background of popular apps, which results in a slightly different methodology than Opensignal or Ookla. The network speed tests run for a much shorter time — downloading a file around 2MB, compared to tens or hundreds of megabytes — and are also triggered randomly, rather than by a user choosing to run a speedtest.

Although you wouldn’t think that those methodological differences would produce a significantly different result to other crowd-source testing, Tutela’s results stand out among the crowd-sourced reports.

There’s a host of different reasons why the different crowd-sourced tests could give varying results, but Tom Luke, Tutela’s VP of Sales and Partnerships, focused on one particular technical explanation. His description is worth reading in full:

“When transferring data over a network (e.g. when downloading an email/uploading a photo), a protocol called TCP-ramp-up is used to control the speed of the transfer. This gradually increases the speed of the file transfer, until it reaches the maximum speed possible. If you plot this as a graph, it looks like a curve, gradually increasing (see below).

When transferring a file, the download speed accelerates for the first few seconds, before reaching a peak speed (peak throughput). For a very big file (e.g. 500MB which can take many seconds, minutes or even hours to transfer) the transfer speed will accelerate over-time until it reaches its top-speed (peak transfer rate).

This can be compared to an F1 car on a very, very long, straight race-track, if you give it enough time, it will reach a fast speed. However, nobody could actually achieve these speeds on normal roads, in real-world conditions. This is why, if you perform a speed test, if you wait longer, you can see the speed increasing.

The download of smaller files (e.g. 1-3MB photos), will complete before the top-speed is ever reached. This is a bit like driving an F1 car on a normal road where it cannot reach its top speed. Most mobile users use their mobile devices to view webpages, browse social media, check emails and download pictures. These are all small files, so the real-world performance differs significantly from peak-speed results.

It is very rare for a mobile user to download a huge file (e.g. 500MB), and would only realistically happen if you wanted to download a HD TV show from iTunes while on your 4G connection, which would use a lot of data.

The big numbers that show-up in peak-speed test results look much bigger and more impressive than the results from typical speed-test (taken by Tutela); however, Tutela’s results closely match the speeds that users get when performing almost all typical mobile usage.

Opensignal CEO Brendan Gill didn’t disagree with the suggestions that the size of file downloaded (and overall duration of test) can significantly change the results; in fact, he told BGR that Opensignal’s tests run for at least 10 seconds, but says the difference is that Opensignal is specifically trying to test the overall throughput of the cellular network. When running short tests, he said, a number of factors including TCP ramp-up, latency, throughput, and even the availability of DNS servers can play a role.

This isn’t to say that one set of results is “better” than the other — they both test different things. Gill emphasized that Opensignal is most interested in measuring speed, so they remove the impact of the length of the test (by running longer tests) to focus on speed.

Tutela takes the opposite approach, running a test that they claim is representative of the average data packet. Obviously, guesstimating the average data packet is going to introduce its own set of methodological flaws, but at the very least, it gives outside observers another set of data to examine.

Overall, there isn’t any way for us to determine which set of data is more accurate or representative. It’s easy to point out that Tutela’s results more closely mirror those of Rootmetrics and conclude that Opensignal and Ookla are “wrong,” but it’s also valid to point out that Opensignal and Ookla have been in the game longer and their results are at least consistent. Ultimately, having a variety of different methodologies and results doesn’t mean that they’re all worthless — it just means you need to know how to interpret them.

Chris Mills
Chris Mills News Editor

Chris Mills has been a news editor and writer for over 15 years, starting at Future Publishing, Gawker Media, and then BGR. He studied at McGill University in Quebec, Canada.