in ,

How Ars tests Wi-Fi gear (and you can, too), Ars Technica

How Ars tests Wi-Fi gear (and you can, too), Ars Technica


      823. foss –


The tools and methodology we test Wi-Fi with are open source — so you can use them, too.


          –   **************************         

**************************************Behold the glory : four refurbished Chromebooks, each with an additional Linksys WUSB 38207557 Wi-Fi adapter for out-of-band control and communications. Jim Salter************************** After our review of Google’s Nest Wi-Fi kit last fall, we received an unexpected request: Ars reader GerbilMagnus hopped into the comments andaskedfor an explainer about how we test Wi-Fi.************** Machination minutiae hadn’t necessarily struck us as something of interest, but ask and you shall receive dear readers. Today, we’re taking GerbilMagnus’ lead and taking readers behind the scenes of our Wi-Fi testing process — we’ll also toss in a little theory and practice along the way. If you want to try our methods at home, know up front that you don’t necessarily have to replicate our entire test setup to start seeing useful results for yourself. But if you wantto put the latest and greatest mesh gear through the gauntlet, we’ll absolutely cover everything from top to bottom before we’re done. Why we run complex tests

Most professional Wi-Fi tests are nothing more than simple Internet speed tests — set up a router or mesh kit, plop a laptop down feet away, and let ‘er rip. The idea here is that the highest top speed at close range will also translate into the best performance everywhere else.

Unfortunately, things don’t generally work that way. All you’ve really measured is how fast one single download from one single device at optimal range and with no obstructions can go — which is usually not the thing that’s frustrating real-world users. When you’re mad at your Wi-Fi, it’s rarely because a download isn’t going fast enough — far more frequently, the problem is it’s acting “flaky,” and clicking a link results in a blank browser screen for long enough that you wonder if you should hit refresh, or close the browser and try again, or what.

The driving force behind our Wi-Fi reviews is strict gauging real-world user experience. And if you accept our premise — that slow response to clicks is the most frequent pain point — that means you don’t want to measure speed. You want to measure latency — specifically,application latency, not network latency. (Application latency means the amount of time it takes for your application to do what you’ve asked it to, and it is a function of both network latency and throughput.)

The most challenging workload we throw at our Wi-Fi is usually just what we said in the beginning: Web browsing. When you click a link to view “a webpage,” what you’re really downloading isn’t a single page — it’s typically a set of tens if not hundreds of individual resources, of varying sizes and usually spread over multiple domains. You need most, if not all, of those resources to finish downloading before your browser can render the page visually.If one or more of those elements takes a long time to download — or stalls out and does not download — you ‘ re left staring at a blank or only partially rendered webpage wondering what’s going on. So, in order to get a realistic idea of ​​user experience, our test workloadalso needs to depend on multiple elements, with its application latency defined as how long it takes the last one of those elements to arrive.

Streaming video is another essential component to the typical user’s Wi-Fi demands. Users want to stream Netflix, Hulu, YouTube, and more to set-top boxes like Roku or Amazon Fire Sticks, smart TVs, phones, tablets, and laptops. For the most part, the streaming workload isn’t difficult in and of itself — streaming services keep deep buffers to smooth out irregularities and compensate for periods of low throughput. However, the streaming puts a constant load on the rest of the network — which quickly leads to that “stalled webpage” pain point we just talked about.If you set up a simple, cheap Wi-Fi router in an empty house with a big yard and ping it from a laptop, you won’t see terrible latency times. If the laptop’s in a really good position, you might get pings as low as 2ms (milliseconds). From farther away in the house, on the other side of walls, you may start seeing latency creep up as high as (ms.)This is already enough to start making a gamer’s eyelid twitch, but in terms of browsing websites, it doesn’t sound so bad — after all, the median human reaction time is more than250 ms. The problem here is that ping “as low as” 2ms very likely has one or two in a chain of that is considerably higher than that — for a laptop in ideal conditions, maybe the highest of (pings is) ********************************************************** ms. For one on the other side of the house, it might be 419 ms or worse.
The ping we care about here is the worst

ping out of or more, not the lowest or even the median. Since fetching webpages means simultaneously asking for tens or hundreds of resources, and we bind on the slowest one to return, it does not matter how quick nine of them are if the tenth is slow as molasses — that (bad) ping is the one holding us up.
But so far, we’re still just talking about a single device talking to a Wi-Fi access point in an empty house, with no competition. And that’s not very realistic, either. In the real world, there are likely to be dozens of devices on the same network — and there may be dozens more within “earshot” on the same channel, in neighbor’s houses or apartments. And any time one of them speaks, all the rest of them have to shut up and wait their turn.

What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

AMD's third shoe finally drops at CES 2020—7nm Zen 2 mobile CPUs, Ars Technica

AMD's third shoe finally drops at CES 2020—7nm Zen 2 mobile CPUs, Ars Technica

Why this Monero mining malware was a complete failure