[Server-devel] Testing and comparing wireless setups.

James Cameron quozl at laptop.org
Mon Jun 24 19:24:45 EDT 2013

[Warning: this author is on coffee for the first time in months]

On Mon, Jun 24, 2013 at 07:14:11AM -0500, David Farning wrote:
> As part of the XSCE, we would like to start offering wireless setup
> guidelines.
> Large deployments might be able to afford a site assessment. Small
> deployments are left scratching their heads and making decision on
> anecdotal references. This results in a bottleneck. Smart and
> motivated people are spending time, early in a deployments
> lifecycle, on wireless which means other critical task are left
> undone.
> This topic is challenging because it is complex. Every workload is
> different and every electromagnetic environment is different.
> My idea is start very simply by assigning a numerical score for
> various wireless devices based on simple criteria like throughput,
> connection count, overheating, and range. My _very_ naive experiences
> are that among consumer grade devices:
> 1. Some devices can handle more connections than others before they
> start dropping connections.
> 2. Some devices can handle higher throughput - Several kids watching
> youtube and me doing a download.
> 3. Some devices overheat and reset more frequently than others.
> 4. Some devices have better range than others.

I think this is overly simplistic, yet I know a simple heuristic is
what is needed.  So I suggest coming at it from a different angle.

> Does this information seem valuable to deployments? Does the general
> approach seem sane?

The stack is deep, so deep that anecdote can be inaccurate and
misleading.  The phase space has a large number of dimensions.  It may
be better to accumulate test reports so that people can form their own
opinions.  The test report should include:

- the wireless access point manufacturer, model, serial number, and
  firmware version,

- the XSCE version,

- the XO OS version, model, and wireless card,

- the measured capability of the internet service, in terms of latency
  and bandwidth, measured with ping and wget,

- the number of access points deployed adjacent to the XSCE,

- the number of XO users active during the test,

- individual user XO performance observations, in terms of latency,
  bandwidth, and packet loss, such as ping, wget, and curl POST,
  rolled up into a total performance score in the range 1 to 10.

Then, abstract from the collection of reports a list of access points,
user counts, and total performance score.  Link each line to the
actual report.

This ensures that the claims the community make about the access
points can be substantiated ... which benefits the community, the
deployers, and the manufacturers of the devices.

(With enough data, of the order of ten or so reports, the workload and
radiofrequency environment aspects can be reduced in importance.  I
think those aspects are better handled by site design guidelines based
on what we find is common to all working deployments.)

James Cameron

More information about the Server-devel mailing list