Sample large datastore

Martin Langhoff martin.langhoff at gmail.com
Mon May 19 01:01:24 EDT 2008


Working on ds-backup, I am concerned about performance of extracting
all of the datastore metadata. Both Tomeu and Ivan have warned me
about performance and memory impact. I want to repro the problem asap.

If a full metadata dump can be acceptably fast, a lot of complications
and error conditions in the ds-backup code just disappear, thanks to
rsync. I think it should be fast - our storage space limits the amount
of data we are dealing with in a normal case, and the machine is fast
enough that we should be able to burn through a few thousand entries
in less that 2s. But I need a good test case.

(So far, with small datasets I haven't been able to go over 1s, and
the python startup dominates time anyway).

Has anyone got a good sized datastore that I can grab? I am happy to
respect privacy of any personal datastore sample... but you must cope
with my browsing a bit around. :-) My test machines are on 703 - happy
to up/down/side-grade in case that matters.

cheers,



m
-- 
 martin.langhoff at gmail.com
 martin at laptop.org -- School Server Architect
 - ask interesting questions
 - don't get distracted with shiny stuff - working code first
 - http://wiki.laptop.org/go/User:Martinlanghoff



More information about the Devel mailing list