JFFS2 file sizes

Jim Gettys jg at laptop.org
Wed Jul 25 20:04:09 EDT 2007


Jffs2's compression is OK, but as the block size of the compression
blocks is relatively smaller than a gzipped archive, for large objects
it's less efficient than gzip.  

Dave Woodhouse may be able to give typical numbers (he wrote jffs2, and
we're fortunate to have him working on OLPC)..  And individually gzipped
small files may not do much better than jffs2.

But you don't want uncompress the data twice, which is a power loss.

And images typically don't compress significantly, anyway; putting them
into a gzip archive is just a waste of joules; in much web content, the
images dominate.   Better there is to adapt the images to exactly
optimize to our screen; often the image data is much higher quality than
can actually be displayed.
                             - Jim


On Wed, 2007-07-25 at 14:58 -0500, Ian Bicking wrote:
> When considering what content can be shipped or stored on the laptop, 
> we're wondering what the real disk(/flash) usage is for files.  Since 
> JFFS2 is doing compression behind the scenes, it's not completely clear. 
>   Also, it would be nice if we could estimate how much disk something 
> will use without actually having to put the content on a laptop.
> 
> My guess is that there will be per-file compression using gzip/zlib.  So 
> we could estimate the size by doing something like:
> 
>    copy -r uncompressed-files compressed-files
>    gzip -r compressed-files/*
>    du -s --apparent-size uncompressed-files compressed-files
> 
> Should I use --apparent-size?  Otherwise it looks like du is taking into 
> account the actual disk usage, which on my ext3 system isn't going to be 
> representative.  Presumably there's some filesystem overhead on JFFS2, 
> but maybe less than on ext3.  Will du give accurate disk usage amounts 
> on the laptop (taking into account compression)?  If there was an 
> equation that would be handy, e.g.:
> 
>    alignment = 1024
>    directory_overhead = overhead = 256
>    directory_file_listing = 256
>    def file_size(filename):
>        if os.path.isdir(filename):
>            return (directory_overhead +
>                    directory_file_listing*len(os.listdir(filename)))
>        compressed_size = len(open(filename, 'rb').read().encode('zlib'))
>        compressed_size += compressed_size % alignment
>        compressed_size += overhead
>        return compressed_size
> 
> 
-- 
Jim Gettys
One Laptop Per Child




More information about the Library mailing list