[OLPC Security] "Correlating bitfrost and threats"

Jameson "Chema" Quinn jquinn at cs.oberlin.edu
Tue Jul 31 13:31:57 EDT 2007


Replying out-of-band not out of privacy, but to avoid spamming the list.
When we reach agreement, some record should be left on list and/or wiki.

Issue 1: sometimes you don't want to encrypt the backup.

>
> >
> > Great. The case I was asking was slightly different. My laptop is turned
> > off, my buddy wants to see the shared file for the first time, it
> happens to
> > be on the server. It would be nice if the file was encrypted to a
> one-time
> > key which is in turn encrypted to both mine and theirs, so they can get
> it.
> > Not a high priority.
>
> Awh, I see now.
>
> Generally speaking, the P_Ident service of Bitfrost says that, if it
> is possible to encrypt a file securely (you trust the public key(s)
> tied to the identity(ies) you wish to share with, you should encrypt.
> That is: if possible, encrypt the file. Now, if you are making
> something open to anyone, or including untrusted peers, you are either
> unable to encrypt, or we will be able to do some nice middle-ground
> with a UI of yellow-light-esque "we sort of trust this person" if that
> makes sense. All that to say, yes, it will be a feature at some point.
> Not necessarily at first ship, if only because of deadlines.


We agree.

1a. Share UI - already agreed.

> > > > And don't tell me there will be no such thing - how does a student
> hand
> > in a paper and then receive the comments back? Not all collaboration is
> > real-time.
> > >
> > > You are quite right, it is not. In the homework turn-in case I suspect
> > > the student would encrypt the file in question with the teacher's
> > > public key and offer it to download. How the UI presents this is
> > > another discussion ('pressing a Turn In Homework' key of some sort?).
> >
> > Well, in the UI paradigm, the simplest would be: share it from my
> journal,
> > my teacher can see it now, and by default my teacher's copy is shared
> with
> > me in return, so when they grade it I see their modifications as the
> latest
> > version.
>
> Sure.


Issue 2: preventing invisible snooping.

> Let me redo my scenario.
> >
> > At first boot, I create a key. I split it in two parts. I leave one part
> on
> > the school server, and distribute copies of the other half to 3 (or so)
> > random peers. By default, nobody keeps a record of who has whose key.
> All of
> > this is done by system software, which is pretty darn secure - it is
> signed
> > by OLPC and separate from all the country-specific preloads, so I don't
> see
> > an easy way to get a keylogger in there. The school server does not tell
> me
> > who to share with, my laptop decides on its own. The school server
> itself
> > does have a password.
> >
> > Now, for me to do a recovery OR for someone to snoop my files, they need
> > help from my school authorities and at least one of my friends. In order
> to
> > have a good chance of getting the right friend, they have to poll at
> least
> > 1/3 of the peers (that is, 1/3 of those which my computer saw in its
> first
> > few days of life - it should offload the first copy relatively quickly,
> > within 2 hours say, but should be in no great hurry to offload the other
> > two). This is fundamentally more secure than trusting just the
> authorities.
>
> This is a clearer scenario (either this or the original, I understand
> your intentions better).
>
> With that in mind, sure, being able to make the weight of the abuse of
> power a more socially obvious problem is A Good Thing(TM). I can say
> that, given the hectic state of things, something like this wouldn't
> be able to make it into FRS, but I like your thought process here.
>
> > > This system requires that your adolescent peers are not susceptible to
> > > social engineering....
> >
> > No, it requires that at least SOME of my peers have at least SOME chance
> of...
>
> Yes. But ...


We're splitting hairs here.

> The game is to make the community aware of what
> > snooping is going on, which makes it at least potentially subject to
> > community standards, WHATEVER they are. (The broadcast could just as
> well
> > say "The local holy man wants to see X's files" as "X lost their laptop
> and
> > wants to recover their files to a new one". If at least one of the 3
> peers
> > says "yes, that's legitimate", the attempt is successful. But if the
> > community then tells the server administrator "we'll have no more of
> that",
> > in the long term, the community standards of privacy are preserved.)
> >


...

> No, they are not asked to confirm that the snoop request is legitimate,
> only
> > that it's plausibly so. Much lower barrier. Again, the game is to make
> > snooping public, not to stop it.
>
> You're drawing a pretty fine line trying to separate the two. The only
> reason they would think it is not "plausible" that John is asking for
> a confirmation of himself (essentially) is if Jane (the person being
> asked for confirmation) has some reason to explicitly doubt that John
> isn't asking. Either by asking him outright, knowing that he is not at
> his computer, or similar absolute means. Otherwise, being asked to
> confirm that she has John's key could be done by any number of people,
> and if her gauge is "it is plausible that he might be doing this" is a
> vague guideline.


Yes. So she essentially always says "yes". But next time she sees John, she
(or someone else who happened to notice the unobtrusive announcement) says,
"so how did you lose your laptop, tell me the story" and if he says "huh?"
people start to suspect things.

With a million laptops per country, there's a big difference between "they
can get your data whenever they want to, and you'll never know" and "they
can get your data 99.9% of whenever they want to, but even if they do,
there's a 5% chance you'll find out".

> > > Obviously, a committed big-brother could still twist two kids' arms
> and
> > convince the rest to ignore the message. But that doesn't scale well to
> > stealing EVERYONE's data and mining it, which is the real threat.
> > >
> > > But they wouldn't need to get everyone. The handful of kids they
> > > choose to strong arm (or coerce or 'borrow' the laptops of, or analyze
> > > while repairing, or, or, or...) would be--using your system--carrying
> > > 1/2 of another person's key. Several halves, in fact. Get a large
> > > enough sample, then you are getting the private keys of children that
> > > they didn't directly interact with. Would this be easy for a million
> > > kids at once? Probably not. But at that point, you are also assuming a
> > > great number of things about your government, like that they didn't
> > > install keyloggers on the XOs before shipping them out or during
> > > repair.
> >
> > Yes, true, a committed evildoer with the key-copies from 60% of the
> laptops
> > would be able to snoop on 1-((1-0.60)cubed) = 93.6% of the files on
> average.
> > With 90%, they'd get 99.9%. Still, that is a lot of work to just get to
> > EXACTLY where they would be from the start in your scenario. Also,
> remember
> > that if any kid puts a password on their laptop, the
> stolen-during-repair
> > scenario should be preventable (assuming that there is a way to put it
> in
> > unlocked "repair mode" which only protects the securest data like keys).
> >
> > Also, Bitfrost prevents keyloggers.
>
> This should be clarified: Bitfrost *does not* stop keyloggers. And
> this is important. Bitfrost *explicitly* makes it possible (though, it
> is an abuse of the authority on the deepest level) for OLPC or the
> country to install software that can bypass security restrictions.
> Keyloggers included. This can either work inside of Bitfrost or simply
> ignore it entirely. Both the country and OLPC are able to sign binary
> patches that are installed on the machine. Kernel modules, X11, Sugar,
> DBus, the firmware. You name it. Heck, they could even install a
> hardware keylogger.
>
> If they touch it, they own it. If they can install software on it,
> they own it. If they can convince you to install software on it, they
> own it. [1] (Ironic that it is a microsoft site, I know)


OK, granted. But here I live in Guatemala - a pretty run-of-the-mill 3rd
world country. Not as well-run as Mexico (or another 20 like it) but much
better-off than Haiti or most of Africa. Genocide in the living memory of
many people I know, a government you can't trust, the policeman is NOT your
friend, but no ongoing conflict. And I'm pretty sure that the government
couldn't get it together enough to install special snooping software if they
wanted, and 100% sure they couldn't do it without getting caught. And the
more capacity to create such snoopers a country has, the more savvy its
citizens are about catching it. Even in, say, Colombia, with plenty of
motive and help from super CIA advisors, the chances of getting caught would
be more than half. And so fine: they'd tough it out, say "the war on the
narcos is more important than your privacy", and people might accept that.
But at least they'd have the conversation.

That "10 laws of security" site you pointed me to is essentially saying that
one serious security flaw is enough to undermine all the rest. That's true
collectively, too. Technical security can be undermined by human insecurity
- and human security can be undermined by technical insecurity. You're
essentially saying that the human failures mean it's not worth trying
technically. I'm saying it's important to give security a chance - if you
store the whole key on the same disk as the encrypted file, you're selling
out the society without giving it a chance. Individual security will never
be 100%, but collectively speaking incremental improvements can make
night-and-day differences in "herd immunity".

I think the realm we are getting into with this is missing the forest
> through the trees. We are trying to find a way to make it difficult
> for The Man to get away with snooping. Yes, you said you didn't mean
> to stop it, just to make it public, but the goal of making it public
> is that the community will exercise social pressure to stop the spying
> (or to actively engage in communication outside the band of
> compromise). The ends, I would argue, are the same. Either by
> technical or social means, our discussion has been trying to
> essentially stop a specific case of Bad Things from ever happening.
> This is entirely possible (ignoring the at gunpoint extreme cases),
> but not without making the user take on an additional responsibility.


A small one. Actually, 3 small ones. 1) The need to somehow get 1 out of 3
specific peers to respond "yes" in order to restore your data after your
laptop is totalled, or you lose 1 person's data. (Remember, most people
today don't even have good backups, and their laptops are far more breakable
and stealable than these ones are). 2) There should be some finite chance
that a given person would notice an illegitimate request - say, 1 in 100 -
and then some finite chance that they'd raise a fuss about it - again, say
1%. 3) There should be some finite chance that a given person would notice
BIOS or OS tampering on their laptop, say 1 in 10,000.

...

>
>
> As mentioned above you are trusting that the users probed for
> information are aware enough to know that they are being probed and
> not benignly misguided, outright tricked or otherwise worked around.
> This is part of the limitation of not relying on the XO owner for a
> token (password/passphrase, etc).


Most will not be aware. It only takes a few - and it only takes a
possibility to make the snooper think twice.

3. Second level of backups - regional or peer-to-peer with the servers?

I think we're talking across each other. My point is:
A: a regional backup is only good for restoring the school server, because
you can't rely on the local internet connection at the moment the kid wants
their old data. There is no file that's you want to throw away locally
that's still worth keeping regionally (after all, you have at least 3-5 gigs
per kid at the local level, that's plenty.
B: This process introduces a whole new level of risks of snooping.

Thinking about it more, I actually agree - a regional/countrywide backup
center may be better than a peer-to-peer scheme, especially if they can
deliver a replacement server already cloned from the backup and ready to
boot. Yet it is really hard to introduce real security at this level. Even
if the server data is encrypted to a password, many passwords will be weak,
and many server admins will happily share them with the regional boss, so
that's next-to-useless. And if it IS strong, there are a million ways to
lose it or forget it, so it's more of a hassle.

This only makes it all the more urgent to do the security right from step 1.
I understand my scheme's not perfect, but it is much better than nothing. I
think the extra hassles it introduces are minor, when you consider that
without it the kids have NO SUBSTANTIAL PROTECTION WHATSOEVER versus
snooping. Of course, the government is always the main risk - but even a
hacker will find the keys if they're always under the same pot next to every
door.


> --
> Michael Burns * Intern
> One Laptop Per Child
>

Cheers,
Jameson Quinn
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.laptop.org/pipermail/security/attachments/20070731/1bd344ec/attachment-0001.htm 


More information about the Security mailing list