- 1 Enhanced Anti-theft
- 2 Possible "virus"
- 3 Protecting privacy: threat model
- 4 Document bundles...
- 5 How is Bitfrost related to capabilities?
- 6 Error in the description
- 7 Sort of Unique
- 8 external drive access for activities
- 9 P_BIOS_COPY
- 10 Taste the rainbow
- 11 Metadata...
- 12 OLPC Bitfrost and Bitfrost
- 13 P_NETWORK clarification
I gather the primary anti-theft lease mechanism completely disables boot from the protected SPI ROM code so that a thief cannot recover from lease expiry by booting from a thumb drive etc.
This meets the most urgent and primary requirement of discouraging widespread organized theft for resale.
A less important problem is recovery of laptops that have been stolen (or just "found") casually by people unaware that the laptop will stop functioning and who may destroy it or throw it away somewhere where it cannot be found rather than enabling its return when they discover it is useless to them (or may damage it in hope of fixing it or finding saleable parts).
In some environments the scale of casual theft and/or the lack of a norm for handing in lost and found property (eg with corrupt police etc), such recoveries could turn out to be significant.
A problem with the lease mechanism is that it may encourage a thief, or a person who finds a lost laptop and plays with for a while instead of returning it promptly, to destroy or secretly dispose of the laptop instead of allowing it to be found again and returned. During the period between stealing or finding a lost laptop and the lease expiry, a person who has started using it may have entered identifying data (including a photograph). Since they cannot delete this data when the laptop stops working and they find out about the lease mechanism, they could fear being caught (or unjustly accused) if the laptop is recovered, and take measures to protect themselves.
Proposed solution is instead of completely disabling any boot at all, set a flag that switches to booting only a specific environment that works in "kiosk mode" carefully limited and tested for no possibility of the user gaining control.
The problem is that organized commercial theft for resale could disable this mode by modifying the unprotected "kiosk mode" boot files during the period before the lease expires (since they cannot be stored within the protected area of the SPI ROM). It would be necessary to use cryptographic integrity checks (a signed file manifest with secure hashes) before loading them (and failsafe to no boot at all if they have been modified).
Provided that mechanism is available, the normal kernel and verified safe applications could be used, but with a patched environment that disables the keyboard etc and only autoloads in "kiosk mode".
In this "kiosk mode" the laptop would do nothing useful to its possessor but would periodically and automatically at random times turn itself on to inform them of the procedures for returning it and convincingly explain that there will be no repercussions from doing so. For example it could play audio, perhaps accompanied with comic strip graphics or even animations that have been localized for this purpose.
To conserve battery the message could either be stopped or paused when any key is pressed, but would always repeat at power on and random intervals.
This would be most useful together with an official government policy designating specific collection points (most obviously Post Offices). A copy of the official directive (with official looking signatures and logos) and reference to the relevant regulations addressed to the Post Office staff could be displayed on the screen and read out in the audio.
This would direct the recipient that anybody delivering a laptop is to be assumed to have found it lost and is not to be asked any questions regardless of any suspicions and that the laptops are to be despatched postage free (or guaranteed by recipient) to a specified Education Ministry address so that they can be returned to the owner. It would also explain that all data on the laptop has already been (securely) wiped and that this is part of a program intended to protect the privacy of the children who own them as well as to encourage people who find them to return them promptly when lost.
It is of course still necessary to be able to update and recover the laptop with the normal bitfrost secured procedures. Only after the flag has been set by lease expiry would the restriction that only permits recovery using the anti-theft keys kick in.
See Intel's Boot Integrity Services (Version 1.0) for an API that fully specifies complex PKI integrity checks for boot files (including provision for complex delegation of update authorizations between manufacturer (OLPC) and user IT organizations (national and regional Education authorities) and others (eg for use with indidivual "developer keys" provided to end users, presumably including all children when they leave school retaining their laptops). This is designed for use with the PXE Preboot Execution Environment and presumably source code implementations are readily available as PXE is used upstream with Linux on PC's with typical BIOSes supported by Red Hat.
Presumably the standard Bitfrost mechanisms for signed manifests would be similar but the document is useful for specifying a complex delegation model for updates etc, with management software available to handle the actual administration.
The same mechanism can of course be used to provide reassurance that the laptop is not "broken" when a lease expires due to a child moving from a school etc with instructions on how to recover.
In addition it could be used to track down lost or stolen laptops by making regular probes for a designated WiFi SSID and/or responding to such beacons. All XOs (and repeaters etc) would be configured with any school server (anywhere) to coordinate triangulation of missing laptops using the mechanisms that are already deployed for 911 locality services and will no doubt eventually be implemented for pictorial displays of mesh neigbourhoods on XOs (but in this case, not showing up in the normal display available to all, but silently notified to whoever is designated to attempt recovery).
Perhaps this could be done for a period of several days (but less than a full lease expiry period) when a missing laptop has been registered to be disabled when it "calls home" on connecting to the internet. That might achieve a higher recovery rate than using the internet connection for immediate shut down as the casual thief/finder would continue operating them while being tracked. This delay could also avoid wiping the child's data since the last backup if it resulted in recovery before the full shutdown to kiosk mode occurs.
The space occupied by the multimedia warnings/recovery instructions could be treated as simply reserved space with provision in the UI for deleting them to recover from "out of space" conditions (eg a "use emergency reserve space" option in the notification of no more room). Backup and archiving procedures would automatically restore the files whenever they are noticed missing and there is sufficient room. Such a facility could be useful anyway (perhaps with a larger reserve).
The design goals of the laptop, including collaboration and modification of source code, make certain kinds of cross-computer self-replicating code possible. For instance, a program with network privs could watch for other users collaboratively using the "develop" activity and "suggest" the insertion of malicious code. Obviously, this is not a major threat, as such a virus would be dependent on user actions at the receiving end, but should be considered.
In order to stop this kind of activity, various finer-grained security privileges would work. For instance, the identity service in P_IDENT could include the registered name of the initiating application and its provenance (signed or unsigned) as a part of any signature. There could be separate network privileges for foreground and background/server use. And there could be a set of "remote privileges" for the kinds of communication an app could carry out with other bitfrost-aware clients - only with apps of the same name, for instance.
On the other side, there are some use-cases which are not covered by this model as described.
- The develop activity itself needs some P_SOURCE access beyond the regular P_DOCUMENT.
- For both P_DOCUMENT and P_SOURCE, the application can take a document handler initially and not just from the "open" dialogue box.
- A P_DOCUMENT_RO browser should be, by default, able to cause a document to open, either with its default handler app or using the os-provided "open with" dialogue.
- There should be a P_DOCUMENT_METADATA that allows a P_DOCUMENT_RO app to do tagging tasks. (Note that if tags are used by the journal for assigning sharing scope, this would be a security hole. I think that a browser app should have its writeable tagging namespace restricted, and that any attempt to use a member of such a namespace for sharing in the journal app should throw up a security confirmation dialog.)
- Privileges for interapplication communication on the same machine.
Homunq 10:48, 29 July 2007 (EDT)
Protecting privacy: threat model
There is an entire (friendly) critique of this document at correlating Bitfrost and threats in which several issues are mentioned. I feel that the following issue is central enough that it should be here on the main talk page so that it can receive wider comment. The views I express here are my own, though they are inspired by the aforementioned document.
The Bitfrost spec is focused on privacy at the individual laptop level, and so almost completely ignores threats to privacy at the backup level. At this level, there are three threats: hackers, theives, and the authorities themselves.
With regard to the third of these threats, the security should not aim to protect privacy in all cases - if Bitfrost blocks duly-executed subpoenas, governments will install a back door, and arguably they would be right to. Rather, a reasonable security model would attempt to ensure that any widespread snooping is publically known. This would enable the community to enforce its own standards.
I am looking at 4 privacy threats:
1. A hacker able to compromise, possibly through social engineering, the local school server.
2. A theif who steals the local school server.
3. A government employee or other authority with access to the school or regional backup server and *without a socially-accepted reason to read someone's files.
4. A government which intends to engage in *widespread* snooping on its citizens *without their knowledge or public consent*.
I also consider the following not a threat, and in fact a necessary feature:
- 5. A school or government employee with a reason that is considered legitimate by community standards should be able to gain access to the files of *one student at a time*. For a non-controversial instance, in the case where the laptop (or password, if it exists, or both - for instance in an accident which kills the child) are lost.
The following threat is outside the threat model. Note the crucial distinction from threat 4; this distinction simultaneously makes a technical defense impossible, and makes a social defense possible.
6. A government which intends to *openly* engage in widespread snooping on its citizens.
Existing bitfrost response
The spec as it stands states that documents would be encrypted when they were stored "on secondary backups". The implication is that they would be sent unencrypted to the primary local backup. This is the only real possibility given the current plan of backing up the users' keys, themselves. Though it is not mentioned here, the logical conclusion (and other communications from OLPC members) is that there will be a secondary level of offsite backup, in case the local backup computer is destroyed. This secondary backup would also need the users keys, if it intended to serve to create a full replacement for a destroyed server.
This leaves the childrens' data vulnerable to all of the above-mentioned threats in two locations. The local servers would be more attractive for theives and hackers, and the second-level backups would be more so for out-of-control authorities, but either way, the situation is very insecure. (Lest a first-world reader imagine that the threat of physical theft targeted at the information is minor - here in Guatemala, there is at least one social organization I know of that has had their computers stolen 3 times in the last year, and many others with 1 such break-in.)
At first boot, I create a key. I split it in 5 Shamir shares, of which any 3 will suffice to reconstruct it. I leave two shares on the school server, and distribute the other 3 to randomly-selected peers. By default, nobody keeps a record of who has whose key. All of this is done by system software, which is pretty darn secure - it is signed by OLPC and separate from all the country-specific preloads, so I don't see an easy way to get a keylogger in there. The school server does not tell me who to share with, my laptop decides on its own. (edited to add Shamir process Homunq 22:10, 2 August 2007 (EDT))
All non-BLOB files are encrypted on my laptop before backing up on the server. BLOB's (largely multimedia files), if they're too computationally expensive to encrypt on the laptop, could be world-readable by default, and when a child turns off that sharing, they could be encrypted (on laptop and server).
Now, for me to do a recovery OR for someone to snoop my files, they need help from my school authorities and at least one of my friends (or, without the school, all 3 random peers). In order to have a good chance of getting the right friend, they have to poll at least 1/3 of the peers (that is, 1/3 of those which my computer saw in its first few days of life - it should offload the first copy relatively quickly, within 2 hours say, but should be in no great hurry to offload the other two). This is fundamentally more secure than trusting just the authorities.
There would be no need for OLPC to try to influence the circumstances for the children (peers) to cooperate with their half of the key. Their resistance and skepticism can be initially quite low. The aim is not to keep snooping from happening - certainly, you have to allow it in the case of a legitimate subpoena, for instance. The aim is to make the community aware of what snooping is going on, which makes it at least potentially subject to community standards, WHATEVER they are. (The broadcast could just as well say "The local holy man wants to see X's files" as "X lost their laptop and wants to recover their files to a new one". If at least one of the 3 peers says "yes, that's legitimate", the attempt is successful. But if the community then tells the server administrator "we'll have no more of that", in the long term, the community standards of privacy are preserved.)
A thief or hacker is pretty much locked out by this system. They could steal the school server, and probably crack any password it has, but all they get is a bunch of encrypted files (and maybe a few home movies). An out-of-control government would not be stopped by this system alone, but it would make it much harder to act in secrecy or on a large scale. Community knowledge doesn't necesarrily give the power to stop excessive snooping from happening, but it's a necessary precondition.
Note that this does NOT protect against threat 6, as the government simply disables the system. However, as this is done on the laptops themselves, it will be caught by at least one of the million users. Therefore, threat 4 is still locked out.
Homunq 15:09, 31 July 2007 (EDT)
The basic model of saved HTML files - a master document, with subdocuments in a specially-named folder stored at the same location - should be supported with P_DOCUMENT by default. This is useful in many cases, not just HTML. The exact requirement on the folder name - including the name of the master document and a specific suffix - should provide enough security against abusing this privilege. Homunq 09:49, 13 August 2007 (EDT)
How is Bitfrost related to capability models such as EROS/CAPROS?
It would be interesting to see some references to other papers.
Error in the description
Near the end, we find this statement
- This story is quite remarkable, as it amounts to a 13th century recognition of the idea that there's no such thing as a perfect security system.
This is wrong! What is remarkable is that modern humans believe there is such a thing as a perfect security system. Throughout history, mankind has always lived with imperfect security. Do you build a wall around the village or not? How high should the wall be? How thick? How much food should you store behind the wall? Do you protect the peasants (the source of next year's food supply) or not? If you are protecting a region, how many forts do you build? How far apart? How many soldiers per fort? How many patrols per day?
Historically, man has always applied a layered security model and applied cost-benefit analysis in designing each layer. Even security through obscurity has existed such as the tomb of the first emperor of China with its thousands of clay soldiers.
Sort of Unique
"In terms of security, the OLPC XO laptops are a highly unique environment." Should remove highly, unless you mean to use unique in the sense of peculiar or unusual. unique "1. being the only one 2a. being without a like or equal 2b. distinctively characteristic : peculiar 3. unusual" Modifiers may be appropriate when used in sense 2b or 3, but not in sense 1 or 2a. (Doesn't make sense to say highly only one.) [Mirriam-Webster's Dictionary]
external drive access for activities
Say I have an activity that essentially runs off of an external memory. It should be able to use a special "open" dialog box that just says "insert external drive". Sugar would then hand it a specific folder on that drive. This makes for a simpler UI for kids than always having to use a journal select.
If they inserted the drive first, it wouldn't work - they'd have to remove and reinsert. This would prevent an activity from silently gaining access to a drive.Homunq 07:57, 21 December 2007 (EST)
Has P_BIOS_COPY been implemented in the give one get one laptops?
Taste the rainbow
Is it just me, or is it odd that the 'authoritative version of this document' in the metadata is not, in fact, this document. Would someone who understands the standard for this metadata fix it. CharlesMerriam 06:38, 20 March 2008 (EDT)
OLPC Bitfrost and Bitfrost
OLPC Bitfrost and Bitfrost appear to be closely-related versions of the same document. Am I missing something or should a merge template be placed on OLPC Bitfrost proposing it's merger into Bitfrost? Although OLPC Bitfrost is the longer article (perhaps too long), Bitfrost is more heavily linked and more recently modified (by Walter). I would suggest keeping Bitfrost as the primary page, merging OLPC Bitfrost content into Bitfrost and hunting down the OLPC Bitfrost page refs and re-pointing them. Cjl 14:45, 7 April 2008 (EDT)
I had the brilliant idea that lack of P_NETWORK does not mean complete isolation. Since telepathy runs in a different process, on the other side of dbus, you could have telepathy access without P_NETWORK. I proposed this in a message on <firstname.lastname@example.org>, and the eventual response from Ivan (bitfrost author) was "While not made explicit in the spec, this is indeed the design and the way it was discussed with the Collabora people."
Some implications of this:
- Telepathy should not allow privately sending messages to arbitrary IPs. Any outgoing traffic should either be publicly visible (practically speaking - that is, actually present in somebody's UI, not just snoopable by a packet sniffer) or, if private, should have the destination chosen by some trusted UI/negotiation which does not allow spyware to export its data.
- For truly high-security data, we may want to have a revokable P_TELEPATHY too.