OLPC Bitfrost/lang-es

From OLPC
< OLPC Bitfrost
Revision as of 15:37, 28 February 2007 by Xavi (talk | contribs) (first dump of the spec)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
   1 System security on the One Laptop per Child's XO laptop
   2 The Bitfrost security platform
   3 =======================================================
   4 
   5 :Author
   6     Ivan Krstić
   7     ivan AT laptop.org
   8     One Laptop Per Child
   9     http://laptop.org
  10 
  11 :Acknowledgments
  12     Simson Garfinkel, a security consultant for OLPC, contributed to this
  13     document.  This document also builds upon a growing body known as
  14     "HCI-SEC," the application of recent advances in the field of Human
  15     Computer Interaction to the important goals of computer security. More
  16     information about HCI-SEC can be found in the book "Security and
  17     Usability," by Lorrie Cranor and Simson Garfinkel (O'Reilly, 2005), and in
  18     Garfinkel's PhD thesis, "Design Principles and Patterns for Computer
  19     Systems that are Simultaneously Secure and Usable" (MIT, 2005).
  20 
  21     We acknowledge also a panel of reviewers that prefer to stay anonymous, who
  22     provided insightful comments and feedback on previous drafts of this
  23     document.
  24 
  25 :Metadata
  26     Revision: Draft-19 - release 1
  27     Timestamp: Wed Feb  7 00:50:57 UTC 2007
  28     Feedback URL: http://mailman.laptop.org/mailman/listinfo/security
  29     Authoritative version of this document: http://wiki.laptop.org/go/Bitfrost
  30 
  31     We welcome feedback on this document, preferably to the public OLPC
  32     security mailing list, for which you can sign up at the feedback URL given
  33     above. If you strongly prefer to keep your comments private, you may mail
  34     the author of the document at the provided e-mail address.
  35 
  36     This is NOT the final version of the specification. The contents of this
  37     document accurately reflect OLPC's thinking about security at the time of
  38     writing, but certain aspects of the security model may change before
  39     production. This document will be updated to reflect any such changes. The
  40     latest version of this document may be found at the authoritative version
  41     URL.
  42 
  43 
  44 
  45 
  46 0. Introduction
  47 ===============
  48 
  49 0.1. Foreword
  50 -------------
  51 
  52 In 1971, 35 years ago, AT&T programmers Ken Thompson and Dennis Ritchie
  53 released the first version of UNIX. The operating system, which started in 1969
  54 as an unpaid project called UNICS, got a name change and some official funding
  55 by Bell Labs when the programmers offered to add text processing support. Many
  56 of the big design ideas behind UNIX persist to this day: popular server
  57 operating systems like Linux, FreeBSD, and a host of others all share much of
  58 the basic UNIX design.
  59 
  60 The 1971 version of UNIX supported the following security permissions on
  61 user files:
  62 
  63     * non-owner can change file (write)
  64     * non-owner can read file
  65     * owner can change file (write)
  66     * owner can read file
  67     * file can be executed
  68     * file is set-uid
  69 
  70 These permissions should look familiar, because they are very close to the same
  71 security permissions a user can set for her files today, in her operating
  72 system of choice. What's deeply troubling -- almost unbelievable -- about these
  73 permissions is that they've remained virtually the _only_ real control
  74 mechanism that a user has over her personal documents today: a user can choose
  75 to protect her files from other people on the system, but has no control
  76 whatsoever over what her own programs are able to do with her files.
  77 
  78 In 1971, this might have been acceptable: it was 20 years before the advent of
  79 the Web, and the threat model for most computer users was entirely different
  80 than the one that applies today. But how, then, is it a surprise that we can't
  81 stop viruses and malware now, when our defenses have remained largely unchanged
  82 from thirty-five years ago?
  83 
  84 The crux of the problem lies in the assumption that any program executing on
  85 a system on the user's behalf should have the exact same abilities and
  86 permissions as any other program executing on behalf of the same user. 1971 was
  87 seven years before the first ever international packet-switched network came
  88 into existence. And the first wide-area network using TCP/IP, the communication
  89 suite used by the modern Internet, wasn't created until 1983, twelve years
  90 after Thompson and Ritchie designed the file permissions we're discussing.  The
  91 bottom line is that in 1971, there was almost no conceivable way a program
  92 could "come to exist" on a computer except if the account owner -- the user --
  93 physically transported it to a machine (for instance, on punched tape), or
  94 entered it there manually. And so the "all or nothing" security approach, where
  95 executing programs have full control over their owner's account, made quite a
  96 lot of sense: any code the user executed, she ipso facto trusted for all
  97 practical purposes.
  98 
  99 Fast forward to today, and the situation couldn't be more different: the
 100 starkest contrast is perhaps the Web, where a user's web browser executes
 101 untrusted scripting code on just about every web page she visits! Browsers are
 102 growing increasingly complex sandboxing systems that try to restrict the
 103 abilities of such web scripts, but even the latest browser versions are still
 104 fixing bugs in their scripting engine implementations. And don't forget e-mail:
 105 anyone can send a user an executable program, and for many years the users'
 106 instinctive reaction was to open the attachment and run the program. Untrusted
 107 code is everywhere, and the only defense seems to be tedious user training and
 108 antivirus software -- the latter assuming it's fully updated, and assuming the
 109 antivirus makers have had time to deconstruct each latest virus and construct a
 110 defense for it.
 111 
 112 Most technologies and approaches mentioned in the rest of this document do not
 113 represent original research: they have been known in the security literature
 114 for years, some of them have been deployed in the field, and others are being
 115 tested in the lab. What makes the OLPC XO laptops radically different is that
 116 they represent the first time that all these security measures have been
 117 carefully put together on a system slated to be introduced to tens or hundreds
 118 of millions of users. The laptops are also possibly the first time that a
 119 mainstream computing product has been willing to give up compatibility with
 120 legacy programs in order to achieve strong security. As an example, you'll find
 121 that talk about anti-virus and anti-spyware technology is conspicuously absent
 122 from this document, because the Bitfrost security platform on the XO laptops
 123 largely renders these issues moot.
 124 
 125 We have set out to create a system that is both drastically more secure and
 126 provides drastically more usable security than any mainstream system currently
 127 on the market. One result of the dedication to usability is that there is only
 128 one protection provided by the Bitfrost platform that requires user response,
 129 and even then, it's a simple 'yes or no' question understandable even by young
 130 children. The remainder of the security is provided behind the scenes.  But
 131 pushing the envelope on both security and usability is a tall order, and as we
 132 state in the concluding chapter of this document, we have neither tried to
 133 create, nor do we believe we have created, a "perfectly secure" system. Notions
 134 of perfect security are foolish, and we distance ourselves up front from any
 135 such claims.
 136 
 137 
 138 
 139 0.2. Security and OLPC
 140 ----------------------
 141 
 142 In terms of security, the OLPC XO laptops are a highly unique environment. They
 143 are slated to introduce computers to young children, many in environments that
 144 have had no prior exposure to computing or the Internet.
 145 
 146 What's more, OLPC is not targeting small-scale local deployments where it could
 147 easily intervene in the case of security problems with the machines or their
 148 usage; instead, once the machines are released in the wild, drastic changes in
 149 the security model should be considered difficult or impossible.
 150 
 151 Plenty of experience exists in locking down user machines, often in corporate
 152 or academic settings. But OLPC has a final constraint that invalidates most of
 153 the current common wisdom: OLPC is, by design, striving to be an eminently
 154 malleable platform, allowing the children to modify, customize, or "hack",
 155 their own machines any way they see fit.
 156 
 157 As a result, no one security policy on the computer will satisfy our
 158 requirements. Instead, we will ship and enable by default a stringent policy
 159 that's appropriate even for the youngest user, and which delivers the strongest
 160 available protections. However, we will provide a simple graphical interface
 161 for interested users to disable any of these protections, allowing the user to
 162 tailor the security level to match her interest in hacking her machine.
 163 
 164 This approach allows us to be highly secure by default, and protect even the
 165 user who has no conception of digital security. At the same time, it avoids
 166 getting in the way of any user who is becoming more sophisticated, and
 167 interested in increasing her abilities on the machine.
 168 
 169 Finally, because we subscribe to constructionist learning theories, we want to
 170 encourage children to all eventually progress to this level of a more
 171 sophisticated user who takes greater liberties with her machine. However, as
 172 long as there exists potential for disaster (i.e. rendering a machine fully
 173 inoperable, or incurring total data loss), this potential serves as a strong
 174 deterrent to this progression. Because of this, in addition to focusing on
 175 security by default, we are explicitly focusing on providing mechanisms for
 176 trivial and unintimidating disaster recovery, such as operating system recovery
 177 from multiple sources and data backup to a central server.
 178 
 179 
 180 
 181 0.3. About this document
 182 ------------------------
 183 
 184 This document follows security throughout the life-cycle of the laptop itself,
 185 starting from the moment a laptop is produced in the factory, to the moment it
 186 first reaches a child, throughout the child's use of the laptop, and finally
 187 stopping at the moment a child wishes to dispose of the laptop. All of this is
 188 preceded by a short section on our goals and principles, which serves to
 189 provide background to some of the decisions we made, and which might be
 190 non-obvious if one thinks of security in the context of normal laptop and
 191 desktop machines.
 192 
 193 This document is complete with regard to the OLPC security model, but is
 194 generally non-technical. A separate document is being prepared that complements
 195 this one with fully technical descriptions and commentary.
 196 
 197 
 198 
 199 0.4. Principles and goals
 200 -------------------------
 201 
 202 === Principles ===
 203 
 204 * Open design
 205   The laptop's security must not depend upon a secret design implemented in
 206   hardware or software.
 207 
 208 * No lockdown
 209   Though in their default settings, the laptop's security systems may impose
 210   various prohibitions on the user's actions, there must exist a way for these
 211   security systems to be disabled. When that is the case, the machine will
 212   grant the user complete control.
 213 
 214 * No reading required
 215   Security cannot depend upon the user's ability to read a message from the
 216   computer and act in an informed and sensible manner. While disabling a
 217   particular security mechanism _may_ require reading, a machine must be secure
 218   out of the factory if given to a user who cannot yet read.
 219 
 220 * Unobtrusive security
 221   Whenever possible, the security on the machines must be behind the scenes,
 222   making its presence known only through subtle visual or audio cues, and never
 223   getting in the user's way. Whenever in conflict with slight user convenience,
 224   strong unobtrusive security is to take precedence, though utmost care must be
 225   taken to ensure such allowances do not seriously or conspicuously reduce the
 226   usability of the machines.
 227 
 228   As an example, if a program is found attempting to violate a security
 229   setting, the user will not be prompted to permit the action; the action will
 230   simply be denied. If the user wishes to grant permission for such an action,
 231   she can do so through the graphical security center interface.
 232 
 233 
 234 === Goals ===
 235 
 236 * No user passwords
 237   With users as young as 5 years old, the security of the laptop cannot depend
 238   on the user's ability to remember a password. Users cannot be expected to
 239   choose passwords when they first receive computers.
 240 
 241 * No unencrypted authentication
 242   Authentication of laptops or users will not depend upon identifiers that are
 243   sent unencrypted over the network.  This means no cleartext passwords of any
 244   kind will be used in any OLPC protocol and Ethernet MAC addresses will never
 245   be used for authentication.
 246 
 247 * Out-of-the-box security
 248   The laptop should be both usable and secure out-of-the-box, without the need
 249   to download security updates when at all possible.
 250 
 251 * Limited institutional PKI
 252   The laptop will be supplied with public keys from OLPC and the country or
 253   regional authority (e.g. the ministry or department of education), but these
 254   keys will not be used to validate the identity of laptop users. The sole
 255   purpose of these keys will be to verify the integrity of bundled software and
 256   content. Users will be identified through an organically-grown PKI without a
 257   certified chain of trust -- in other words, our approach to PKI is KCM, or
 258   key continuity management.
 259 
 260 * No permanent data loss
 261   Information on the laptop will be replicated to some centralized storage
 262   place so that the student can recover it in the even that the laptop is lost,
 263   stolen or destroyed.
 264 
 265 
 266 
 267 
 268 1. Factory production
 269 =====================
 270 
 271 As part of factory production, certain manufacturing data is written to the
 272 built-in SPI flash chip. The chip is rewritable, but barring hardware tampering,
 273 only by a trusted process that will not damage or modify the manufacturing
 274 information.
 275 
 276 Manufacturing data includes two unique identifiers: SN, the serial number,
 277 and U#, the randomly-generated UUID. Serial numbers are not assigned in
 278 order; instead, they are chosen randomly from a pool of integers. The
 279 manufacturing process maintains a mapping of the random serial number assigned,
 280 to the real, incremental serial number which was set to 1 for the first laptop
 281 produced. This mapping is confidential but not secret, and is kept by OLPC.
 282 
 283 The random mapping's sole purpose is to discourage attempts at using serial
 284 numbers of laptops delivered to different countries for attempting to analyze
 285 countries' purchase volumes.
 286 
 287 A laptop's UUID, U#, is a random 32-byte printable ASCII identifier.
 288 
 289 In one of the factory diagnostics stages after each laptop's production, the
 290 diagnostics tool will send the complete manufacturing information, including U#,
 291 SN, and factory information, to an OLPC server. This information will be queued
 292 at the factory in case of connectivity issues, and so won't be lost under any
 293 foreseeable circumstances.
 294 
 295 At the end of the production line, the laptop is in the 'deactivated' state.
 296 This means it must undergo a cryptographic activation process when powered on,
 297 before it can be used by an end user.
 298 
 299 
 300 
 301 
 302 2. Delivery chain security
 303 ==========================
 304 
 305 OLPC arranges only the shipment of laptops from their origin factory to each
 306 purchasing country. Shipping and delivery within each country is organized fully
 307 by the country.
 308 
 309 Given OLPC production volumes, the delivery chain poses an attractive attack
 310 vector for an enterprising thief. The activation requirement makes delivery
 311 theft highly unappealing, requiring hardware intervention to disable on each
 312 stolen laptop before resale. We give an overview of the activation process
 313 below.
 314 
 315 
 316 
 317 
 318 3. Arrival at school site and activation
 319 ========================================
 320 
 321 Before a batch of laptops is shipped to each school, the country uses
 322 OLPC-provided software to generate a batch of activation codes. This "activation
 323 list" maps each (SN, UUID) tuple to a unique activation code for the referenced
 324 laptop. Activation lists are generated on-demand by the country for each laptop
 325 batch, as the laptops are partitioned into batches destined for specific
 326 schools. In other words, there is no master activation list.
 327 
 328 The activation list for a laptop batch is loaded onto a USB drive, and delivered
 329 to a project handler in the target school out of band from the actual laptop
 330 shipment. The handler will be commonly a teacher or other school administrator.
 331 The activation list sent to one school cannot be used to activate any other
 332 laptop batch.
 333 
 334 When the activation list USB drive is received, it is plugged into the
 335 OLPC-provided school server, or another server running the requisite software
 336 that is connected to a wireless access point. Whichever server takes on this
 337 role will be called the 'activation server'. An activated XO laptop can be used
 338 for this purpose, if necessary.
 339 
 340 After receiving the matching laptop batch, the school's project handler will be
 341 tasked with giving a laptop to each child at the school. When a child receives
 342 a laptop, it is still disabled. The child must power on the laptop within
 343 wireless range of the school's activation server. When this happens, the laptop
 344 will securely communicate its (SN, UUID) tuple to the server, which will return
 345 the activation code for the laptop in question, provided the tuple is found in
 346 the activation list, or an error if it isn't.
 347 
 348 Given an invalid activation code or an error, the laptop will sleep for one
 349 hour before retrying activation. If the activation code is valid, the laptop
 350 becomes 'activated', and proceeds to boot to the first-boot screen. A textual
 351 activation code can be entered into the machine manually, if the machine is not
 352 activating automatically for any reason.
 353 
 354 
 355 
 356 
 357 4. First boot
 358 =============
 359 
 360 On first boot, a program is run that asks the child for their name, takes
 361 their picture, and in the background generates an ECC key pair. The key pair is
 362 initially not protected by a passphrase, and is then used to sign the child's
 363 name and picture. This information and the signature are the child's 'digital
 364 identity'.
 365 
 366 The laptop transmits the (SN, UUID, digital identity) tuple to the activation
 367 server. The mapping between a laptop and the user's identity is maintained by
 368 the country or regional authority for anti-theft purposes, but never reaches
 369 OLPC.
 370 
 371 After this, the laptop boots normally, with all security settings enabled.
 372 
 373 
 374 
 375 
 376 5. Software installation
 377 ========================
 378 
 379 There is a very important distinction between two broad classes of programs
 380 that execute on a running system, and this distinction is not often mentioned
 381 in security literature. There are programs that are purposely malicious,
 382 which is to say that they were written with ill intent from the start, such as
 383 with viruses and worms, and there are programs which are circumstantially
 384 malicious but otherwise benign, such as legitimate programs that have been
 385 exploited by an attacker while they're running, and are now being instrumented
 386 to execute code on behalf of the attacker via code injection or some other
 387 method.
 388 
 389 This difference is crucial and cannot be understated, because it's a
 390 reasonable assumption that most software running on a normal machine starts
 391 benign. In fact, we observe that it is through exploitation of benign software
 392 that most malicious software is first _introduced_ to many machines, so
 393 protecting benign software becomes a doubly worthy goal.
 394 
 395 The protection of benign software is a keystone of our security model. We
 396 approach it with the following idea in mind: benign software will not lie about
 397 its purpose during installation.
 398 
 399 To provide an example, consider the Solitaire game shipped with most versions
 400 of Microsoft Windows. This program needs:
 401 
 402     * no network access whatsoever
 403     * no ability to read the user's documents
 404     * no ability to utilize the built-in camera or microphone
 405     * no ability to look at, or modify, other programs
 406 
 407 Yet if somehow compromised by an attacker, Solitaire is free to do whatever the
 408 attacker wishes, including:
 409 
 410     * read, corrupt or delete the user's documents, spreadsheets, music,
 411       photos and any other files
 412     * eavesdrop on the user via the camera or microphone
 413     * replace the user's wallpaper
 414     * access the user's website passwords
 415     * infect other programs on the hard drive with a virus
 416     * download files to the user's machine
 417     * receive or send e-mail on behalf of the user
 418     * play loud or embarassing sounds on the speakers
 419 
 420 The critical observation here is not that Solitaire should never have the
 421 ability to do any of the above (which it clearly shouldn't), but that its
 422 creators _know_ it should never do any of the above. It follows that if the
 423 system implemented a facility for Solitaire to indicate this at installation
 424 time, Solitaire could irreversibly shed various privileges the moment it's
 425 installed, which severely limits or simply destroys its usefulness to an
 426 attacker were it taken over.
 427 
 428 The OLPC XO laptops provide just such a facility. Program installation does
 429 not occur through the simple execution of the installer, which is yet another
 430 program, but through a system installation service which knows how to install
 431 XO program bundles. During installation, the installer service will query
 432 the bundle for the program's desired security permissions, and will notify
 433 the system Security Service accordingly. After installation, the
 434 per-program permission list is only modifiable by the user through a
 435 graphical interface.
 436 
 437 A benign program such as Solitaire would simply not request any special
 438 permissions during installation, and if taken over, would not be able to
 439 perform anything particularly damaging, such as the actions from the above
 440 list.
 441 
 442 It must be noted here that this system _only_ protects benign software. The
 443 problem still remains of intentionally malicious software, which might request
 444 all available permissions during installation in order to abuse them
 445 arbitrarily when run. We address this by making certain initially-requestable
 446 permissions mutually exclusive, in effect making it difficult for malicious
 447 software to request a set of permissions that easily allow malicious action.
 448 Details on this mechanism are provided later in this document.
 449 
 450 As a final note, programs cryptographically signed by OLPC or the
 451 individual countries may bypass the permission request limits, and request
 452 any permissions they wish at installation time.
 453 
 454 
 455 
 456 
 457 6. Software execution: problem statement
 458 ========================================
 459 
 460 The threat model that we are trying to address while the machine is running
 461 normally is a difficult one: we wish to have the ability to execute generally
 462 untrusted code, while severely limiting its ability to inflict harm to the
 463 system.
 464 
 465 Many computer devices that are seen or marketed more as embedded or managed
 466 computers than personal laptops or desktops (one example is AMD's [[PIC
 467 communicator -> http://www.amdboard.com/pic.html]]) purport to dodge the
 468 issue of untrusted code entirely, while staving off viruses, malware and
 469 spyware by only permitting execution of code cryptographically signed by the
 470 vendor. In practice, this means the user is limited to executing a very
 471 restricted set of vendor-provided programs, and cannot develop her own software
 472 or use software from third party developers. While this approach to security
 473 certainly limits available attack vectors, it should be noted it is pointedly
 474 not a silver bullet. A computer that is not freely programmable represents a
 475 tremendous decrease in utility from what most consumers have come to expect
 476 from their computers -- but even if we ignore this and focus merely on the
 477 technical qualifications of such a security system, we must stress that almost
 478 always, cryptographic signatures for binaries are checked at load time, not
 479 continually during execution. Thus exploits for vendor-provided binaries are
 480 still able to execute and harm the system. Similarly, this system fails to
 481 provide any protection against macro attacks.
 482 
 483 As we mention in the introduction, this severely restricted execution model is
 484 absolutely not an option for the XO laptops. What's more, we want to explicitly
 485 encourage our users, the children, to engage in a scenario certain to give
 486 nightmares to any security expert: easy code sharing between computers.
 487 
 488 As part of our educational mission, we're making it very easy for children to
 489 see the code of the programs they're running -- we even provide a View
 490 Source key on the keyboard for this purpose -- and are making it similarly easy
 491 for children to write their own code in Python, our programming language of
 492 choice. Given our further emphasis on collaboration as a feature integrated
 493 directly into the operating system, the scenario where a child develops some
 494 software and wishes to share it with her friends becomes a natural one, and one
 495 that needs to be well-supported.
 496 
 497 Unfortunately, software received through a friend or acquaintance is completely
 498 untrusted code, because there's no trust mapping between people and software:
 499 trusting a friend isn't, and cannot be, the same as trusting code coming from
 500 that friend. The friend's machine might be taken over, and may be attempting to
 501 send malicious code to all her friends, or the friend might be trying to execute
 502 a prank, or he might have written -- either out of ignorance or malice --
 503 software that is sometimes malicious.
 504 
 505 It is against this background that we've constructed security protections for
 506 software on the laptop. A one-sentence summary of the intent of our complete
 507 software security model is that it "tries to prevent software from doing bad
 508 things". The next chapter explains the five categories of 'bad things' that
 509 malicious software might do, and the chapter after that our protections
 510 themselves. Chapter 9 explains how each protection addresses the threat model.
 511 
 512 
 513 
 514 
 515 7. Threat model: bad things that software can do
 516 ==================================================
 517 
 518 There are five broad categories of "bad things" that running software could do,
 519 for the purposes of our discussion. In no particular order, software can attempt
 520 to damage the machine, compromise the user's privacy, damage the user's
 521 information, do "bad things" to people other than the machine's user, and
 522 lastly, impersonate the user.
 523 
 524 
 525 
 526 7.1. Damaging the machine
 527 -------------------------
 528 
 529 Software wishing to render a laptop inoperable has at least five attack
 530 vectors.  It may try to ruin the machine's BIOS, preventing it from booting. It
 531 may attempt to run down the NAND chip used for primary storage, which -- being
 532 a flash chip -- has a limited number of write/erase cycles before ceasing to
 533 function properly and requiring replacement. Successful attacks on the BIOS or
 534 NAND cause hard damage to the machine, meaning such laptops require trained
 535 hardware intervention, including part replacement, to restore to operation. The
 536 third vector, deleting or damaging the operating system, is an annoyance that
 537 would require the machine to be re-imaged and reactivated to run.
 538 
 539 Two other means of damaging the machine cause soft damage: they significantly
 540 reduce its utility. These attacks are performance degradation and battery
 541 drainage (with the side note that variants of the former can certainly cause the
 542 latter.)
 543 
 544 When we say performance degradation, we are referring to the over-utilization of
 545 any system resource such as RAM, the CPU or the networking chip, in a way that
 546 makes the system too slow or unresponsive to use for other purposes. Battery
 547 drainage might be a side-effect of such a malicious performance degradation
 548 (e.g. because of bypassing normal power saving measures and over-utilization of
 549 power-hungry hardware components), or it might be accomplished through some
 550 other means. Once we can obtain complete power measurements for our hardware
 551 system, we will be aware of whether side channels exist for consuming large
 552 amounts of battery power without general performance degradation; this section
 553 will be updated to reflect that information.
 554 
 555 
 556 
 557 7.2. Compromising privacy
 558 -------------------------
 559 
 560 We see two primary means of software compromising user privacy: the
 561 unauthorized sending of user-owned information such as documents and images
 562 over the network, and eavesdropping on the user via the laptops' built-in
 563 camera and microphone.
 564 
 565 
 566 
 567 7.3. Damaging the user's data
 568 -----------------------------
 569 
 570 A malicious program can attempt to delete or corrupt the user's documents,
 571 create large numbers of fake or garbage-filled documents to make it difficult
 572 for the user to find her legitimate ones, or attack other system services that
 573 deal with data, such as the search service. Indeed, attacking the global
 574 indexing service might well become a new venue for spam, that would thus show
 575 up every time the user searched for anything on her system. Other attack
 576 vectors undoubtedly exist.
 577 
 578 
 579 
 580 7.4. Doing bad things to other people
 581 -------------------------------------
 582 
 583 Software might be malicious in ways that do not directly or strongly affect the
 584 machine's owner or operator. Examples include performing Denial of Service
 585 attacks against the current wireless or wired network (a feat particularly easy
 586 on IPv6 networks, which our laptops will operate on by default), becoming a
 587 spam relay, or joining a floodnet or other botnet.
 588 
 589 
 590 
 591 7.5. Impersonating the user
 592 ---------------------------
 593 
 594 Malicious software might attempt to abuse the digital identity primitives on
 595 the system, such as digital signing, to send messages appearing to come from
 596 the user, or to abuse previously authenticated sessions that the user might
 597 have created to privileged resources, such as the school server.
 598 
 599 
 600 
 601 
 602 8. Protections
 603 ==============
 604 
 605 Here, we explain the set of protections that make up the bulk of the Bitfrost
 606 security platform, our name for the sum total of the laptop's security systems.
 607 Each protection listed below is given a concise uppercase textual label
 608 beginning with the letter P. This label is simply a convenience for easy
 609 reference, and stands for both the policy and mechanism of a given protection
 610 system.
 611 
 612 Almost all of the protections we discuss can be disabled by the user through a
 613 graphical interface. While the laptop's protections are active, this interface
 614 cannot be manipulated by the programs on the system through any means, be
 615 it synthetic keyboard and mouse events or direct configuration file
 616 modification.
 617 
 618 
 619 
 620 8.1. P_BIOS_CORE: core BIOS protection
 621 --------------------------------------
 622 
 623 The BIOS on an XO laptop lives in a 1MB SPI flash chip, mentioned in Section
 624 1.1. This chip's purpose is to hold manufacturing information about the machine
 625 including its (SN, UUID) tuple, and the BIOS and firmware. Reflashing the
 626 stored BIOS is strictly controlled, in such a way that only a BIOS image
 627 cryptographically signed by OLPC can be flashed to the chip. The firmware will
 628 not perform a BIOS reflashing if the battery level is detected as low, to avoid
 629 the machine powering off while the operation is in progress.
 630 
 631 A child may request a so-called developer key from OLPC. This key, bound to the
 632 child's laptop's (SN, UUID) tuple, allows the child to flash any BIOS she
 633 wishes, to accommodate the use case of those children who progress to be very
 634 advanced developers and wish to modify their own firmware.
 635 
 636 
 637 
 638 8.2. P_BIOS_COPY: secondary BIOS protection
 639 -----------------------------------------------
 640 
 641 The inclusion of this protection is uncertain, and depends on the final size of
 642 the BIOS and firmware after all the desired functionality is included. The SPI
 643 flash offers 1MB of storage space; if the BIOS and firmware can be made to fit
 644 in less than 512KB, a second copy of the bundle will be stored in the SPI. This
 645 secondary copy would be immutable (cannot be reflashed) and used to boot the
 646 machine in case of the primary BIOS being unbootable. Various factors might
 647 lead to such a state, primarily hard power loss during flashing, such as
 648 through the removal of the battery from the machine, or simply a malfunctioning
 649 SPI chip which does not reflash correctly. This section will be updated once it
 650 becomes clear whether this protection can be included.
 651 
 652 
 653 
 654 8.3. P_SF_CORE: core system file protection
 655 -----------------------------------------------
 656 
 657 The core system file protection disallows modification of the stored system
 658 image on a laptop's NAND flash, which OLPC laptops use as primary storage.
 659 While engaged, this protection keeps any process on the machine from altering
 660 in any way the system files shipped as part of the OLPC OS build.
 661 
 662 This protection may not be disabled without a developer key, explained in
 663 Section 8.1.
 664 
 665 
 666 
 667 8.4. P_SF_RUN: running system file protection
 668 ---------------------------------------------
 669 
 670 Whereas P_SF_CORE protects the *stored* system files, P_SF_RUN protects the
 671 *running* system files from modification. As long as P_SF_RUN is engaged, at
 672 every boot, the running system is loaded directly from the stored system files,
 673 which are then marked read-only.
 674 
 675 When P_SF_RUN is disengaged, the system file loading process at boot changes.
 676 Instead of loading the stored files directly, a COW (copy on write) image is
 677 constructed from them, and system files from _that_ image are initialized as the
 678 running system. The COW image uses virtually no additional storage space on the
 679 NAND flash until the user makes modifications to her running system files, which
 680 causes the affected files to be copied before being changed. These modifications
 681 persist between boots, but only apply to the COW copies: the underlying system
 682 files remain untouched.
 683 
 684 If P_SF_RUN is re-engaged after being disabled, the boot-time loading of system
 685 files changes again; the system files are loaded into memory directly with no
 686 intermediate COW image, and marked read-only.
 687 
 688 P_SF_CORE and P_SF_RUN do not inter-depend. If P_SF_CORE is disengaged and the
 689 stored system files are modified, but P_SF_RUN is engaged, after reboot no
 690 modification of the running system will be permitted, despite the fact that the
 691 underlying system files have changed from their original version in the OLPC OS
 692 build.
 693 
 694 
 695 
 696 8.5. P_NET: network policy protection
 697 -------------------------------------
 698 
 699 Each program's network utilization can be constrained in the following
 700 ways:
 701 
 702     * Boolean network on/off restriction
 703     * token-bucketed bandwidth throttling with burst allowance
 704     * connection rate limiting
 705     * packet destination restrictions by host name, IP and port(s)
 706     * time-of-day restrictions on network use
 707     * data transfer limit by hour or day
 708     * server restriction (can bind and listen on a socket), Boolean and
 709       per-port
 710 
 711 Reasonable default rate and transfer limits will be imposed on all non-signed
 712 programs. If necessary, different policies can apply to mesh and access point
 713 traffic.  Additional restrictions might be added to this list as we complete
 714 our evaluation of network policy requirements.
 715 
 716 
 717 
 718 8.6. P_NAND_RL: NAND write/erase protection
 719 -------------------------------------------
 720 
 721 A token-bucketed throttle with burst allowance will be in effect for the JFFS2
 722 filesystem used on the NAND flash, which will simply start delaying
 723 write/erase operations caused by a particular program after its bucket is
 724 drained. It is currently being considered that such a delay behaves as an
 725 exponential backoff, though no decision has yet been made, pending some field
 726 testing.
 727 
 728 A kernel interface will expose the per-program bucket fill levels to
 729 userspace, allowing the implementation of further userspace policies, such as
 730 shutting down programs whose buckets remain drained for too long. These
 731 policies will be maintained and enforced by the system Security Service, a
 732 privileged userspace program.
 733 
 734 
 735 
 736 8.7. P_NAND_QUOTA: NAND quota
 737 -----------------------------
 738 
 739 To prevent disk exhaustion attacks, programs are given a limited scratch
 740 space in which they can store their configuration and temporary files, such as
 741 various caches. Currently, that limit is 5MB. Additionally, limits will be
 742 imposed on inodes and dirents within that scratch space, with values to be
 743 determined.
 744 
 745 This does not include space for user documents created or manipulated by the
 746 program, which are stored through the file store. The file store is
 747 explained in a later section.
 748 
 749 
 750 
 751 8.8. P_MIC_CAM: microphone and camera protection
 752 ------------------------------------------------
 753 
 754 At the first level, our built-in camera and microphone are protected by
 755 hardware: an LED is present next to each, and is lit (in hardware, without
 756 software control) when the respective component is engaged. This provides a
 757 very simple and obvious indication of the two being used. The LEDs turning on
 758 unexpectedly will immediately tip off the user to potential eavesdropping.
 759 
 760 Secondly, the use of the camera and microphone require a special permission,
 761 requested at install-time as described in Chapter 5, for each program
 762 wishing to do so. This permission does not, however, allow a program to
 763 instantly turn on the camera and microphone. Instead, it merely lets the
 764 program _ask_ the user to allow the camera or microphone (or both) to be
 765 turned on.
 766 
 767 This means that any benign programs which are taken over but haven't
 768 declared themselves as needing the camera or microphone cannot be used
 769 neither to turn on either, NOR to ask the user to do so!
 770 
 771 Programs which have declared themselves as requiring those privileges (e.g.
 772 a VOIP or videoconferencing app) can instruct the system to ask the user for
 773 permission to enable the camera and microphone components, and if the request
 774 is granted, the program is granted a timed capability to manipulate the
 775 components, e.g. for 30 minutes. After that, the user will be asked for
 776 permission again.
 777 
 778 As mentioned in Chapter 5, programs cryptographically signed by a
 779 trusted authority will be exempt from having to ask permission to manipulate
 780 the components, but because of the LEDs which indicate their status, the
 781 potential for abuse is rather low.
 782 
 783 
 784 
 785 8.9. P_CPU_RL: CPU rate limiting
 786 --------------------------------
 787 
 788 Foreground programs may use all of the machine's CPU power. Background
 789 programs, however, may use no more than a fixed amount -- currently we're
 790 looking to use 10% -- unless given a special permission by the user.
 791 
 792 The Sugar UI environment on the XO laptops does not support overlapping
 793 windows: only maximized application windows are supported. When we talk about
 794 foreground and background execution, we are referring to programs that are, or
 795 are not, currently displaying windows on the screen.
 796 
 797 
 798 
 799 8.10. P_RTC: real time clock protection
 800 ---------------------------------------
 801 
 802 A time offset from the RTC is maintained for each running program, and the
 803 program is allowed to change the offset arbitrarily. This fulfills the need
 804 of certain programs to change the system time they use (we already have a
 805 music program that must synchronize to within 10ms with any machines with
 806 which it co-plays a tune) while not impacting other programs on the system.
 807 
 808 
 809 
 810 8.11. P_DSP_BG: background sound permission
 811 -------------------------------------------
 812 
 813 This is a permission, requestable at install-time, which lets the program
 814 play audio while it isn't in the foreground. Its purpose is to make benign
 815 programs immune to being used to play annoying or embarrassing loud sounds
 816 if taken over.
 817 
 818 
 819 
 820 8.12. P_X: X window system protection
 821 -------------------------------------
 822 
 823 When manually assigned to a program by the user through a graphical
 824 security interface, this permission lets a program send synthetic mouse
 825 X events to another program. Its purpose is to enable the use of
 826 accessibility software such as an on-screen keyboard. The permission is NOT
 827 requestable at install-time, and thus must be manually assigned by the user
 828 through a graphical interface, unless the software wishing to use it is
 829 cryptographically signed by a trusted authority.
 830 
 831 Without this permission, programs cannot eavesdrop on or fake one another's
 832 events, which disables key logging software or sophisticated synthetic event
 833 manipulation attacks, where malicious software acts as a remote control for
 834 some other running program.
 835 
 836 
 837 
 838 8.13. P_IDENT: identity service
 839 -------------------------------
 840 
 841 The identity service is responsible for generating an ECC key pair at first
 842 boot, keeping the key pair secure, and responding to requests to initiate
 843 signed or encrypted sessions with other networked machines.
 844 
 845 With the use of the identity service, all digital peer interactions or
 846 communication (e-mails, instant messages, and so forth) can be
 847 cryptographically signed to maintain integrity even as they're routed through
 848 potentially malicious peers on the mesh, and may also be encrypted in countries
 849 where this does not present a legal problem.
 850 
 851 
 852 
 853 8.14. P_SANDBOX: program jails
 854 ----------------------------------
 855 
 856 A program on the XO starts in a fortified chroot, akin to a BSD jail,
 857 where its visible filesystem root is only its own constrained scratch space. It
 858 normally has no access to system paths such as /proc or /sys, cannot see other
 859 programs on the system or their scratch spaces, and only the libraries it needs
 860 are mapped into its scratch space. It cannot access user documents directly,
 861 but only through the file store service, explained in the next section.
 862 
 863 Every program scratch space has three writable directories, called 'tmp',
 864 'conf', and 'data'. The program is free to use these for temporary,
 865 configuration, and data (resource) files, respectively. The rest of the scratch
 866 space is immutable; the program may not modify its binaries or core
 867 resource files. This model ensures that a program may be restored to its
 868 base installation state by emptying the contents of the three writable
 869 directories, and that it can be completely uninstalled by removing its bundle
 870 (scratch space) directory.
 871 
 872 
 873 
 874 8.15. P_DOCUMENT: file store service
 875 ------------------------------------
 876 
 877 Unlike with traditional machines, user documents on the XO laptop are not
 878 stored directly on the filesystem. Instead, they are read and stored through
 879 the file store service, which provides an object-oriented interface to user
 880 documents. Similar in very broad terms to the Microsoft WinFS design, the file
 881 store allows rich metadata association while maintaining traditional UNIX
 882 read()/write() semantics for actual file content manipulation.
 883 
 884 Programs on the XO may not use the open() call to arbitrarily open user
 885 documents in the system, nor can they introspect the list of available
 886 documents, e.g. through listing directory contents.  Instead, when a program
 887 wishes to open a user document, it asks the system to present the user with a
 888 'file open' dialog. A copy-on-write version of the file that the user selects
 889 is also mapped into this scratch space -- in effect, the file just "appears",
 890 along with a message informing the program of the file's path within the
 891 scratch space.
 892 
 893 Unix supports the passing of file descriptors (fds) through Unix domain
 894 sockets, so an alternative implementation of P_DOCUMENT would merely pass in
 895 the fd of the file in question to the calling program. We have elected not to
 896 pursue this approach because communication with the file store service does not
 897 take place directly over Unix domain sockets, but over the D-BUS IPC mechanism,
 898 and because dealing with raw fds can be a hassle in higher-level languages.
 899 
 900 Benign programs are not adversely impacted by the need to use the file store
 901 for document access, because they generally do not care about rendering their
 902 own file open dialogs (with the rare exception of programs which create
 903 custom dialogs to e.g. offer built-in file previews; for the time being, we
 904 are not going to support this use case).
 905 
 906 Malicious programs, however, lose a tremendous amount of ability to violate
 907 the user's privacy or damage her data, because all document access requires
 908 explicit assent by the user.
 909 
 910 
 911 
 912 8.16. P_DOCUMENT_RO
 913 -------------------
 914 
 915 Certain kinds of software, such as photo viewing programs, need access to
 916 all documents of a certain kind (e.g. images) to fulfill their desired
 917 function. This is in direct opposition with the P_DOCUMENT protection which
 918 requires user consent for each document being opened -- in this case, each
 919 photo.
 920 
 921 To resolve the quandary, we must ask ourselves: "from what are we trying to
 922 protect the user?". The answer, here, is a malicious program which requests
 923 permission to read all images, or all text files, or all e-mails, and then
 924 sends those documents over the network to an attacker or posts them publicly,
 925 seriously breaching the user's privacy.
 926 
 927 We solve this by allowing programs to request read-only permissions for one
 928 type of document (e.g. image, audio, text, e-mail) at installation time, but
 929 making that permission (P_DOCUMENT_RO) mutually exclusive with asking for any
 930 network access at all. A photo viewing program, in other words, normally
 931 has no business connecting to the Internet.
 932 
 933 As with other permissions, the user may assign the network permission to a
 934 program which requested P_DOCUMENT_RO at install, bypassing the mutual
 935 exclusion.
 936 
 937 
 938 
 939 8.17. P_DOCUMENT_RL: file store rate limiting
 940 ---------------------------------------------
 941 
 942 The file store does not permit programs to store new files or new versions
 943 of old files with a frequency higher than a certain preset, e.g. once every 30
 944 seconds.
 945 
 946 
 947 
 948 8.18. P_DOCUMENT_BACKUP: file store backup service
 949 --------------------------------------------------
 950 
 951 When in range of servers that advertise themselves as offering a backup
 952 service, the laptop will automatically perform incremental backups of user
 953 documents which it can later retrieve. Because of the desire to avoid having to
 954 ask children to generate a new digital identity if their laptop is ever lost,
 955 stolen or broken, by default the child's ECC keypair is also backed up to the
 956 server. Given that a child's private key normally has no password protection,
 957 stealing the primary backup server (normally the school server) offers the
 958 thief the ability to impersonate any child in the system.
 959 
 960 For now, we deem this an acceptable risk. We should also mention that the
 961 private key will only be backed up to the primary backup server -- usually in
 962 the school -- and not any server that advertises itself as providing backup
 963 service. Furthermore, for all non-primary backup servers, only encrypted
 964 version of the incremental backups will be stored.
 965 
 966 
 967 
 968 8.19. P_THEFT: anti-theft protection
 969 ------------------------------------
 970 
 971 The OLPC project has received very strong requests from certain countries
 972 considering joining the program to provide a powerful anti-theft service that
 973 would act as a theft deterrent against most thieves.
 974 
 975 We provide such a service for interested countries to enable on the laptops. It
 976 works by running, as a privileged process that cannot be disabled or
 977 terminated even by the root user, an anti-theft daemon which detects Internet
 978 access, and performs a call-home request -- no more than once a day -- to the
 979 country's anti-theft servers. In so doing, it is able to securely use NTP to
 980 set the machine RTC to the current time, and then obtain a cryptographic lease
 981 to keep running for some amount of time, e.g. 21 days. The lease duration is
 982 controlled by each country.
 983 
 984 A stolen laptop will have its (SN, UUID) tuple reported to the country's OLPC
 985 oversight body in charge of the anti-theft service. The laptop will be marked
 986 stolen in the country's master database.
 987 
 988 A thief might do several things with a laptop: use it to connect to the
 989 Internet, remove it from any networks and attempt to use it as a standalone
 990 machine, or take it apart for parts.
 991 
 992 In the former case, the anti-theft daemon would learn that the laptop is stolen
 993 as soon as it's connected to the Internet, and would perform a hard shutdown
 994 and lock the machine such that it requires activation, described previously, to
 995 function.
 996 
 997 We do not expect the machines will be an appealing target for part resale. Save
 998 for the custom display, all valuable parts of the XO laptops are soldered onto
 999 the motherboard.
1000 
1001 To address the case where a stolen machine is used as a personal computer but
1002 not connected to the Internet, the anti-theft daemon will shut down and lock
1003 the machine if its cryptographic lease ever expires. In other words, if the
1004 country operates with 21-day leases, a normal, non-stolen laptop will get the
1005 lease extended by 21 days each day it connects to the Internet. But if the
1006 machine does not connect to the Internet for 21 days, it will shut down and
1007 lock.
1008 
1009 Since this might present a problem in some countries due to intermittent
1010 Internet access, the leases can either be made to last rather long (they're
1011 still an effective theft deterrent even with a 3 month duration), or they can
1012 be manually extended by connecting a USB drive to the activation server. For
1013 instance, a country may issue 3-week leases, but if a school has a satellite
1014 dish failure, the country's OLPC oversight body may mail a USB drive to the
1015 school handler, which when connected to the school server, transparently
1016 extends the lease of each referenced laptop for some period of time.
1017 
1018 The anti-theft system cannot be bypassed as long as P_SF_CORE is enabled (and
1019 disabling it requires a developer key). This, in effect, means that a child is
1020 free to do any modification to her machine's userspace (by disabling P_SF_RUN
1021 without a developer key), but cannot change the running kernel without
1022 requesting the key. The key-issuing process incorporates a 14-day delay to
1023 allow for a slow theft report to percolate up through the system, and is only
1024 issued if the machine is not reported stolen at the end of that period of time.
1025 
1026 
1027 
1028 8.21. P_SERVER_AUTH: transparent strong authentication to trusted server
1029 ------------------------------------------------------------------------
1030 
1031 When in wireless range of a trusted server (e.g. one provided by OLPC or the
1032 country), the laptop can securely respond to an authentication challenge with
1033 its (SN, UUID) tuple. In addition to serving as a means for the school to
1034 exercise network access control -- we know about some schools, for instance,
1035 that do not wish to provide Internet access to alumni, but only current
1036 students -- this authentication can unlock extra services like backup and
1037 access to a decentralized digital identity system such as OpenID.
1038 
1039 [[OpenID -> http://en.wikipedia.org/wiki/OpenID]] is particularly appealing
1040 to OLPC, because it can be used to perpetuate passwordless access even on sites
1041 that normally require authentication, as long as they support OpenID. The most
1042 common mode of operation for current OpenID identity providers is to request
1043 password authentication from the user. With an OpenID provider service running
1044 on the school server (or other trusted servers), logins to OpenID-enabled sites
1045 will simply succeed transparently, because the child's machine has been
1046 authenticated in the background by P_SERVER_AUTH.
1047 
1048 
1049 
1050 8.21. (For later implementation) P_PASSWORD: password protection
1051 ----------------------------------------------------------------
1052 
1053 It is unclear whether this protection will make it in to generation 1 of the XO
1054 laptops. When implemented, however, it will allow the user to set a password to
1055 be used for her digital identity, booting the machine, and accessing some of
1056 her files.
1057 
1058 
1059 
1060 
1061 9. Addressing the threat model
1062 ==============================
1063 
1064 We look at the five categories of "bad things" software can do as listed in
1065 Chapter 7, and explain how protections listed in Chapter 8 help. The following
1066 sections are given in the same order as software threat model entries in
1067 Chapter 7.
1068 
1069 
1070 
1071 9.1. Damaging the machine
1072 -------------------------
1073 
1074 P_BIOS_CORE ensures the BIOS can only be updated by BIOS images coming from
1075 trusted sources. A child with a developer key may flash whichever BIOS she
1076 pleases, though if we are able to implement P_BIOS_COPY, the machine will
1077 remain operational even if the child flashes a broken or garbage BIOS.
1078 Programs looking to damage the OS cannot do so because of P_SANDBOX and
1079 P_SF_RUN. Should a user with P_SF_RUN disabled be tricked into damaging her OS
1080 or do so accidentally, P_SF_CORE enables her to restore her OS to its initial
1081 (activated) state at boot time.
1082 
1083 Programs trying to trash the NAND by exhausting write/erase cycles are
1084 controlled through P_NAND_RL, and disk exhaustion attacks in the scratch space
1085 are curbed by P_NAND_QUOTA. Disk exhaustion attacks with user documents are
1086 made much more difficult by P_DOCUMENT_RL.
1087 
1088 CPU-hogging programs are reined in with P_CPU_RL. Network-hogging programs are
1089 controlled by policy via P_NET.
1090 
1091 
1092 
1093 9.2. Compromising privacy
1094 -------------------------
1095 
1096 Arbitrary reading and/or sending of the user's documents over the network is
1097 curbed by P_DOCUMENT, while tagging documents with the program that created
1098 them addresses the scenario in which a malicious program attempts to spam
1099 the search service. Search results from a single program can simply be
1100 hidden (permanently), or removed from the index completely.
1101 
1102 P_DOCUMENT_RO additionally protects the user from wide-scale privacy breaches
1103 by software that purports to be a "viewer" of some broad class of documents.
1104 
1105 P_MIC_CAM makes eavesdropping on the user difficult, and P_X makes it very hard
1106 to steal passwords or other sensitive information, or monitor text entry from
1107 other running programs.
1108 
1109 
1110 
1111 9.3. Damaging the user's data
1112 -----------------------------
1113 
1114 File store does not permit programs to overwrite objects such as e-mail and
1115 text which aren't opaque binary blobs. Instead, only a new version is stored,
1116 and the file store exposes a list of the full version history. This affords a
1117 large class of documents protection against deletion or corruption at the hands
1118 of a malicious program -- which, of course, had to obtain the user's
1119 permission to look at the file in question in the first place, as explained in
1120 P_DOCUMENT.
1121 
1122 For binary blobs -- videos, music, images -- a malicious program in which
1123 the user specifically opens a certain file does have the ability to corrupt or
1124 delete the file. However, we cannot protect the user from herself. We point
1125 out that such deletion is constrained to _only_ those files which the user
1126 explicitly opened. Furthermore, P_DOCUMENT_BACKUP allows a final way out even
1127 in such situations, assuming the machine came across a backup server (OLPC
1128 school servers advertise themselves as such).
1129 
1130 
1131 
1132 9.4. Doing bad things to other people
1133 -------------------------------------
1134 
1135 XO laptops will be quite unattractive as spam relays or floodnet clients due to
1136 network rate and transfer limits imposed on all non-signed programs by
1137 P_NET. Despite the appeal of the XO deployment scale for spamming or flooding,
1138 we expect that a restriction to generally low-volume network usage for
1139 untrusted software -- coupled with the great difficulty in writing worms or
1140 self-propagating software for XO machines -- will drastically reduce this
1141 concern.
1142 
1143 
1144 
1145 9.5. Impersonating the user
1146 ---------------------------
1147 
1148 The design of the identity service, P_IDENT, does not allow programs to
1149 ever come in direct contact with the user's cryptographic key pair, nor to
1150 inject information into currently-open sessions which are using the identity
1151 service for signing or encryption.
1152 
1153 
1154 
1155 9.6. Miscellaneous
1156 ------------------
1157 
1158 In addition to the protections listed above which each address some part of the
1159 threat model, permissions P_RTC and P_THEFT combine to offer an anti-theft
1160 system that requires non-trivial sophistication (ability to tamper with
1161 on-board hardware) to defeat, and P_DSP_BG provides protection against certain
1162 types of annoying malware, such as the infamous 1989 Yankee Doodle virus.
1163 
1164 
1165 
1166 9.7. Missing from this list
1167 ---------------------------
1168 
1169 At least two problems, commonly associated with laptops and child computer
1170 users respectively, are not discussed by our threat model or protection
1171 systems: hard drive encryption and objectionable content filtering / parental
1172 controls.
1173 
1174 
1175 === 9.7.1. Filesystem encryption ===
1176 
1177 While the XO laptops have no hard drive to speak of, the data encryption
1178 question applies just as well to our flash primary storage. The answer consists
1179 of two parts: firstly, filesystem encryption is too slow given our hardware.
1180 The XO laptops can encrypt about 2-4 MB/s with the AES-128 algorithm in CBC
1181 mode, using 100% of the available CPU power. This is about ten times less than
1182 the throughput of the NAND flash chip. Moving to a faster algorithm such as RC4
1183 increases encryption throughput to about 15 MB/s with large blocks at 100% CPU
1184 utilization, and is hence still too slow for general use, and provides
1185 questionable security. Secondly, because of the age of our users, we have
1186 explicitly designed the Bitfrost platform not to rely on the user setting
1187 passwords to control access to her computer. But without passwords, user data
1188 encryption would have to be keyed based on unique identifiers of the laptop
1189 itself, which lends no protection to the user's documents in case the laptop is
1190 stolen.
1191 
1192 Once the Bitfrost platform supports the P_PASSWORD protection, which might not
1193 be until the second generation of the XO laptops, we will provide support for
1194 the user to individually encrypt files if she enabled the protection and set a
1195 password for herself.
1196 
1197 
1198 === 9.7.2. Objectionable content filtering ===
1199 
1200 The Bitfrost platform governs system security on the XO laptops. Given that
1201 "objectionable content" lacks any kind of technical definition, and is instead
1202 a purely social construct, filtering such content lies wholly outside of the
1203 scope of the security platform and this document.
1204 
1205 
1206 
1207 
1208 10. Laptop disposal and transfer security
1209 =========================================
1210 
1211 The target lifetime of an XO laptop is five years. After this time elapses, the
1212 laptop's owner might wish to dispose of the laptop. Similarly, for logistical
1213 reasons, a laptop may change hands, going from one owner to another.
1214 
1215 A laptop re-initialization program will be provided which securely erases the
1216 user's digital identity and all user documents from a laptop. When running in
1217 "disposal" mode, that program could also be made to permanently disable the
1218 laptop, but it is unclear whether such functionality is actually necessary, so
1219 there are no current plans for providing it.
1220 
1221 
1222 
1223 
1224 11. Closing words
1225 =================
1226 
1227 In Norse mythology, Bifröst is the bridge which keeps mortals, inhabitants of
1228 the realm of Midgard, from venturing into Asgard, the realm of the gods. In
1229 effect, Bifröst is a powerful security system designed to keep out unwanted
1230 intruders.
1231 
1232 This is not why the OLPC security platform's name is a play on the name of the
1233 mythical bridge, however. What's particularly interesting about Bifröst is a
1234 story that 12th century Icelandic historian and poet Snorri Sturluson tells in
1235 the first part of his poetics manual called the Prose Edda. Here is the
1236 relevant excerpt from the 1916 translation by Arthur Gilchrist Brodeur:
1237 
1238     Then said Gangleri: "What is the way to heaven from earth?"
1239 
1240     Then Hárr answered, and laughed aloud: "Now, that is not wisely asked; has
1241     it not been told thee, that the gods made a bridge from earth, to heaven,
1242     called Bifröst? Thou must have seen it; it may be that ye call it rainbow.'
1243     It is of three colors, and very strong, and made with cunning and with more
1244     magic art than other works of craftsmanship. But strong as it is, yet must
1245     it be broken, when the sons of Múspell shall go forth harrying and ride it,
1246     and swim their horses over great rivers; thus they shall proceed."
1247 
1248     Then said Gangleri: "To my thinking the gods did not build the bridge
1249     honestly, seeing that it could be broken, and they able to make it as they
1250     would."
1251 
1252     Then Hárr replied: "The gods are not deserving of reproof because of this
1253     work of skill: a good bridge is Bifröst, but nothing in this world is of
1254     such nature that it may be relied on when the sons of Múspell go
1255     a-harrying."
1256 
1257 This story is quite remarkable, as it amounts to a 13th century recognition of
1258 the idea that there's no such thing as a perfect security system.
1259 
1260 To borrow Sturluson's terms, we believe we've imbued the OLPC security system
1261 with cunning and more magic art than other similar works of craftmanship -- but
1262 not for a second do we believe we've designed something that cannot be broken
1263 when talented, determined and resourceful attackers go forth harrying. Indeed,
1264 this was not the goal. The goal was to significantly raise the bar from the
1265 current, deeply unsatisfactory, state of desktop security. We believe Bitfrost
1266 accomplishes this, though only once the laptops are deployed in the field will
1267 we be able to tell with some degree of certainty whether we have succeeded.
1268 
1269 If the subject matter interests you, please join the OLPC security mailing
1270 list, share your thoughts, and join the discussion.
1271 
1272 
1273 
1274 
1275 
1276 END