The FBI’s outrageous assault on the 1st Amendment and Apple

By Andrew Crocker | ( Electronic Frontier Foundation ) | – –

Last week, EFF filed a brief in support of Apple’s fight against the FBI, in which we argued that forcing Apple to write—and sign—a custom version of iOS would violate the First Amendment rights of Apple and its programmers. That’s because the right to free speech sharply limits the government’s ability to compel unwilling speakers to speak, and writing and signing computer code are forms of protected speech. So by forcing Apple to write and sign an update to undermine the security of iOS, the court is also compelling Apple to speak in violation of the First Amendment. Along with our brief, we published a “deep dive” into our legal arguments, which you should check out before reading further.

Our argument got some positive attention, but it’s also raised valid questions from folks who aren’t totally convinced. This (long) post attempts to clear up some of those questions.

A caveat: First Amendment doctrine has a lot of facets. Much as it would be nice to present a grand unified theory of free speech, that isn’t the function of a legal brief, or of this FAQ. We’ve made an argument that is firmly grounded in First Amendment case law and that fits the particulars of Apple’s case. Nevertheless, it’s important that our argument be consistent with well-accepted government practices. We think what the FBI wants Apple to do is unprecedented, and an Apple win here wouldn’t risk making every government regulation into a constitutional violation.

With that out of the way, here are some common questions we’ve heard:

Isn’t Apple’s signature more like an instruction to the iPhone and not speech at all?

In order for the court’s All Writs Act order to violate Apple’s First Amendment rights, it has to implicate First Amendment-protected speech. So is the code that Apple would be forced to write and sign really speech? To the extent that most users think about what applications are, they tend to think of them in functional terms: code causes the computer to do something.

But courts that have examined First Amendment protections for computer code have been clear that software has both functional and communicative elements. Just like musical scores, code communicates ideas, and it is interpretable by (some) people. To use the language of First Amendment law, code is “expressive.” And just like other forms of communication, code can be elegant or messy, terse or verbose, and so on.

It’s true that when speech has both expressive and non-expressive elements, government regulations aimed at the non-expressive aspects are subject to less strict scrutiny by courts. Here, you could argue that the government is merely asking Apple to achieve a functional result by writing and signing code, but we think that ignores the inextricably expressive elements of compelling Apple’s signature.

Apple’s signature conveys its strong endorsement of the signed code, what the Supreme Court has called “an affirmation of a belief.” Apple says it believes in strong security for its devices, and it has designed them to run only signed iOS code as a means to ensure this security. (You can certainly quibble with this as a means to enforced walled gardens, but it is a conscious choice.) When Apple signs code, it is conveying, among other things, that (1) the code originated with (or has been reviewed and approved by) Apple; (2) is authentic and has not been modified by a malicious third party; (3) and is safe to run on an Apple device.

Given this, forcing Apple to sign code it does not want to sign is clearly expressive, just as forcing parade organizers to include marchers they do not want injects an unwanted message into the parade. As we argue in the brief, the court’s order is “akin to the government dictating a letter endorsing its preferred position and forcing Apple to transcribe it and sign its unique and forgery-proof name at the bottom.”

Isn’t the “audience” for Apple’s speech just the phone? Or the government? And can’t Apple say it doesn’t “agree” with this speech?

Even if you accept that forcing Apple to write and sign code is compelled speech, you might think it’s a special case, since the only “audience” is the single iPhone 5c that the FBI wants to unlock. According to this argument, it might be different if Apple were forced to push weakened iOS updates to non-consenting users.

But that’s simply not how the First Amendment works. Forcing Apple to engage in speech that is only transmitted to the government, in private, is compelled speech nonetheless, just as the Supreme Court has held that is unconstitutional to force benefit seekers to sign a loyalty oath that only the government sees. And it’s irrelevant that Apple can spend as much money as it wants telling the world that it doesn’t “agree” with the signed code—the Supreme Court has likewise made clear that compelled speech is unconstitutional even when speakers can use other channels to disavow that speech.

What about cigarette labeling and highway safety mandates?

Wouldn’t this argument invalidate a whole range of government regulations—like mandatory labeling for cigarettes or nutritional information on food packaging? What about highway safety rules—bumpers, wheels, and the thousands of other arguably expressive choices that go into making a car that’s allowed on US roads? Wouldn’t companies be allowed to disobey any time a court’s order required them to engage in some speech—such as when a CEO has to order employees to shut down a factory that is illegally polluting?

As we’ve said, we think this is an especially egregious case that rises above lots of other hypotheticals that might incidentally involve compelled speech.

First, compelled speech doctrine has an exception for “purely factual and noncontroversial information” about commercial speech to prevent consumer deception. That’s the theory that has been used to require some cigarette labeling and other “purely factual” government-mandated labels. However, it should be emphasized that this is a narrow exception. (Even some cigarette labels—those that go beyond “purely factual, noncontroversial” information—have been struck down.)

Second, many safety regulations might indirectly require companies to make certain design decisions in order to comply. For example, since the late 80’s, cars have been required to include a “Liddy Light,” a center high-mounted taillight to improve visibility when stopping.

These regulations are distinguishable from requiring Apple to write and sign code, which is a compelled affirmation of belief. Safety regulations are aimed at the non-expressive elements of car designs—the government is making a judgment about necessary safety specifications, not their aesthetic or expressive content. Thus they are arguably subject to a less stringent form of constitutional scrutiny. By contrast, compelled writing and signing of code would require Apple to falsely endorse the government’s chosen version of iOS, which Apple believes to be detrimental to its users’ security. This is somewhat like requiring a food manufacturer to include an ingredient the company feels is unsafe for consumption and then label the package with the company’s “seal of quality.” Apple’s signed code is inherently expressive; indeed the government wants to force Apple to sign the code precisely because of the message the signature conveys.

Even if a certain safety regulation is considered a speech compulsion, it would not be automatically unconstitutional. Even under so-called strict scrutiny, the government can compel speech if it can show that the compelled speech is narrowly tailored to advancing a highly important public interest that cannot be addressed in any other way. With safety regulations, the government may be able to demonstrate the necessity of specific designs, particularly since they’re part of comprehensive regulatory schemes. Similarly, forcing a CEO to order a factory to be shut down might compel some speech, but this rather incidental burden on the CEO’s First Amendment rights could well be justified in light of the government interest at stake. In Apple’s case, the government has not demonstrated any such necessity, nor that the All Writs Act order is narrowly tailored.

Of course, these aren’t the only questions in Apple’s fight, and free speech is just one of many reasons to oppose the government’s demands. But the First Amendment is an important protection for strong encryption, and we’ll rely on it as the new Crypto Wars roll on.

Via Electronic Frontier Foundation

——

Related video added by Juan Cole:

Team Coco: “Steve Wozniak On Apple’s Battle With The FBI – CONAN on TBS”

4 Responses

  1. All the sanctimonious preaching by the EFF and Apple is duplicitous nonsense given the FACT using Apple devices entitles PRIVATE ENTERPRISE access to a constant and unstoppable flow of granular information about YOU as long as that device is powered.

    Although Mr. Wozniac is a pretty good super-tech, a fair dancer and Apple evangelist, he completely ignores one entire side of personal privacy destruction from using the iOS and OSX operating systems as do most Apple accolades.

    Now, if Apple and the EFF can “protect First Amendment Rights” from government encroachment with such passion, why can the constant flow of collectable private data from iOS devices not be met with the same voracity?

    Unfortunately, Apple and the EFF will lose in court as will Apple customers due to a lopsided defense based on duplicity given that PRIVATE SECTOR data collectors and data aggregators already take without a warrant or your permission everything it wants from you.

  2. For all practical purposes, the USA government, as well as ALL governments, have already LOST this battle. So, no matter what sort of “con job” the USA government does on technologically ignorant judges, strong encryption, that requires many months to many years of super-computer power to crack, is simply a fact of life. Nothing the USA government can do will put that toothpaste back in the tube.

    The global open source community has invested lots of energy in developing encryption tools that will completely frustrate all governments and because the basic programming (source code) is freely available to anyone that wants it, there can be no “back doors.” If governments try to add a “back door,” it is immediately discovered and removed (and the submitter is black-listed).

    The simple reality is the “bad guys” already have full access to “unbreakable” encryption and will continue to do so no matter what USA courts decide.

    The down side of the USA government attack on Apple is that if the USA government “wins” it will cripple USA companies on the global market where every country is trying to make their local companies the “winners” in the global market.

    I have seen this first hand . . .

    Many years ago, I worked for a company that made its own operating system. Because of USA law at the time we were prevented from shipping our operating system outside the USA unless we stripped out encryption. Our customers outside the USA were NOT happy about this and threatened to switch to similar products from other parts of the globe. Our solution was to remove our encryption module from the operating system and create an interface between our operating system and a separate encryption engine (all three of the “standard” operating systems now use this design). Although we never “officially” documented the interface between our operating system and our now separate encryption engine, somehow the information “leaked.” As a result, while our USA customers got the encryption engine for “free,” our non-USA customers still purchased our operating system and then purchased one of the several encryption engines that had been developed (without our help or “consent”) outside the reach of the USA government. For some reason, the best of these “independent” encryption engines plugged right into the same place as our encryption engine the USA government prevented us from shipping outside the USA. We were in strict compliance with USA law, but our customers still had very strong encryption.

    BY giving the customers the ability to have encryption, we saved our global market. If we had prevented our non-USA customers from having encryption, like the USA government wanted, we would have lost most, if not all, of our non-USA sales.

    Note that there are several non-USA smart phone makers that are more than willing to provide highly secure smart phones . . .

    link to cryptophone.de

    While the USA can try to keep these phones from being sold in the USA, given the small size and the tremendous amount of trade and human traffic between the USA and other countries, most of the people that would want secure phones will have little trouble getting them. Note that although iPhones have limited sales channels outside the USA and Europe, iPhones are used all over the globe – they are simply purchased in other countries and smuggled.

    Another example is Xiaomi smart phones. Technically Xiaomi does NOT sell in the USA, but it is very easy to purchase a Xiaomi GSM phone from many sources and have it shipped to the USA and use it on either the AT&T or T-Mobile networks.

    BTW – there is already an very active market that provides additional security technology for both the iOS and Android based phones, so even if the USA tries to cripple iOS and Android, the tools are readily available to stop the USA government cold.

    The bottom line is the USA government spies and law enforcement have LOST and they may as well just accept REALITY instead of wasting lots of time and money. Communication devices are global and the USA has no control over any devices that people use.

    • spyguy – Access and encryption are two distinctly separate issues.

      You make an excellent point regarding the effective use of encryption to protect intellectual property.

      However, the government will contend email and chat traffic (data) via an FCC-regulated source is not intellectual property unless formally protected by DRM.

      Our highest government officials have had their personal communications scrutinized.

      There is also the over-integration of Apple devices issue and the continuous stream of personal information already granted to private enterprise.

      Everyone will LOSE this specific battle unless access and encryption are not completely separated in the minds of investigators, the courts and Apple, especially.

      Access and encryption are as mutually exclusive as a coin toss.

      • Access restrictions and encryption use the same technology and as I noted, this technology is readily available for all three major operating systems. As a result, even if the government “wins” its fight with Apple, it will be a Pyrrhic victory where regular non-technical folks will have their devices and communications totally unprotected while the “bad guys” will still have all the tools they need.

        As I noted, the global governments LOST the encryption technology battle long ago and there is no going back. The USA can pass all the laws it wants, but they will be meaningless.

        The global open source community tends to get very angry when governments try to interfere with their activities and they take it as an intellectual challenge to make the lives of governments as miserable as possible.

        As John Oliver noted on Sunday (after I wrote the stuff above) there is a very thriving global market for security technology for personal devices and much of the most secure technology is NOT from the USA (people in other countries have had long experience with nasty governments).

        link to youtube.com

        As for the three operating system vendors having access to personal data, IF people don’t like that, then they can pick different products. Right now most people on earth appear to be less concerned about OS vendors having some of their data than they are with the government getting access.

        The bottom line is for most people, the situation will not change all that much, but the same will be true for the government. The whole reason people use encryption and secure access is to make the cost of getting data as costly as possible (both time and finances). Given enough time, money and resources, any security technology can be cracked, BUT by the time the data is in “plain text” the value is usually close to zero.

        In fact the ONLY value the iPhone that Apple and the government are arguing about are the tools the government is trying to force Apple to build. the actual data on the phone is of no value what so ever.

        I suspect that the government has already reverse engineered all three operating systems and developed tools, so in reality what they want from Apple is cover for what they have already done.

        If I could write a de-compiler fresh out of college, the government can build a much better one. (I had to de-compile a terminal driver that no-one had the original source code for, so we could update it for newer devices).

Comments are closed.