Make options hard to find or understand

Place opt-outs out of the way and obfuscate.


  • How many End User License Agreements have you read? How many Privacy Statements? How many Terms of Use documents? One company, PC Pitstop, hid a customer offer in the End User License Agreement for one of its products. It took 4 months and 3,000 downloads before anyone noticed it and contacted them to claim the reward.
  • Between 2007 and 2009, several U.S. Internet service providers (ISPs) installed ad-serving technology produced by a company called NebuAd that could inject advertisements into their customers’ web pages. Typically, this technology was introduced to users via a change in the privacy policy buried on the ISP’s corporate web pages and was opt-out (therefore on by default). One ISP, Embarq, when questioned by the U.S. Government’s House Energy and Commerce Committee, said that it had a 0.06% opt out rate.
  • The path of least resistance through Facebook’s privacy options is to never visit the privacy settings page, and instead just accept these defaults. But that leads to some interesting “openness” issues. The default overall privacy setting makes information available to everyone; ads default to social ads; third-party advertising is allowed; third-party websites get data about you to allow instant personalization; and your timeline is visible in public searches.


Place the options far away from people’s regularly trafficked desire lines.

How to use this pattern

  • Remove any talk of opt-out activities from actual transactional points. Instead create a separate location (a “privacy center”) where you can obscure the true activities with general statements.
  • If you are caught doing bad things with user data, apologize profusely and then add more check boxes, explanations, and options to your privacy center, so it’s even harder to divine the correct settings.
  • Obfuscate. Make it too effortful to actually understand the points in the privacy statement by hiding them in legalese and convoluted options. As long as you have sufficient user trust (or sufficient utility) users will assume that you can’t be doing anything too bad with their data.