Facebook Ad’s Massive Design Bug


In mid-September ProPublica published an article proving Facebook’s advertising system helped them market to people who expressed interest in radical and racial topics:

“Want to market Nazi memorabilia, or recruit marchers for a far-right rally? Facebook’s self-service ad-buying platform had the right audience for you.

Until this week, when we asked Facebook about it, the world’s largest social network enabled advertisers to direct their pitches to the news feeds of almost 2,300 people who expressed interest in the topics of “Jew hater,” “How to burn jews,” or, “History of ‘why jews ruin the world.’” Published September 14th, 2017

Turns out it was technically possible to use Facebook ads to target users claiming to be anti-semitic. This same system could perhaps be used by Russians to target American Voters but that’s a whole other thing. Facebook’s ad system allows advertisers to type categories of users they’d like to target as an input field and then an automated matching system will check those user supplied categories (full or parital) with ones Facebook already had like this:

Aside from the moral and ethical problems of targeting users and groups this way, from a software perspective we might classify this as a design bug. Unlike a coding bug where the program behaves in a way the programmer didn’t intend, a design bug is when the program behaves in a way the programmer did intend but stakeholders don’t like it. In this example Facebook’s ad system was behaving as it was programmed but in a way that was HIGHLY questionable to many stakeholders like advertisers, advertisees, journalists and now federal authorities.

Design Bugs

Design bugs are some of the most common types of bugs developers (testers) find. They are are often caught by questioning the system, approach, design, requirements, et .al in a holistic way. In cross functional and agile teams I’ve found this is easier because you can take a step back to look at the bigger picture with many of the original decision makers. Going through the implications of those decisions as a mental exercise will often expose a bug or two that can be then addressed in the near future.

To help in questioning design bugs, try to:

  • Find a set of scenarios or circumstances that showcase your concerns
  • Make those concerns as costly or annoying as possible
  • Find places outside of the program where users will be impacted

For those familiar with the RIMGEN bug reporting mnemonic, you might notice some similarities, although with a different emphasis.

We (all software developers) introduce design bugs all of the time with the justification of it “works as designed”. Most times the design of a system is’t so damaging. Sometimes it is incredibly.This lesson seems worth keeping in mind. 

Other Things:

  • You can see Facebook’s response here.
  • Benedict Evans suggested this was more of an exploit and programmers (I say all software developers) should consider what happens when users are evil. This is probably a valuable approach. I often take the approach of “anything you allow me to do I will do”. In other words, give me enough room to hang myself and I will.
  • Matt Heusser

    this seems a bit like a test exercise “Write a program thats allows you to search profiles by interest and claims, then filter out anything that might be offensive.” Without a Prioi knowledge of what could be offensive, such a program is conceptually impossible. And A Prior is a fancy way of saying “telling the future.” Jerry weinberg refers to a similar problem in his book “becoming a technical leader” if I recall correctly.

    • That’s the thing, we do have some prior knowledge as to what might be considered offensive. If a small team got together they could probably generate a good size list. For example Nazi’s are universally bad (even if some groups are trying to bring them back), verbs associated with religious groups could very well be (should you be allowed to target people based on religion), etc.

      Twitter also seems like a good place to do research for offensive things as well. Lol

  • Of course, this could simply be a matter of the difference between automated and manual testing. Manual testing will (depending on the deviousness of the testers) show up what might happen when users get evil; automated testing will merely demonstrate that the code works for users, whether they are evil or not.

    • Robert, thanks for commenting.

      Without knowing more about their test strategy it’s hard to say where the fault lies.

      However it is possible to automate evil inputs. The challenge would be coming up a list of offensive inputs that might prove to be problematic. And from there predicting future variations of those. I wonder if there’s a way to use machine learning to help guess this stuff along with some human review..?

  • nilanjan

    The more important takeaway for me is that you shouldn’t classify bugs as design or coding. If you focus on discovering customer inconvenience/pain you will scramble to find the greatest sources of pain. This has all sorts of implications:
    – recognizing that we naturally focus on code and what we intend
    – we shouldn’t just focus *only* on what customers value or on waiting for customer feedback
    – we should up the ante for developers and testers to find potential pain

    Using phrases like ‘coding bug’ is what is called ‘taming a wicked problem’.

    #weinberg #kaner #satisfice #developsense #lets-test #ast #lessonslearned

    • Hi @disqus_UNwJe94IsW:disqus thanks for commenting.

      Why is your takeaway that you shouldn’t classify bugs? Classifying bugs as coding vs design (although not always easy to determine) can help us focus on the right things when reporting them. Classifying bugs doesn’t mean we CAN’T focus on discovering customer pain. Those are mutually exclusive.

      How is using a phrase (or even classifying things) such as coding or design bug a wicked problem? It isn’t impossible to do.

      • nilanjan

        The wicked problem is the abuse of Facebook to target users. It is a very difficult problem to anticipate.

        Companies like Facebook are very developer centric. When you look at a difficult problem and then categorize it as a design problem, it can give the impression that it is not a developer problem. That is what it means to tame a wicked problem.

        A wicked problem cannot be tamed.

        To solve this problem, developers need to not look at code. Software is not code. Every problem is a user problem. For developer centric companies, boxing problems into categories allows them to continue doing what they are doing.