My own reporting at Slate, as did follow-up stories across the web. The word algorithm has different meanings to different people. In the math world, it simply means a set procedure by which a given class of facebook algorithm message can be solved. But in the context of Facebook, Google, and other big tech companies, it has popularly come to refer to the complex processes by which software programs turn reams of data into some kind of abstract output: a suggestion, a recommendation, or a decision. This interpretation does not absolve the humans; it just informs the discussion of what went wrong and how to fix it. In both of these cases, the problematic output was software-generated, though of course it stemmed from poor decisions made by the humans who coded the process. In the context of ads on Facebook, algorithm usually refers to the software that decides precisely which Facebook users within a target group to show a given ad, and exactly when to show it to them. A rudimentary auto-complete function aided in finding such matches. Imagine your basic office vending machine. Now imagine that some troublemaker placed a grenade in slot E5, where the Snickers bars were supposed to go. Facebook subsequently in an attempt to prevent such discrimination. But the problem had very little to do with that sorting process. And then there was the scandal that hit earlier this month, when Facebook disclosed to congressional investigators that it had found evidence of a Kremlin-linked organization buying a series facebook algorithm message advertisements targeting U. Rather, it was facebook algorithm message fact that Facebook allowed such ads to be placed at all. And how did that happen. As the Verge reported on Monday, the ads had slipped past who were paid to quickly evaluate different components of each advertisement for abuse, illegal activity, or violations of Facebook policy. Some might object that this all boils down to semantic quibbling. But how we frame the problem does matter, because it carries implications for how we might try to solve it. Algorithm problems sound complicated, mysterious. They sound like problems that could be solved one of two ways: either through some sort of technical wizardry or by pulling back from the project of automation altogether, on the grounds that some tasks are just too nuanced to be trusted to machines. And by automating the process, Facebook has sold targeted ads on a scale that would be unthinkable if human oversight were required. So how do you fix a vending machine to keep people from stocking it with grenades. The other possibility would be to make the vending machine smarter—to build in some new features that either more tightly restrict the size and shape of its contents or somehow automatically detect attempts to load it with contraband. Sophisticated algorithms come with their own pitfalls, of course, as Facebook knows from its experience with the news feed. If you value our work, please disable your ad blocker. Want to Block Ads But Still Support Slate. By joining Slate Plus you support our work and get exclusive content. And you'll never see this message again. Po r+i,0 :Uo i,r-1 ,g e,yr t,3 ,i. D:Di e }function Si e {return e?.