“..the collective postmortem – on the left and right of politics – has focused on a concern with far greater long-term impact: the accidental or deliberate propagation of misinformation via social media.”

Facebook’s founder Mark Zuckerberg, having initially denied that fake news was problematic, now wants us to believe that “we’ve been working on this problem for a long time.” Really? So it is a problem after all?Much has been written about the arrival of a “post-truth” era, in which facts become secondary to feeling; expertise and vision to ersatz emotional connection. Nazi Germany shows that this is not new, but the internet-driven efficiency with which it can be manipulated is.

One of the main drivers of this process is a click-based revenue model, in which algorithms prioritise items in news feeds based on how likely individual users are to “engage with” (ie click on) them – and thus be exposed to more ads. Whether these items contain carefully researched or fabricated material is of no concern to the algos: in fact, false, sensationalist stories that bolster existing prejudices are more likely to draw clicks than sober analyses that challenge assumption. With misinformation being incentivised in this way, who could be surprised when Buzzfeed found a group of young Macedonians copying the most outlandish fabrications to more than 140 specially created pro-Trump websites and sexing up the headlines to gain clicks and go viral on Facebook?

Among the most pernicious myths of our time is that the functioning of the web is neutral and immutable; that it has evolved of its own ethereal logic, like a galaxy, and can’t be changed or stopped.

This is important, because a recent study by the Pew Research Centre found a majority of American adults using Facebook as a source of news (which means Britain is sure to follow). Facebook CEO Mark Zuckerberg has been resistant to the notion that his company, social media, or the web in general are undermining democracy (“a pretty crazy idea”), even after dozens of his own staff formed a covert taskforce to address the problem post-election. It’s easy to see why he bridles too, because if he accepts the truth that his algorithms function no more objectively than a human editor, then he bears responsibility for their choices. And once he does that, he allows the equally obvious truth that Facebook, whether it wants to be or not, is now a media organisation and must vouch for the information it disseminates.

The pedlars of fake news are corroding democracy | Andrew Smith

The most interesting question about 2016 is not why the Brexit result and Trump happened, but whether historians will regard both as incidental; whether this will go down as the year democracy revealed itself unworkable in the age of the internet – in which reality, already engaged in a life-or-death struggle with inverted commas, finally gave way to “alt-reality”.