Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem is not the algorithm. The problem is the masses that believe everything they watch without questioning it.

This is wrong. You're basically saying people should have been educated against an emerging technology, a process that can take a generation. You let the technologists off the hook on the basis that if they don't do it someone else will, and the negative outcomes are the fault of uneducated for having failed to educate themselves.

The same argument was used to dismiss the way Facebook facilitated the genocide in Myanmar. For sure, social media companies are not the originators of human evil nor in any way the monopolists of it, but they are massively profiting from it despite themselves having the education and knowledge of how their tools are being abused, so that makes them culpable.



That’s a fair point. It does seem morally reprehensible to be aware of the fact that your platform has become a megaphone for genocide and to profit from it nonetheless. So I agree that on those grounds we should hold Facebook accountable.

Still, this is not a new problem. In 1939 an entire nation was susceptible enough to propaganda to accept global war and genocide, and several other nations were willing to turn a blind eye and ally themselves to that nation. Mass propaganda has functioned successfully for a long time, the only difference is that it used to be driven primarily by national interests while now its driven by global iconoclasts on platforms that have economic interests but are otherwise neutral. The reason populations in ‘39 were suceptible to harmful ideogy is the same as the reason they are today—they are cultivated in environs that, either becuase of material limits or cultural emphasis, do not emphasize instilling the ability to conduct critical analysis of agrumentation in the wider population.

Its important to recognize that while technology has certainly resulted in an acceleration of the spread of harmful thought and populist rehteoric it has not enacted a fundamental change in the nature of that rhetoric or of its function. Totalitarianism today looks much the same as it did a few years ago, and as it did even further back in history—only its vehicle has changed. In fact, one of the prime errors in the analysis of modern ills seems to lie in the willful ignorance of history, which in flavor differs from or day but in essence remains the same.

Also, I’m not making the argument that we should have adapted to technologies we couldn’t have foreseen—I’m saying the time to adapt is now not a few years ago. I simply think focusing too heavily on attacking the technologists (now) is the wrong angle and will fail to address the whole issue (though of course, holding them accountable is defintely a big part of the story), simply ensuring the problem lives on in another form.

The goal should be better education systems, and particularly ones that focus on producing rational human subjects , not economic cogs (what public education, for the most part, produces today) that, while plenty efficient (and pleasingly expendable) for corporations, have no aptitude for discerning judgement or humanistic reasoning to save them from political radicalization.


We're in general agreement, but the problem with the educational approach is just the relatively long timeframe over which it takes place. Sticking with your ww2 example, you could say some people certainly saw the risks by the time of the Nazi regime's ascent to power in 1933, but lacked the time as well as the will to adapt sufficiently quickly. We have less of an excuse because while a fully networked society is certainly novel, we certainly have sufficient historical perspective to know and really understand the downside risks compared to earlier generations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: