[NetBehaviour] Free speech

Max Herman maxnmherman at hotmail.com
Sun Jan 10 20:54:43 CET 2021

Hi Ruth,

Great points.  The algorithms I think are not so much designed to do things, but selected based on their results.  What algorithms keep people looking, keep them "in session"?  There is this supposition that such algorithms are "finding the user what they want," but what if the user really wants to take a break and go for a walk but cannot because they are being manipulated by an algorithm?

Health and wellness are relevant here.  Peter Sterling makes a great argument that humans (and most life) evolved to forage and socialize based on small, unexpected dopamine rewards.  We walk through meadows with our friends until someone finds some berries, and calls out "Hey I found some berries" and we all settle in for a lovely berry lunch.  Next day, we repeat, or maybe look for apples or try to catch a fish.  We tell jokes and stories.

Yet with industrialization and now digital technology our dopamine "cycle" is all out of whack.  Lacking time in nature, with less social safety, and more environmental stress, we look to "large dose" sources of dopamine.  I.e., instead of a walk in the woods with a fine berry lunch we obtain large amounts of processed food.  But it goes for other behaviors as well as food, informing all of our behavior paths including visual stimulation, social, verbal, chemical, you name it.

Sterling is a student of medicine, and neuroscientist, and argues that for health (at all levels) the balance needs to be restored.  In other words, he says that surgery and pharma are not the cure.  He calls for preventive wellness like diet and exercise, but also the cultural practices which he says are essential.  He doesn't know what these are though, and says that artists need to help.

As to big data, I think we all will need to come to terms with the deleterious health effects which disordered tech and media can cause.  One phrase I think of for this is, "social is the new smoking."  Can tech people design financially stable algorithms which respect the health of the user?  I would imagine yes.  They will do so faster once consumers catch on and demand it.

With great problems come great opportunities, I guess!

Very best,



"Predictive regulation and human design"

From: NetBehaviour <netbehaviour-bounces at lists.netbehaviour.org> on behalf of Ruth Catlow via NetBehaviour <netbehaviour at lists.netbehaviour.org>
Sent: Sunday, January 10, 2021 12:15 PM
To: NetBehaviour for networked distributed creativity <netbehaviour at lists.netbehaviour.org>
Cc: Ruth Catlow <ruthcatlow at gmail.com>
Subject: Re: [NetBehaviour] Free speech


In addition to the obvious dangers of building global communication systems for the profit of platform owners, (whatever good design is - it must prioritise delivering profit to shareholders) the problem seems to be that networked algorithms have emergent properties.

I saw Tristan Harris, ex Google Designer and now heading up the anti-google "designer for humanity" race (yes I have reservations), showing research about how social media algorithms, will always push people to look next at the more extreme version of the thing they just saw...which results for example in directing depressed teenagers from legitimate mental health support communities to suicide cults.

I am quite relieved that  Twitter's terms of service mean that Donald Trump can be silenced. But it doesn't say much for the state of American democracy that their political institutions are unable to deal with such obvious danger.


On Fri, Jan 8, 2021 at 8:09 PM Edward Picot via NetBehaviour <netbehaviour at lists.netbehaviour.org<mailto:netbehaviour at lists.netbehaviour.org>> wrote:
I'm genuinely conflicted about it.

It occurs to me to wonder how the algorithms work - if I look at a video about conspiracy theories on YouTube, for example, am I then presented with a lot more videos about conspiracy theories next time I visit? I think the answer to this is probably yes, because I looked at a video of Trump doing his YMCA dance (which apparently he does quite frequently at the end of his rallies), thinking about re-using it for satirical purposes, and now every time I go to YouTube it wants me to look at more videos of Trump dancing.

I think the algorithms are one of the most insidious and damaging aspects of Web 2 - instead of genuinely exploring the web and coming across new things, which I seem to remember we used to do in the early 2000s, we now find ourselves in a commercialised feedback-loop which presents us over and over again with amplified (and monetized) versions of whatever beliefs and ideas and interests we had in the first place. Perhaps there's some mileage in legislating against the algorithms.


On 08/01/2021 19:16, Alan Sondheim via NetBehaviour wrote:
I think some safeguards need to be put into place; if you look at the propaganda-machine-work in Nazi Germany, it can do terrible harm. But in the U.S. under Reagen, the fairness doctrine was scrapped, which meant local news outlets of all sorts could be grabbed up by opinionated multi-nationals, and you get people like Rush Linbaugh spreading hatred unchallenged in rural areas - probably the biggest swatch of territory in the country. That's where "these people" get their news, unchallenged. It's far-right-wing money. I also think hate speech might be covered more directly - one of the tshirts at the riot said in abbreviated form - 6 million is not enough. What do you do with that?

Best, Alan (mind you I've been censored on YouTube and elsewhere myself, I think unfairly, so you might make a counter-argument that it's all in the eye/ear of the beholder. It's an aporia.)

On Fri, Jan 8, 2021 at 2:07 PM Edward Picot via NetBehaviour <netbehaviour at lists.netbehaviour.org<mailto:netbehaviour at lists.netbehaviour.org>> wrote:
What do people think - have we reached the point at which social media
companies should be prosecuted for allowing hate-speech, incitements to
violence, demonstrable untruths and conspiracy theories to be uploaded
onto their sites?

Should they be regarded as publishers, and therefore legally responsible
for their content?

I'm genuinely torn, but I think maybe we've now reached that point. I'd
be very interested to hear what others think.


NetBehaviour mailing list
NetBehaviour at lists.netbehaviour.org<mailto:NetBehaviour at lists.netbehaviour.org>

directory http://www.alansondheim.org tel 718-813-3285
email sondheim ut panix.com<http://panix.com>, sondheim ut gmail.com<http://gmail.com>

NetBehaviour mailing list
NetBehaviour at lists.netbehaviour.org<mailto:NetBehaviour at lists.netbehaviour.org>

NetBehaviour mailing list
NetBehaviour at lists.netbehaviour.org<mailto:NetBehaviour at lists.netbehaviour.org>

Co-founder & Artistic director of Furtherfield & DECAL Decentralised Arts Lab
+44 (0) 77370 02879

*I will only agree to speak at events that are racially and gender balanced.

**sending thanks<https://www.ovoenergy.com/ovo-newsroom/press-releases/2019/november/think-before-you-thank-if-every-brit-sent-one-less-thank-you-email-a-day-we-would-save-16433-tonnes-of-carbon-a-year-the-same-as-81152-flights-to-madrid.html> in advance

Furtherfield disrupts and democratises art and technology through exhibitions, labs & debate, for deep exploration, open tools & free thinking.

DECAL Decentralised Arts Lab is an arts, blockchain & web 3.0 technologies research hub

for fairer, more dynamic & connected cultural ecologies & economies now.


Furtherfield is a Not-for-Profit Company Limited by Guarantee

Registered in England and Wales under the Company No.7005205.

Registered business address: Carbon Accountancy, 80-83 Long Lane, London, EC1A 9ET.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.netbehaviour.org/pipermail/netbehaviour/attachments/20210110/561a589c/attachment.htm>

More information about the NetBehaviour mailing list