Doctorow is not using ‘dark’ the way you suggest. It’s not Dark as in Dark Web, it’s Dark because they exist outside of the glare of big tech control.
His Dark corners are
privately hosted servers, groupchats, message-boards, etc. [outside the] nexus of control [of Facebook, Google et al.]
It has nothing to do with whether or not it’s publicly accessible, indexed by search engines, or whether the participants are engaged in ‘good’ or ‘bad’ things. Dark corners are self-governing communities that operate independently of the tech giants that aspire to control every aspect of the web.
Doctorow’s argument is that these dark corners are good, and in fact necessary spaces. And this forum absolutely fits that description!
I get the distinct impression that he assiociates “dark” corners with criminal or unsavory activities, eg
The worst people on the internet have relocated to its so-called dark corners
fighting to abolish dark corners because only the worst people on the internet use them today
are two quotes from the “platforms burn” article. Which also sketches a vision of trying to normalize these “dark corners” instead of giving Facebook & friends more control over users.
I am not sure that this kind of argument would identify eg this forum as a “dark corner”. We are not unsavory enough. The worst I have seen on this forum is false color transitions in sunsets
I respect Doctorow (and I like many of his books), and I understand that he wants to engage the reader emotionally, but I think that choosing between “social media” and “dark corners” is a false dichotomy. The internet is full of great sites and forums where you can have productive and meaningful conversations with smart people. It takes minimal effort to find these.
Same darkness as maker spaces, community gardens, house building collectives and much more grassroots initiatives in the physical world. Visible but too small and too individual to be of interest for the big ones.
But there is always the trap of growing, see little hippie organic food shop → small chain because it scales good → big chain → bought by a big one. The small shops are on a steep decline here in Germany, there are some local and national chains competing but up to now they have not been bought by the big firms like Whole Foods by Amazon.
Resisting and “in the dark” are the collective shops where you have to be a member and perhaps do some shifts to get good prices.
Rereading the article, I can see why you would think this. Having followed his work over many years, I’m quite confident this isn’t his position. I think some nuance was lost in the way he framed this particular essay.
A key section points to arguments better articulated elsewhere:
The same is broadly true of other disfavored groups, including those with out-of-mainstream political ideologies. Some of these groups hold progressive views, others are out-and-out Nazis, but all of them chafe at the platforms’ policies at the best of times, and are far more ready to jump ship when the platforms tighten the noose on all their users.
This is where “dark corners” come in.
In other words, he doesn’t want to normalize the bad actors who you find in ‘dark corners’. Rather, he wants to normalize dark corners, because not everyone who goes to them are bad actors, and some people have very good reasons to want those spaces.
His choice of words isn’t great though, I admit. The phrase ‘the worst people’ in particular, and without qualification (i.e., that not only ‘the worst people’ are in these places), is clumsy.
Now I’ve got my coat, I think there’s a point to be made about overreach in regulating spaces out of existence to crack down on criminal activity that is definitionally already illegal.
“Companies, from Big Tech down to smaller platforms and messaging apps, will need to comply … Wikipedia, the eighth-most-visited website in the UK, has said it won’t be able to comply with the rule because it violates the Wikimedia Foundation’s principles on collecting data about its users.”
“Think of the Children!” and “There are criminals hiding!” is nearly always an argument for curtailing the freedom of “normal” people.
The best way to fight a lot of crime would be to make all money transfers and bank statements public. That will never happen, even if it could save children. BTW, it would, if one could follow the money to the child abuse sites they would dry up in an instance.
Europe is wanting to do the same as a means of making E2E encryption illegal or at least adding backdoors to it (which makes it useless). They say the road to hell is paved with good intentions, but I have doubt if these lawmakers even had good intentions in the first place. Do they even have data to backup these measures?
In the end it seems like every major nation or union is moving towards more surveillance and authoritarian measures by the day, and they show no intention of stopping. And to be fair, it’s not like we weren’t warned after Snowden, Wikileaks and a lot of whistleblowers warned us about it all.
Edit: Not to speak of how that UK law is worded. It uses such general terms such as harm or “misinformation” that whoever controls that will be able to dictate what is truth. How many times have we heard of “misinformation” for it to be true afterwards?
There is a slim chance of that happening after a recent (2024 March) ECHR ruling that upholds it as a basic human right.
I don’t know about intentions, but I seriously doubt that they understand the implications. Outlawing or crippling E2E encryption would be a cybersecurity nightmare. It is impossible to keep a backdoor secret for long, and pretty much all business would collapse when it becomes public knowledge.
Exactly. It would kill cyber-business, everything from using your credit card at a store, buying things online, doing your personal banking, and on and on. There are hundreds or thousands of ways it would kill the things we can semi-safely do, today. LOL, it would also kill their dreams of eliminating cash.
This is great news, hopefully it continues this way.
IIRC they only wanted to go after encrypted messages where the users hold the keys and messages are only decrypted locally, like signal or what’s app. Regular TLS and other encryption would remain how they are, since they already have access to all the data they want by the backdoors they have in all companies, or simply by asking for it.
I need to share this for the smiley, always relevant
That is hilarious. The great thing with “AI” is that it holds up a mirror to our culture. That card is so close to the meaningless cards actually on sale. AI just makes small mistakes that reveal the absurdity of the real thing. How you can trust AI when you clearly see that all current models are just spouting cliche dung without (obvisously) any reflection. Just sort of variations and mashups of the average solution to a prompt. That card actually conveys the same thing as a properly spelled card. There’s no value in the words being legible.
More crap: from stealing traffic by generating fake content using titles harvested from a competitor’s website, to generated ‘educational’ videos targeting toddlers.
As the parent of a 4 year old, yes, I can confirm that Youtube for Kids is flooded with AI-generated content (at least, I hope it is AI generated, and no humans are that dumb). It comes up in the suggestions so kids who learned that they can click there will end up watching it.
Which is why my daughter only gets to watch carefully pre-selected stuff offline. Mostly cartoons from the 1980s; and of course stories on filmstrip which is enjoying a revival in Hungary.
What I’d like to see (and what won’t happen): any AI-assisted/generated content (whether it’s an image, a video, text, sound or source code) mandatorily marked as such; not doing so punishable as a criminal offence. Then, the next step would be the big platforms, such as YouTube, allowing one to filter all of them out, automatically deleting anything AI-assisted/generated targeting children.
Yes, I know it’s pretty much impossible to enforce: how do I prove that some content is not purely a human creation? If that were possible, labelling by the one publishing it would not be needed.
Also, what would motivate YouTube etc. to actually remove such content, as long as there’s a market for it?