No Regret Software
Sometimes people talk about "dark patterns" -- user interfaces and system designs intended to trick people into acting against their own best interests. For example, it is easy to sign up for a service and very difficult to leave that same service. For example, unless you click on exactly the right pixel of your screen you will pop up a video or agree to cookies or sell your firstborn. You might join a social media network to connect with friends and family, and then be subjected to infinite scrolling rage bait.
The people paid by companies to design this software have long convoluted arguments about why this is okay, actually. Sometimes the argument takes the form of "We need ad revenue to survive, and so unless we trick users we won't survive and the consumer will suffer." Sometimes the argument takes the form of "The user clicked 'I Agree' to some long unreadable terms of service when they signed up, so they are giving full informed consent to whatever bad thing we are doing." Sometimes the argument takes the form "Our customers aren't stupid and if they didn't want us doing this awful thing then they would indicate that or leave." None of these arguments are convincing to me.
There is a fairly-simple question that cuts through a lot of these arguments. Because the question is straightforward, it will never be adopted by any tech company ever. The question is: "Will the customer regret taking this action later?"
Yes, there are fuzzy edges here. I might badly want junk food now, and consent to clicking on your cookie banner so that I get cookies. But if I wake up tomorrow with a stomachache and data brokers with baseball bats outside my house, then I have taken an action I have regretted. Software works hard to appeal to our addictive impulses, but in most cases we regret giving into those impulses later. Maybe there are situations that we deeply regret but which are still good for us, and maybe some tech companies have our best interests at heart when they put us through those regretful situations, but I think this would be rare.
The other argument is that "there exist some number of people who will not regret this awful thing, and therefore it is justified." In that case I beg you to temporarily accept utilitarianism and determine whether more people regret this design decision or applaud it.
If I click "accept cookies", then ads will follow me around and data brokers with baseball bats will show up at my door. So maybe I don't want to accept those cookies, and maybe you as the software developer should not make that choice the default.
If I fall into a Mastodon doomscroll for several hours and feel bad about it later, then maybe you should not have made it so easy for me to doomscroll on Mastodon, even if it makes Eugene Mastodon rich. (More relevantly to Mastodon, if I scroll down using arrow keys and then try scrolling down using the mouse, maybe the screen should not jump way back up? That would make me regret Mastodon less.)
I feel lots of individual programmers and designers have the best interests of their users at heart, and it hurts their souls to implement these dark patterns. But I also feel that those programmers and designers tolerate their bad feelings enough to enjoy their paycheques, so they are not free of blame here. Unfortunately there is no moral purity in this capitalist hellscape, and what begins as one minor decision against user interests eventually snowballs into Facebook hell.
The real problem is that any tech company that took the "no regrets" principle seriously would lose out to their competitors that don't, because in the tech world scaling is everything. Maybe one can get by with good service at some local bike shop, but the VC-powered world of Silicon Valley works differently.
Personally I regret most aspects of the Internet. I especially regret having Internet access at home. I have many addictions and very little control over any of them, which makes access to the Internet (and computers in general) very bad for me. But even a degenerate like me can see that some software platforms are worse than others and lead to more regret (hello, LinkedIn).
Is it possible to follow this no-regret rule at all? I don't know. We can probably come close. Watcamp is an events listing website. Some users might regret interacting with it because of FOMO (and ironically I am one of them) but for the most part it provides a simple service with no further tracking. People who are interested in learning about local events can visit the site (or subscribe to a feed ) and get information about those events; people who are not interested can step away with no regrets. Similarly, my intention with Waterloo Region Votes was to provide non-partisan information around municipal elections. Again, this might have caused regret because there is so much information about so many races. But the site does not harvest data so political parties can target you (hello, Elections Ontario/Canada), and it does not encourage you to doomscroll through all the content to get more clicks. On the other hand, neither of these websites is commercial, so it is easy for me to say these sites aspired to few consumer regrets.