Yes, “public” is quite nuanced, and always has been. May we rephrase the reverse as “our expectation of privacy”? Suppose a social setting well pre-dating the internet… we used to have a reasonable expectation of privacy. But no guarantees.
On a party gathering, meeting with friends, the anecdotes talked about are likely spreading further in a friends-of-friends (gossip) network. There’s no control of that info spreading about. You might make a moral/ethical appeal to friendship and say “This is personal, so don’t tell further”. Still no guarantees, but you stated your intent and expectation of privacy. And you may be able to take action when you observe a breach of trust (e.g. “unfriend” someone).
Here we find the equivalence of ODRL and Mastodon’s discoverable
… statements on our expectations, and a moral appeal to honor them.
If you want to be more serious about it, e.g. in a business setting, you might ask people to sign a non-disclosure agreement before sharing information. A betrayal of trust may now have legal ramifications, thus raising the barrier to do so. Still you may be secretly betrayed. There are never full guarantees.
Now, if you add internet technologies to this picture, the notion of “public” gets way more complex, and innocuous communications may suddenly have huge ramifications and impact (e.g. think victims of sexting). What you say in any context, may suddenly become plastered to a global audience, and be out there for years for all to see.
Any expectation of privacy is truly out the door. Anything we say is transferred through a range of technology platforms that serve as our talking and hearing aids. Tools with unknown side-effects, and that continuously change in the way they work. E.g. by using Mastodon’s discoverable
under the hood, we implicitly defer to their app platform to define what that means in practice. When using a service we agree to be bound by their privacy policy, and the intricate network of privacy policies they relate to through 3rd-parties. Only lawyers might analyse the legalese and conclude the level of privacy we are entitled to, by accepting such policies.
In real life privacy is seriously eroded too. You sit on a terrace with your friends, while your mobile phones are gathering metadata about the meeting, and strangers make photographs with you in the frame, which are uploaded to FB and Instagram for facial recognition. Etcetera… surveillance capitalism is ubiquitous.
Observation:
- By communicating online we have deferred our privacy to online platforms, beyond our control.
- Hard guarantees don’t exist. If you wanna have highest level of control, don’t put your info online.
How to increase assurances and trust that information processing meets our expectation of privacy?
- By being very explicit what our expectations are.
- By information platforms working to meet that expectation.
- By gaining insight in the level information platforms match our expectation
You might say that 1) constitutes our aspiration, and 2) boils down to enforcement, which leads to 3) providing informed consent.
The ODRL and Mastodon’s discoverable
are ways to express our aspiration wrt privacy, whereby…
- ODRL offers a fine-grained standardized means to so,
- While
discoverable
is an arbitrary app-specific way.
With such mechanisms to express aspirations implemented, the enforcement in point 2) boils down to…
- Implementing effective ways (functionality) to honor our aspiration.
- Technical verification by independent parties that this is done in meaningful way.
The most tricky part is in providing informed consent. A non-technical audience should be able to…
- Choose information platforms with confidence.
- Learn about potential impact of their actions in an intuitive manner.
F-Droid is an app-store that comes with baked-in level of enforcement to ensure a level of trust with a non-technical audience that apps they install do not contain adware and the like.
Individual FOSS project websites try to convey assurances to the level of privacy and trust that can be expected, and if that isn’t accurate then the Free Software movement will be very vocal about that. In ways where hopefully the broader public may become educated about that.
For in-app activity, being properly informed on the impact of ones actions, is to large extent a UX issue. Other than that, a certain level of digital literacy may be expected from the user. And cultural norms, netiquette, may contribute a bit to the level of trust (Fediverse currently has a higher level of netiquette than e.g. birdsite).
Regardless what we do, and efforts to define our Right to Privacy in an online age, we don’t have hard guarantees. What’s public or not will remain nuanced, complex and partly in the “eye of the beholder”.
As for “Where do ‘we’ discuss this?” I guess that that boils down to everywhere
In each context where it is discussed we can contribute from a different perspective relevant to that context.