I agree. I haven’t seen anyone suggesting this is not the case. However, it’s not clear to me that JSON-LD itself is the primary cause of these issues in a scenario where a developer has decided to do plain JSON processing with little or no extension to existing contexts (like a Mastodon interop scenario).
At the beginning of this thread, you wrote:
The information described by the bullet points is valuable for either plain JSON or JSON-LD processing. It doesn’t require the spec changes that you are suggesting. I understand some people think JSON-LD contexts provide semantic information, but they don’t. They also don’t provide much data schema shape and constraint information, especially in the AS2 case. In those senses, the list above is not an alternative to capabilities (not) provided by JSON-LD. I believe the community would benefit from the approach you describe, today, even without an AP/AS2 specification reinterpretation.
I like “Interoperability Profile” better than “Compliance Profile”, but that’s just my preference. I also think that the word “extension” is adding to confusion on this topic (not just in this thread). Are you talking specifically about an AS2 “vocabulary extension”? Or are you talking about behavior extensions, in general, which could be related to non-extended AS2 messages? For example, non-extended Add/Remove messages could be used to implement some Profile-specific Collection management with Profile-specific behavioral requirements, side effects, and so on. A Profile’s specified behavior will typically restrict AS2 more than it extends it (only some types are supported, only some properties are supported on those types, some properties are required rather than optional, some property values can only be scalars and not arrays, etc.).
I am not really saying spec changes are required. I am making a suggestion for a different way to consider the specs and to maybe add an additional mechanism (e.g. JSON Schema’s). But how this is done exactly should be in a discussion between current implementers fased with bad DX, thinking of the needs of future implementers that love fedi’s DX
“Extension” is the word we currently use without clear definition, and for sure can find better terminology. When I say “extension” I refer to “AP extension” which is any combination of vocab + behavior. Wrt behavior I often mentioned “msg exchange pattern”, but if you have some vocab extension e.g. additional properties, then they may also be subject to having particular business logic associated to them.
In that light I don’t think it is valuable to distinguish between vocab-only extensions and behavior extensions. And extension is an extension… one might say: given that the protocol is pluggable, extensions add additional specs for particular use cases. Hence by definition they extend the spec. You must take them into account to interoperate with an extended endpoint. But a whole different terminology may be chosen.
So given the definition above, this is an extension (to the spec) even though it may restrict the behavior of the endpoint.
I think I’ll try to use the term “Interoperability Profile”, for now, since “extension” in an AP/AS2 discussion is typically used to describe @context additions beyond the normative AS2 context. Calling an Interoperability Profile that doesn’t have a vocabulary extension (behavior-only and maybe restriction-only) an “extension” is probably going to lead to confusion in related discussions.
I’ll need to go back and reread what you’ve written then. Some aspects of your proposal are not clear to me. I might be confused by the conflation of JSON Schema and JSON-LD topics and the mention of “Linked Data profile” (not exactly sure what this is), non-Linked Data extension mechanisms (also useful/needed for Linked Data apps), and so on.
We won’t know until we have some concrete examples, but I’d expect an “interoperability profile” to include information like which, if any, FEPs are implemented. Any application can decide to support any number of interoperability profiles. I wouldn’t call that an interoperability profile itself since it’s about a specific application’s implementation choices.
Yes, we can do that now and most major implementations are consistent with that JSON-only or JSON-first perspective. I’m struggling to understand what’s different with what you’re suggesting.
This is needed with either a JSON or JSON-LD perspective. I’ve made that claim before. I’ll stop repeating myself now unless there’s a counterpoint.
JSON Schema is useful for either JSON or JSON-LD. The purpose of JSON-LD is to support mapping between a JSON data structure and Linked Data (RDF). It has very limited message format definition support or message validation support.
The same as now.
My recommendation is:
(Given that AP is intended to be compatible with JSON and JSON-LD)
Educate developers about best practices for JSON-only/JSON-first development with AP.
Develop interoperability profiles for specific applications and crosscutting features.
Include FEP references, message schema, documentation, JSON-LD context extensions, etc. in the profile
I think we’re mostly in agreement on what needs to be done, just not the specific reasons for it.
I also think much (not all) of the criticism of AP JSON-LD is focused on the wrong and/or secondary issues. The thread you quote, for example, is focused on JSON-LD usage in the context of the major implementations (“800 lb gorillas”), but those applications all do exclusively JSON processing (or ~100% is JSON processing, in Mastodon’s case), so the relevance of JSON-LD processing-related criticism is questionable, IMO.
The Linked Data discussion keeps us occupied. On chat a discussion about focusing on biggest pain points. Triggered by @hrefnamusing about Linked Data.
My reaction this morning with a bit of ranty frustration, on what I see as biggest pain point:
Technology adoption lifecycle of AS/AP is nearly stuck.
Besides coordinating / collab in grassroots movement, the prominent role of LD is stifling, imho. On chat I called it “Pita delivery”:
To me the biggest pain point is that today there’s no real interop facilitated by the standards, other than “I copy a bit of what you do, add a bit I found elsewere, and sprinkle some own fairy dust on top”. Hampering evolution and limiting potential, while tech debt accrues. I called it the “Fixing the broken technology adoption lifecycle” challenge in my notes.
Fediverse is becoming popular, right? But how much of that is due to an app, Mastodon, “crossing the chasm”, rather than ActivityPub?
The original dream/promise of LD was to make life easier connecting all the things together. Define your own data/information/type model in human-readable format. “Don’t worry the machines can read it too, and it will all fit into one big interwoven tapestry on which we dance and celebrate mankind.” It didn’t turn out that way and 24 years later we see some uptake in academic circles and new AI fields, while people are still working on complex standards to meet the original promise.
As I mentioned before I still like the ideas of LD, would love to see us go to the moon with it and venture into the galaxy. But as the primary extension mechanism for this particular AP protocol it may not be the best choice. LD support? Perfect. AP on LD life support? Umm… not a fan.
Linked Data is the footgun of AS/AP…
Update:
As I said before I deliberately chose to volunteer most my time on the community and advocacy side of the Fediverse. To me it is the future potential where things get really interesting. And in that future the decentralized protocol and technology ecosystem needs to be easily accessible to a broad range of people. Not a small elite group of experts who have managed to eat through all the complexity with grit and pure dedication and came out victorious.
In that approach I am seriously considering taking a step back from this focus on AS/AP. After all the protocol should be just an implementation detail in a delightfully decentralized interwoven social networking environment. My vision is a Peopleverse (social) on top of a Fediverse (technical), and hence considering the “Social Web” in its entirety may make more sense to me.