Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse
Brand Logo
  1. Home
  2. Uncategorized
  3. I have deeply mixed feelings about #ActivityPub's adoption of JSON-LD, as someone who's spent way too long dealing with it while building #Fedify.

I have deeply mixed feelings about #ActivityPub's adoption of JSON-LD, as someone who's spent way too long dealing with it while building #Fedify.

Scheduled Pinned Locked Moved Uncategorized
fedifyjsonldfedidevactivitypub
34 Posts 15 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • Sebastian LasseS Sebastian Lasse

    @julian @mat

    We implemented this standard and you can create / describe your rooms [Place, `redaktor:fictional`] and the chessboard is just a geohash as described in the geosocial CG so the use is the same, just `redaktor:fictional` too,
    You load the Collection of Chessfigures (pawn1 ...) can name them, they `Travel` over the chessboard ant the `Arrive` describes the `result`.
    As always you can get very detailed with wikidata properties and entities but bare AS Vocabulary is enough.
    In the end you have a Collection for the Travels which is your played game which you can replay or do whatever with.

    But you can still install immers - it is worth a try https://github.com/immers-space

    The reason for its end are the same as for the gup.pe groups and I hope people konw about it …

    Matthew ExonM This user is from outside of this forum
    Matthew ExonM This user is from outside of this forum
    Matthew Exon
    wrote last edited by
    #13

    @sl007 @julian I admit I didn't pay attention to immers at the time - I don't play games, not even chess. I was just using chess as an example, didn't mean to trigger anyone's trauma!

    Still, it kinda proves my point. You have to use standard AS vocabulary because Mastodon, and if you squint then sure, Travel and Arrive, why not? But given some of the conversations I've seen on this forum, I shudder to think how that would go down if you tried to get approval for that usage from "the community" first.

    1 Reply Last reply
    1
    0
    • Luke KaniesL This user is from outside of this forum
      Luke KaniesL This user is from outside of this forum
      Luke Kanies
      wrote last edited by
      #14

      @hongminhee @jalefkowit huh. I’ve been pondering using it for some projects of mine, so this is good to know.

      Is it a fundamental problem with JSON-LD, such that it should just be avoided, or a problem with how ActivityPub uses it?

      And is there something else you’d recommend that fulfills the same goals?

      洪 民憙 (Hong Minhee) :nonbinary:H 1 Reply Last reply
      0
      • Luke KaniesL Luke Kanies

        @hongminhee @jalefkowit huh. I’ve been pondering using it for some projects of mine, so this is good to know.

        Is it a fundamental problem with JSON-LD, such that it should just be avoided, or a problem with how ActivityPub uses it?

        And is there something else you’d recommend that fulfills the same goals?

        洪 民憙 (Hong Minhee) :nonbinary:H This user is from outside of this forum
        洪 民憙 (Hong Minhee) :nonbinary:H This user is from outside of this forum
        洪 民憙 (Hong Minhee) :nonbinary:
        wrote last edited by
        #15

        @lkanies@hachyderm.io @jalefkowit@vmst.io To be honest, I'm not too sure myself. I just know that JSON-LD was originally planned as a foundation for the Semantic Web. I can only guess that if ontology is useful in a certain area, then JSON-LD would probably be useful there too.

        1 Reply Last reply
        2
        0
        • Evan ProdromouE This user is from outside of this forum
          Evan ProdromouE This user is from outside of this forum
          Evan Prodromou
          wrote last edited by
          #16

          @hongminhee do you use the activitystrea.ms module from npm? It takes a lot of the pain out.

          洪 民憙 (Hong Minhee) :nonbinary:H 1 Reply Last reply
          0
          • Evan ProdromouE Evan Prodromou

            @hongminhee do you use the activitystrea.ms module from npm? It takes a lot of the pain out.

            洪 民憙 (Hong Minhee) :nonbinary:H This user is from outside of this forum
            洪 民憙 (Hong Minhee) :nonbinary:H This user is from outside of this forum
            洪 民憙 (Hong Minhee) :nonbinary:
            wrote last edited by
            #17

            @evan@cosocial.ca I don't remember exactly, but I think I came across it while doing research before developing Fedify. I probably didn't use it because the TypeScript type definitions were missing. In the end, I ended up making something similar in Fedify anyway.

            1 Reply Last reply
            2
            0
            • Doug WebbD This user is from outside of this forum
              Doug WebbD This user is from outside of this forum
              Doug Webb
              wrote last edited by
              #18

              @pintoch read this thread?

              1 Reply Last reply
              1
              0
              • kopper :colon_three:K This user is from outside of this forum
                kopper :colon_three:K This user is from outside of this forum
                kopper :colon_three:
                wrote last edited by
                #19
                @hongminhee from the point of view of someone who is "maintaining" a JSON-LD processing fedi software and has implemented their own JSON-LD processing library (which is, to my knowledge, the fastest in it's programming language), JSON-LD is pure overhead. there is nothing it allows for that can't be done with

                1. making fields which take multiple values explicit
                2. always using namespaces and letting HTTP compression take care of minimizing the transfer

                without JSON-LD, fedi software could use zero-ish-copy deserialization for a majority of their objects (when strings aren't escaped) through tools like serde_json and Cow<str>, or
                System.Text.Json.JsonDocument. JSON-LD processing effectively mandates a JSON node DOM (in the algorithms standardized, you may be able to get rid of it with Clever Programming)

                additionally, due to JSON-LD 1.1 features like @type:@json, you can not even fetch contexts in parallel, meaning all JSON-LD code has to be async (in the languages which has the concept), potentially losing out on significant optimizations that can't be done in coroutines due to various reasons (e.g. C# async methods can't have ref structs, Rust async functions usually require thread safety due to tokio's prevalence, even if they're ran in a single-threaded runtime)

                this is
                after context processing introducing network dependency to the deserialization of data, wasting time and data on non-server cases (e.g. activitypub C2S). sure you can cache individual contexts, but then the context can change underneath you, desynchronizing your cached context and, in the worst case, opening you up to security vulnerabilities

                json-ld is not my favorite part of this protocol
                kopper :colon_three:K Sebastian LasseS 2 Replies Last reply
                1
                0
                • kopper :colon_three:K kopper :colon_three:
                  @hongminhee from the point of view of someone who is "maintaining" a JSON-LD processing fedi software and has implemented their own JSON-LD processing library (which is, to my knowledge, the fastest in it's programming language), JSON-LD is pure overhead. there is nothing it allows for that can't be done with

                  1. making fields which take multiple values explicit
                  2. always using namespaces and letting HTTP compression take care of minimizing the transfer

                  without JSON-LD, fedi software could use zero-ish-copy deserialization for a majority of their objects (when strings aren't escaped) through tools like serde_json and Cow<str>, or
                  System.Text.Json.JsonDocument. JSON-LD processing effectively mandates a JSON node DOM (in the algorithms standardized, you may be able to get rid of it with Clever Programming)

                  additionally, due to JSON-LD 1.1 features like @type:@json, you can not even fetch contexts in parallel, meaning all JSON-LD code has to be async (in the languages which has the concept), potentially losing out on significant optimizations that can't be done in coroutines due to various reasons (e.g. C# async methods can't have ref structs, Rust async functions usually require thread safety due to tokio's prevalence, even if they're ran in a single-threaded runtime)

                  this is
                  after context processing introducing network dependency to the deserialization of data, wasting time and data on non-server cases (e.g. activitypub C2S). sure you can cache individual contexts, but then the context can change underneath you, desynchronizing your cached context and, in the worst case, opening you up to security vulnerabilities

                  json-ld is not my favorite part of this protocol
                  kopper :colon_three:K This user is from outside of this forum
                  kopper :colon_three:K This user is from outside of this forum
                  kopper :colon_three:
                  wrote last edited by
                  #20
                  @hongminhee take this part with a grain of salt because my benchmarks for it are with dotNetRdf which is the slowest C# implementation i know of (hence my replacement library), but JSON-LD is slower than RSA validation, which is one of the pain points around authorized fetch scalability

                  wetdry.world/@kopper/114678924693500011
                  kopper :colon_three:K 2 Replies Last reply
                  1
                  0
                  • kopper :colon_three:K kopper :colon_three:
                    @hongminhee take this part with a grain of salt because my benchmarks for it are with dotNetRdf which is the slowest C# implementation i know of (hence my replacement library), but JSON-LD is slower than RSA validation, which is one of the pain points around authorized fetch scalability

                    wetdry.world/@kopper/114678924693500011
                    kopper :colon_three:K This user is from outside of this forum
                    kopper :colon_three:K This user is from outside of this forum
                    kopper :colon_three:
                    wrote last edited by
                    #21
                    @hongminhee if i can give one piece of advice to devs who want to process JSON-LD: dont bother compacting. you already know the schema you output (or you're just passing through what the user gives and it doesn't matter to you), serialize directly to the compacted representation, and only run expansion on incoming data

                    expansion is the cheapest JSON-LD operation (since all other operations depend on it and run it internally anyhow), and this will get you all the compatibility benefits of JSON-LD with little downsides (beyond more annoying deserialization code, as you have to map the expanded representation to your internal structure which will likely be modeled after the compacted one)
                    pancake :butterfly_:​:neofox_lesbian:N infinite love ⴳT 2 Replies Last reply
                    1
                    0
                    • kopper :colon_three:K kopper :colon_three:
                      @hongminhee if i can give one piece of advice to devs who want to process JSON-LD: dont bother compacting. you already know the schema you output (or you're just passing through what the user gives and it doesn't matter to you), serialize directly to the compacted representation, and only run expansion on incoming data

                      expansion is the cheapest JSON-LD operation (since all other operations depend on it and run it internally anyhow), and this will get you all the compatibility benefits of JSON-LD with little downsides (beyond more annoying deserialization code, as you have to map the expanded representation to your internal structure which will likely be modeled after the compacted one)
                      pancake :butterfly_:​:neofox_lesbian:N This user is from outside of this forum
                      pancake :butterfly_:​:neofox_lesbian:N This user is from outside of this forum
                      pancake :butterfly_:​:neofox_lesbian:
                      wrote last edited by
                      #22

                      @kopper@not-brain.d.on-t.work @hongminhee@hollo.social expansion is actually really annoying because the resulting JSON has annoyingly similar keys to lookup in a hashmap, wasting cache lines, and CPU time

                      kopper :colon_three:K 1 Reply Last reply
                      0
                      • pancake :butterfly_:​:neofox_lesbian:N pancake :butterfly_:​:neofox_lesbian:

                        @kopper@not-brain.d.on-t.work @hongminhee@hollo.social expansion is actually really annoying because the resulting JSON has annoyingly similar keys to lookup in a hashmap, wasting cache lines, and CPU time

                        kopper :colon_three:K This user is from outside of this forum
                        kopper :colon_three:K This user is from outside of this forum
                        kopper :colon_three:
                        wrote last edited by
                        #23
                        @natty @hongminhee i would imagine a Good hash algorithm wouldn't care about the similarity of the keys, no?
                        1 Reply Last reply
                        1
                        0
                        • kopper :colon_three:K kopper :colon_three:
                          @hongminhee take this part with a grain of salt because my benchmarks for it are with dotNetRdf which is the slowest C# implementation i know of (hence my replacement library), but JSON-LD is slower than RSA validation, which is one of the pain points around authorized fetch scalability

                          wetdry.world/@kopper/114678924693500011
                          kopper :colon_three:K This user is from outside of this forum
                          kopper :colon_three:K This user is from outside of this forum
                          kopper :colon_three:
                          wrote last edited by
                          #24
                          @hongminhee i put this in a quote but people reading the thread may also be interested: json-ld compaction does not really save that much bandwidth over having all the prefixes explicitly defined if you're gzipping (and you are gzipping, right? this is json. make sure your nginx gzip_types includes ld+json and activity+json)

                          RE:
                          not-brain.d.on-t.work/notes/aihftmbjpxdyb9k7
                          1 Reply Last reply
                          1
                          0
                          • kopper :colon_three:K kopper :colon_three:
                            @hongminhee from the point of view of someone who is "maintaining" a JSON-LD processing fedi software and has implemented their own JSON-LD processing library (which is, to my knowledge, the fastest in it's programming language), JSON-LD is pure overhead. there is nothing it allows for that can't be done with

                            1. making fields which take multiple values explicit
                            2. always using namespaces and letting HTTP compression take care of minimizing the transfer

                            without JSON-LD, fedi software could use zero-ish-copy deserialization for a majority of their objects (when strings aren't escaped) through tools like serde_json and Cow<str>, or
                            System.Text.Json.JsonDocument. JSON-LD processing effectively mandates a JSON node DOM (in the algorithms standardized, you may be able to get rid of it with Clever Programming)

                            additionally, due to JSON-LD 1.1 features like @type:@json, you can not even fetch contexts in parallel, meaning all JSON-LD code has to be async (in the languages which has the concept), potentially losing out on significant optimizations that can't be done in coroutines due to various reasons (e.g. C# async methods can't have ref structs, Rust async functions usually require thread safety due to tokio's prevalence, even if they're ran in a single-threaded runtime)

                            this is
                            after context processing introducing network dependency to the deserialization of data, wasting time and data on non-server cases (e.g. activitypub C2S). sure you can cache individual contexts, but then the context can change underneath you, desynchronizing your cached context and, in the worst case, opening you up to security vulnerabilities

                            json-ld is not my favorite part of this protocol
                            Sebastian LasseS This user is from outside of this forum
                            Sebastian LasseS This user is from outside of this forum
                            Sebastian Lasse
                            wrote last edited by
                            #25

                            @kopper
                            @julian
                            @hongminhee

                            hm, we really need to differentiate between users responsibility and dev responsibility.

                            Not sure if Hong saw the draft about the AP kv thing, it supports either JSON-LD fields _or_ as:attachment / as:context …
                            wtf do I want to say.

                            user story:
                            We are working on 2 major and 3 projects fulltime which is
                            - federation of wikibase / wikidata
                            - federation of Public Broadcasters https://www.publicmediaalliance.org/public-broadcasters-create-public-spaces-incubator/
                            and these https://codeberg.org/Menschys/fedi-codebase

                            Let's say we want to federate a Country, then all the knowledge is sent in `attachment` with the fully qualified qikidata url in `context` [as:context - not @context ! - this is so confusing :)]
                            For example the according entries from the PressFreedomIndex `collection` (co-founder of freelens here 🙂

                            But anyway, the idea about having
                            "wd": "https://www.wikidata.org/wiki/Special:EntityData/",
                            "wdt": "https://www.wikidata.org/prop/direct/" in the `@context` was that any user can consume and federate wikibase
                            incl.
                            🧵 1/2

                            Sebastian LasseS 1 Reply Last reply
                            0
                            • Sebastian LasseS Sebastian Lasse

                              @kopper
                              @julian
                              @hongminhee

                              hm, we really need to differentiate between users responsibility and dev responsibility.

                              Not sure if Hong saw the draft about the AP kv thing, it supports either JSON-LD fields _or_ as:attachment / as:context …
                              wtf do I want to say.

                              user story:
                              We are working on 2 major and 3 projects fulltime which is
                              - federation of wikibase / wikidata
                              - federation of Public Broadcasters https://www.publicmediaalliance.org/public-broadcasters-create-public-spaces-incubator/
                              and these https://codeberg.org/Menschys/fedi-codebase

                              Let's say we want to federate a Country, then all the knowledge is sent in `attachment` with the fully qualified qikidata url in `context` [as:context - not @context ! - this is so confusing :)]
                              For example the according entries from the PressFreedomIndex `collection` (co-founder of freelens here 🙂

                              But anyway, the idea about having
                              "wd": "https://www.wikidata.org/wiki/Special:EntityData/",
                              "wdt": "https://www.wikidata.org/prop/direct/" in the `@context` was that any user can consume and federate wikibase
                              incl.
                              🧵 1/2

                              Sebastian LasseS This user is from outside of this forum
                              Sebastian LasseS This user is from outside of this forum
                              Sebastian Lasse
                              wrote last edited by
                              #26

                              @kopper @julian @hongminhee

                              incl.
                              - the properties in all the languages of the world
                              - the knowledge of the world in all the languages
                              - the wikidata relations and qualified statements including the nameMap etc. and all the urls to all wikiprojects incl. their languages and knowledge

                              How else could I say to other softwares if they want all users qualified data, use wikidata vocabulary?
                              wikipedia, wikidata, EBU, Public Broadcasters, taxi data is _all_ JSON-LD …

                              kopper :colon_three:K 1 Reply Last reply
                              0
                              • Sebastian LasseS Sebastian Lasse

                                @kopper @julian @hongminhee

                                incl.
                                - the properties in all the languages of the world
                                - the knowledge of the world in all the languages
                                - the wikidata relations and qualified statements including the nameMap etc. and all the urls to all wikiprojects incl. their languages and knowledge

                                How else could I say to other softwares if they want all users qualified data, use wikidata vocabulary?
                                wikipedia, wikidata, EBU, Public Broadcasters, taxi data is _all_ JSON-LD …

                                kopper :colon_three:K This user is from outside of this forum
                                kopper :colon_three:K This user is from outside of this forum
                                kopper :colon_three:
                                wrote last edited by
                                #27
                                @sl007 @hongminhee @julian i feel like you're falling into a trap i've seen a lot around AP spaces: just because the data can be contorted to represent something does not mean software will interpret it as such.

                                any software who wants to support wikidata statements and relations will have to go out of their way to implement that manually with or without json-ld in the mix, and interoperability between those software will have to specify how that works. and in your specification you can indeed make it so Simply Linking to the wikidata json-ld (which i don't believe it provides out of the box, it does for xml and n-triples, if we're talking about rdf. if not,
                                their bespoke json format is just as authoritative) can work (but i'd say using the Qxxx and Pxx IDs and letting the software figure out how to access it would be better!)

                                if you have the dream of making an as:Note and having it's as:attributedTo be the wikidata entity for alan turing... sorry, nobody other than maybe your own software will even attempt interpreting that
                                kopper :colon_three:K 1 Reply Last reply
                                2
                                0
                                • R AodeRelay shared this topic
                                • kopper :colon_three:K kopper :colon_three:
                                  @sl007 @hongminhee @julian i feel like you're falling into a trap i've seen a lot around AP spaces: just because the data can be contorted to represent something does not mean software will interpret it as such.

                                  any software who wants to support wikidata statements and relations will have to go out of their way to implement that manually with or without json-ld in the mix, and interoperability between those software will have to specify how that works. and in your specification you can indeed make it so Simply Linking to the wikidata json-ld (which i don't believe it provides out of the box, it does for xml and n-triples, if we're talking about rdf. if not,
                                  their bespoke json format is just as authoritative) can work (but i'd say using the Qxxx and Pxx IDs and letting the software figure out how to access it would be better!)

                                  if you have the dream of making an as:Note and having it's as:attributedTo be the wikidata entity for alan turing... sorry, nobody other than maybe your own software will even attempt interpreting that
                                  kopper :colon_three:K This user is from outside of this forum
                                  kopper :colon_three:K This user is from outside of this forum
                                  kopper :colon_three:
                                  wrote last edited by
                                  #28
                                  @hongminhee @sl007 @julian attempting to support this kind of "data contortion" (i made this up and prolly isnt the right way to describe this) would rapidly balloon the scope of every fedi software ever. i don't believe anyone would want to develop for such ecosystem

                                  a similar example i saw was someone attempting to explain how you can partially inline an as:object you as:Like'd in order to specify you only liked that past version of it and if it changed your like shouldn't count. without describing this exact scenario i don't believe any software, json-ld capable or not, would interpret that Like as such. same thing with the long-form text FEP which attempts to support non-activitypub authors
                                  Sebastian LasseS 1 Reply Last reply
                                  2
                                  0
                                  • kopper :colon_three:K kopper :colon_three:
                                    @hongminhee @sl007 @julian attempting to support this kind of "data contortion" (i made this up and prolly isnt the right way to describe this) would rapidly balloon the scope of every fedi software ever. i don't believe anyone would want to develop for such ecosystem

                                    a similar example i saw was someone attempting to explain how you can partially inline an as:object you as:Like'd in order to specify you only liked that past version of it and if it changed your like shouldn't count. without describing this exact scenario i don't believe any software, json-ld capable or not, would interpret that Like as such. same thing with the long-form text FEP which attempts to support non-activitypub authors
                                    Sebastian LasseS This user is from outside of this forum
                                    Sebastian LasseS This user is from outside of this forum
                                    Sebastian Lasse
                                    wrote last edited by
                                    #29

                                    @kopper @hongminhee @julian

                                    it is just damned simple, your as: Client can do so much by asking wikidata, OSM, federated geocoding and not our system. When you use a property for the first time, the client can cache its names in the languages of the user etc.

                                    kopper :colon_three:K 1 Reply Last reply
                                    0
                                    • Sebastian LasseS Sebastian Lasse

                                      @kopper @hongminhee @julian

                                      it is just damned simple, your as: Client can do so much by asking wikidata, OSM, federated geocoding and not our system. When you use a property for the first time, the client can cache its names in the languages of the user etc.

                                      kopper :colon_three:K This user is from outside of this forum
                                      kopper :colon_three:K This user is from outside of this forum
                                      kopper :colon_three:
                                      wrote last edited by
                                      #30
                                      @sl007 @hongminhee @julian i genuinely can't see where json-ld is relevant here. if your client wants to support wikidata and OSM then it can do that with or without json-ld being involved. you are going to have to document how this integration works anyhow if you want anyone else to do so
                                      1 Reply Last reply
                                      1
                                      0
                                      • mccM This user is from outside of this forum
                                        mccM This user is from outside of this forum
                                        mcc
                                        wrote last edited by
                                        #31

                                        @hongminhee How hard would it be for a future version of ActivityPub to simply back out JSON-LD support? Would there be a downside to this?

                                        洪 民憙 (Hong Minhee) :nonbinary:H 1 Reply Last reply
                                        0
                                        • mccM mcc

                                          @hongminhee How hard would it be for a future version of ActivityPub to simply back out JSON-LD support? Would there be a downside to this?

                                          洪 民憙 (Hong Minhee) :nonbinary:H This user is from outside of this forum
                                          洪 民憙 (Hong Minhee) :nonbinary:H This user is from outside of this forum
                                          洪 民憙 (Hong Minhee) :nonbinary:
                                          wrote last edited by
                                          #32

                                          @mcc@mastodon.social I'm not sure, but that would break some ActivityPub implementations relying on JSON-LD processors. 🤔

                                          1 Reply Last reply
                                          1
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups