• RonSijm@programming.dev
    link
    fedilink
    arrow-up
    3
    ·
    10 days ago

    I haven’t used json(b) in a Spring app, so I can’t say much about that.

    Json vs Jsonb depends on the use-case. Inserting json is faster than inserting Jsonb. Reading json (based on searching for specific json properties) Jsonb is faster, because Jsonb is parsed into a more optimized tree.

    From my experience, I don’t really like doing selects based on json properties. If I know I’ll be selecting a certain property, I usually add an additional column next to the json with the data, and insert that property there (At least in c#/dotnet, with EF) The frameworks don’t have that much support for selecting within json (you can do it, it’s just a lot more natively supported to use proper columns)

  • db0@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    1
    ·
    10 days ago

    I am using it in the AI Horde for mostly immutable fields which I don’t need to search often (although sometimes I might use them for DB filtering).

    They work well in the way I use them, no complains so far.

  • Loup@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    10 days ago

    I had a use case for something similar to this in the past and it worked pretty well. Data projections were stored as json payloads in jsonb columns that was served through an API.

    The tables contained the appropriate key(s) and the jsonb column only. The majority of the json data was updated using the functions that postresql provides to update specific sections of the document. There were a few tables where the value of the jsonb columns was replaced completely instead of updating a specific segment. At least one table had the jsonb column changed to a json column because it was replaced so often and never searched.

    This was done in a spring boot application and jOOQ. At the time the json and jsonb data types were not supported by jOOQ so I had to build that piece but it was worth it. This may be supported out of the box now.

    I would take the approach again in the future

  • Galactic@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 days ago

    I have used jsonb in production and it is plenty fast for general purpose workloads. If you need high performance, you will want to stick with a normalized schema, but in general jsonb is pretty slick and highly performant when compared with things like mongo.

    With that said, if you are developing queries against jsonb, you should really try not to use it. Dumping denormalized data into a single blob is going to make your life really painful in a few years when you have giant piles of highly differentiated data. Postgres will happily let you do it, and I don’t know your use case, but think carefully about the tech debt you may be incurring by using this over the long term