Just make it a LJM (Large JSON Model) capable of predicting the next JSON token from the previous JSON tokens and you would have massive savings in file storagre and network traffic from not having to store and transmit full JSON documents all in exchange for an “acceptable” error rate.
Maybe it’s time we invent JPUs (json processing units) to equalize the playing field.
Latest Nvidia co-processor can perform 60 million curly brace instructions a second.
Finally, something to process “databases” that ditched excel for json!
60 million CLOPS? No way!
Until then, we have simdjson https://github.com/simdjson/simdjson
JSON and the Argonaut RISC processors
The best I can do is an ML model running on an NPU that parses JSON in subtly wrong and impossible to debug ways
So you’re saying it’s already feature complete with most json libraries out there?
deleted by creator
Did you know? By indiscriminately removing every 3rd letter, you can ethically decrease input size by up to 33%!
Just make it a LJM (Large JSON Model) capable of predicting the next JSON token from the previous JSON tokens and you would have massive savings in file storagre and network traffic from not having to store and transmit full JSON documents all in exchange for an “acceptable” error rate.
Hardware accelerated JSON Markov chain operations when?