Maybe it's time we invent JPUs (json processing units) to equalize the playing field.
292 0 ReplyWell, do you have dedicated JSON hardware?
166 1 ReplyEverybody gangsta still we invent hardware accelerated JSON parsing
122 0 ReplyRender the json as polygons?
106 0 ReplyPermanently Deleted
91 0 ReplyThat is sometime the issue when your code editor is a disguised web browser 😅
78 2 Replythere are simd accelerated json decoders
57 0 ReplyCPU vs GPU tasks I suppose.
44 1 ReplyWould you rather have 100,000 kg of tasty supreme pizza, or 200 kg of steaming manure?
Choose wisely.
44 1 ReplyI have the same problem with XML too. Notepad++ has a plugin that can format a 50MB XML file in a few seconds. But my current client won't allow plugins installed. So I have to use VS Code, which chokes on anything bigger than what I could do myself manually if I was determined.
37 0 ReplySomeone just needs to make a GPU-accelerated JSON decoder
25 0 ReplyWorks fine in
vim
24 0 ReplyReject MB, embrace MiB.
23 6 ReplyRockstar making GTA online be like: "Computer, here is a 512mb json file please download it from the server and then do nothing with it"
12 0 ReplyYou jest, but i asked for a similar (but much simpler) vector / polygon model, and it generated it.
11 0 ReplyLet it be known that heat death is not the last event in the universe
11 0 ReplyThe obvious solution is parsing jsons with gpus? maybe not...
7 0 ReplyC++ vs JavaScript
6 0 ReplyI've never had any problems with 4,2 MB (and bigger) json files. What languages/libraries/editors chokes on it?
5 0 ReplyGiven it is a CPU is limiting the parsing of the file, I wonder how a GPU-based editor like Zed would handle it.
Been wanting to test out the editor ever since it was partially open sourced but I am too lazy to get around doing it
8 8 Reply