• Phoenixz@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Just give me plain UTF32 with ~@4 billion code points, that really should be enough for any symbol ee can come up with. Give everything it’s own code point, no bullshit with combined glyphs that make text processing a nightmare. I need to be able to do a strlen either on byte length or amount of characters without the CPU spendings minute to count each individual character.

    I think Unicode started as a great idea and the kind of blubbered into aimless “everybody kinda does what everyone wants” territory. Unicode is for humans, sure, but we shouldn’t forget that computers actually have to do the work