Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For reference, it looks like this on Safari v9.0.2 on OS X v10.11.2 'El Capitan':

http://f.cl.ly/items/0E182U1p3r430M073w2i/braille.png

Seems like the font that OS X’s text rendering system substitutes for the braille characters (U+28E3, etc.) is Apple Braille Regular over the other fonts installed that also have glyphs for those codepoints (Apple Symbols, Everson Mono (font I installed myself), and Symbola (font I installed myself)).

As an aside (and yes, I’m copying part of a post I made more than a year ago¹; I’m still interested in knowing the answer to this!) I’m pretty interested on how OS X and Windows decide on which font to use when there are multiple fonts installed containing the required glyph. For example, I have two other fonts on my OS X system that have a glyph for U+2705 (Everson Mono and Symbola), but OS X always seems to consistently pick Apple Color Emoji’s glyph. Maybe OS X’s text rendering system goes through the fonts in alphabetical order and uses the first one it finds containing the required glyph? It would be great if end users could have a bit more control over the font substitution process. I know it’s possible to do in some text editors like Emacs², but I believe that programs like that use their own text rendering systems instead of that supplied by the OS (could be wrong though).

――――――

¹ — https://news.ycombinator.com/item?id=8865067

² — http://stackoverflow.com/questions/6491202/overriding-emacs-...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: