Adding font that should support special characters to a website doesn't display them

New Here ,
Mar 17, 2022 Mar 17, 2022

Copy link to clipboard

Copied

Hello, 

I'm talking about degular font. Degular font should support special characters (talking about czech language now) but even though I added the font to a website (wordpress) some special characters (č, ě, ř ...) are not displayed in the degular font.

In the settings of the web project, I have ticked that font should support these characters.

I have tried importing the font in the header and in the css using Import [link to forum member's page removed by moderator], none of those two works as expected.

I tried this:

honzab82122210_0-1647522263098.png

 

and this: 

honzab82122210_1-1647522290338.png

I tried both options with and without OpenType Features.

Thank you in advance for any help.
 

Views

122

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Community Professional ,
Mar 29, 2022 Mar 29, 2022

Copy link to clipboard

Copied

Have you tried using the Unicode character codes?

https://www.w3schools.com/charsets/ref_html_utf8.asp

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Community Professional ,
Apr 10, 2022 Apr 10, 2022

Copy link to clipboard

Copied

Does it work with other fonts, like Arial, Times New Roman or similar. There are always two possibilities:

  1. The font does not work as advetized (improbale, but possible).
  2. The implementation on the website is not correct (probable and possible).
ABAMBO | Hard- and Software Engineer | Photographer

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Community Professional ,
Jul 13, 2022 Jul 13, 2022

Copy link to clipboard

Copied

LATEST

This may not be a satisfactory answer. But hopefully it explains why Czech and Slovak characters may not appear correctly on your WordPress website.


It most likely has to do with a low-level technology standard that most people never worry about: text encoding.


I have Czech and Slovak colleagues who experience the same problem online every day. One colleague – whose name includes the Slovak character »Ľ« – has simply given up, and uses the »L« + »’« character combination instead. For those of us who have to work with Native American languages like Navajo and Apache – combining letters with multiple accents – reliably typesetting text is even more difficult.


Back in the early days of computing and the internet, there were severe limitations on power, memory, and networking. That meant that anything that wasn’t absolutely necessary – such as accented letters – was left out of early text encoding standards like Ascii (1963). When the later 8-bit Ascii and Ansi standards were introduced, they still only covered characters commonly used in Western European languages. Think of this as a legacy of the Cold War.


When the Cold War ended around 1990, most computers sold worldwide had limited support for specific encoding standards and languages. That’s one reason why the Unicode was developed in the mid-1990s: to support all languages in a single, universal text encoding standard.


To achieve this, Unicode divides up the Latin alphabet into sets of characters. Since the Latin alphabet is used by so many languages, each character set couldn’t possibly include every variation of every letter used by every language. The result: the Latin character sets look remarkably like their historical predecessors. The two most important Latin sets are:


• Basic Latin;
• Extended Latin.


Basic Latin covers letters – both regular and accented – that were supported by the older Ascii and Ansi encoding standards. That is: support for most Western European languages.


Extended Latin covers accented and specialized letters used by other languages. That is: support for Central European, Eastern European, Asian, African, and other languages that use the Latin alphabet.


What does that mean for languages like Czech and Slovak? All regular letters – and some accented letters – are part of the Basic Latin set. But certain accented letters specifically used in Czech and Slovak – like č, ě, ľ, and ř – are part of the Extended Latin set.


When type designers create typefaces, they have a choice of which languages to support. Inevitably, many historical and older typefaces included support for only the Basic Latin set, and thus only Western European languages. In the world of OpenType fonts, most of these designs are normally labelled as ‘Std’ (Standard) fonts.


But when a type designer decides to produce fonts that also support parts of the Extended Latin set, these designs are normally labelled as ‘Pro’ (Professional) fonts. Which languages and characters an OpenType Pro font specifically supports is entirely up to the type designer: some Pro fonts even include support for languages that use the Greek and Cyrillic alphabets.


Even though the original Cold War is over, its historical consequences still affect us today. And with font standards, that means the long-gone Iron Curtain is still an unfortunate and clear marker between which languages considered to be ‘normal’ versus ‘special’.


Before Unicode became established as a default encoding standard, many fonts with limited Basic Latin character support had been released. And the World Wide Web already existed. Even today, many popular applications and web browsers have settings that assume that the default text encoding on files is something other than Unicode.


But as time passes by, more and more typefaces are released with extensive character support. Many typefaces on Adobe Fonts – like James Edmondson’s Degular family – reflect this. The long-term hope is that the challenges that you face today typesetting Czech and Slovak online will become less of a problem as time goes by. But in the meantime, you may not be so lucky.


There are operating systems, web browsers, fonts, and websites in active use that may still default to older text encoding standards like Ansi or ISO 8859-1: standards that don’t natively support specific Czech and Slovak characters. If the HTML files that make up your WordPress site don’t include code like:


<meta charset="utf-8" />


[Translation: this web page is encoded using the Unicode UTF-8 standard.]


…then you’re leaving it up to individual devices’ operating systems and web browser settings to decide whether to respect Czech and Slovak typesetting and font embedding. Results will vary, depending on the device.


I’ve checked James Edmondson’s type specimen for Degular on the OH no typefoundry website: the original publisher of the typeface. The specimen’s character set clearly supports a range of Extended Latin languages, including Czech and Slovak. The Adobe Fonts website also indicates support for Czech and Slovak. And the settings that you used on Adobe Fonts to define which languages should be supported on your WordPress site are also correct.


Based on what I can see, you’ve diligently done everything that you can on Adobe Fonts to ensure that any Czech and Slovak text on your website should render accurately. And as long as your HTML files point to a CSS file that defines that you’re using Degular for typesetting text – using the ‘@font-face’ declaration – then you’ve done as much as you practically can. But that unfortunately may not get the results that you want.


If it makes you feel any better, I’m currently looking at your forum posting on the Adobe Support Community website. Your example Czech letters – č, ě, and ř – should be rendering like the rest of your posting using Adobe’s Clean Serif corporate font. But they’re not: they’re rendering in Lucida Grande, one of Apple’s fallback fonts for macOS.


The same thing happens, regardless of the web browser that I use: Chromium, Firefox, and Safari all fail. And it also happens in Safari on iOS, in Kindle’s Silk Browser on Android OS, and in Firefox on Ubuntu Linux.


The reason: the fonts used on the Adobe Support Community were most likely defined to use the ‘Default’ character set encoding that you diligently avoided using yourself. What does ‘Default’ mean in this case? No surprise there: use fonts encoded in the Basic Latin character set, which is only good for Western Europe.


But as I write my response to your posting in plaintext using Apple’s TextEdit application built into macOS: č, ě, and ř render correctly, even when I switch fonts. That’s how fickle devices, operating systems, applications, and web browsers can be: if just one thing’s out of place, you’ll get a different result.


I hope that answers your question. Unfortunately, it isn’t a solution. At least, be assured that Adobe’s own website suffers from a variation of the problem that you’re observing on your own website. Although you’re not alone, at least you tried to do things right.


Best regards
Andrew


–30–

 

 

ANDREW KEITH STRAUSS / ACTP / CTT+ / ACI / ACE / ACP

Likes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines