Byte or Character Length semantics?
When defining a CHAR
/VARCHAR
database column or program
variable, you must specify a size. When using a multibyte character set, the unit of this size
matters: it can be specified in bytes or characters. In programs, the size unit of
CHAR
/VARCHAR
variables depends on the length semantics defined
by the FGL_LENGTH_SEMANTICS
environment variable. In databases, the size unit of
the CHAR
/VARCHAR
columns can be expressed in bytes or
characters, depending on the database server and its configuration.