Import Upstream version 0.15.2
This commit is contained in:
commit
161638757f
|
@ -0,0 +1,6 @@
|
|||
{
|
||||
"git": {
|
||||
"sha1": "d1640215b0f0acec06bb24bd088be45df49bf7ca"
|
||||
},
|
||||
"path_in_vcs": ""
|
||||
}
|
|
@ -0,0 +1,5 @@
|
|||
target
|
||||
Cargo.lock
|
||||
.directory
|
||||
.DS_Store
|
||||
.vscode
|
|
@ -0,0 +1,357 @@
|
|||
# Change Log
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](http://keepachangelog.com/)
|
||||
and this project adheres to [Semantic Versioning](http://semver.org/).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
## [0.15.2] - 2022-06-17
|
||||
### Fixed
|
||||
- Missing advance and side bearing offsets in `HVAR`/`VVAR` is not an error. Simply ignore them.
|
||||
|
||||
## [0.15.1] - 2022-06-04
|
||||
### Fixed
|
||||
- (cmap) `cmap::Subtable4::glyph_index` correctly handles malformed glyph offsets now.
|
||||
- (cmap) `cmap::Subtable4::codepoints` no longer includes `0xFFFF` codepoint.
|
||||
- (SVG) Fixed table parsing. Thanks to [Shubhamj280](https://github.com/Shubhamj280)
|
||||
|
||||
## [0.15.0] - 2022-02-20
|
||||
### Added
|
||||
- `apple-layout` build feature.
|
||||
- `ankr`, `feat`, `kerx`, `morx` and `trak` tables.
|
||||
- `kern` AAT subtable format 1.
|
||||
- `RawFace`
|
||||
|
||||
### Changed
|
||||
- The `parser` module is private now again.
|
||||
|
||||
## [0.14.0] - 2021-12-28
|
||||
### Changed
|
||||
- (cmap) `cmap::Subtable::glyph_index` and `cmap::Subtable::glyph_variation_index` accept
|
||||
`u32` instead of `char` now.
|
||||
- (glyf) ~7% faster outline parsing.
|
||||
|
||||
## [0.13.4] - 2021-11-23
|
||||
### Fixed
|
||||
- (CFF) Panic during `seac` resolving.
|
||||
- (CFF) Stack overflow during `seac` resolving.
|
||||
|
||||
## [0.13.3] - 2021-11-19
|
||||
### Fixed
|
||||
- (glyf) Endless loop during malformed file parsing.
|
||||
|
||||
## [0.13.2] - 2021-10-28
|
||||
### Added
|
||||
- `gvar-alloc` build feature that unlocks `gvar` table limits by using heap.
|
||||
Thanks to [OrionNebula](https://github.com/OrionNebula)
|
||||
|
||||
## [0.13.1] - 2021-10-27
|
||||
### Fixed
|
||||
- `Face::line_gap` logic.
|
||||
|
||||
## [0.13.0] - 2021-10-24
|
||||
### Added
|
||||
- Complete GSUB and GPOS tables support. Available under the `opentype-layout` feature.
|
||||
- Public access to all supported TrueType tables. This allows a low-level, but still safe,
|
||||
access to internal data layout, which can be used for performance optimization, like caching.
|
||||
- `Style` enum and `Face::style` method.
|
||||
- `Face::glyph_name` can be disabled via the `glyph-names` feature to reduce binary size.
|
||||
|
||||
### Changed
|
||||
- Improved ascender/descender/line_gap resolving logic.
|
||||
- `Face` methods: `has_glyph_classes`, `glyph_class`, `glyph_mark_attachment_class`,
|
||||
`is_mark_glyph` and `glyph_variation_delta` are moved to `gdef::Table`.
|
||||
- The `Names` struct is no longer an iterator, but a container.
|
||||
You have to call `into_iter()` manually.
|
||||
- The `VariationAxes` struct is no longer an iterator, but a container.
|
||||
You have to call `into_iter()` manually.
|
||||
- Most of the `Name` struct methods become public fields.
|
||||
- `Face::units_per_em` no longer returns `Option`.
|
||||
- (`cmap`) Improved subtable 12 performance. Thanks to [xnuk](https://github.com/xnuk)
|
||||
|
||||
### Removed
|
||||
- (c-api) `ttfp_glyph_class`, `ttfp_get_glyph_class`, `ttfp_get_glyph_mark_attachment_class`,
|
||||
`ttfp_is_mark_glyph`, `ttfp_glyph_variation_delta` and `ttfp_has_table`.
|
||||
- `TableName` enum and `Face::has_table`. Tables can be access directly now.
|
||||
- `Face::character_mapping_subtables`. Use `Face::tables().cmap` instead.
|
||||
- `Face::kerning_subtables`. Use `Face::tables().kern` instead.
|
||||
|
||||
### Fixed
|
||||
- `Iterator::count` implementation for `cmap::Subtables`, `name::Names` and `LazyArrayIter32`.
|
||||
|
||||
## [0.12.3] - 2021-06-27
|
||||
### Changed
|
||||
- (`glyf`) Always use a calculated bbox.
|
||||
|
||||
## [0.12.2] - 2021-06-11
|
||||
### Fixed
|
||||
- `Face::glyph_bounding_box` for variable `glyf`.
|
||||
- (`glyf`) Do not skip glyphs with zero-sized bbox.
|
||||
|
||||
## [0.12.1] - 2021-05-24
|
||||
### Added
|
||||
- Support Format 13 subtables in `cmap::Subtable::is_unicode`.
|
||||
Thanks to [csmulhern](https://github.com/csmulhern)
|
||||
- Derive more traits by default. Thanks to [dhardy](https://github.com/dhardy)
|
||||
|
||||
## [0.12.0] - 2021-02-14
|
||||
### Changed
|
||||
- `Face::ascender` and `Face::descender` will use
|
||||
[usWinAscent](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#uswinascent) and
|
||||
[usWinDescent](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#uswindescent)
|
||||
when `USE_TYPO_METRICS` flag is not set in `OS/2` table.
|
||||
Previously, those values were ignored and
|
||||
[hhea::ascender](https://docs.microsoft.com/en-us/typography/opentype/spec/hhea#ascender) and
|
||||
[hhea::descender](https://docs.microsoft.com/en-us/typography/opentype/spec/hhea#descender)
|
||||
were used. Now `hhea` table values will be used only when `OS/2` table is not present.
|
||||
- `Face::outline_glyph` and `Face::glyph_bounding_box` in case of a `glyf` table
|
||||
can fallback to a calculated bbox when the embedded bbox is malformed now.
|
||||
|
||||
## [0.11.0] - 2021-02-04
|
||||
### Added
|
||||
- `FaceTables`, which allowed to load `Face` not only from a single chunk of data,
|
||||
but also in a per-table way. Which is useful for WOFF parsing.
|
||||
No changes to the API.
|
||||
Thanks to [fschutt](https://github.com/fschutt)
|
||||
|
||||
## [0.10.1] - 2021-01-21
|
||||
### Changed
|
||||
- Update a font used for tests.
|
||||
|
||||
## [0.10.0] - 2021-01-16
|
||||
### Added
|
||||
- `variable-fonts` build feature. Enabled by default.
|
||||
By disabling it you can reduce `ttf-parser` binary size overhead almost twice.
|
||||
|
||||
### Changed
|
||||
- (`gvar`) Increase the maximum number of variation tuples from 16 to 32.
|
||||
Increases stack usage and makes `gvar` parsing 10% slower now.
|
||||
|
||||
### Fixed
|
||||
- (`CFF`) Fix `seac` processing. Thanks to [wezm](https://github.com/wezm)
|
||||
|
||||
## [0.9.0] - 2020-12-05
|
||||
### Removed
|
||||
- `kern` AAT subtable 1 aka `kern::state_machine`.
|
||||
Mainly because it's useless without a proper shaping.
|
||||
|
||||
## [0.8.3] - 2020-11-15
|
||||
### Added
|
||||
- `Face::glyph_variation_delta`
|
||||
|
||||
### Fixed
|
||||
- `Iterator::nth` implementation for `cmap::Subtables` and `Names`.
|
||||
|
||||
## [0.8.2] - 2020-07-31
|
||||
### Added
|
||||
- `cmap::Subtable::codepoints`
|
||||
|
||||
### Fixed
|
||||
- (cmap) Incorrectly returning glyph ID `0` instead of `None` for format 0
|
||||
- (cmap) Possible invalid glyph mapping for format 2
|
||||
|
||||
## [0.8.1] - 2020-07-29
|
||||
### Added
|
||||
- `Face::is_monospaced`
|
||||
- `Face::italic_angle`
|
||||
- `Face::typographic_ascender`
|
||||
- `Face::typographic_descender`
|
||||
- `Face::typographic_line_gap`
|
||||
- `Face::capital_height`
|
||||
|
||||
## [0.8.0] - 2020-07-21
|
||||
### Added
|
||||
- Allow `true` magic.
|
||||
- `FaceParsingError`
|
||||
- `NormalizedCoordinate`
|
||||
- `Face::variation_coordinates`
|
||||
- `Face::has_non_default_variation_coordinates`
|
||||
- `Face::glyph_name` can lookup CFF names too.
|
||||
- `Face::table_data`
|
||||
- `Face::character_mapping_subtables`
|
||||
|
||||
### Changed
|
||||
- (CFF,CFF2) 10% faster parsing.
|
||||
- `Face::from_slice` returns `Result` now.
|
||||
- `Name::platform_id` returns `PlatformId` instead of `Option<PlatformId>` now.
|
||||
- The `cmap` module became public.
|
||||
|
||||
### Fixed
|
||||
- `Face::width` parsing.
|
||||
- Possible u32 overflow on 32-bit platforms during `Face::from_slice`.
|
||||
- (cmap) `Face::glyph_variation_index` processing when the encoding table has only one glyph.
|
||||
|
||||
## [0.7.0] - 2020-07-16
|
||||
### Added
|
||||
- (CFF) CID fonts support.
|
||||
- (CFF) `seac` support.
|
||||
- `Font::global_bounding_box`
|
||||
|
||||
### Changed
|
||||
- Rename `Font` to `Face`, because this is what it actually is.
|
||||
- Rename `Font::from_data` to `Font::from_slice` to match serde and other libraries.
|
||||
- Rename `Name::name_utf8` to `Name::to_string`.
|
||||
|
||||
### Removed
|
||||
- `Font::family_name` and `Font::post_script_name`. They were a bit confusing.
|
||||
Prefer:
|
||||
```
|
||||
face.names().find(|name| name.name_id() == name_id::FULL_NAME).and_then(|name| name.to_string())
|
||||
```
|
||||
|
||||
## [0.6.2] - 2020-07-02
|
||||
### Added
|
||||
- `Name::is_unicode`
|
||||
- `Font::family_name` will load names with Windows Symbol encoding now.
|
||||
|
||||
### Fixed
|
||||
- `Font::glyph_bounding_box` will apply variation in case of `gvar` fonts.
|
||||
|
||||
## [0.6.1] - 2020-05-19
|
||||
### Fixed
|
||||
- (`kern`) Support fonts that ignore the subtable size limit.
|
||||
|
||||
## [0.6.0] - 2020-05-18
|
||||
### Added
|
||||
- `sbix`, `CBLC`, `CBDT` and `SVG` tables support.
|
||||
- `Font::glyph_raster_image` and `Font::glyph_svg_image`.
|
||||
- `Font::kerning_subtables` with subtable formats 0..3 support.
|
||||
|
||||
### Changed
|
||||
- (c-api) The library doesn't allocate `ttfp_font` anymore. All allocations should be
|
||||
handled by the caller from now.
|
||||
|
||||
### Removed
|
||||
- `Font::glyphs_kerning`. Use `Font::kerning_subtables` instead.
|
||||
- (c-api) `ttfp_create_font` and `ttfp_destroy_font`.
|
||||
Use `ttfp_font_size_of` + `ttfp_font_init` instead.
|
||||
```c
|
||||
ttfp_font *font = (ttfp_font*)alloca(ttfp_font_size_of());
|
||||
ttfp_font_init(font_data, font_data_size, 0, font);
|
||||
```
|
||||
- Logging support. We haven't used it anyway.
|
||||
|
||||
### Fixed
|
||||
- (`gvar`) Integer overflow.
|
||||
- (`cmap`) Integer overflow during subtable format 2 parsing.
|
||||
- (`CFF`, `CFF2`) DICT number parsing.
|
||||
- `Font::glyph_*_advance` will return `None` when glyph ID
|
||||
is larger than the number of metrics in the table.
|
||||
- Ignore variation offset in `Font::glyph_*_advance` and `Font::glyph_*_side_bearing`
|
||||
when `HVAR`/`VVAR` tables are missing.
|
||||
Previously returned `None` which is incorrect.
|
||||
|
||||
## [0.5.0] - 2020-03-19
|
||||
### Added
|
||||
- Variable fonts support.
|
||||
- C API.
|
||||
- `gvar`, `CFF2`, `avar`, `fvar`, `HVAR`, `VVAR` and `MVAR` tables support.
|
||||
- `Font::variation_axes`
|
||||
- `Font::set_variation`
|
||||
- `Font::is_variable`
|
||||
- `Tag` type.
|
||||
|
||||
### Fixed
|
||||
- Multiple issues due to arithmetic overflow.
|
||||
|
||||
## [0.4.0] - 2020-02-24
|
||||
|
||||
**A major rewrite.**
|
||||
|
||||
### Added
|
||||
- `Font::glyph_bounding_box`
|
||||
- `Font::glyph_name`
|
||||
- `Font::has_glyph_classes`
|
||||
- `Font::glyph_class`
|
||||
- `Font::glyph_mark_attachment_class`
|
||||
- `Font::is_mark_glyph`
|
||||
- `Font::glyph_y_origin`
|
||||
- `Font::vertical_ascender`
|
||||
- `Font::vertical_descender`
|
||||
- `Font::vertical_height`
|
||||
- `Font::vertical_line_gap`
|
||||
- Optional `log` dependency.
|
||||
|
||||
### Changed
|
||||
- `Font::outline_glyph` now accepts `&mut dyn OutlineBuilder` and not `&mut impl OutlineBuilder`.
|
||||
- `Font::ascender`, `Font::descender` and `Font::line_gap` will check `USE_TYPO_METRICS`
|
||||
flag in OS/2 table now.
|
||||
- `glyph_hor_metrics` was split into `glyph_hor_advance` and `glyph_hor_side_bearing`.
|
||||
- `glyph_ver_metrics` was split into `glyph_ver_advance` and `glyph_ver_side_bearing`.
|
||||
- `CFFError` is no longer public.
|
||||
|
||||
### Removed
|
||||
- `Error` enum. All methods will return `Option<T>` now.
|
||||
- All `unsafe`.
|
||||
|
||||
### Fixed
|
||||
- `glyph_hor_side_bearing` parsing when the number of metrics is less than the total number of glyphs.
|
||||
- Multiple CFF parsing fixes. The parser is more strict now.
|
||||
|
||||
## [0.3.0] - 2019-09-26
|
||||
### Added
|
||||
- `no_std` compatibility.
|
||||
|
||||
### Changed
|
||||
- The library has one `unsafe` block now.
|
||||
- 35% faster `family_name()` method.
|
||||
- 25% faster `from_data()` method for TrueType fonts.
|
||||
- The `Name` struct has a new API. Public fields became public functions
|
||||
and data is parsed on demand and not beforehand.
|
||||
|
||||
## [0.2.2] - 2019-08-12
|
||||
### Fixed
|
||||
- Allow format 12 subtables with *Unicode full repertoire* in `cmap`.
|
||||
|
||||
## [0.2.1] - 2019-08-12
|
||||
### Fixed
|
||||
- Check that `cmap` subtable encoding is Unicode.
|
||||
|
||||
## [0.2.0] - 2019-07-10
|
||||
### Added
|
||||
- CFF support.
|
||||
- Basic kerning support.
|
||||
- All `cmap` subtable formats except Mixed Coverage (8) are supported.
|
||||
- Vertical metrics querying from the `vmtx` table.
|
||||
- OpenType fonts are allowed now.
|
||||
|
||||
### Changed
|
||||
- A major rewrite. TrueType tables are no longer public.
|
||||
- Use `GlyphId` instead of `u16`.
|
||||
|
||||
### Removed
|
||||
- `GDEF` table parsing.
|
||||
|
||||
[Unreleased]: https://github.com/RazrFalcon/ttf-parser/compare/v0.15.2...HEAD
|
||||
[0.15.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.15.1...v0.15.2
|
||||
[0.15.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.15.0...v0.15.1
|
||||
[0.15.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.14.0...v0.15.0
|
||||
[0.14.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.13.4...v0.14.0
|
||||
[0.13.4]: https://github.com/RazrFalcon/ttf-parser/compare/v0.13.3...v0.13.4
|
||||
[0.13.3]: https://github.com/RazrFalcon/ttf-parser/compare/v0.13.2...v0.13.3
|
||||
[0.13.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.13.1...v0.13.2
|
||||
[0.13.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.13.0...v0.13.1
|
||||
[0.13.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.12.3...v0.13.0
|
||||
[0.12.3]: https://github.com/RazrFalcon/ttf-parser/compare/v0.12.2...v0.12.3
|
||||
[0.12.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.12.1...v0.12.2
|
||||
[0.12.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.12.0...v0.12.1
|
||||
[0.12.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.11.0...v0.12.0
|
||||
[0.11.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.10.1...v0.11.0
|
||||
[0.10.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.10.0...v0.10.1
|
||||
[0.10.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.9.0...v0.10.0
|
||||
[0.9.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.8.3...v0.9.0
|
||||
[0.8.3]: https://github.com/RazrFalcon/ttf-parser/compare/v0.8.2...v0.8.3
|
||||
[0.8.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.8.1...v0.8.2
|
||||
[0.8.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.8.0...v0.8.1
|
||||
[0.8.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.7.0...v0.8.0
|
||||
[0.7.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.6.2...v0.7.0
|
||||
[0.6.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.6.1...v0.6.2
|
||||
[0.6.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.6.0...v0.6.1
|
||||
[0.6.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.5.0...v0.6.0
|
||||
[0.5.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.4.0...v0.5.0
|
||||
[0.4.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.3.0...v0.4.0
|
||||
[0.3.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.2.2...v0.3.0
|
||||
[0.2.2]: https://github.com/RazrFalcon/ttf-parser/compare/v0.2.1...v0.2.2
|
||||
[0.2.1]: https://github.com/RazrFalcon/ttf-parser/compare/v0.2.0...v0.2.1
|
||||
[0.2.0]: https://github.com/RazrFalcon/ttf-parser/compare/v0.1.0...v0.2.0
|
|
@ -0,0 +1,30 @@
|
|||
# This file is automatically @generated by Cargo.
|
||||
# It is not intended for manual editing.
|
||||
version = 3
|
||||
|
||||
[[package]]
|
||||
name = "base64"
|
||||
version = "0.13.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "904dfeac50f3cdaba28fc6f57fdcddb75f49ed61346676a78c4ffe55877802fd"
|
||||
|
||||
[[package]]
|
||||
name = "pico-args"
|
||||
version = "0.5.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5be167a7af36ee22fe3115051bc51f6e6c7054c9348e28deb4f49bd6f705a315"
|
||||
|
||||
[[package]]
|
||||
name = "ttf-parser"
|
||||
version = "0.15.2"
|
||||
dependencies = [
|
||||
"base64",
|
||||
"pico-args",
|
||||
"xmlwriter",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "xmlwriter"
|
||||
version = "0.1.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ec7a2a501ed189703dba8b08142f057e887dfc4b2cc4db2d343ac6376ba3e0b9"
|
|
@ -0,0 +1,52 @@
|
|||
# THIS FILE IS AUTOMATICALLY GENERATED BY CARGO
|
||||
#
|
||||
# When uploading crates to the registry Cargo will automatically
|
||||
# "normalize" Cargo.toml files for maximal compatibility
|
||||
# with all versions of Cargo and also rewrite `path` dependencies
|
||||
# to registry (e.g., crates.io) dependencies.
|
||||
#
|
||||
# If you are reading this file be aware that the original Cargo.toml
|
||||
# will likely look very different (and much more reasonable).
|
||||
# See Cargo.toml.orig for the original contents.
|
||||
|
||||
[package]
|
||||
edition = "2018"
|
||||
name = "ttf-parser"
|
||||
version = "0.15.2"
|
||||
authors = ["Yevhenii Reizner <razrfalcon@gmail.com>"]
|
||||
exclude = ["benches/**"]
|
||||
description = "A high-level, safe, zero-allocation TrueType font parser."
|
||||
documentation = "https://docs.rs/ttf-parser/"
|
||||
readme = "README.md"
|
||||
keywords = [
|
||||
"ttf",
|
||||
"truetype",
|
||||
"opentype",
|
||||
]
|
||||
categories = ["parser-implementations"]
|
||||
license = "MIT OR Apache-2.0"
|
||||
repository = "https://github.com/RazrFalcon/ttf-parser"
|
||||
|
||||
[dev-dependencies.base64]
|
||||
version = "0.13"
|
||||
|
||||
[dev-dependencies.pico-args]
|
||||
version = "0.5"
|
||||
|
||||
[dev-dependencies.xmlwriter]
|
||||
version = "0.1"
|
||||
|
||||
[features]
|
||||
apple-layout = []
|
||||
default = [
|
||||
"std",
|
||||
"opentype-layout",
|
||||
"apple-layout",
|
||||
"variable-fonts",
|
||||
"glyph-names",
|
||||
]
|
||||
glyph-names = []
|
||||
gvar-alloc = ["std"]
|
||||
opentype-layout = []
|
||||
std = []
|
||||
variable-fonts = []
|
|
@ -0,0 +1,41 @@
|
|||
[package]
|
||||
name = "ttf-parser"
|
||||
version = "0.15.2"
|
||||
authors = ["Yevhenii Reizner <razrfalcon@gmail.com>"]
|
||||
keywords = ["ttf", "truetype", "opentype"]
|
||||
categories = ["parser-implementations"]
|
||||
license = "MIT OR Apache-2.0"
|
||||
description = "A high-level, safe, zero-allocation TrueType font parser."
|
||||
repository = "https://github.com/RazrFalcon/ttf-parser"
|
||||
documentation = "https://docs.rs/ttf-parser/"
|
||||
readme = "README.md"
|
||||
edition = "2018"
|
||||
exclude = ["benches/**"]
|
||||
|
||||
[features]
|
||||
default = ["std", "opentype-layout", "apple-layout", "variable-fonts", "glyph-names"]
|
||||
std = []
|
||||
# Enables variable fonts support. Increases binary size almost twice.
|
||||
# Includes avar, CFF2, fvar, gvar, HVAR, MVAR and VVAR tables.
|
||||
variable-fonts = []
|
||||
# Enables GDEF, GPOS and GSUB tables.
|
||||
opentype-layout = []
|
||||
# Enables ankr, feat, format1 subtable in kern, kerx, morx and trak tables.
|
||||
apple-layout = []
|
||||
# Enables glyph name query via `Face::glyph_name`.
|
||||
# TrueType fonts do not store default glyph names, to reduce file size,
|
||||
# which means we have to store them in ttf-parser. And there are almost 500 of them.
|
||||
# By disabling this feature a user can reduce binary size a bit.
|
||||
glyph-names = []
|
||||
# Enables heap allocations during gvar table parsing used by Apple's variable fonts.
|
||||
# Due to the way gvar table is structured, we cannot avoid allocations.
|
||||
# By default, only up to 32 variable tuples will be allocated on the stack,
|
||||
# while the spec allows up to 4095. Most variable fonts use 10-20 tuples,
|
||||
# so our limit is suitable for most of the cases. But if you need full support, you have to
|
||||
# enable this feature.
|
||||
gvar-alloc = ["std"]
|
||||
|
||||
[dev-dependencies]
|
||||
base64 = "0.13"
|
||||
pico-args = "0.5"
|
||||
xmlwriter = "0.1"
|
|
@ -0,0 +1,201 @@
|
|||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
|
@ -0,0 +1,20 @@
|
|||
Copyright (c) 2018 Yevhenii Reizner
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
|
|
@ -0,0 +1,196 @@
|
|||
## ttf-parser
|
||||
![Build Status](https://github.com/RazrFalcon/ttf-parser/workflows/Rust/badge.svg)
|
||||
[![Crates.io](https://img.shields.io/crates/v/ttf-parser.svg)](https://crates.io/crates/ttf-parser)
|
||||
[![Documentation](https://docs.rs/ttf-parser/badge.svg)](https://docs.rs/ttf-parser)
|
||||
[![Rust 1.42+](https://img.shields.io/badge/rust-1.42+-orange.svg)](https://www.rust-lang.org)
|
||||
![](https://img.shields.io/badge/unsafe-forbidden-brightgreen.svg)
|
||||
|
||||
A high-level, safe, zero-allocation TrueType font parser.
|
||||
|
||||
Supports [TrueType](https://docs.microsoft.com/en-us/typography/truetype/),
|
||||
[OpenType](https://docs.microsoft.com/en-us/typography/opentype/spec/)
|
||||
and [AAT](https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6AATIntro.html)
|
||||
fonts.
|
||||
|
||||
Can be used as Rust and as C library.
|
||||
|
||||
### Features
|
||||
|
||||
- A high-level API for most common properties, hiding all parsing and data resolving logic.
|
||||
- A low-level, but safe API to access TrueType tables data.
|
||||
- Highly configurable. You can disable most of the features, reducing binary size.
|
||||
You can also parse TrueType tables separately, without loading the whole font/face.
|
||||
- Zero heap allocations.
|
||||
- Zero unsafe.
|
||||
- Zero dependencies.
|
||||
- `no_std`/WASM compatible.
|
||||
- A basic [C API](./c-api).
|
||||
- Fast.
|
||||
- Stateless. All parsing methods are immutable.
|
||||
- Simple and maintainable code (no magic numbers).
|
||||
|
||||
### Safety
|
||||
|
||||
- The library must not panic. Any panic considered as a critical bug and should be reported.
|
||||
- The library forbids unsafe code.
|
||||
- No heap allocations, so crash due to OOM is not possible.
|
||||
- All recursive methods have a depth limit.
|
||||
- Technically, should use less than 64KiB of stack in the worst case scenario.
|
||||
- Most of arithmetic operations are checked.
|
||||
- Most of numeric casts are checked.
|
||||
|
||||
### Alternatives
|
||||
|
||||
It's very hard to compare different libraries, so we are using table-based comparison.
|
||||
There are roughly three types of TrueType tables:
|
||||
|
||||
- A table with a list of properties (like `head`, `OS/2`, etc.).<br/>
|
||||
If a library tries to parse it at all then we mark it as supported.
|
||||
- A table that contains a single type of data (`glyf`, `CFF` (kinda), `hmtx`, etc.).<br/>
|
||||
Can only be supported or not.
|
||||
- A table that contains multiple subtables (`cmap`, `kern`, `GPOS`, etc.).<br/>
|
||||
Can be partially supported and we note which subtables are actually supported.
|
||||
|
||||
| Feature/Library | ttf-parser | FreeType | stb_truetype |
|
||||
| ----------------- | :--------------------: | :-----------------: | :----------------------------: |
|
||||
| Memory safe | ✓ | | |
|
||||
| Thread safe | ✓ | | ~ (mostly reentrant) |
|
||||
| Zero allocation | ✓ | | |
|
||||
| Variable fonts | ✓ | ✓ | |
|
||||
| Rendering | -<sup>1</sup> | ✓ | ~ (very primitive) |
|
||||
| `ankr` table | ✓ | | |
|
||||
| `avar` table | ✓ | ✓ | |
|
||||
| `bdat` table | | ✓ | |
|
||||
| `bloc` table | | ✓ | |
|
||||
| `CBDT` table | ✓ | ✓ | |
|
||||
| `CBLC` table | ✓ | ✓ | |
|
||||
| `COLR` table | | ✓ | |
|
||||
| `CPAL` table | | ✓ | |
|
||||
| `CFF ` table | ✓ | ✓ | ~ (no `seac` support) |
|
||||
| `CFF2` table | ✓ | ✓ | |
|
||||
| `cmap` table | ~ (no 8) | ✓ | ~ (no 2,8,10,14; Unicode-only) |
|
||||
| `EBDT` table | | ✓ | |
|
||||
| `EBLC` table | | ✓ | |
|
||||
| `feat` table | ✓ | | |
|
||||
| `fvar` table | ✓ | ✓ | |
|
||||
| `gasp` table | | ✓ | |
|
||||
| `GDEF` table | ~ | | |
|
||||
| `glyf` table | ~<sup>2</sup> | ✓ | ~<sup>2</sup> |
|
||||
| `GPOS` table | ✓ | | ~ (only 2) |
|
||||
| `GSUB` table | ✓ | | |
|
||||
| `gvar` table | ✓ | ✓ | |
|
||||
| `head` table | ✓ | ✓ | ✓ |
|
||||
| `hhea` table | ✓ | ✓ | ✓ |
|
||||
| `hmtx` table | ✓ | ✓ | ✓ |
|
||||
| `HVAR` table | ✓ | ✓ | |
|
||||
| `kern` table | ✓ | ~ (only 0) | ~ (only 0) |
|
||||
| `kerx` table | ✓ | | |
|
||||
| `maxp` table | ✓ | ✓ | ✓ |
|
||||
| `morx` table | ✓ | | |
|
||||
| `MVAR` table | ✓ | ✓ | |
|
||||
| `name` table | ✓ | ✓ | |
|
||||
| `OS/2` table | ✓ | ✓ | |
|
||||
| `post` table | ✓ | ✓ | |
|
||||
| `sbix` table | ~ (PNG only) | ~ (PNG only) | |
|
||||
| `SVG ` table | ✓ | ✓ | ✓ |
|
||||
| `trak` table | ✓ | | |
|
||||
| `vhea` table | ✓ | ✓ | |
|
||||
| `vmtx` table | ✓ | ✓ | |
|
||||
| `VORG` table | ✓ | ✓ | |
|
||||
| `VVAR` table | ✓ | ✓ | |
|
||||
| Language | Rust + C API | C | C |
|
||||
| Tested version | 0.15.0 | 2.12.0 | 1.24 |
|
||||
| License | MIT / Apache-2.0 | FTL / GPLv2 | public domain |
|
||||
|
||||
Legend:
|
||||
|
||||
- ✓ - supported
|
||||
- ~ - partial
|
||||
- *nothing* - not supported
|
||||
|
||||
Notes:
|
||||
|
||||
1. While `ttf-parser` doesn't support rendering by itself,
|
||||
there are multiple rendering libraries on top of it:
|
||||
[rusttype](https://gitlab.redox-os.org/redox-os/rusttype),
|
||||
[ab-glyph](https://github.com/alexheretic/ab-glyph)
|
||||
and [fontdue](https://github.com/mooman219/fontdue).
|
||||
2. Matching points are not supported.
|
||||
|
||||
### Performance
|
||||
|
||||
TrueType fonts designed for fast querying, so most of the methods are very fast.
|
||||
The main exception is glyph outlining. Glyphs can be stored using two different methods:
|
||||
using [Glyph Data Format](https://docs.microsoft.com/en-us/typography/opentype/spec/glyf)
|
||||
and [Compact Font Format](http://wwwimages.adobe.com/content/dam/Adobe/en/devnet/font/pdfs/5176.CFF.pdf) (pdf).
|
||||
The first one is fairly simple which makes it faster to process.
|
||||
The second one is basically a tiny language with a stack-based VM, which makes it way harder to process.
|
||||
|
||||
The [benchmark](./benches/outline/) tests how long it takes to outline all glyphs in a font.
|
||||
|
||||
x86 (AMD 3700X)
|
||||
|
||||
| Table/Library | ttf-parser | FreeType | stb_truetype |
|
||||
| ------------- | -------------: | ---------: | -------------: |
|
||||
| `glyf` | `0.901 ms` | `1.171 ms` | **`0.675 ms`** |
|
||||
| `gvar` | **`2.972 ms`** | `4.132 ms` | - |
|
||||
| `CFF` | **`1.197 ms`** | `5.647 ms` | `2.813 ms` |
|
||||
| `CFF2` | **`1.968 ms`** | `6.392 ms` | - |
|
||||
|
||||
ARM (Apple M1)
|
||||
|
||||
| Table/Library | ttf-parser | FreeType | stb_truetype |
|
||||
| ------------- | -------------: | ---------: | -------------: |
|
||||
| `glyf` | **`0.550 ms`** | `0.854 ms` | `0.703 ms` |
|
||||
| `gvar` | **`2.270 ms`** | `4.594 ms` | - |
|
||||
| `CFF` | **`1.054 ms`** | `5.223 ms` | `3.262 ms` |
|
||||
| `CFF2` | **`1.765 ms`** | `5.995 ms` | - |
|
||||
|
||||
**Note:** FreeType is surprisingly slow, so I'm worried that I've messed something up.
|
||||
|
||||
And here are some methods benchmarks:
|
||||
|
||||
```text
|
||||
test outline_glyph_276_from_cff2 ... bench: 867 ns/iter (+/- 15)
|
||||
test from_data_otf_cff ... bench: 968 ns/iter (+/- 13)
|
||||
test from_data_otf_cff2 ... bench: 887 ns/iter (+/- 25)
|
||||
test outline_glyph_276_from_cff ... bench: 678 ns/iter (+/- 41)
|
||||
test outline_glyph_276_from_glyf ... bench: 649 ns/iter (+/- 11)
|
||||
test outline_glyph_8_from_cff2 ... bench: 534 ns/iter (+/- 14)
|
||||
test from_data_ttf ... bench: 467 ns/iter (+/- 11)
|
||||
test glyph_name_post_276 ... bench: 223 ns/iter (+/- 5)
|
||||
test outline_glyph_8_from_cff ... bench: 315 ns/iter (+/- 13)
|
||||
test outline_glyph_8_from_glyf ... bench: 291 ns/iter (+/- 5)
|
||||
test family_name ... bench: 183 ns/iter (+/- 102)
|
||||
test glyph_name_cff_276 ... bench: 62 ns/iter (+/- 1)
|
||||
test glyph_index_u41 ... bench: 16 ns/iter (+/- 0)
|
||||
test glyph_name_cff_8 ... bench: 5 ns/iter (+/- 0)
|
||||
test glyph_name_post_8 ... bench: 2 ns/iter (+/- 0)
|
||||
test subscript_metrics ... bench: 2 ns/iter (+/- 0)
|
||||
test glyph_hor_advance ... bench: 2 ns/iter (+/- 0)
|
||||
test glyph_hor_side_bearing ... bench: 2 ns/iter (+/- 0)
|
||||
test glyph_name_8 ... bench: 1 ns/iter (+/- 0)
|
||||
test ascender ... bench: 1 ns/iter (+/- 0)
|
||||
test underline_metrics ... bench: 1 ns/iter (+/- 0)
|
||||
test strikeout_metrics ... bench: 1 ns/iter (+/- 0)
|
||||
test x_height ... bench: 1 ns/iter (+/- 0)
|
||||
test units_per_em ... bench: 0.5 ns/iter (+/- 0)
|
||||
test width ... bench: 0.2 ns/iter (+/- 0)
|
||||
```
|
||||
|
||||
### License
|
||||
|
||||
Licensed under either of
|
||||
|
||||
- Apache License, Version 2.0
|
||||
([LICENSE-APACHE](LICENSE-APACHE) or http://www.apache.org/licenses/LICENSE-2.0)
|
||||
- MIT license
|
||||
([LICENSE-MIT](LICENSE-MIT) or http://opensource.org/licenses/MIT)
|
||||
|
||||
at your option.
|
||||
|
||||
### Contribution
|
||||
|
||||
Unless you explicitly state otherwise, any contribution intentionally submitted
|
||||
for inclusion in the work by you, as defined in the Apache-2.0 license, shall be
|
||||
dual licensed as above, without any additional terms or conditions.
|
|
@ -0,0 +1,95 @@
|
|||
fn main() {
|
||||
let args: Vec<_> = std::env::args().collect();
|
||||
if args.len() != 2 {
|
||||
println!("Usage:\n\tfont-info font.ttf");
|
||||
std::process::exit(1);
|
||||
}
|
||||
|
||||
let font_data = std::fs::read(&args[1]).unwrap();
|
||||
|
||||
let now = std::time::Instant::now();
|
||||
|
||||
let face = match ttf_parser::Face::from_slice(&font_data, 0) {
|
||||
Ok(f) => f,
|
||||
Err(e) => {
|
||||
eprint!("Error: {}.", e);
|
||||
std::process::exit(1);
|
||||
},
|
||||
};
|
||||
|
||||
let family_name = face.names().into_iter()
|
||||
.find(|name| name.name_id == ttf_parser::name_id::FULL_NAME && name.is_unicode())
|
||||
.and_then(|name| name.to_string());
|
||||
|
||||
let post_script_name = face.names().into_iter()
|
||||
.find(|name| name.name_id == ttf_parser::name_id::POST_SCRIPT_NAME && name.is_unicode())
|
||||
.and_then(|name| name.to_string());
|
||||
|
||||
println!("Family name: {:?}", family_name);
|
||||
println!("PostScript name: {:?}", post_script_name);
|
||||
println!("Units per EM: {:?}", face.units_per_em());
|
||||
println!("Ascender: {}", face.ascender());
|
||||
println!("Descender: {}", face.descender());
|
||||
println!("Line gap: {}", face.line_gap());
|
||||
println!("Global bbox: {:?}", face.global_bounding_box());
|
||||
println!("Number of glyphs: {}", face.number_of_glyphs());
|
||||
println!("Underline: {:?}", face.underline_metrics());
|
||||
println!("X height: {:?}", face.x_height());
|
||||
println!("Weight: {:?}", face.weight());
|
||||
println!("Width: {:?}", face.width());
|
||||
println!("Regular: {}", face.is_regular());
|
||||
println!("Italic: {}", face.is_italic());
|
||||
println!("Bold: {}", face.is_bold());
|
||||
println!("Oblique: {}", face.is_oblique());
|
||||
println!("Strikeout: {:?}", face.strikeout_metrics());
|
||||
println!("Subscript: {:?}", face.subscript_metrics());
|
||||
println!("Superscript: {:?}", face.superscript_metrics());
|
||||
println!("Variable: {:?}", face.is_variable());
|
||||
|
||||
#[cfg(feature = "opentype-layout")] {
|
||||
if let Some(ref table) = face.tables().gpos {
|
||||
print_opentype_layout("positioning", table);
|
||||
}
|
||||
|
||||
if let Some(ref table) = face.tables().gsub {
|
||||
print_opentype_layout("substitution", table);
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(feature = "variable-fonts")] {
|
||||
if face.is_variable() {
|
||||
println!("Variation axes:");
|
||||
for axis in face.variation_axes() {
|
||||
println!(" {} {}..{}, default {}",
|
||||
axis.tag, axis.min_value, axis.max_value, axis.def_value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
println!("Elapsed: {}us", now.elapsed().as_micros());
|
||||
}
|
||||
|
||||
fn print_opentype_layout(name: &str, table: &ttf_parser::opentype_layout::LayoutTable) {
|
||||
println!("OpenType {}:", name);
|
||||
println!(" Scripts:");
|
||||
for script in table.scripts {
|
||||
println!(" {}", script.tag);
|
||||
|
||||
if script.languages.is_empty() {
|
||||
println!(" No languages");
|
||||
continue;
|
||||
}
|
||||
|
||||
println!(" Languages:");
|
||||
for lang in script.languages {
|
||||
println!(" {}", lang.tag);
|
||||
}
|
||||
}
|
||||
|
||||
let mut features: Vec<_> = table.features.into_iter().map(|f| f.tag).collect();
|
||||
features.dedup();
|
||||
println!(" Features:");
|
||||
for feature in features {
|
||||
println!(" {}", feature);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,292 @@
|
|||
use std::path::PathBuf;
|
||||
use std::io::Write;
|
||||
|
||||
use ttf_parser as ttf;
|
||||
|
||||
const FONT_SIZE: f64 = 128.0;
|
||||
const COLUMNS: u32 = 100;
|
||||
|
||||
const HELP: &str = "\
|
||||
Usage:
|
||||
font2svg font.ttf out.svg
|
||||
font2svg --variations 'wght:500;wdth:200' font.ttf out.svg
|
||||
";
|
||||
|
||||
struct Args {
|
||||
#[allow(dead_code)] variations: Vec<ttf::Variation>,
|
||||
ttf_path: PathBuf,
|
||||
svg_path: PathBuf,
|
||||
}
|
||||
|
||||
fn main() {
|
||||
let args = match parse_args() {
|
||||
Ok(v) => v,
|
||||
Err(e) => {
|
||||
eprintln!("Error: {}.", e);
|
||||
print!("{}", HELP);
|
||||
std::process::exit(1);
|
||||
}
|
||||
};
|
||||
|
||||
if let Err(e) = process(args) {
|
||||
eprintln!("Error: {}.", e);
|
||||
std::process::exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
fn parse_args() -> Result<Args, Box<dyn std::error::Error>> {
|
||||
let mut args = pico_args::Arguments::from_env();
|
||||
|
||||
if args.contains(["-h", "--help"]) {
|
||||
print!("{}", HELP);
|
||||
std::process::exit(0);
|
||||
}
|
||||
|
||||
let variations = args.opt_value_from_fn("--variations", parse_variations)?;
|
||||
let free = args.finish();
|
||||
if free.len() != 2 {
|
||||
return Err("invalid number of arguments".into());
|
||||
}
|
||||
|
||||
Ok(Args {
|
||||
variations: variations.unwrap_or_default(),
|
||||
ttf_path: PathBuf::from(&free[0]),
|
||||
svg_path: PathBuf::from(&free[1]),
|
||||
})
|
||||
}
|
||||
|
||||
fn parse_variations(s: &str) -> Result<Vec<ttf::Variation>, &'static str> {
|
||||
let mut variations = Vec::new();
|
||||
for part in s.split(';') {
|
||||
let mut iter = part.split(':');
|
||||
|
||||
let axis = iter.next().ok_or("failed to parse a variation")?;
|
||||
let axis = ttf::Tag::from_bytes_lossy(axis.as_bytes());
|
||||
|
||||
let value = iter.next().ok_or("failed to parse a variation")?;
|
||||
let value: f32 = value.parse().map_err(|_| "failed to parse a variation")?;
|
||||
|
||||
variations.push(ttf::Variation { axis, value });
|
||||
}
|
||||
|
||||
Ok(variations)
|
||||
}
|
||||
|
||||
fn process(args: Args) -> Result<(), Box<dyn std::error::Error>> {
|
||||
let font_data = std::fs::read(&args.ttf_path)?;
|
||||
|
||||
// Exclude IO operations.
|
||||
let now = std::time::Instant::now();
|
||||
|
||||
#[allow(unused_mut)]
|
||||
let mut face = ttf::Face::from_slice(&font_data, 0)?;
|
||||
if face.is_variable() {
|
||||
#[cfg(feature = "variable-fonts")] {
|
||||
for variation in args.variations {
|
||||
face.set_variation(variation.axis, variation.value)
|
||||
.ok_or("failed to create variation coordinates")?;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let units_per_em = face.units_per_em();
|
||||
let scale = FONT_SIZE / units_per_em as f64;
|
||||
|
||||
let cell_size = face.height() as f64 * FONT_SIZE / units_per_em as f64;
|
||||
let rows = (face.number_of_glyphs() as f64 / COLUMNS as f64).ceil() as u32;
|
||||
|
||||
let mut svg = xmlwriter::XmlWriter::with_capacity(
|
||||
face.number_of_glyphs() as usize * 512,
|
||||
xmlwriter::Options::default(),
|
||||
);
|
||||
svg.start_element("svg");
|
||||
svg.write_attribute("xmlns", "http://www.w3.org/2000/svg");
|
||||
svg.write_attribute("xmlns:xlink", "http://www.w3.org/1999/xlink");
|
||||
svg.write_attribute_fmt(
|
||||
"viewBox",
|
||||
format_args!("{} {} {} {}", 0, 0, cell_size * COLUMNS as f64, cell_size * rows as f64),
|
||||
);
|
||||
|
||||
draw_grid(face.number_of_glyphs(), cell_size, &mut svg);
|
||||
|
||||
let mut path_buf = String::with_capacity(256);
|
||||
let mut row = 0;
|
||||
let mut column = 0;
|
||||
for id in 0..face.number_of_glyphs() {
|
||||
let x = column as f64 * cell_size;
|
||||
let y = row as f64 * cell_size;
|
||||
|
||||
svg.start_element("text");
|
||||
svg.write_attribute("x", &(x + 2.0));
|
||||
svg.write_attribute("y", &(y + cell_size - 4.0));
|
||||
svg.write_attribute("font-size", "36");
|
||||
svg.write_attribute("fill", "gray");
|
||||
svg.write_text_fmt(format_args!("{}", &id));
|
||||
svg.end_element();
|
||||
|
||||
if let Some(img) = face.glyph_raster_image(ttf::GlyphId(id), std::u16::MAX) {
|
||||
svg.start_element("image");
|
||||
svg.write_attribute("x", &(x + 2.0 + img.x as f64));
|
||||
svg.write_attribute("y", &(y - img.y as f64));
|
||||
svg.write_attribute("width", &img.width);
|
||||
svg.write_attribute("height", &img.height);
|
||||
svg.write_attribute_raw("xlink:href", |buf| {
|
||||
buf.extend_from_slice(b"data:image/png;base64, ");
|
||||
|
||||
let mut enc = base64::write::EncoderWriter::new(buf, base64::STANDARD);
|
||||
enc.write_all(img.data).unwrap();
|
||||
enc.finish().unwrap();
|
||||
});
|
||||
svg.end_element();
|
||||
} else if let Some(img) = face.glyph_svg_image(ttf::GlyphId(id)) {
|
||||
svg.start_element("image");
|
||||
svg.write_attribute("x", &(x + 2.0));
|
||||
svg.write_attribute("y", &(y + cell_size));
|
||||
svg.write_attribute("width", &cell_size);
|
||||
svg.write_attribute("height", &cell_size);
|
||||
svg.write_attribute_raw("xlink:href", |buf| {
|
||||
buf.extend_from_slice(b"data:image/svg+xml;base64, ");
|
||||
|
||||
let mut enc = base64::write::EncoderWriter::new(buf, base64::STANDARD);
|
||||
enc.write_all(img).unwrap();
|
||||
enc.finish().unwrap();
|
||||
});
|
||||
svg.end_element();
|
||||
} else {
|
||||
glyph_to_path(
|
||||
x,
|
||||
y,
|
||||
&face,
|
||||
ttf::GlyphId(id),
|
||||
cell_size,
|
||||
scale,
|
||||
&mut svg,
|
||||
&mut path_buf,
|
||||
);
|
||||
}
|
||||
|
||||
column += 1;
|
||||
if column == COLUMNS {
|
||||
column = 0;
|
||||
row += 1;
|
||||
}
|
||||
}
|
||||
|
||||
println!("Elapsed: {}ms", now.elapsed().as_micros() as f64 / 1000.0);
|
||||
|
||||
std::fs::write(&args.svg_path, &svg.end_document())?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn draw_grid(
|
||||
n_glyphs: u16,
|
||||
cell_size: f64,
|
||||
svg: &mut xmlwriter::XmlWriter,
|
||||
) {
|
||||
let columns = COLUMNS;
|
||||
let rows = (n_glyphs as f64 / columns as f64).ceil() as u32;
|
||||
|
||||
let width = columns as f64 * cell_size;
|
||||
let height = rows as f64 * cell_size;
|
||||
|
||||
svg.start_element("path");
|
||||
svg.write_attribute("fill", "none");
|
||||
svg.write_attribute("stroke", "black");
|
||||
svg.write_attribute("stroke-width", "5");
|
||||
|
||||
let mut path = String::with_capacity(256);
|
||||
|
||||
use std::fmt::Write;
|
||||
let mut x = 0.0;
|
||||
for _ in 0..=columns {
|
||||
write!(&mut path, "M {} {} L {} {} ", x, 0.0, x, height).unwrap();
|
||||
x += cell_size;
|
||||
}
|
||||
|
||||
let mut y = 0.0;
|
||||
for _ in 0..=rows {
|
||||
write!(&mut path, "M {} {} L {} {} ", 0.0, y, width, y).unwrap();
|
||||
y += cell_size;
|
||||
}
|
||||
|
||||
path.pop();
|
||||
|
||||
svg.write_attribute("d", &path);
|
||||
svg.end_element();
|
||||
}
|
||||
|
||||
fn glyph_to_path(
|
||||
x: f64,
|
||||
y: f64,
|
||||
face: &ttf::Face,
|
||||
glyph_id: ttf::GlyphId,
|
||||
cell_size: f64,
|
||||
scale: f64,
|
||||
svg: &mut xmlwriter::XmlWriter,
|
||||
path_buf: &mut String,
|
||||
) {
|
||||
path_buf.clear();
|
||||
let mut builder = Builder(path_buf);
|
||||
let bbox = match face.outline_glyph(glyph_id, &mut builder) {
|
||||
Some(v) => v,
|
||||
None => return,
|
||||
};
|
||||
if !path_buf.is_empty() {
|
||||
path_buf.pop(); // remove trailing space
|
||||
}
|
||||
|
||||
let bbox_w = (bbox.x_max as f64 - bbox.x_min as f64) * scale;
|
||||
let dx = (cell_size - bbox_w) / 2.0;
|
||||
let y = y + cell_size + face.descender() as f64 * scale;
|
||||
|
||||
let transform = format!("matrix({} 0 0 {} {} {})", scale, -scale, x + dx, y);
|
||||
|
||||
svg.start_element("path");
|
||||
svg.write_attribute("d", path_buf);
|
||||
svg.write_attribute("transform", &transform);
|
||||
svg.end_element();
|
||||
|
||||
{
|
||||
let bbox_h = (bbox.y_max as f64 - bbox.y_min as f64) * scale;
|
||||
let bbox_x = x + dx + bbox.x_min as f64 * scale;
|
||||
let bbox_y = y - bbox.y_max as f64 * scale;
|
||||
|
||||
svg.start_element("rect");
|
||||
svg.write_attribute("x", &bbox_x);
|
||||
svg.write_attribute("y", &bbox_y);
|
||||
svg.write_attribute("width", &bbox_w);
|
||||
svg.write_attribute("height", &bbox_h);
|
||||
svg.write_attribute("fill", "none");
|
||||
svg.write_attribute("stroke", "green");
|
||||
svg.end_element();
|
||||
}
|
||||
}
|
||||
|
||||
struct Builder<'a>(&'a mut String);
|
||||
|
||||
impl ttf::OutlineBuilder for Builder<'_> {
|
||||
fn move_to(&mut self, x: f32, y: f32) {
|
||||
use std::fmt::Write;
|
||||
write!(self.0, "M {} {} ", x, y).unwrap()
|
||||
}
|
||||
|
||||
fn line_to(&mut self, x: f32, y: f32) {
|
||||
use std::fmt::Write;
|
||||
write!(self.0, "L {} {} ", x, y).unwrap()
|
||||
}
|
||||
|
||||
fn quad_to(&mut self, x1: f32, y1: f32, x: f32, y: f32) {
|
||||
use std::fmt::Write;
|
||||
write!(self.0, "Q {} {} {} {} ", x1, y1, x, y).unwrap()
|
||||
}
|
||||
|
||||
fn curve_to(&mut self, x1: f32, y1: f32, x2: f32, y2: f32, x: f32, y: f32) {
|
||||
use std::fmt::Write;
|
||||
write!(self.0, "C {} {} {} {} {} {} ", x1, y1, x2, y2, x, y).unwrap()
|
||||
}
|
||||
|
||||
fn close(&mut self) {
|
||||
self.0.push_str("Z ")
|
||||
}
|
||||
}
|
|
@ -0,0 +1 @@
|
|||
*.wasm
|
|
@ -0,0 +1,18 @@
|
|||
# ttf-parser as a WebAssembly module
|
||||
|
||||
## Build
|
||||
|
||||
```sh
|
||||
rustup target add wasm32-unknown-unknown
|
||||
|
||||
cargo build --target wasm32-unknown-unknown --release --manifest-path ../../c-api/Cargo.toml
|
||||
cp ../../c-api/target/wasm32-unknown-unknown/release/ttfparser.wasm .
|
||||
```
|
||||
|
||||
## Run
|
||||
|
||||
You can use any webserver that can serve `index.html`. Here is a Python example:
|
||||
|
||||
```sh
|
||||
python -m http.server
|
||||
```
|
Binary file not shown.
|
@ -0,0 +1,67 @@
|
|||
<h2>ttf-parser in WebAssembly</h2>
|
||||
<p><small>(supports font files drag and drop)</small></p>
|
||||
<p><span id="fileName">TTC.ttc</span>:</p>
|
||||
<p><code>ttfp_fonts_in_collection():</code> <code id="fontsInCollection"></code></p>
|
||||
<p><code>ttfp_is_variable():</code> <code id="isVariable"></code></p>
|
||||
<p><code>ttfp_get_weight():</code> <code id="fontWeight"></code></p>
|
||||
<script>
|
||||
'use strict';
|
||||
|
||||
let wasm;
|
||||
|
||||
function update(fontBlob) {
|
||||
const exports = wasm.instance.exports;
|
||||
|
||||
// How our heaped is structured, as ttf-parser doesn't allocate anything
|
||||
// it is all ours and we can decide how to use it
|
||||
const heapStart = exports.__heap_base.value;
|
||||
const fontHandlerAddress = heapStart;
|
||||
const fontHandlerLength = exports.ttfp_face_size_of();
|
||||
const fontDataAddress = heapStart + fontHandlerLength;
|
||||
const fontDataLength = fontBlob.length;
|
||||
|
||||
// Copy the fetched blob into WebAssembly machine
|
||||
const heapu8 = new Uint8Array(exports.memory.buffer);
|
||||
heapu8.set(fontBlob, fontDataAddress);
|
||||
|
||||
fontsInCollection.textContent = exports.ttfp_fonts_in_collection(fontDataAddress, fontDataLength);
|
||||
|
||||
// Create font handler
|
||||
exports.ttfp_face_init(fontDataAddress, fontDataLength, 0/*face index*/, fontHandlerAddress);
|
||||
|
||||
isVariable.textContent = exports.ttfp_is_variable(fontHandlerAddress);
|
||||
fontWeight.textContent = exports.ttfp_get_weight(fontHandlerAddress);
|
||||
}
|
||||
|
||||
fetch('ttfparser.wasm').then(x => x.arrayBuffer()).then(WebAssembly.instantiate).then(result => {
|
||||
wasm = result;
|
||||
// Extend wasm machine heap once now that we are here, each page is 64kb
|
||||
wasm.instance.exports.memory.grow(400);
|
||||
|
||||
// Could be done in parallel using Promise.all
|
||||
fetch('TTC.ttc').then(x => x.arrayBuffer()).then(result => {
|
||||
update(new Uint8Array(result));
|
||||
});
|
||||
});
|
||||
|
||||
document.addEventListener('dragover', e => {
|
||||
e.stopPropagation(); e.preventDefault();
|
||||
}, false);
|
||||
document.addEventListener('dragleave', e => {
|
||||
e.stopPropagation(); e.preventDefault();
|
||||
}, false);
|
||||
document.addEventListener('drop', e => {
|
||||
e.stopPropagation(); e.preventDefault();
|
||||
handleFontUpdate(e.dataTransfer.files[0]);
|
||||
});
|
||||
// document.addEventListener('paste', e => {
|
||||
// handleFontUpdate(e.clipboardData.files[0]);
|
||||
// });
|
||||
function handleFontUpdate(file) {
|
||||
if (!file) return;
|
||||
fileName.textContent = file.name;
|
||||
const reader = new FileReader();
|
||||
reader.addEventListener('load', () => update(new Uint8Array(reader.result)));
|
||||
reader.readAsArrayBuffer(file);
|
||||
}
|
||||
</script>
|
|
@ -0,0 +1,12 @@
|
|||
project('ttf-parser', 'rust')
|
||||
|
||||
add_project_arguments(['--edition=2018'], language: 'rust')
|
||||
|
||||
ttf_parser = static_library('ttf_parser_capi', 'c-api/lib.rs', rust_crate_type: 'staticlib',
|
||||
link_with: static_library('ttf_parser', 'src/lib.rs'),
|
||||
)
|
||||
|
||||
ttf_parser_dep = declare_dependency(
|
||||
link_with: ttf_parser,
|
||||
include_directories: 'c-api/',
|
||||
)
|
|
@ -0,0 +1,557 @@
|
|||
/*!
|
||||
A collection of [Apple Advanced Typography](
|
||||
https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6AATIntro.html)
|
||||
related types.
|
||||
*/
|
||||
|
||||
use core::num::NonZeroU16;
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::parser::{Stream, FromData, LazyArray16, Offset, Offset16, Offset32, NumFrom};
|
||||
|
||||
/// Predefined states.
|
||||
pub mod state {
|
||||
#![allow(missing_docs)]
|
||||
pub const START_OF_TEXT: u16 = 0;
|
||||
}
|
||||
|
||||
/// Predefined classes.
|
||||
///
|
||||
/// Search for _Class Code_ in [Apple Advanced Typography Font Tables](
|
||||
/// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6Tables.html).
|
||||
pub mod class {
|
||||
#![allow(missing_docs)]
|
||||
pub const END_OF_TEXT: u8 = 0;
|
||||
pub const OUT_OF_BOUNDS: u8 = 1;
|
||||
pub const DELETED_GLYPH: u8 = 2;
|
||||
}
|
||||
|
||||
/// A State Table entry.
|
||||
///
|
||||
/// Used by legacy and extended tables.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct GenericStateEntry<T: FromData> {
|
||||
/// A new state.
|
||||
pub new_state: u16,
|
||||
/// Entry flags.
|
||||
pub flags: u16,
|
||||
/// Additional data.
|
||||
///
|
||||
/// Use `()` if no data expected.
|
||||
pub extra: T,
|
||||
}
|
||||
|
||||
impl<T: FromData> FromData for GenericStateEntry<T> {
|
||||
const SIZE: usize = 4 + T::SIZE;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(GenericStateEntry {
|
||||
new_state: s.read::<u16>()?,
|
||||
flags: s.read::<u16>()?,
|
||||
extra: s.read::<T>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl<T: FromData> GenericStateEntry<T> {
|
||||
/// Checks that entry has an offset.
|
||||
#[inline]
|
||||
pub fn has_offset(&self) -> bool {
|
||||
self.flags & 0x3FFF != 0
|
||||
}
|
||||
|
||||
/// Returns a value offset.
|
||||
///
|
||||
/// Used by kern::format1 subtable.
|
||||
#[inline]
|
||||
pub fn value_offset(&self) -> ValueOffset {
|
||||
ValueOffset(self.flags & 0x3FFF)
|
||||
}
|
||||
|
||||
/// If set, reset the kerning data (clear the stack).
|
||||
#[inline]
|
||||
pub fn has_reset(&self) -> bool {
|
||||
self.flags & 0x2000 != 0
|
||||
}
|
||||
|
||||
/// If set, advance to the next glyph before going to the new state.
|
||||
#[inline]
|
||||
pub fn has_advance(&self) -> bool {
|
||||
self.flags & 0x4000 == 0
|
||||
}
|
||||
|
||||
/// If set, push this glyph on the kerning stack.
|
||||
#[inline]
|
||||
pub fn has_push(&self) -> bool {
|
||||
self.flags & 0x8000 != 0
|
||||
}
|
||||
|
||||
/// If set, remember this glyph as the marked glyph.
|
||||
///
|
||||
/// Used by kerx::format4 subtable.
|
||||
///
|
||||
/// Yes, the same as [`has_push`](Self::has_push).
|
||||
#[inline]
|
||||
pub fn has_mark(&self) -> bool {
|
||||
self.flags & 0x8000 != 0
|
||||
}
|
||||
}
|
||||
|
||||
/// A legacy state entry used by [StateTable].
|
||||
pub type StateEntry = GenericStateEntry<()>;
|
||||
|
||||
/// A type-safe wrapper for a kerning value offset.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct ValueOffset(u16);
|
||||
|
||||
impl ValueOffset {
|
||||
/// Returns the next offset.
|
||||
///
|
||||
/// After reaching u16::MAX will start from 0.
|
||||
#[inline]
|
||||
pub fn next(self) -> Self {
|
||||
ValueOffset(self.0.wrapping_add(u16::SIZE as u16))
|
||||
}
|
||||
}
|
||||
|
||||
/// A [State Table](
|
||||
/// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6Tables.html).
|
||||
///
|
||||
/// Also called `STHeader`.
|
||||
///
|
||||
/// Currently used by `kern` table.
|
||||
#[derive(Clone)]
|
||||
pub struct StateTable<'a> {
|
||||
number_of_classes: u16,
|
||||
first_glyph: GlyphId,
|
||||
class_table: &'a [u8],
|
||||
state_array_offset: u16,
|
||||
state_array: &'a [u8],
|
||||
entry_table: &'a [u8],
|
||||
actions: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> StateTable<'a> {
|
||||
pub(crate) fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let number_of_classes: u16 = s.read()?;
|
||||
// Note that in format1 subtable, offsets are not from the subtable start,
|
||||
// but from subtable start + `header_size`.
|
||||
// So there is not need to subtract the `header_size`.
|
||||
let class_table_offset = s.read::<Offset16>()?.to_usize();
|
||||
let state_array_offset = s.read::<Offset16>()?.to_usize();
|
||||
let entry_table_offset = s.read::<Offset16>()?.to_usize();
|
||||
// Ignore `values_offset` since we don't use it.
|
||||
|
||||
// Parse class subtable.
|
||||
let mut s = Stream::new_at(data, class_table_offset)?;
|
||||
let first_glyph: GlyphId = s.read()?;
|
||||
let number_of_glyphs: u16 = s.read()?;
|
||||
// The class table contains u8, so it's easier to use just a slice
|
||||
// instead of a LazyArray.
|
||||
let class_table = s.read_bytes(usize::from(number_of_glyphs))?;
|
||||
|
||||
Some(StateTable {
|
||||
number_of_classes,
|
||||
first_glyph,
|
||||
class_table,
|
||||
state_array_offset: state_array_offset as u16,
|
||||
// We don't know the actual data size and it's kinda expensive to calculate.
|
||||
// So we are simply storing all the data past the offset.
|
||||
// Despite the fact that they may overlap.
|
||||
state_array: data.get(state_array_offset..)?,
|
||||
entry_table: data.get(entry_table_offset..)?,
|
||||
// `ValueOffset` defines an offset from the start of the subtable data.
|
||||
// We do not check that the provided offset is actually after `values_offset`.
|
||||
actions: data,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns a glyph class.
|
||||
#[inline]
|
||||
pub fn class(&self, glyph_id: GlyphId) -> Option<u8> {
|
||||
if glyph_id.0 == 0xFFFF {
|
||||
return Some(class::DELETED_GLYPH as u8);
|
||||
}
|
||||
|
||||
let idx = glyph_id.0.checked_sub(self.first_glyph.0)?;
|
||||
self.class_table.get(usize::from(idx)).copied()
|
||||
}
|
||||
|
||||
/// Returns a class entry.
|
||||
#[inline]
|
||||
pub fn entry(&self, state: u16, mut class: u8) -> Option<StateEntry> {
|
||||
if u16::from(class) >= self.number_of_classes {
|
||||
class = class::OUT_OF_BOUNDS as u8;
|
||||
}
|
||||
|
||||
let entry_idx = self.state_array.get(
|
||||
usize::from(state) * usize::from(self.number_of_classes) + usize::from(class)
|
||||
)?;
|
||||
|
||||
Stream::read_at(self.entry_table, usize::from(*entry_idx) * StateEntry::SIZE)
|
||||
}
|
||||
|
||||
/// Returns kerning at offset.
|
||||
#[inline]
|
||||
pub fn kerning(&self, offset: ValueOffset) -> Option<i16> {
|
||||
Stream::read_at(self.actions, usize::from(offset.0))
|
||||
}
|
||||
|
||||
/// Produces a new state.
|
||||
#[inline]
|
||||
pub fn new_state(&self, state: u16) -> u16 {
|
||||
let n = (i32::from(state) - i32::from(self.state_array_offset))
|
||||
/ i32::from(self.number_of_classes);
|
||||
|
||||
use core::convert::TryFrom;
|
||||
u16::try_from(n).unwrap_or(0)
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for StateTable<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "StateTable {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An [Extended State Table](
|
||||
/// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6Tables.html).
|
||||
///
|
||||
/// Also called `STXHeader`.
|
||||
///
|
||||
/// Currently used by `kerx` and `morx` tables.
|
||||
#[derive(Clone)]
|
||||
pub struct ExtendedStateTable<'a, T> {
|
||||
number_of_classes: u32,
|
||||
lookup: Lookup<'a>,
|
||||
state_array: &'a [u8],
|
||||
entry_table: &'a [u8],
|
||||
entry_type: core::marker::PhantomData<T>,
|
||||
}
|
||||
|
||||
impl<'a, T: FromData> ExtendedStateTable<'a, T> {
|
||||
// TODO: make private
|
||||
/// Parses an Extended State Table from a stream.
|
||||
///
|
||||
/// `number_of_glyphs` is from the `maxp` table.
|
||||
pub fn parse(number_of_glyphs: NonZeroU16, s: &mut Stream<'a>) -> Option<Self> {
|
||||
let data = s.tail()?;
|
||||
|
||||
let number_of_classes = s.read::<u32>()?;
|
||||
// Note that offsets are not from the subtable start,
|
||||
// but from subtable start + `header_size`.
|
||||
// So there is not need to subtract the `header_size`.
|
||||
let lookup_table_offset = s.read::<Offset32>()?.to_usize();
|
||||
let state_array_offset = s.read::<Offset32>()?.to_usize();
|
||||
let entry_table_offset = s.read::<Offset32>()?.to_usize();
|
||||
|
||||
Some(ExtendedStateTable {
|
||||
number_of_classes,
|
||||
lookup: Lookup::parse(number_of_glyphs, data.get(lookup_table_offset..)?)?,
|
||||
// We don't know the actual data size and it's kinda expensive to calculate.
|
||||
// So we are simply storing all the data past the offset.
|
||||
// Despite the fact that they may overlap.
|
||||
state_array: data.get(state_array_offset..)?,
|
||||
entry_table: data.get(entry_table_offset..)?,
|
||||
entry_type: core::marker::PhantomData,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns a glyph class.
|
||||
#[inline]
|
||||
pub fn class(&self, glyph_id: GlyphId) -> Option<u16> {
|
||||
if glyph_id.0 == 0xFFFF {
|
||||
return Some(u16::from(class::DELETED_GLYPH));
|
||||
}
|
||||
|
||||
self.lookup.value(glyph_id)
|
||||
}
|
||||
|
||||
/// Returns a class entry.
|
||||
#[inline]
|
||||
pub fn entry(&self, state: u16, mut class: u16) -> Option<GenericStateEntry<T>> {
|
||||
if u32::from(class) >= self.number_of_classes {
|
||||
class = u16::from(class::OUT_OF_BOUNDS);
|
||||
}
|
||||
|
||||
let state_idx =
|
||||
usize::from(state) * usize::num_from(self.number_of_classes) + usize::from(class);
|
||||
|
||||
let entry_idx: u16 = Stream::read_at(self.state_array, state_idx * u16::SIZE)?;
|
||||
Stream::read_at(self.entry_table, usize::from(entry_idx) * GenericStateEntry::<T>::SIZE)
|
||||
}
|
||||
}
|
||||
|
||||
impl<T> core::fmt::Debug for ExtendedStateTable<'_, T> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "ExtendedStateTable {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [lookup table](
|
||||
/// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6Tables.html).
|
||||
///
|
||||
/// u32 values in Format10 tables will be truncated to u16.
|
||||
/// u64 values in Format10 tables are not supported.
|
||||
#[derive(Clone)]
|
||||
pub struct Lookup<'a> {
|
||||
data: LookupInner<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Lookup<'a> {
|
||||
/// Parses a lookup table from raw data.
|
||||
///
|
||||
/// `number_of_glyphs` is from the `maxp` table.
|
||||
#[inline]
|
||||
pub fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
LookupInner::parse(number_of_glyphs, data).map(|data| Self { data })
|
||||
}
|
||||
|
||||
/// Returns a value associated with the specified glyph.
|
||||
#[inline]
|
||||
pub fn value(&self, glyph_id: GlyphId) -> Option<u16> {
|
||||
self.data.value(glyph_id)
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Lookup<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Lookup {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone)]
|
||||
enum LookupInner<'a> {
|
||||
Format1(LazyArray16<'a, u16>),
|
||||
Format2(BinarySearchTable<'a, LookupSegment>),
|
||||
Format4(BinarySearchTable<'a, LookupSegment>, &'a [u8]),
|
||||
Format6(BinarySearchTable<'a, LookupSingle>),
|
||||
Format8 {
|
||||
first_glyph: u16,
|
||||
values: LazyArray16<'a, u16>
|
||||
},
|
||||
Format10 {
|
||||
value_size: u16,
|
||||
first_glyph: u16,
|
||||
glyph_count: u16,
|
||||
data: &'a [u8],
|
||||
},
|
||||
}
|
||||
|
||||
impl<'a> LookupInner<'a> {
|
||||
fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let format = s.read::<u16>()?;
|
||||
match format {
|
||||
0 => {
|
||||
let values = s.read_array16::<u16>(number_of_glyphs.get())?;
|
||||
Some(Self::Format1(values))
|
||||
}
|
||||
2 => {
|
||||
let bsearch = BinarySearchTable::<LookupSegment>::parse(s.tail()?)?;
|
||||
Some(Self::Format2(bsearch))
|
||||
}
|
||||
4 => {
|
||||
let bsearch = BinarySearchTable::<LookupSegment>::parse(s.tail()?)?;
|
||||
Some(Self::Format4(bsearch, data))
|
||||
}
|
||||
6 => {
|
||||
let bsearch = BinarySearchTable::<LookupSingle>::parse(s.tail()?)?;
|
||||
Some(Self::Format6(bsearch))
|
||||
}
|
||||
8 => {
|
||||
let first_glyph = s.read::<u16>()?;
|
||||
let glyph_count = s.read::<u16>()?;
|
||||
let values = s.read_array16::<u16>(glyph_count)?;
|
||||
Some(Self::Format8 { first_glyph, values })
|
||||
}
|
||||
10 => {
|
||||
let value_size = s.read::<u16>()?;
|
||||
let first_glyph = s.read::<u16>()?;
|
||||
let glyph_count = s.read::<u16>()?;
|
||||
Some(Self::Format10 { value_size, first_glyph, glyph_count, data: s.tail()? })
|
||||
}
|
||||
_ => {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn value(&self, glyph_id: GlyphId) -> Option<u16> {
|
||||
match self {
|
||||
Self::Format1(values) => {
|
||||
values.get(glyph_id.0)
|
||||
}
|
||||
Self::Format2(ref bsearch) => {
|
||||
bsearch.get(glyph_id).map(|v| v.value)
|
||||
}
|
||||
Self::Format4(ref bsearch, data) => {
|
||||
// In format 4, LookupSegment contains an offset to a list of u16 values.
|
||||
// One value for each glyph in the LookupSegment range.
|
||||
let segment = bsearch.get(glyph_id)?;
|
||||
let index = glyph_id.0.checked_sub(segment.first_glyph)?;
|
||||
let offset = usize::from(segment.value) + u16::SIZE * usize::from(index);
|
||||
Stream::read_at::<u16>(data, offset)
|
||||
}
|
||||
Self::Format6(ref bsearch) => {
|
||||
bsearch.get(glyph_id).map(|v| v.value)
|
||||
}
|
||||
Self::Format8 { first_glyph, values } => {
|
||||
let idx = glyph_id.0.checked_sub(*first_glyph)?;
|
||||
values.get(idx)
|
||||
}
|
||||
Self::Format10 { value_size, first_glyph, glyph_count, data } => {
|
||||
let idx = glyph_id.0.checked_sub(*first_glyph)?;
|
||||
let mut s = Stream::new(data);
|
||||
match value_size {
|
||||
1 => s.read_array16::<u8>(*glyph_count)?.get(idx).map(u16::from),
|
||||
2 => s.read_array16::<u16>(*glyph_count)?.get(idx),
|
||||
// TODO: we should return u32 here, but this is not supported yet
|
||||
4 => s.read_array16::<u32>(*glyph_count)?.get(idx).map(|n| n as u16),
|
||||
_ => None, // 8 is also supported
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A binary searching table as defined at
|
||||
/// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6Tables.html
|
||||
#[derive(Clone)]
|
||||
struct BinarySearchTable<'a, T: BinarySearchValue> {
|
||||
values: LazyArray16<'a, T>,
|
||||
len: NonZeroU16, // values length excluding termination segment
|
||||
}
|
||||
|
||||
impl<'a, T: BinarySearchValue + core::fmt::Debug> BinarySearchTable<'a, T> {
|
||||
#[inline(never)]
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let segment_size = s.read::<u16>()?;
|
||||
let number_of_segments = s.read::<u16>()?;
|
||||
s.advance(6); // search_range + entry_selector + range_shift
|
||||
|
||||
if usize::from(segment_size) != T::SIZE {
|
||||
return None;
|
||||
}
|
||||
|
||||
if number_of_segments == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let values = s.read_array16::<T>(number_of_segments)?;
|
||||
|
||||
// 'The number of termination values that need to be included is table-specific.
|
||||
// The value that indicates binary search termination is 0xFFFF.'
|
||||
let mut len = number_of_segments;
|
||||
if values.last()?.is_termination() {
|
||||
len = len.checked_sub(1)?;
|
||||
}
|
||||
|
||||
Some(BinarySearchTable {
|
||||
len: NonZeroU16::new(len)?,
|
||||
values,
|
||||
})
|
||||
}
|
||||
|
||||
fn get(&self, key: GlyphId) -> Option<T> {
|
||||
let mut min = 0;
|
||||
let mut max = (self.len.get() as isize) - 1;
|
||||
while min <= max {
|
||||
let mid = (min + max) / 2;
|
||||
let v = self.values.get(mid as u16)?;
|
||||
match v.contains(key) {
|
||||
core::cmp::Ordering::Less => max = mid - 1,
|
||||
core::cmp::Ordering::Greater => min = mid + 1,
|
||||
core::cmp::Ordering::Equal => return Some(v),
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
trait BinarySearchValue: FromData {
|
||||
fn is_termination(&self) -> bool;
|
||||
fn contains(&self, glyph_id: GlyphId) -> core::cmp::Ordering;
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct LookupSegment {
|
||||
last_glyph: u16,
|
||||
first_glyph: u16,
|
||||
value: u16,
|
||||
}
|
||||
|
||||
impl FromData for LookupSegment {
|
||||
const SIZE: usize = 6;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(LookupSegment {
|
||||
last_glyph: s.read::<u16>()?,
|
||||
first_glyph: s.read::<u16>()?,
|
||||
value: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl BinarySearchValue for LookupSegment {
|
||||
#[inline]
|
||||
fn is_termination(&self) -> bool {
|
||||
self.last_glyph == 0xFFFF && self.first_glyph == 0xFFFF
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn contains(&self, id: GlyphId) -> core::cmp::Ordering {
|
||||
if id.0 < self.first_glyph {
|
||||
core::cmp::Ordering::Less
|
||||
} else if id.0 <= self.last_glyph {
|
||||
core::cmp::Ordering::Equal
|
||||
} else {
|
||||
core::cmp::Ordering::Greater
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct LookupSingle {
|
||||
glyph: u16,
|
||||
value: u16,
|
||||
}
|
||||
|
||||
impl FromData for LookupSingle {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(LookupSingle {
|
||||
glyph: s.read::<u16>()?,
|
||||
value: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl BinarySearchValue for LookupSingle {
|
||||
#[inline]
|
||||
fn is_termination(&self) -> bool {
|
||||
self.glyph == 0xFFFF
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn contains(&self, id: GlyphId) -> core::cmp::Ordering {
|
||||
id.0.cmp(&self.glyph)
|
||||
}
|
||||
}
|
|
@ -0,0 +1,130 @@
|
|||
use crate::parser::{FromSlice, LazyArray16, LazyOffsetArray16, Stream};
|
||||
use super::{ClassDefinition, Coverage, SequenceLookupRecord};
|
||||
|
||||
/// A [Chained Contextual Lookup Subtable](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#chseqctxt1).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum ChainedContextLookup<'a> {
|
||||
/// Simple glyph contexts.
|
||||
Format1 {
|
||||
coverage: Coverage<'a>,
|
||||
sets: ChainedSequenceRuleSets<'a>,
|
||||
},
|
||||
/// Class-based glyph contexts.
|
||||
Format2 {
|
||||
coverage: Coverage<'a>,
|
||||
backtrack_classes: ClassDefinition<'a>,
|
||||
input_classes: ClassDefinition<'a>,
|
||||
lookahead_classes: ClassDefinition<'a>,
|
||||
sets: ChainedSequenceRuleSets<'a>,
|
||||
},
|
||||
/// Coverage-based glyph contexts.
|
||||
Format3 {
|
||||
coverage: Coverage<'a>,
|
||||
backtrack_coverages: LazyOffsetArray16<'a, Coverage<'a>>,
|
||||
input_coverages: LazyOffsetArray16<'a, Coverage<'a>>,
|
||||
lookahead_coverages: LazyOffsetArray16<'a, Coverage<'a>>,
|
||||
lookups: LazyArray16<'a, SequenceLookupRecord>,
|
||||
},
|
||||
}
|
||||
|
||||
impl<'a> ChainedContextLookup<'a> {
|
||||
pub(crate) fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
Some(Self::Format1 {
|
||||
coverage,
|
||||
sets: ChainedSequenceRuleSets::new(data, offsets),
|
||||
})
|
||||
}
|
||||
2 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let backtrack_classes = ClassDefinition::parse(s.read_at_offset16(data)?)?;
|
||||
let input_classes = ClassDefinition::parse(s.read_at_offset16(data)?)?;
|
||||
let lookahead_classes = ClassDefinition::parse(s.read_at_offset16(data)?)?;
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
Some(Self::Format2 {
|
||||
coverage,
|
||||
backtrack_classes,
|
||||
input_classes,
|
||||
lookahead_classes,
|
||||
sets: LazyOffsetArray16::new(data, offsets),
|
||||
})
|
||||
}
|
||||
3 => {
|
||||
let backtrack_count = s.read::<u16>()?;
|
||||
let backtrack_coverages = s.read_array16(backtrack_count)?;
|
||||
let input_count = s.read::<u16>()?;
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let input_coverages = s.read_array16(input_count.checked_sub(1)?)?;
|
||||
let lookahead_count = s.read::<u16>()?;
|
||||
let lookahead_coverages = s.read_array16(lookahead_count)?;
|
||||
let lookup_count = s.read::<u16>()?;
|
||||
let lookups = s.read_array16(lookup_count)?;
|
||||
Some(Self::Format3 {
|
||||
coverage,
|
||||
backtrack_coverages: LazyOffsetArray16::new(data, backtrack_coverages),
|
||||
input_coverages: LazyOffsetArray16::new(data, input_coverages),
|
||||
lookahead_coverages: LazyOffsetArray16::new(data, lookahead_coverages),
|
||||
lookups,
|
||||
})
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the subtable coverage.
|
||||
#[inline]
|
||||
pub fn coverage(&self) -> Coverage<'a> {
|
||||
match self {
|
||||
Self::Format1 { coverage, .. } => *coverage,
|
||||
Self::Format2 { coverage, .. } => *coverage,
|
||||
Self::Format3 { coverage, .. } => *coverage,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A list of [`ChainedSequenceRule`] sets.
|
||||
pub type ChainedSequenceRuleSets<'a> = LazyOffsetArray16<'a, ChainedSequenceRuleSet<'a>>;
|
||||
|
||||
/// A set of [`ChainedSequenceRule`].
|
||||
pub type ChainedSequenceRuleSet<'a> = LazyOffsetArray16<'a, ChainedSequenceRule<'a>>;
|
||||
|
||||
impl<'a> FromSlice<'a> for ChainedSequenceRuleSet<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
Self::parse(data)
|
||||
}
|
||||
}
|
||||
|
||||
/// A [Chained Sequence Rule](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#chained-sequence-context-format-1-simple-glyph-contexts).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct ChainedSequenceRule<'a> {
|
||||
/// Contains either glyph IDs or glyph Classes.
|
||||
pub backtrack: LazyArray16<'a, u16>,
|
||||
pub input: LazyArray16<'a, u16>,
|
||||
/// Contains either glyph IDs or glyph Classes.
|
||||
pub lookahead: LazyArray16<'a, u16>,
|
||||
pub lookups: LazyArray16<'a, SequenceLookupRecord>,
|
||||
}
|
||||
|
||||
impl<'a> FromSlice<'a> for ChainedSequenceRule<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let backtrack_count = s.read::<u16>()?;
|
||||
let backtrack = s.read_array16(backtrack_count)?;
|
||||
let input_count = s.read::<u16>()?;
|
||||
let input = s.read_array16(input_count.checked_sub(1)?)?;
|
||||
let lookahead_count = s.read::<u16>()?;
|
||||
let lookahead = s.read_array16(lookahead_count)?;
|
||||
let lookup_count = s.read::<u16>()?;
|
||||
let lookups = s.read_array16(lookup_count)?;
|
||||
Some(Self { backtrack, input, lookahead, lookups })
|
||||
}
|
||||
}
|
|
@ -0,0 +1,129 @@
|
|||
use crate::parser::{FromData, FromSlice, LazyArray16, LazyOffsetArray16, Stream};
|
||||
use super::{ClassDefinition, Coverage, LookupIndex};
|
||||
|
||||
/// A [Contextual Lookup Subtable](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#seqctxt1).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum ContextLookup<'a> {
|
||||
/// Simple glyph contexts.
|
||||
Format1 {
|
||||
coverage: Coverage<'a>,
|
||||
sets: SequenceRuleSets<'a>,
|
||||
},
|
||||
/// Class-based glyph contexts.
|
||||
Format2 {
|
||||
coverage: Coverage<'a>,
|
||||
classes: ClassDefinition<'a>,
|
||||
sets: SequenceRuleSets<'a>,
|
||||
},
|
||||
/// Coverage-based glyph contexts.
|
||||
Format3 {
|
||||
coverage: Coverage<'a>,
|
||||
coverages: LazyOffsetArray16<'a, Coverage<'a>>,
|
||||
lookups: LazyArray16<'a, SequenceLookupRecord>,
|
||||
},
|
||||
}
|
||||
|
||||
impl<'a> ContextLookup<'a> {
|
||||
pub(crate) fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
Some(Self::Format1 {
|
||||
coverage,
|
||||
sets: SequenceRuleSets::new(data, offsets),
|
||||
})
|
||||
}
|
||||
2 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let classes = ClassDefinition::parse(s.read_at_offset16(data)?)?;
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
Some(Self::Format2 {
|
||||
coverage,
|
||||
classes,
|
||||
sets: SequenceRuleSets::new(data, offsets),
|
||||
})
|
||||
}
|
||||
3 => {
|
||||
let input_count = s.read::<u16>()?;
|
||||
let lookup_count = s.read::<u16>()?;
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let coverages = s.read_array16(input_count.checked_sub(1)?)?;
|
||||
let lookups = s.read_array16(lookup_count)?;
|
||||
Some(Self::Format3 {
|
||||
coverage,
|
||||
coverages: LazyOffsetArray16::new(data, coverages),
|
||||
lookups,
|
||||
})
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the subtable coverage.
|
||||
#[inline]
|
||||
pub fn coverage(&self) -> Coverage<'a> {
|
||||
match self {
|
||||
Self::Format1 { coverage, .. } => *coverage,
|
||||
Self::Format2 { coverage, .. } => *coverage,
|
||||
Self::Format3 { coverage, .. } => *coverage,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A list of [`SequenceRuleSet`]s.
|
||||
pub type SequenceRuleSets<'a> = LazyOffsetArray16<'a, SequenceRuleSet<'a>>;
|
||||
|
||||
impl<'a> FromSlice<'a> for SequenceRuleSet<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
Self::parse(data)
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> FromSlice<'a> for SequenceRule<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let input_count = s.read::<u16>()?;
|
||||
let lookup_count = s.read::<u16>()?;
|
||||
let input = s.read_array16(input_count.checked_sub(1)?)?;
|
||||
let lookups = s.read_array16(lookup_count)?;
|
||||
Some(Self { input, lookups })
|
||||
}
|
||||
}
|
||||
|
||||
/// A set of [`SequenceRule`]s.
|
||||
pub type SequenceRuleSet<'a> = LazyOffsetArray16<'a, SequenceRule<'a>>;
|
||||
|
||||
/// A sequence rule.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct SequenceRule<'a> {
|
||||
pub input: LazyArray16<'a, u16>,
|
||||
pub lookups: LazyArray16<'a, SequenceLookupRecord>,
|
||||
}
|
||||
|
||||
/// A sequence rule record.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct SequenceLookupRecord {
|
||||
pub sequence_index: u16,
|
||||
pub lookup_list_index: LookupIndex,
|
||||
}
|
||||
|
||||
impl FromData for SequenceLookupRecord {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Self {
|
||||
sequence_index: s.read::<u16>()?,
|
||||
lookup_list_index: s.read::<LookupIndex>()?,
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,174 @@
|
|||
use crate::{NormalizedCoordinate, Tag};
|
||||
use crate::parser::{FromData, LazyArray16, LazyArray32};
|
||||
use crate::parser::{Offset, Offset32, Stream};
|
||||
use super::{Feature, FeatureIndex, VariationIndex, RecordListItem};
|
||||
|
||||
/// A [Feature Variations Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#featurevariations-table).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct FeatureVariations<'a> {
|
||||
data: &'a [u8],
|
||||
records: LazyArray32<'a, FeatureVariationRecord>,
|
||||
}
|
||||
|
||||
impl<'a> FeatureVariations<'a> {
|
||||
pub(crate) fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let major_version = s.read::<u16>()?;
|
||||
s.skip::<u16>(); // minor version
|
||||
if major_version != 1 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let count = s.read::<u32>()?;
|
||||
let records = s.read_array32(count)?;
|
||||
Some(Self { data, records })
|
||||
}
|
||||
|
||||
/// Returns a [`VariationIndex`] for variation coordinates.
|
||||
pub fn find_index(&self, coords: &[NormalizedCoordinate]) -> Option<VariationIndex> {
|
||||
for i in 0..self.records.len() {
|
||||
let record = self.records.get(i)?;
|
||||
let offset = record.conditions.to_usize();
|
||||
let set = ConditionSet::parse(self.data.get(offset..)?)?;
|
||||
if set.evaluate(coords) {
|
||||
return Some(i);
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
/// Returns a [`Feature`] at specified indices.
|
||||
pub fn find_substitute(
|
||||
&self,
|
||||
feature_index: FeatureIndex,
|
||||
variation_index: VariationIndex,
|
||||
) -> Option<Feature<'a>> {
|
||||
let offset = self.records.get(variation_index)?.substitutions.to_usize();
|
||||
let subst = FeatureTableSubstitution::parse(self.data.get(offset..)?)?;
|
||||
subst.find_substitute(feature_index)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct FeatureVariationRecord {
|
||||
conditions: Offset32,
|
||||
substitutions: Offset32,
|
||||
}
|
||||
|
||||
impl FromData for FeatureVariationRecord {
|
||||
const SIZE: usize = 8;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Self {
|
||||
conditions: s.read::<Offset32>()?,
|
||||
substitutions: s.read::<Offset32>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct ConditionSet<'a> {
|
||||
data: &'a [u8],
|
||||
conditions: LazyArray16<'a, Offset32>,
|
||||
}
|
||||
|
||||
impl<'a> ConditionSet<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let count = s.read::<u16>()?;
|
||||
let conditions = s.read_array16(count)?;
|
||||
Some(Self { data, conditions })
|
||||
}
|
||||
|
||||
fn evaluate(&self, coords: &[NormalizedCoordinate]) -> bool {
|
||||
self.conditions.into_iter().all(|offset| {
|
||||
self.data.get(offset.to_usize()..)
|
||||
.and_then(Condition::parse)
|
||||
.map_or(false, |c| c.evaluate(coords))
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
enum Condition {
|
||||
Format1 {
|
||||
axis_index: u16,
|
||||
filter_range_min: i16,
|
||||
filter_range_max: i16,
|
||||
}
|
||||
}
|
||||
|
||||
impl Condition {
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let format = s.read::<u16>()?;
|
||||
match format {
|
||||
1 => {
|
||||
let axis_index = s.read::<u16>()?;
|
||||
let filter_range_min = s.read::<i16>()?;
|
||||
let filter_range_max = s.read::<i16>()?;
|
||||
Some(Self::Format1 { axis_index, filter_range_min, filter_range_max })
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
fn evaluate(&self, coords: &[NormalizedCoordinate]) -> bool {
|
||||
let Self::Format1 { axis_index, filter_range_min, filter_range_max } = *self;
|
||||
let coord = coords.get(usize::from(axis_index)).map(|c| c.get()).unwrap_or(0);
|
||||
filter_range_min <= coord && coord <= filter_range_max
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct FeatureTableSubstitution<'a> {
|
||||
data: &'a [u8],
|
||||
records: LazyArray16<'a, FeatureTableSubstitutionRecord>,
|
||||
}
|
||||
|
||||
impl<'a> FeatureTableSubstitution<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let major_version = s.read::<u16>()?;
|
||||
s.skip::<u16>(); // minor version
|
||||
if major_version != 1 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let count = s.read::<u16>()?;
|
||||
let records = s.read_array16(count)?;
|
||||
Some(Self { data, records })
|
||||
}
|
||||
|
||||
fn find_substitute(&self, feature_index: FeatureIndex) -> Option<Feature<'a>> {
|
||||
for record in self.records {
|
||||
if record.feature_index == feature_index {
|
||||
let offset = record.feature.to_usize();
|
||||
// TODO: set tag
|
||||
return Feature::parse(Tag::from_bytes(b"DFLT"), self.data.get(offset..)?);
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct FeatureTableSubstitutionRecord {
|
||||
feature_index: FeatureIndex,
|
||||
feature: Offset32,
|
||||
}
|
||||
|
||||
impl FromData for FeatureTableSubstitutionRecord {
|
||||
const SIZE: usize = 6;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Self {
|
||||
feature_index: s.read::<FeatureIndex>()?,
|
||||
feature: s.read::<Offset32>()?,
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,251 @@
|
|||
// Suppresses `minor_version` variable warning.
|
||||
#![allow(unused_variables)]
|
||||
|
||||
use super::LookupList;
|
||||
use crate::parser::{FromData, LazyArray16, Offset, Offset16, Stream};
|
||||
use crate::Tag;
|
||||
#[cfg(feature = "variable-fonts")] use super::FeatureVariations;
|
||||
#[cfg(feature = "variable-fonts")] use crate::parser::Offset32;
|
||||
|
||||
/// A [Layout Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#table-organization).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct LayoutTable<'a> {
|
||||
/// A list of all supported scripts.
|
||||
pub scripts: ScriptList<'a>,
|
||||
/// A list of all supported features.
|
||||
pub features: FeatureList<'a>,
|
||||
/// A list of all lookups.
|
||||
pub lookups: LookupList<'a>,
|
||||
/// Used to substitute an alternate set of lookup tables
|
||||
/// to use for any given feature under specified conditions.
|
||||
#[cfg(feature = "variable-fonts")]
|
||||
pub variations: Option<FeatureVariations<'a>>,
|
||||
}
|
||||
|
||||
impl<'a> LayoutTable<'a> {
|
||||
pub(crate) fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let major_version = s.read::<u16>()?;
|
||||
let minor_version = s.read::<u16>()?;
|
||||
if major_version != 1 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let scripts = ScriptList::parse(s.read_at_offset16(data)?)?;
|
||||
let features = FeatureList::parse(s.read_at_offset16(data)?)?;
|
||||
let lookups = LookupList::parse(s.read_at_offset16(data)?)?;
|
||||
|
||||
#[cfg(feature = "variable-fonts")] {
|
||||
let mut variations_offset = None;
|
||||
if minor_version >= 1 {
|
||||
variations_offset = s.read::<Option<Offset32>>()?;
|
||||
}
|
||||
|
||||
let variations = match variations_offset {
|
||||
Some(offset) => data.get(offset.to_usize()..).and_then(FeatureVariations::parse),
|
||||
None => None,
|
||||
};
|
||||
|
||||
Some(Self { scripts, features, lookups, variations })
|
||||
}
|
||||
|
||||
#[cfg(not(feature = "variable-fonts"))] {
|
||||
Some(Self { scripts, features, lookups })
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An index in [`ScriptList`].
|
||||
pub type ScriptIndex = u16;
|
||||
/// An index in [`LanguageSystemList`].
|
||||
pub type LanguageIndex = u16;
|
||||
/// An index in [`FeatureList`].
|
||||
pub type FeatureIndex = u16;
|
||||
/// An index in [`LookupList`].
|
||||
pub type LookupIndex = u16;
|
||||
/// An index in [`FeatureVariations`].
|
||||
pub type VariationIndex = u32;
|
||||
|
||||
/// A trait to parse item in [`RecordList`].
|
||||
///
|
||||
/// Internal use only.
|
||||
pub trait RecordListItem<'a>: Sized {
|
||||
/// Parses raw data.
|
||||
fn parse(tag: Tag, data: &'a [u8]) -> Option<Self>;
|
||||
}
|
||||
|
||||
/// A data storage used by [`ScriptList`], [`LanguageSystemList`] and [`FeatureList`] data types.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct RecordList<'a, T: RecordListItem<'a>> {
|
||||
data: &'a [u8],
|
||||
records: LazyArray16<'a, TagRecord>,
|
||||
data_type: core::marker::PhantomData<T>,
|
||||
}
|
||||
|
||||
impl<'a, T: RecordListItem<'a>> RecordList<'a, T> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let count = s.read::<u16>()?;
|
||||
let records = s.read_array16(count)?;
|
||||
Some(Self { data, records, data_type: core::marker::PhantomData })
|
||||
}
|
||||
|
||||
/// Returns a number of items in the RecordList.
|
||||
pub fn len(&self) -> u16 {
|
||||
self.records.len()
|
||||
}
|
||||
|
||||
/// Checks that RecordList is empty.
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.records.is_empty()
|
||||
}
|
||||
|
||||
/// Returns RecordList value by index.
|
||||
pub fn get(&self, index: u16) -> Option<T> {
|
||||
let record = self.records.get(index)?;
|
||||
self.data.get(record.offset.to_usize()..).and_then(|data| T::parse(record.tag, data))
|
||||
}
|
||||
|
||||
/// Returns RecordList value by [`Tag`].
|
||||
pub fn find(&self, tag: Tag) -> Option<T> {
|
||||
let record = self.records.binary_search_by(|record| record.tag.cmp(&tag)).map(|p| p.1)?;
|
||||
self.data.get(record.offset.to_usize()..).and_then(|data| T::parse(record.tag, data))
|
||||
}
|
||||
|
||||
/// Returns RecordList value index by [`Tag`].
|
||||
pub fn index(&self, tag: Tag) -> Option<u16> {
|
||||
self.records.binary_search_by(|record| record.tag.cmp(&tag)).map(|p| p.0)
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T: RecordListItem<'a>> IntoIterator for RecordList<'a, T> {
|
||||
type Item = T;
|
||||
type IntoIter = RecordListIter<'a, T>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
RecordListIter {
|
||||
list: self,
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over [`RecordList`] values.
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct RecordListIter<'a, T: RecordListItem<'a>> {
|
||||
list: RecordList<'a, T>,
|
||||
index: u16,
|
||||
}
|
||||
|
||||
impl<'a, T: RecordListItem<'a>> Iterator for RecordListIter<'a, T> {
|
||||
type Item = T;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index < self.list.len() {
|
||||
self.index += 1;
|
||||
self.list.get(self.index - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A list of [`Script`] records.
|
||||
pub type ScriptList<'a> = RecordList<'a, Script<'a>>;
|
||||
/// A list of [`LanguageSystem`] records.
|
||||
pub type LanguageSystemList<'a> = RecordList<'a, LanguageSystem<'a>>;
|
||||
/// A list of [`Feature`] records.
|
||||
pub type FeatureList<'a> = RecordList<'a, Feature<'a>>;
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct TagRecord {
|
||||
tag: Tag,
|
||||
offset: Offset16,
|
||||
}
|
||||
|
||||
impl FromData for TagRecord {
|
||||
const SIZE: usize = 6;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Self {
|
||||
tag: s.read::<Tag>()?,
|
||||
offset: s.read::<Offset16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// A [Script Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#script-table-and-language-system-record).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Script<'a> {
|
||||
/// Script tag.
|
||||
pub tag: Tag,
|
||||
/// Default language.
|
||||
pub default_language: Option<LanguageSystem<'a>>,
|
||||
/// List of supported languages, excluding the default one. Listed alphabetically.
|
||||
pub languages: LanguageSystemList<'a>,
|
||||
}
|
||||
|
||||
impl<'a> RecordListItem<'a> for Script<'a> {
|
||||
fn parse(tag: Tag, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let mut default_language = None;
|
||||
if let Some(offset) = s.read::<Option<Offset16>>()? {
|
||||
default_language = LanguageSystem::parse(
|
||||
Tag::from_bytes(b"dflt"),
|
||||
data.get(offset.to_usize()..)?
|
||||
);
|
||||
}
|
||||
let mut languages = RecordList::parse(s.tail()?)?;
|
||||
// Offsets are relative to this table.
|
||||
languages.data = data;
|
||||
Some(Self { tag, default_language, languages })
|
||||
}
|
||||
}
|
||||
|
||||
/// A [Language System Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#language-system-table).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct LanguageSystem<'a> {
|
||||
/// Language tag.
|
||||
pub tag: Tag,
|
||||
/// Index of a feature required for this language system.
|
||||
pub required_feature: Option<FeatureIndex>,
|
||||
/// Array of indices into the FeatureList, in arbitrary order.
|
||||
pub feature_indices: LazyArray16<'a, FeatureIndex>
|
||||
}
|
||||
|
||||
impl<'a> RecordListItem<'a> for LanguageSystem<'a> {
|
||||
fn parse(tag: Tag, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let _lookup_order = s.read::<Offset16>()?; // Unsupported.
|
||||
let required_feature = match s.read::<FeatureIndex>()? {
|
||||
0xFFFF => None,
|
||||
v => Some(v),
|
||||
};
|
||||
let count = s.read::<u16>()?;
|
||||
let feature_indices = s.read_array16(count)?;
|
||||
Some(Self { tag, required_feature, feature_indices })
|
||||
}
|
||||
}
|
||||
|
||||
/// A [Feature](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#feature-table).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Feature<'a> {
|
||||
pub tag: Tag,
|
||||
pub lookup_indices: LazyArray16<'a, LookupIndex>,
|
||||
}
|
||||
|
||||
impl<'a> RecordListItem<'a> for Feature<'a> {
|
||||
fn parse(tag: Tag, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let _params_offset = s.read::<Offset16>()?; // Unsupported.
|
||||
let count = s.read::<u16>()?;
|
||||
let lookup_indices = s.read_array16(count)?;
|
||||
Some(Self { tag, lookup_indices })
|
||||
}
|
||||
}
|
|
@ -0,0 +1,152 @@
|
|||
use crate::parser::{FromData, FromSlice, LazyArray16, LazyOffsetArray16, Offset, Offset16, Offset32, Stream};
|
||||
|
||||
/// A list of [`Lookup`] values.
|
||||
pub type LookupList<'a> = LazyOffsetArray16<'a, Lookup<'a>>;
|
||||
|
||||
/// A [Lookup Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#lookup-table).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Lookup<'a> {
|
||||
/// Lookup qualifiers.
|
||||
pub flags: LookupFlags,
|
||||
/// Available subtables.
|
||||
pub subtables: LookupSubtables<'a>,
|
||||
/// Index into GDEF mark glyph sets structure.
|
||||
pub mark_filtering_set: Option<u16>,
|
||||
}
|
||||
|
||||
impl<'a> FromSlice<'a> for Lookup<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let kind = s.read::<u16>()?;
|
||||
let flags = s.read::<LookupFlags>()?;
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
|
||||
let mut mark_filtering_set: Option<u16> = None;
|
||||
if flags.use_mark_filtering_set() {
|
||||
mark_filtering_set = Some(s.read::<u16>()?);
|
||||
}
|
||||
|
||||
Some(Self {
|
||||
flags,
|
||||
subtables: LookupSubtables { kind, data, offsets },
|
||||
mark_filtering_set,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// A trait for parsing Lookup subtables.
|
||||
///
|
||||
/// Internal use only.
|
||||
pub trait LookupSubtable<'a>: Sized {
|
||||
/// Parses raw data.
|
||||
fn parse(data: &'a [u8], kind: u16) -> Option<Self>;
|
||||
}
|
||||
|
||||
/// A list of lookup subtables.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct LookupSubtables<'a> {
|
||||
kind: u16,
|
||||
data: &'a [u8],
|
||||
offsets: LazyArray16<'a, Offset16>,
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for LookupSubtables<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "LookupSubtables {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> LookupSubtables<'a> {
|
||||
/// Returns a number of items in the LookupSubtables.
|
||||
#[inline]
|
||||
pub fn len(&self) -> u16 {
|
||||
self.offsets.len()
|
||||
}
|
||||
|
||||
/// Parses a subtable at index.
|
||||
///
|
||||
/// Accepts either
|
||||
/// [`PositioningSubtable`](crate::gpos::PositioningSubtable)
|
||||
/// or [`SubstitutionSubtable`](crate::gsub::SubstitutionSubtable).
|
||||
///
|
||||
/// Technically, we can enforce it at compile time, but it makes code too convoluted.
|
||||
pub fn get<T: LookupSubtable<'a>>(&self, index: u16) -> Option<T> {
|
||||
let offset = self.offsets.get(index)?.to_usize();
|
||||
let data = self.data.get(offset..)?;
|
||||
T::parse(data, self.kind)
|
||||
}
|
||||
|
||||
/// Creates an iterator over subtables.
|
||||
///
|
||||
/// We cannot use `IntoIterator` here, because we have to use user-provided base type.
|
||||
pub fn into_iter<T: LookupSubtable<'a>>(self) -> LookupSubtablesIter<'a, T> {
|
||||
LookupSubtablesIter {
|
||||
data: self,
|
||||
index: 0,
|
||||
data_type: core::marker::PhantomData,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over lookup subtables.
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct LookupSubtablesIter<'a, T: LookupSubtable<'a>> {
|
||||
data: LookupSubtables<'a>,
|
||||
index: u16,
|
||||
data_type: core::marker::PhantomData<T>,
|
||||
}
|
||||
|
||||
impl<'a, T: LookupSubtable<'a>> Iterator for LookupSubtablesIter<'a, T> {
|
||||
type Item = T;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index < self.data.len() {
|
||||
self.index += 1;
|
||||
self.data.get(self.index - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Lookup table flags.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct LookupFlags(pub u16);
|
||||
|
||||
#[allow(missing_docs)]
|
||||
impl LookupFlags {
|
||||
#[inline] pub fn right_to_left(self) -> bool { self.0 & 0x0001 != 0 }
|
||||
#[inline] pub fn ignore_base_glyphs(self) -> bool { self.0 & 0x0002 != 0 }
|
||||
#[inline] pub fn ignore_ligatures(self) -> bool { self.0 & 0x0004 != 0 }
|
||||
#[inline] pub fn ignore_marks(self) -> bool { self.0 & 0x0008 != 0 }
|
||||
#[inline] pub fn ignore_flags(self) -> bool { self.0 & 0x000E != 0 }
|
||||
#[inline] pub fn use_mark_filtering_set(self) -> bool { self.0 & 0x0010 != 0 }
|
||||
#[inline] pub fn mark_attachment_type(self) -> u8 { (self.0 & 0xFF00) as u8 }
|
||||
}
|
||||
|
||||
impl FromData for LookupFlags {
|
||||
const SIZE: usize = 2;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
u16::parse(data).map(Self)
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn parse_extension_lookup<'a, T: 'a>(
|
||||
data: &'a [u8],
|
||||
parse: impl FnOnce(&'a [u8], u16) -> Option<T>,
|
||||
) -> Option<T> {
|
||||
let mut s = Stream::new(data);
|
||||
let format = s.read::<u16>()?;
|
||||
match format {
|
||||
1 => {
|
||||
let kind = s.read::<u16>()?;
|
||||
let offset = s.read::<Offset32>()?.to_usize();
|
||||
parse(data.get(offset..)?, kind)
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
|
@ -0,0 +1,168 @@
|
|||
//! Common data types used by GDEF/GPOS/GSUB tables.
|
||||
//!
|
||||
//! <https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2>
|
||||
|
||||
// A heavily modified port of https://github.com/RazrFalcon/rustybuzz implementation
|
||||
// originally written by https://github.com/laurmaedje
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::parser::{Stream, FromData, FromSlice, LazyArray16};
|
||||
|
||||
mod context;
|
||||
mod chained_context;
|
||||
mod lookup;
|
||||
mod layout_table;
|
||||
#[cfg(feature = "variable-fonts")] mod feature_variations;
|
||||
|
||||
pub use context::*;
|
||||
pub use chained_context::*;
|
||||
pub use lookup::*;
|
||||
pub use layout_table::*;
|
||||
#[cfg(feature = "variable-fonts")] pub use feature_variations::*;
|
||||
|
||||
/// A record that describes a range of glyph IDs.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct RangeRecord {
|
||||
/// First glyph ID in the range
|
||||
pub start: GlyphId,
|
||||
/// Last glyph ID in the range
|
||||
pub end: GlyphId,
|
||||
/// Coverage Index of first glyph ID in range.
|
||||
pub value: u16,
|
||||
}
|
||||
|
||||
impl LazyArray16<'_, RangeRecord> {
|
||||
/// Returns a [`RangeRecord`] for a glyph.
|
||||
pub fn range(&self, glyph: GlyphId) -> Option<RangeRecord> {
|
||||
self.binary_search_by(|record| {
|
||||
if glyph < record.start {
|
||||
core::cmp::Ordering::Greater
|
||||
} else if glyph <= record.end {
|
||||
core::cmp::Ordering::Equal
|
||||
} else {
|
||||
core::cmp::Ordering::Less
|
||||
}
|
||||
}).map(|p| p.1)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for RangeRecord {
|
||||
const SIZE: usize = 6;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(RangeRecord {
|
||||
start: s.read::<GlyphId>()?,
|
||||
end: s.read::<GlyphId>()?,
|
||||
value: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Coverage Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#coverage-table).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum Coverage<'a> {
|
||||
Format1 {
|
||||
/// Array of glyph IDs. Sorted.
|
||||
glyphs: LazyArray16<'a, GlyphId>,
|
||||
},
|
||||
Format2 {
|
||||
/// Array of glyph ranges. Ordered by `RangeRecord.start`.
|
||||
records: LazyArray16<'a, RangeRecord>,
|
||||
},
|
||||
}
|
||||
|
||||
impl<'a> FromSlice<'a> for Coverage<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let count = s.read::<u16>()?;
|
||||
let glyphs = s.read_array16(count)?;
|
||||
Some(Self::Format1 { glyphs })
|
||||
}
|
||||
2 => {
|
||||
let count = s.read::<u16>()?;
|
||||
let records = s.read_array16(count)?;
|
||||
Some(Self::Format2 { records })
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Coverage<'a> {
|
||||
/// Checks that glyph is present.
|
||||
pub fn contains(&self, glyph: GlyphId) -> bool {
|
||||
self.get(glyph).is_some()
|
||||
}
|
||||
|
||||
/// Returns the coverage index of the glyph or `None` if it is not covered.
|
||||
pub fn get(&self, glyph: GlyphId) -> Option<u16> {
|
||||
match self {
|
||||
Self::Format1 { glyphs } => {
|
||||
glyphs.binary_search(&glyph).map(|p| p.0)
|
||||
}
|
||||
Self::Format2 { records } => {
|
||||
let record = records.range(glyph)?;
|
||||
let offset = glyph.0 - record.start.0;
|
||||
record.value.checked_add(offset)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A value of [Class Definition Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#class-definition-table).
|
||||
pub type Class = u16;
|
||||
|
||||
/// A [Class Definition Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#class-definition-table).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum ClassDefinition<'a> {
|
||||
Format1 {
|
||||
start: GlyphId,
|
||||
classes: LazyArray16<'a, Class>,
|
||||
},
|
||||
Format2 {
|
||||
records: LazyArray16<'a, RangeRecord>,
|
||||
},
|
||||
}
|
||||
|
||||
impl<'a> ClassDefinition<'a> {
|
||||
#[inline]
|
||||
pub(crate) fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let start = s.read::<GlyphId>()?;
|
||||
let count = s.read::<u16>()?;
|
||||
let classes = s.read_array16(count)?;
|
||||
Some(Self::Format1 { start, classes })
|
||||
},
|
||||
2 => {
|
||||
let count = s.read::<u16>()?;
|
||||
let records = s.read_array16(count)?;
|
||||
Some(Self::Format2 { records })
|
||||
},
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the glyph class of the glyph (zero if it is not defined).
|
||||
pub fn get(&self, glyph: GlyphId) -> Class {
|
||||
match self {
|
||||
Self::Format1 { start, classes } => {
|
||||
glyph.0.checked_sub(start.0).and_then(|index| classes.get(index))
|
||||
}
|
||||
Self::Format2 { records } => {
|
||||
records.range(glyph).map(|record| record.value)
|
||||
}
|
||||
}.unwrap_or(0)
|
||||
}
|
||||
}
|
File diff suppressed because it is too large
Load Diff
|
@ -0,0 +1,864 @@
|
|||
//! Binary parsing utils.
|
||||
//!
|
||||
//! This module should not be used directly, unless you're planning to parse
|
||||
//! some tables manually.
|
||||
|
||||
use core::ops::Range;
|
||||
use core::convert::{TryFrom, TryInto};
|
||||
|
||||
/// A trait for parsing raw binary data of fixed size.
|
||||
///
|
||||
/// This is a low-level, internal trait that should not be used directly.
|
||||
pub trait FromData: Sized {
|
||||
/// Object's raw data size.
|
||||
///
|
||||
/// Not always the same as `mem::size_of`.
|
||||
const SIZE: usize;
|
||||
|
||||
/// Parses an object from a raw data.
|
||||
fn parse(data: &[u8]) -> Option<Self>;
|
||||
}
|
||||
|
||||
/// A trait for parsing raw binary data of variable size.
|
||||
///
|
||||
/// This is a low-level, internal trait that should not be used directly.
|
||||
pub trait FromSlice<'a>: Sized {
|
||||
/// Parses an object from a raw data.
|
||||
fn parse(data: &'a [u8]) -> Option<Self>;
|
||||
}
|
||||
|
||||
impl FromData for () {
|
||||
const SIZE: usize = 0;
|
||||
|
||||
#[inline]
|
||||
fn parse(_: &[u8]) -> Option<Self> {
|
||||
Some(())
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for u8 {
|
||||
const SIZE: usize = 1;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
data.get(0).copied()
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for i8 {
|
||||
const SIZE: usize = 1;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
data.get(0).copied().map(|n| n as i8)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for u16 {
|
||||
const SIZE: usize = 2;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
data.try_into().ok().map(u16::from_be_bytes)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for i16 {
|
||||
const SIZE: usize = 2;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
data.try_into().ok().map(i16::from_be_bytes)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for u32 {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
data.try_into().ok().map(u32::from_be_bytes)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for i32 {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
data.try_into().ok().map(i32::from_be_bytes)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for u64 {
|
||||
const SIZE: usize = 8;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
data.try_into().ok().map(u64::from_be_bytes)
|
||||
}
|
||||
}
|
||||
|
||||
/// A u24 number.
|
||||
///
|
||||
/// Stored as u32, but encoded as 3 bytes in the font.
|
||||
///
|
||||
/// <https://docs.microsoft.com/en-us/typography/opentype/spec/otff#data-types>
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct U24(pub u32);
|
||||
|
||||
impl FromData for U24 {
|
||||
const SIZE: usize = 3;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let data: [u8; 3] = data.try_into().ok()?;
|
||||
Some(U24(u32::from_be_bytes([0, data[0], data[1], data[2]])))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A 16-bit signed fixed number with the low 14 bits of fraction (2.14).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct F2DOT14(pub i16);
|
||||
|
||||
impl F2DOT14 {
|
||||
/// Converts i16 to f32.
|
||||
#[inline]
|
||||
pub fn to_f32(&self) -> f32 {
|
||||
f32::from(self.0) / 16384.0
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for F2DOT14 {
|
||||
const SIZE: usize = 2;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
i16::parse(data).map(F2DOT14)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A 32-bit signed fixed-point number (16.16).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Fixed(pub f32);
|
||||
|
||||
impl FromData for Fixed {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
// TODO: is it safe to cast?
|
||||
i32::parse(data).map(|n| Fixed(n as f32 / 65536.0))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A safe u32 to usize casting.
|
||||
///
|
||||
/// Rust doesn't implement `From<u32> for usize`,
|
||||
/// because it has to support 16 bit targets.
|
||||
/// We don't, so we can allow this.
|
||||
pub trait NumFrom<T>: Sized {
|
||||
/// Converts u32 into usize.
|
||||
fn num_from(_: T) -> Self;
|
||||
}
|
||||
|
||||
impl NumFrom<u32> for usize {
|
||||
#[inline]
|
||||
fn num_from(v: u32) -> Self {
|
||||
#[cfg(any(target_pointer_width = "32", target_pointer_width = "64"))]
|
||||
{
|
||||
v as usize
|
||||
}
|
||||
|
||||
// compilation error on 16 bit targets
|
||||
}
|
||||
}
|
||||
|
||||
impl NumFrom<char> for usize {
|
||||
#[inline]
|
||||
fn num_from(v: char) -> Self {
|
||||
#[cfg(any(target_pointer_width = "32", target_pointer_width = "64"))]
|
||||
{
|
||||
v as usize
|
||||
}
|
||||
|
||||
// compilation error on 16 bit targets
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// Just like TryFrom<N>, but for numeric types not supported by the Rust's std.
|
||||
pub trait TryNumFrom<T>: Sized {
|
||||
/// Casts between numeric types.
|
||||
fn try_num_from(_: T) -> Option<Self>;
|
||||
}
|
||||
|
||||
impl TryNumFrom<f32> for u8 {
|
||||
#[inline]
|
||||
fn try_num_from(v: f32) -> Option<Self> {
|
||||
i32::try_num_from(v).and_then(|v| u8::try_from(v).ok())
|
||||
}
|
||||
}
|
||||
|
||||
impl TryNumFrom<f32> for i16 {
|
||||
#[inline]
|
||||
fn try_num_from(v: f32) -> Option<Self> {
|
||||
i32::try_num_from(v).and_then(|v| i16::try_from(v).ok())
|
||||
}
|
||||
}
|
||||
|
||||
impl TryNumFrom<f32> for u16 {
|
||||
#[inline]
|
||||
fn try_num_from(v: f32) -> Option<Self> {
|
||||
i32::try_num_from(v).and_then(|v| u16::try_from(v).ok())
|
||||
}
|
||||
}
|
||||
|
||||
impl TryNumFrom<f32> for i32 {
|
||||
#[inline]
|
||||
fn try_num_from(v: f32) -> Option<Self> {
|
||||
// Based on https://github.com/rust-num/num-traits/blob/master/src/cast.rs
|
||||
|
||||
// Float as int truncates toward zero, so we want to allow values
|
||||
// in the exclusive range `(MIN-1, MAX+1)`.
|
||||
|
||||
// We can't represent `MIN-1` exactly, but there's no fractional part
|
||||
// at this magnitude, so we can just use a `MIN` inclusive boundary.
|
||||
const MIN: f32 = core::i32::MIN as f32;
|
||||
// We can't represent `MAX` exactly, but it will round up to exactly
|
||||
// `MAX+1` (a power of two) when we cast it.
|
||||
const MAX_P1: f32 = core::i32::MAX as f32;
|
||||
if v >= MIN && v < MAX_P1 {
|
||||
Some(v as i32)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A slice-like container that converts internal binary data only on access.
|
||||
///
|
||||
/// Array values are stored in a continuous data chunk.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct LazyArray16<'a, T> {
|
||||
data: &'a [u8],
|
||||
data_type: core::marker::PhantomData<T>,
|
||||
}
|
||||
|
||||
impl<T> Default for LazyArray16<'_, T> {
|
||||
#[inline]
|
||||
fn default() -> Self {
|
||||
LazyArray16 {
|
||||
data: &[],
|
||||
data_type: core::marker::PhantomData,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T: FromData> LazyArray16<'a, T> {
|
||||
/// Creates a new `LazyArray`.
|
||||
#[inline]
|
||||
pub fn new(data: &'a [u8]) -> Self {
|
||||
LazyArray16 {
|
||||
data,
|
||||
data_type: core::marker::PhantomData,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns a value at `index`.
|
||||
#[inline]
|
||||
pub fn get(&self, index: u16) -> Option<T> {
|
||||
if index < self.len() {
|
||||
let start = usize::from(index) * T::SIZE;
|
||||
let end = start + T::SIZE;
|
||||
self.data.get(start..end).and_then(T::parse)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the last value.
|
||||
#[inline]
|
||||
pub fn last(&self) -> Option<T> {
|
||||
if !self.is_empty() {
|
||||
self.get(self.len() - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns sub-array.
|
||||
#[inline]
|
||||
pub fn slice(&self, range: Range<u16>) -> Option<Self> {
|
||||
let start = usize::from(range.start) * T::SIZE;
|
||||
let end = usize::from(range.end) * T::SIZE;
|
||||
Some(LazyArray16 {
|
||||
data: self.data.get(start..end)?,
|
||||
..LazyArray16::default()
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns array's length.
|
||||
#[inline]
|
||||
pub fn len(&self) -> u16 {
|
||||
(self.data.len() / T::SIZE) as u16
|
||||
}
|
||||
|
||||
/// Checks if array is empty.
|
||||
#[inline]
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.len() == 0
|
||||
}
|
||||
|
||||
/// Performs a binary search by specified `key`.
|
||||
#[inline]
|
||||
pub fn binary_search(&self, key: &T) -> Option<(u16, T)>
|
||||
where T: Ord
|
||||
{
|
||||
self.binary_search_by(|p| p.cmp(key))
|
||||
}
|
||||
|
||||
/// Performs a binary search using specified closure.
|
||||
#[inline]
|
||||
pub fn binary_search_by<F>(&self, mut f: F) -> Option<(u16, T)>
|
||||
where F: FnMut(&T) -> core::cmp::Ordering
|
||||
{
|
||||
// Based on Rust std implementation.
|
||||
|
||||
use core::cmp::Ordering;
|
||||
|
||||
let mut size = self.len();
|
||||
if size == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut base = 0;
|
||||
while size > 1 {
|
||||
let half = size / 2;
|
||||
let mid = base + half;
|
||||
// mid is always in [0, size), that means mid is >= 0 and < size.
|
||||
// mid >= 0: by definition
|
||||
// mid < size: mid = size / 2 + size / 4 + size / 8 ...
|
||||
let cmp = f(&self.get(mid)?);
|
||||
base = if cmp == Ordering::Greater { base } else { mid };
|
||||
size -= half;
|
||||
}
|
||||
|
||||
// base is always in [0, size) because base <= mid.
|
||||
let value = self.get(base)?;
|
||||
if f(&value) == Ordering::Equal { Some((base, value)) } else { None }
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T: FromData + core::fmt::Debug + Copy> core::fmt::Debug for LazyArray16<'a, T> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
f.debug_list().entries(self.into_iter()).finish()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T: FromData> IntoIterator for LazyArray16<'a, T> {
|
||||
type Item = T;
|
||||
type IntoIter = LazyArrayIter16<'a, T>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
LazyArrayIter16 {
|
||||
data: self,
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An iterator over `LazyArray16`.
|
||||
#[derive(Clone, Copy)]
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct LazyArrayIter16<'a, T> {
|
||||
data: LazyArray16<'a, T>,
|
||||
index: u16,
|
||||
}
|
||||
|
||||
impl<T: FromData> Default for LazyArrayIter16<'_, T> {
|
||||
#[inline]
|
||||
fn default() -> Self {
|
||||
LazyArrayIter16 {
|
||||
data: LazyArray16::new(&[]),
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T: FromData> Iterator for LazyArrayIter16<'a, T> {
|
||||
type Item = T;
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
self.index += 1; // TODO: check
|
||||
self.data.get(self.index - 1)
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn count(self) -> usize {
|
||||
usize::from(self.data.len().checked_sub(self.index).unwrap_or(0))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A slice-like container that converts internal binary data only on access.
|
||||
///
|
||||
/// This is a low-level, internal structure that should not be used directly.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct LazyArray32<'a, T> {
|
||||
data: &'a [u8],
|
||||
data_type: core::marker::PhantomData<T>,
|
||||
}
|
||||
|
||||
impl<T> Default for LazyArray32<'_, T> {
|
||||
#[inline]
|
||||
fn default() -> Self {
|
||||
LazyArray32 {
|
||||
data: &[],
|
||||
data_type: core::marker::PhantomData,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T: FromData> LazyArray32<'a, T> {
|
||||
/// Creates a new `LazyArray`.
|
||||
#[inline]
|
||||
pub fn new(data: &'a [u8]) -> Self {
|
||||
LazyArray32 {
|
||||
data,
|
||||
data_type: core::marker::PhantomData,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns a value at `index`.
|
||||
#[inline]
|
||||
pub fn get(&self, index: u32) -> Option<T> {
|
||||
if index < self.len() {
|
||||
let start = usize::num_from(index) * T::SIZE;
|
||||
let end = start + T::SIZE;
|
||||
self.data.get(start..end).and_then(T::parse)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns array's length.
|
||||
#[inline]
|
||||
pub fn len(&self) -> u32 {
|
||||
(self.data.len() / T::SIZE) as u32
|
||||
}
|
||||
|
||||
/// Performs a binary search by specified `key`.
|
||||
#[inline]
|
||||
pub fn binary_search(&self, key: &T) -> Option<(u32, T)>
|
||||
where T: Ord
|
||||
{
|
||||
self.binary_search_by(|p| p.cmp(key))
|
||||
}
|
||||
|
||||
/// Performs a binary search using specified closure.
|
||||
#[inline]
|
||||
pub fn binary_search_by<F>(&self, mut f: F) -> Option<(u32, T)>
|
||||
where F: FnMut(&T) -> core::cmp::Ordering
|
||||
{
|
||||
// Based on Rust std implementation.
|
||||
|
||||
use core::cmp::Ordering;
|
||||
|
||||
let mut size = self.len();
|
||||
if size == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut base = 0;
|
||||
while size > 1 {
|
||||
let half = size / 2;
|
||||
let mid = base + half;
|
||||
// mid is always in [0, size), that means mid is >= 0 and < size.
|
||||
// mid >= 0: by definition
|
||||
// mid < size: mid = size / 2 + size / 4 + size / 8 ...
|
||||
let cmp = f(&self.get(mid)?);
|
||||
base = if cmp == Ordering::Greater { base } else { mid };
|
||||
size -= half;
|
||||
}
|
||||
|
||||
// base is always in [0, size) because base <= mid.
|
||||
let value = self.get(base)?;
|
||||
if f(&value) == Ordering::Equal { Some((base, value)) } else { None }
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T: FromData + core::fmt::Debug + Copy> core::fmt::Debug for LazyArray32<'a, T> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
f.debug_list().entries(self.into_iter()).finish()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T: FromData> IntoIterator for LazyArray32<'a, T> {
|
||||
type Item = T;
|
||||
type IntoIter = LazyArrayIter32<'a, T>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
LazyArrayIter32 {
|
||||
data: self,
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An iterator over `LazyArray32`.
|
||||
#[derive(Clone, Copy)]
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct LazyArrayIter32<'a, T> {
|
||||
data: LazyArray32<'a, T>,
|
||||
index: u32,
|
||||
}
|
||||
|
||||
impl<'a, T: FromData> Iterator for LazyArrayIter32<'a, T> {
|
||||
type Item = T;
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
self.index += 1; // TODO: check
|
||||
self.data.get(self.index - 1)
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn count(self) -> usize {
|
||||
usize::num_from(self.data.len().checked_sub(self.index).unwrap_or(0))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [`LazyArray16`]-like container, but data is accessed by offsets.
|
||||
///
|
||||
/// Unlike [`LazyArray16`], internal storage is not continuous.
|
||||
///
|
||||
/// Multiple offsets can point to the same data.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct LazyOffsetArray16<'a, T: FromSlice<'a>> {
|
||||
data: &'a [u8],
|
||||
// Zero offsets must be ignored, therefore we're using `Option<Offset16>`.
|
||||
offsets: LazyArray16<'a, Option<Offset16>>,
|
||||
data_type: core::marker::PhantomData<T>,
|
||||
}
|
||||
|
||||
impl<'a, T: FromSlice<'a>> LazyOffsetArray16<'a, T> {
|
||||
/// Creates a new `LazyOffsetArray16`.
|
||||
#[allow(dead_code)]
|
||||
pub fn new(data: &'a [u8], offsets: LazyArray16<'a, Option<Offset16>>) -> Self {
|
||||
Self { data, offsets, data_type: core::marker::PhantomData }
|
||||
}
|
||||
|
||||
/// Parses `LazyOffsetArray16` from raw data.
|
||||
#[allow(dead_code)]
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
Some(Self { data, offsets, data_type: core::marker::PhantomData })
|
||||
}
|
||||
|
||||
/// Returns a value at `index`.
|
||||
#[inline]
|
||||
pub fn get(&self, index: u16) -> Option<T> {
|
||||
let offset = self.offsets.get(index)??.to_usize();
|
||||
self.data.get(offset..).and_then(T::parse)
|
||||
}
|
||||
|
||||
/// Returns array's length.
|
||||
#[inline]
|
||||
pub fn len(&self) -> u16 {
|
||||
self.offsets.len()
|
||||
}
|
||||
|
||||
/// Checks if array is empty.
|
||||
#[inline]
|
||||
#[allow(dead_code)]
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.len() == 0
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T: FromSlice<'a> + core::fmt::Debug + Copy> core::fmt::Debug for LazyOffsetArray16<'a, T> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
f.debug_list().entries(self.into_iter()).finish()
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over [`LazyOffsetArray16`] values.
|
||||
#[derive(Clone, Copy)]
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct LazyOffsetArrayIter16<'a, T: FromSlice<'a>> {
|
||||
array: LazyOffsetArray16<'a, T>,
|
||||
index: u16,
|
||||
}
|
||||
|
||||
impl<'a, T: FromSlice<'a>> IntoIterator for LazyOffsetArray16<'a, T> {
|
||||
type Item = T;
|
||||
type IntoIter = LazyOffsetArrayIter16<'a, T>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
LazyOffsetArrayIter16 {
|
||||
array: self,
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T: FromSlice<'a>> Iterator for LazyOffsetArrayIter16<'a, T> {
|
||||
type Item = T;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index < self.array.len() {
|
||||
self.index += 1;
|
||||
self.array.get(self.index - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn count(self) -> usize {
|
||||
usize::from(self.array.len().checked_sub(self.index).unwrap_or(0))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A streaming binary parser.
|
||||
#[derive(Clone, Copy, Default, Debug)]
|
||||
pub struct Stream<'a> {
|
||||
data: &'a [u8],
|
||||
offset: usize,
|
||||
}
|
||||
|
||||
impl<'a> Stream<'a> {
|
||||
/// Creates a new `Stream` parser.
|
||||
#[inline]
|
||||
pub fn new(data: &'a [u8]) -> Self {
|
||||
Stream { data, offset: 0 }
|
||||
}
|
||||
|
||||
/// Creates a new `Stream` parser at offset.
|
||||
///
|
||||
/// Returns `None` when `offset` is out of bounds.
|
||||
#[inline]
|
||||
pub fn new_at(data: &'a [u8], offset: usize) -> Option<Self> {
|
||||
if offset <= data.len() {
|
||||
Some(Stream { data, offset })
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Checks that stream reached the end of the data.
|
||||
#[inline]
|
||||
pub fn at_end(&self) -> bool {
|
||||
self.offset >= self.data.len()
|
||||
}
|
||||
|
||||
/// Jumps to the end of the stream.
|
||||
///
|
||||
/// Useful to indicate that we parsed all the data.
|
||||
#[inline]
|
||||
pub fn jump_to_end(&mut self) {
|
||||
self.offset = self.data.len();
|
||||
}
|
||||
|
||||
/// Returns the current offset.
|
||||
#[inline]
|
||||
pub fn offset(&self) -> usize {
|
||||
self.offset
|
||||
}
|
||||
|
||||
/// Returns the trailing data.
|
||||
///
|
||||
/// Returns `None` when `Stream` is reached the end.
|
||||
#[inline]
|
||||
pub fn tail(&self) -> Option<&'a [u8]> {
|
||||
self.data.get(self.offset..)
|
||||
}
|
||||
|
||||
/// Advances by `FromData::SIZE`.
|
||||
///
|
||||
/// Doesn't check bounds.
|
||||
#[inline]
|
||||
pub fn skip<T: FromData>(&mut self) {
|
||||
self.advance(T::SIZE);
|
||||
}
|
||||
|
||||
/// Advances by the specified `len`.
|
||||
///
|
||||
/// Doesn't check bounds.
|
||||
#[inline]
|
||||
pub fn advance(&mut self, len: usize) {
|
||||
self.offset += len;
|
||||
}
|
||||
|
||||
/// Advances by the specified `len` and checks for bounds.
|
||||
#[inline]
|
||||
pub fn advance_checked(&mut self, len: usize) -> Option<()> {
|
||||
if self.offset + len <= self.data.len() {
|
||||
self.advance(len);
|
||||
Some(())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Parses the type from the steam.
|
||||
///
|
||||
/// Returns `None` when there is not enough data left in the stream
|
||||
/// or the type parsing failed.
|
||||
#[inline]
|
||||
pub fn read<T: FromData>(&mut self) -> Option<T> {
|
||||
self.read_bytes(T::SIZE).and_then(T::parse)
|
||||
}
|
||||
|
||||
/// Parses the type from the steam at offset.
|
||||
#[inline]
|
||||
pub fn read_at<T: FromData>(data: &[u8], offset: usize) -> Option<T> {
|
||||
data.get(offset..offset + T::SIZE).and_then(T::parse)
|
||||
}
|
||||
|
||||
/// Reads N bytes from the stream.
|
||||
#[inline]
|
||||
pub fn read_bytes(&mut self, len: usize) -> Option<&'a [u8]> {
|
||||
let v = self.data.get(self.offset..self.offset + len)?;
|
||||
self.advance(len);
|
||||
Some(v)
|
||||
}
|
||||
|
||||
/// Reads the next `count` types as a slice.
|
||||
#[inline]
|
||||
pub fn read_array16<T: FromData>(&mut self, count: u16) -> Option<LazyArray16<'a, T>> {
|
||||
let len = usize::from(count) * T::SIZE;
|
||||
self.read_bytes(len).map(LazyArray16::new)
|
||||
}
|
||||
|
||||
/// Reads the next `count` types as a slice.
|
||||
#[inline]
|
||||
pub fn read_array32<T: FromData>(&mut self, count: u32) -> Option<LazyArray32<'a, T>> {
|
||||
let len = usize::num_from(count) * T::SIZE;
|
||||
self.read_bytes(len).map(LazyArray32::new)
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
#[inline]
|
||||
pub(crate) fn read_at_offset16(&mut self, data: &'a [u8]) -> Option<&'a [u8]> {
|
||||
let offset = self.read::<Offset16>()?.to_usize();
|
||||
data.get(offset..)
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
#[inline]
|
||||
pub(crate) fn read_at_offset32(&mut self, data: &'a [u8]) -> Option<&'a [u8]> {
|
||||
let offset = self.read::<Offset32>()?.to_usize();
|
||||
data.get(offset..)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A common offset methods.
|
||||
pub trait Offset {
|
||||
/// Converts the offset to `usize`.
|
||||
fn to_usize(&self) -> usize;
|
||||
|
||||
/// Checks that offset is null.
|
||||
fn is_null(&self) -> bool { self.to_usize() == 0 }
|
||||
}
|
||||
|
||||
|
||||
/// A type-safe u16 offset.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Offset16(pub u16);
|
||||
|
||||
impl Offset for Offset16 {
|
||||
#[inline]
|
||||
fn to_usize(&self) -> usize {
|
||||
usize::from(self.0)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for Offset16 {
|
||||
const SIZE: usize = 2;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
u16::parse(data).map(Offset16)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for Option<Offset16> {
|
||||
const SIZE: usize = Offset16::SIZE;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let offset = Offset16::parse(data)?;
|
||||
if offset.0 != 0 { Some(Some(offset)) } else { Some(None) }
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A type-safe u32 offset.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Offset32(pub u32);
|
||||
|
||||
impl Offset for Offset32 {
|
||||
#[inline]
|
||||
fn to_usize(&self) -> usize {
|
||||
usize::num_from(self.0)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for Offset32 {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
u32::parse(data).map(Offset32)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
impl FromData for Option<Offset32> {
|
||||
const SIZE: usize = Offset32::SIZE;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let offset = Offset32::parse(data)?;
|
||||
if offset.0 != 0 { Some(Some(offset)) } else { Some(None) }
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[inline]
|
||||
pub(crate) fn i16_bound(min: i16, val: i16, max: i16) -> i16 {
|
||||
use core::cmp;
|
||||
cmp::max(min, cmp::min(max, val))
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub(crate) fn f32_bound(min: f32, val: f32, max: f32) -> f32 {
|
||||
debug_assert!(min.is_finite());
|
||||
debug_assert!(val.is_finite());
|
||||
debug_assert!(max.is_finite());
|
||||
|
||||
if val > max {
|
||||
return max;
|
||||
} else if val < min {
|
||||
return min;
|
||||
}
|
||||
|
||||
val
|
||||
}
|
|
@ -0,0 +1,80 @@
|
|||
//! An [Anchor Point Table](
|
||||
//! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6ankr.html) implementation.
|
||||
|
||||
use core::num::NonZeroU16;
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::parser::{Stream, FromData, LazyArray32};
|
||||
use crate::aat;
|
||||
|
||||
/// An anchor point.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, PartialEq, Default, Debug)]
|
||||
pub struct Point {
|
||||
pub x: i16,
|
||||
pub y: i16,
|
||||
}
|
||||
|
||||
impl FromData for Point {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Point {
|
||||
x: s.read::<i16>()?,
|
||||
y: s.read::<i16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An [Anchor Point Table](
|
||||
/// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6ankr.html).
|
||||
#[derive(Clone)]
|
||||
pub struct Table<'a> {
|
||||
lookup: aat::Lookup<'a>,
|
||||
// Ideally, Glyphs Data can be represented as an array,
|
||||
// but Apple's spec doesn't specify that Glyphs Data members have padding or not.
|
||||
// Meaning we cannot simply iterate over them.
|
||||
glyphs_data: &'a [u8],
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Table<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Table {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
///
|
||||
/// `number_of_glyphs` is from the `maxp` table.
|
||||
pub fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let version = s.read::<u16>()?;
|
||||
if version != 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
s.skip::<u16>(); // reserved
|
||||
// TODO: we should probably check that offset is larger than the header size (8)
|
||||
let lookup_table = s.read_at_offset32(data)?;
|
||||
let glyphs_data = s.read_at_offset32(data)?;
|
||||
|
||||
Some(Table {
|
||||
lookup: aat::Lookup::parse(number_of_glyphs, lookup_table)?,
|
||||
glyphs_data,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns a list of anchor points for the specified glyph.
|
||||
pub fn points(&self, glyph_id: GlyphId) -> Option<LazyArray32<'a, Point>> {
|
||||
let offset = self.lookup.value(glyph_id)?;
|
||||
|
||||
let mut s = Stream::new_at(self.glyphs_data, usize::from(offset))?;
|
||||
let number_of_points = s.read::<u32>()?;
|
||||
s.read_array32::<Point>(number_of_points)
|
||||
}
|
||||
}
|
|
@ -0,0 +1,175 @@
|
|||
//! An [Axis Variations Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/avar) implementation.
|
||||
|
||||
use core::convert::TryFrom;
|
||||
|
||||
use crate::NormalizedCoordinate;
|
||||
use crate::parser::{Stream, FromData, LazyArray16};
|
||||
|
||||
/// An axis value map.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct AxisValueMap {
|
||||
/// A normalized coordinate value obtained using default normalization.
|
||||
pub from_coordinate: i16,
|
||||
/// The modified, normalized coordinate value.
|
||||
pub to_coordinate: i16,
|
||||
}
|
||||
|
||||
impl FromData for AxisValueMap {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(AxisValueMap {
|
||||
from_coordinate: s.read::<i16>()?,
|
||||
to_coordinate: s.read::<i16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A list of segment maps.
|
||||
///
|
||||
/// Can be empty.
|
||||
///
|
||||
/// The internal data layout is not designed for random access,
|
||||
/// therefore we're not providing the `get()` method and only an iterator.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct SegmentMaps<'a> {
|
||||
count: u16,
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> SegmentMaps<'a> {
|
||||
/// Returns the number of segments.
|
||||
pub fn len(&self) -> u16 {
|
||||
self.count
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for SegmentMaps<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "SegmentMaps {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for SegmentMaps<'a> {
|
||||
type Item = LazyArray16<'a, AxisValueMap>;
|
||||
type IntoIter = SegmentMapsIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
SegmentMapsIter {
|
||||
stream: Stream::new(self.data),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over maps.
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct SegmentMapsIter<'a> {
|
||||
stream: Stream<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for SegmentMapsIter<'a> {
|
||||
type Item = LazyArray16<'a, AxisValueMap>;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
let count = self.stream.read::<u16>()?;
|
||||
return self.stream.read_array16::<AxisValueMap>(count);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An [Axis Variations Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/avar).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// The segment maps array — one segment map for each axis
|
||||
/// in the order of axes specified in the `fvar` table.
|
||||
pub segment_maps: SegmentMaps<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let version = s.read::<u32>()?;
|
||||
if version != 0x00010000 {
|
||||
return None;
|
||||
}
|
||||
|
||||
s.skip::<u16>(); // reserved
|
||||
Some(Self {
|
||||
segment_maps: SegmentMaps {
|
||||
// TODO: check that `axisCount` is the same as in `fvar`?
|
||||
count: s.read::<u16>()?,
|
||||
data: s.tail()?,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
/// Maps coordinates.
|
||||
pub fn map_coordinates(&self, coordinates: &mut [NormalizedCoordinate]) -> Option<()> {
|
||||
if usize::from(self.segment_maps.count) != coordinates.len() {
|
||||
return None;
|
||||
}
|
||||
|
||||
for (map, coord) in self.segment_maps.into_iter().zip(coordinates) {
|
||||
*coord = NormalizedCoordinate::from(map_value(&map, coord.0)?);
|
||||
}
|
||||
|
||||
Some(())
|
||||
}
|
||||
}
|
||||
|
||||
fn map_value(map: &LazyArray16<AxisValueMap>, value: i16) -> Option<i16> {
|
||||
// This code is based on harfbuzz implementation.
|
||||
|
||||
if map.len() == 0 {
|
||||
return Some(value);
|
||||
} else if map.len() == 1 {
|
||||
let record = map.get(0)?;
|
||||
return Some(value - record.from_coordinate + record.to_coordinate);
|
||||
}
|
||||
|
||||
let record_0 = map.get(0)?;
|
||||
if value <= record_0.from_coordinate {
|
||||
return Some(value - record_0.from_coordinate + record_0.to_coordinate);
|
||||
}
|
||||
|
||||
let mut i = 1;
|
||||
while i < map.len() && value > map.get(i)?.from_coordinate {
|
||||
i += 1;
|
||||
}
|
||||
|
||||
if i == map.len() {
|
||||
i -= 1;
|
||||
}
|
||||
|
||||
let record_curr = map.get(i)?;
|
||||
let curr_from = record_curr.from_coordinate;
|
||||
let curr_to = record_curr.to_coordinate;
|
||||
if value >= curr_from {
|
||||
return Some(value - curr_from + curr_to);
|
||||
}
|
||||
|
||||
let record_prev = map.get(i - 1)?;
|
||||
let prev_from = record_prev.from_coordinate;
|
||||
let prev_to = record_prev.to_coordinate;
|
||||
if prev_from == curr_from {
|
||||
return Some(prev_to);
|
||||
}
|
||||
|
||||
let curr_from = i32::from(curr_from);
|
||||
let curr_to = i32::from(curr_to);
|
||||
let prev_from = i32::from(prev_from);
|
||||
let prev_to = i32::from(prev_to);
|
||||
|
||||
let denom = curr_from - prev_from;
|
||||
let k = (curr_to - prev_to) * (i32::from(value) - prev_from) + denom / 2;
|
||||
let value = prev_to + k / denom;
|
||||
i16::try_from(value).ok()
|
||||
}
|
|
@ -0,0 +1,90 @@
|
|||
//! A [Color Bitmap Data Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/cbdt) implementation.
|
||||
|
||||
use crate::{GlyphId, RasterGlyphImage, RasterImageFormat};
|
||||
use crate::parser::{Stream, NumFrom};
|
||||
use super::cblc::{self, BitmapFormat};
|
||||
|
||||
/// A [Color Bitmap Data Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/cbdt).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Table<'a> {
|
||||
locations: cblc::Table<'a>,
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(locations: cblc::Table<'a>, data: &'a [u8]) -> Option<Self> {
|
||||
Some(Self { locations, data })
|
||||
}
|
||||
|
||||
/// Returns a raster image for the glyph.
|
||||
pub fn get(&self, glyph_id: GlyphId, pixels_per_em: u16) -> Option<RasterGlyphImage<'a>> {
|
||||
let location = self.locations.get(glyph_id, pixels_per_em)?;
|
||||
let mut s = Stream::new_at(self.data, location.offset)?;
|
||||
match location.format {
|
||||
BitmapFormat::Format17 => {
|
||||
let height = s.read::<u8>()?;
|
||||
let width = s.read::<u8>()?;
|
||||
let bearing_x = s.read::<i8>()?;
|
||||
let bearing_y = s.read::<i8>()?;
|
||||
s.skip::<u8>(); // advance
|
||||
let data_len = s.read::<u32>()?;
|
||||
let data = s.read_bytes(usize::num_from(data_len))?;
|
||||
Some(RasterGlyphImage {
|
||||
x: i16::from(bearing_x),
|
||||
// `y` in CBDT is a bottom bound, not top one.
|
||||
y: i16::from(bearing_y) - i16::from(height),
|
||||
width: u16::from(width),
|
||||
height: u16::from(height),
|
||||
pixels_per_em: location.ppem,
|
||||
format: RasterImageFormat::PNG,
|
||||
data,
|
||||
})
|
||||
}
|
||||
BitmapFormat::Format18 => {
|
||||
let height = s.read::<u8>()?;
|
||||
let width = s.read::<u8>()?;
|
||||
let hor_bearing_x = s.read::<i8>()?;
|
||||
let hor_bearing_y = s.read::<i8>()?;
|
||||
s.skip::<u8>(); // hor_advance
|
||||
s.skip::<i8>(); // ver_bearing_x
|
||||
s.skip::<i8>(); // ver_bearing_y
|
||||
s.skip::<u8>(); // ver_advance
|
||||
let data_len = s.read::<u32>()?;
|
||||
let data = s.read_bytes(usize::num_from(data_len))?;
|
||||
Some(RasterGlyphImage {
|
||||
x: i16::from(hor_bearing_x),
|
||||
// `y` in CBDT is a bottom bound, not top one.
|
||||
y: i16::from(hor_bearing_y) - i16::from(height),
|
||||
width: u16::from(width),
|
||||
height: u16::from(height),
|
||||
pixels_per_em: location.ppem,
|
||||
format: RasterImageFormat::PNG,
|
||||
data,
|
||||
})
|
||||
}
|
||||
BitmapFormat::Format19 => {
|
||||
let data_len = s.read::<u32>()?;
|
||||
let data = s.read_bytes(usize::num_from(data_len))?;
|
||||
Some(RasterGlyphImage {
|
||||
x: i16::from(location.metrics.x),
|
||||
// `y` in CBDT is a bottom bound, not top one.
|
||||
y: i16::from(location.metrics.y) - i16::from(location.metrics.height),
|
||||
width: u16::from(location.metrics.width),
|
||||
height: u16::from(location.metrics.height),
|
||||
pixels_per_em: location.ppem,
|
||||
format: RasterImageFormat::PNG,
|
||||
data,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Table<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Table {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,226 @@
|
|||
//! A [Color Bitmap Location Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/cblc) implementation.
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::parser::{Stream, FromData, Offset, Offset16, Offset32, NumFrom};
|
||||
|
||||
#[derive(Clone, Copy, PartialEq, Debug)]
|
||||
pub(crate) enum BitmapFormat {
|
||||
Format17,
|
||||
Format18,
|
||||
Format19,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Default, Debug)]
|
||||
pub(crate) struct Metrics {
|
||||
pub x: i8,
|
||||
pub y: i8,
|
||||
pub width: u8,
|
||||
pub height: u8,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub(crate) struct Location {
|
||||
pub format: BitmapFormat,
|
||||
pub offset: usize,
|
||||
pub metrics: Metrics,
|
||||
pub ppem: u16,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct BitmapSizeTable {
|
||||
subtable_array_offset: Offset32,
|
||||
number_of_subtables: u32,
|
||||
ppem: u16,
|
||||
// Many fields are omitted.
|
||||
}
|
||||
|
||||
fn select_bitmap_size_table(
|
||||
glyph_id: GlyphId,
|
||||
pixels_per_em: u16,
|
||||
mut s: Stream,
|
||||
) -> Option<BitmapSizeTable> {
|
||||
let subtable_count = s.read::<u32>()?;
|
||||
let orig_s = s.clone();
|
||||
|
||||
let mut idx = None;
|
||||
let mut max_ppem = 0;
|
||||
for i in 0..subtable_count {
|
||||
// Check that the current subtable contains a provided glyph id.
|
||||
s.advance(40); // Jump to `start_glyph_index`.
|
||||
let start_glyph_id = s.read::<GlyphId>()?;
|
||||
let end_glyph_id = s.read::<GlyphId>()?;
|
||||
let ppem = u16::from(s.read::<u8>()?);
|
||||
|
||||
if !(start_glyph_id..=end_glyph_id).contains(&glyph_id) {
|
||||
s.advance(4); // Jump to the end of the subtable.
|
||||
continue;
|
||||
}
|
||||
|
||||
// Select a best matching subtable based on `pixels_per_em`.
|
||||
if (pixels_per_em <= ppem && ppem < max_ppem) || (pixels_per_em > max_ppem && ppem > max_ppem) {
|
||||
idx = Some(usize::num_from(i));
|
||||
max_ppem = ppem;
|
||||
}
|
||||
}
|
||||
|
||||
let mut s = orig_s;
|
||||
s.advance(idx? * 48); // 48 is BitmapSize Table size
|
||||
|
||||
let subtable_array_offset = s.read::<Offset32>()?;
|
||||
s.skip::<u32>(); // index_tables_size
|
||||
let number_of_subtables = s.read::<u32>()?;
|
||||
|
||||
Some(BitmapSizeTable {
|
||||
subtable_array_offset,
|
||||
number_of_subtables,
|
||||
ppem: max_ppem,
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct IndexSubtableInfo {
|
||||
start_glyph_id: GlyphId,
|
||||
offset: usize, // absolute offset
|
||||
}
|
||||
|
||||
fn select_index_subtable(
|
||||
data: &[u8],
|
||||
size_table: BitmapSizeTable,
|
||||
glyph_id: GlyphId,
|
||||
) -> Option<IndexSubtableInfo> {
|
||||
let mut s = Stream::new_at(data, size_table.subtable_array_offset.to_usize())?;
|
||||
for _ in 0..size_table.number_of_subtables {
|
||||
let start_glyph_id = s.read::<GlyphId>()?;
|
||||
let end_glyph_id = s.read::<GlyphId>()?;
|
||||
let offset = s.read::<Offset32>()?;
|
||||
|
||||
if (start_glyph_id..=end_glyph_id).contains(&glyph_id) {
|
||||
let offset = size_table.subtable_array_offset.to_usize() + offset.to_usize();
|
||||
return Some(IndexSubtableInfo {
|
||||
start_glyph_id,
|
||||
offset,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct GlyphIdOffsetPair {
|
||||
glyph_id: GlyphId,
|
||||
offset: Offset16,
|
||||
}
|
||||
|
||||
impl FromData for GlyphIdOffsetPair {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(GlyphIdOffsetPair {
|
||||
glyph_id: s.read::<GlyphId>()?,
|
||||
offset: s.read::<Offset16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// TODO: rewrite
|
||||
|
||||
/// A [Color Bitmap Location Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/cblc).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Table<'a> {
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
Some(Self { data })
|
||||
}
|
||||
|
||||
pub(crate) fn get(
|
||||
&self,
|
||||
glyph_id: GlyphId,
|
||||
pixels_per_em: u16,
|
||||
) -> Option<Location> {
|
||||
let mut s = Stream::new(self.data);
|
||||
|
||||
// The CBLC table version is a bit tricky, so we are ignoring it for now.
|
||||
// The CBLC table is based on EBLC table, which was based on the `bloc` table.
|
||||
// And before the CBLC table specification was finished, some fonts,
|
||||
// notably Noto Emoji, have used version 2.0, but the final spec allows only 3.0.
|
||||
// So there are perfectly valid fonts in the wild, which have an invalid version.
|
||||
s.skip::<u32>(); // version
|
||||
|
||||
let size_table = select_bitmap_size_table(glyph_id, pixels_per_em, s)?;
|
||||
let info = select_index_subtable(self.data, size_table, glyph_id)?;
|
||||
|
||||
let mut s = Stream::new_at(self.data, info.offset)?;
|
||||
let index_format = s.read::<u16>()?;
|
||||
let image_format = s.read::<u16>()?;
|
||||
let mut image_offset = s.read::<Offset32>()?.to_usize();
|
||||
|
||||
let image_format = match image_format {
|
||||
17 => BitmapFormat::Format17,
|
||||
18 => BitmapFormat::Format18,
|
||||
19 => BitmapFormat::Format19,
|
||||
_ => return None, // Invalid format.
|
||||
};
|
||||
|
||||
// TODO: I wasn't able to find fonts with index 4 and 5, so they are untested.
|
||||
|
||||
let glyph_diff = glyph_id.0.checked_sub(info.start_glyph_id.0)?;
|
||||
let metrics = Metrics::default();
|
||||
match index_format {
|
||||
1 => {
|
||||
s.advance(usize::from(glyph_diff) * Offset32::SIZE);
|
||||
let offset = s.read::<Offset32>()?;
|
||||
image_offset += offset.to_usize();
|
||||
}
|
||||
2 => {
|
||||
let image_size = s.read::<u32>()?;
|
||||
image_offset += usize::from(glyph_diff).checked_mul(usize::num_from(image_size))?;
|
||||
}
|
||||
3 => {
|
||||
s.advance(usize::from(glyph_diff) * Offset16::SIZE);
|
||||
let offset = s.read::<Offset16>()?;
|
||||
image_offset += offset.to_usize();
|
||||
}
|
||||
4 => {
|
||||
let num_glyphs = s.read::<u32>()?;
|
||||
let num_glyphs = num_glyphs.checked_add(1)?;
|
||||
let pairs = s.read_array32::<GlyphIdOffsetPair>(num_glyphs)?;
|
||||
let pair = pairs.into_iter().find(|pair| pair.glyph_id == glyph_id)?;
|
||||
image_offset += pair.offset.to_usize();
|
||||
}
|
||||
5 => {
|
||||
let image_size = s.read::<u32>()?;
|
||||
s.advance(8); // big metrics
|
||||
let num_glyphs = s.read::<u32>()?;
|
||||
let glyphs = s.read_array32::<GlyphId>(num_glyphs)?;
|
||||
let (index, _) = glyphs.binary_search(&glyph_id)?;
|
||||
image_offset = image_offset
|
||||
.checked_add(usize::num_from(index).checked_mul(usize::num_from(image_size))?)?;
|
||||
}
|
||||
_ => return None, // Invalid format.
|
||||
}
|
||||
|
||||
Some(Location {
|
||||
format: image_format,
|
||||
offset: image_offset,
|
||||
metrics,
|
||||
ppem: size_table.ppem,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Table<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Table {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,64 @@
|
|||
use super::CFFError;
|
||||
|
||||
pub struct ArgumentsStack<'a> {
|
||||
pub data: &'a mut [f32],
|
||||
pub len: usize,
|
||||
pub max_len: usize,
|
||||
}
|
||||
|
||||
impl<'a> ArgumentsStack<'a> {
|
||||
#[inline]
|
||||
pub fn len(&self) -> usize {
|
||||
self.len
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.len == 0
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn push(&mut self, n: f32) -> Result<(), CFFError> {
|
||||
if self.len == self.max_len {
|
||||
Err(CFFError::ArgumentsStackLimitReached)
|
||||
} else {
|
||||
self.data[self.len] = n;
|
||||
self.len += 1;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn at(&self, index: usize) -> f32 {
|
||||
self.data[index]
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn pop(&mut self) -> f32 {
|
||||
debug_assert!(!self.is_empty());
|
||||
self.len -= 1;
|
||||
self.data[self.len]
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn reverse(&mut self) {
|
||||
if self.is_empty() {
|
||||
return;
|
||||
}
|
||||
|
||||
// Reverse only the actual data and not the whole stack.
|
||||
let (first, _) = self.data.split_at_mut(self.len);
|
||||
first.reverse();
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn clear(&mut self) {
|
||||
self.len = 0;
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for ArgumentsStack<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
f.debug_list().entries(&self.data[..self.len]).finish()
|
||||
}
|
||||
}
|
|
@ -0,0 +1,842 @@
|
|||
//! A [Compact Font Format Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/cff) implementation.
|
||||
|
||||
// Useful links:
|
||||
// http://wwwimages.adobe.com/content/dam/Adobe/en/devnet/font/pdfs/5176.CFF.pdf
|
||||
// http://wwwimages.adobe.com/content/dam/Adobe/en/devnet/font/pdfs/5177.Type2.pdf
|
||||
// https://github.com/opentypejs/opentype.js/blob/master/src/tables/cff.js
|
||||
|
||||
use core::convert::TryFrom;
|
||||
use core::ops::Range;
|
||||
|
||||
use crate::{GlyphId, OutlineBuilder, Rect, BBox};
|
||||
use crate::parser::{Stream, LazyArray16, NumFrom, TryNumFrom};
|
||||
use super::{Builder, IsEven, CFFError, StringId, calc_subroutine_bias, conv_subroutine_index};
|
||||
use super::argstack::ArgumentsStack;
|
||||
use super::charset::{STANDARD_ENCODING, Charset, parse_charset};
|
||||
use super::charstring::CharStringParser;
|
||||
use super::dict::DictionaryParser;
|
||||
use super::index::{Index, parse_index, skip_index};
|
||||
#[cfg(feature = "glyph-names")] use super::std_names::STANDARD_NAMES;
|
||||
|
||||
// Limits according to the Adobe Technical Note #5176, chapter 4 DICT Data.
|
||||
const MAX_OPERANDS_LEN: usize = 48;
|
||||
|
||||
// Limits according to the Adobe Technical Note #5177 Appendix B.
|
||||
const STACK_LIMIT: u8 = 10;
|
||||
const MAX_ARGUMENTS_STACK_LEN: usize = 48;
|
||||
|
||||
const TWO_BYTE_OPERATOR_MARK: u8 = 12;
|
||||
|
||||
/// Enumerates some operators defined in the Adobe Technical Note #5177.
|
||||
mod operator {
|
||||
pub const HORIZONTAL_STEM: u8 = 1;
|
||||
pub const VERTICAL_STEM: u8 = 3;
|
||||
pub const VERTICAL_MOVE_TO: u8 = 4;
|
||||
pub const LINE_TO: u8 = 5;
|
||||
pub const HORIZONTAL_LINE_TO: u8 = 6;
|
||||
pub const VERTICAL_LINE_TO: u8 = 7;
|
||||
pub const CURVE_TO: u8 = 8;
|
||||
pub const CALL_LOCAL_SUBROUTINE: u8 = 10;
|
||||
pub const RETURN: u8 = 11;
|
||||
pub const ENDCHAR: u8 = 14;
|
||||
pub const HORIZONTAL_STEM_HINT_MASK: u8 = 18;
|
||||
pub const HINT_MASK: u8 = 19;
|
||||
pub const COUNTER_MASK: u8 = 20;
|
||||
pub const MOVE_TO: u8 = 21;
|
||||
pub const HORIZONTAL_MOVE_TO: u8 = 22;
|
||||
pub const VERTICAL_STEM_HINT_MASK: u8 = 23;
|
||||
pub const CURVE_LINE: u8 = 24;
|
||||
pub const LINE_CURVE: u8 = 25;
|
||||
pub const VV_CURVE_TO: u8 = 26;
|
||||
pub const HH_CURVE_TO: u8 = 27;
|
||||
pub const SHORT_INT: u8 = 28;
|
||||
pub const CALL_GLOBAL_SUBROUTINE: u8 = 29;
|
||||
pub const VH_CURVE_TO: u8 = 30;
|
||||
pub const HV_CURVE_TO: u8 = 31;
|
||||
pub const HFLEX: u8 = 34;
|
||||
pub const FLEX: u8 = 35;
|
||||
pub const HFLEX1: u8 = 36;
|
||||
pub const FLEX1: u8 = 37;
|
||||
pub const FIXED_16_16: u8 = 255;
|
||||
}
|
||||
|
||||
/// Enumerates some operators defined in the Adobe Technical Note #5176,
|
||||
/// Table 9 Top DICT Operator Entries
|
||||
mod top_dict_operator {
|
||||
pub const CHARSET_OFFSET: u16 = 15;
|
||||
pub const CHAR_STRINGS_OFFSET: u16 = 17;
|
||||
pub const PRIVATE_DICT_SIZE_AND_OFFSET: u16 = 18;
|
||||
pub const ROS: u16 = 1230;
|
||||
pub const FD_ARRAY: u16 = 1236;
|
||||
pub const FD_SELECT: u16 = 1237;
|
||||
}
|
||||
|
||||
/// Enumerates some operators defined in the Adobe Technical Note #5176,
|
||||
/// Table 23 Private DICT Operators
|
||||
mod private_dict_operator {
|
||||
pub const LOCAL_SUBROUTINES_OFFSET: u16 = 19;
|
||||
}
|
||||
|
||||
/// Enumerates Charset IDs defined in the Adobe Technical Note #5176, Table 22
|
||||
mod charset_id {
|
||||
pub const ISO_ADOBE: usize = 0;
|
||||
pub const EXPERT: usize = 1;
|
||||
pub const EXPERT_SUBSET: usize = 2;
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub(crate) enum FontKind<'a> {
|
||||
SID(SIDMetadata<'a>),
|
||||
CID(CIDMetadata<'a>),
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Default, Debug)]
|
||||
pub(crate) struct SIDMetadata<'a> {
|
||||
local_subrs: Index<'a>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Default, Debug)]
|
||||
pub(crate) struct CIDMetadata<'a> {
|
||||
fd_array: Index<'a>,
|
||||
fd_select: FDSelect<'a>,
|
||||
}
|
||||
|
||||
#[derive(Default)]
|
||||
struct TopDict {
|
||||
charset_offset: Option<usize>,
|
||||
char_strings_offset: usize,
|
||||
private_dict_range: Option<Range<usize>>,
|
||||
has_ros: bool,
|
||||
fd_array_offset: Option<usize>,
|
||||
fd_select_offset: Option<usize>,
|
||||
}
|
||||
|
||||
fn parse_top_dict(s: &mut Stream) -> Option<TopDict> {
|
||||
let mut top_dict = TopDict::default();
|
||||
|
||||
let index = parse_index::<u16>(s)?;
|
||||
|
||||
// The Top DICT INDEX should have only one dictionary.
|
||||
let data = index.get(0)?;
|
||||
|
||||
let mut operands_buffer = [0; MAX_OPERANDS_LEN];
|
||||
let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer);
|
||||
while let Some(operator) = dict_parser.parse_next() {
|
||||
match operator.get() {
|
||||
top_dict_operator::CHARSET_OFFSET => {
|
||||
top_dict.charset_offset = dict_parser.parse_offset();
|
||||
}
|
||||
top_dict_operator::CHAR_STRINGS_OFFSET => {
|
||||
top_dict.char_strings_offset = dict_parser.parse_offset()?;
|
||||
}
|
||||
top_dict_operator::PRIVATE_DICT_SIZE_AND_OFFSET => {
|
||||
top_dict.private_dict_range = dict_parser.parse_range();
|
||||
}
|
||||
top_dict_operator::ROS => {
|
||||
top_dict.has_ros = true;
|
||||
}
|
||||
top_dict_operator::FD_ARRAY => {
|
||||
top_dict.fd_array_offset = dict_parser.parse_offset();
|
||||
}
|
||||
top_dict_operator::FD_SELECT => {
|
||||
top_dict.fd_select_offset = dict_parser.parse_offset();
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
Some(top_dict)
|
||||
}
|
||||
|
||||
// TODO: move to integration
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn private_dict_size_overflow() {
|
||||
let data = &[
|
||||
0x00, 0x01, // count: 1
|
||||
0x01, // offset size: 1
|
||||
0x01, // index [0]: 1
|
||||
0x0C, // index [1]: 14
|
||||
0x1D, 0x7F, 0xFF, 0xFF, 0xFF, // length: i32::MAX
|
||||
0x1D, 0x7F, 0xFF, 0xFF, 0xFF, // offset: i32::MAX
|
||||
0x12 // operator: 18 (private)
|
||||
];
|
||||
|
||||
let top_dict = parse_top_dict(&mut Stream::new(data)).unwrap();
|
||||
assert_eq!(top_dict.private_dict_range, Some(2147483647..4294967294));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn private_dict_negative_char_strings_offset() {
|
||||
let data = &[
|
||||
0x00, 0x01, // count: 1
|
||||
0x01, // offset size: 1
|
||||
0x01, // index [0]: 1
|
||||
0x03, // index [1]: 3
|
||||
// Item 0
|
||||
0x8A, // offset: -1
|
||||
0x11, // operator: 17 (char_string)
|
||||
];
|
||||
|
||||
assert!(parse_top_dict(&mut Stream::new(data)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn private_dict_no_char_strings_offset_operand() {
|
||||
let data = &[
|
||||
0x00, 0x01, // count: 1
|
||||
0x01, // offset size: 1
|
||||
0x01, // index [0]: 1
|
||||
0x02, // index [1]: 2
|
||||
// Item 0
|
||||
// <-- No number here.
|
||||
0x11, // operator: 17 (char_string)
|
||||
];
|
||||
|
||||
assert!(parse_top_dict(&mut Stream::new(data)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn negative_private_dict_offset_and_size() {
|
||||
let data = &[
|
||||
0x00, 0x01, // count: 1
|
||||
0x01, // offset size: 1
|
||||
0x01, // index [0]: 1
|
||||
0x04, // index [1]: 4
|
||||
// Item 0
|
||||
0x8A, // length: -1
|
||||
0x8A, // offset: -1
|
||||
0x12, // operator: 18 (private)
|
||||
];
|
||||
|
||||
let top_dict = parse_top_dict(&mut Stream::new(data)).unwrap();
|
||||
assert!(top_dict.private_dict_range.is_none());
|
||||
}
|
||||
}
|
||||
|
||||
fn parse_private_dict(data: &[u8]) -> Option<usize> {
|
||||
let mut operands_buffer = [0; MAX_OPERANDS_LEN];
|
||||
let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer);
|
||||
while let Some(operator) = dict_parser.parse_next() {
|
||||
if operator.get() == private_dict_operator::LOCAL_SUBROUTINES_OFFSET {
|
||||
return dict_parser.parse_offset();
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
fn parse_font_dict(data: &[u8]) -> Option<Range<usize>> {
|
||||
let mut operands_buffer = [0; MAX_OPERANDS_LEN];
|
||||
let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer);
|
||||
while let Some(operator) = dict_parser.parse_next() {
|
||||
if operator.get() == top_dict_operator::PRIVATE_DICT_SIZE_AND_OFFSET {
|
||||
return dict_parser.parse_range();
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
/// In CID fonts, to get local subroutines we have to:
|
||||
/// 1. Find Font DICT index via FDSelect by GID.
|
||||
/// 2. Get Font DICT data from FDArray using this index.
|
||||
/// 3. Get a Private DICT offset from a Font DICT.
|
||||
/// 4. Get a local subroutine offset from Private DICT.
|
||||
/// 5. Parse a local subroutine at offset.
|
||||
fn parse_cid_local_subrs<'a>(
|
||||
data: &'a [u8],
|
||||
glyph_id: GlyphId,
|
||||
cid: &CIDMetadata,
|
||||
) -> Option<Index<'a>> {
|
||||
let font_dict_index = cid.fd_select.font_dict_index(glyph_id)?;
|
||||
let font_dict_data = cid.fd_array.get(u32::from(font_dict_index))?;
|
||||
let private_dict_range = parse_font_dict(font_dict_data)?;
|
||||
let private_dict_data = data.get(private_dict_range.clone())?;
|
||||
let subroutines_offset = parse_private_dict(private_dict_data)?;
|
||||
|
||||
// 'The local subroutines offset is relative to the beginning
|
||||
// of the Private DICT data.'
|
||||
let start = private_dict_range.start.checked_add(subroutines_offset)?;
|
||||
let subrs_data = data.get(start..)?;
|
||||
let mut s = Stream::new(subrs_data);
|
||||
parse_index::<u16>(&mut s)
|
||||
}
|
||||
|
||||
struct CharStringParserContext<'a> {
|
||||
metadata: &'a Table<'a>,
|
||||
width_parsed: bool,
|
||||
stems_len: u32,
|
||||
has_endchar: bool,
|
||||
has_seac: bool,
|
||||
glyph_id: GlyphId, // Required to parse local subroutine in CID fonts.
|
||||
local_subrs: Option<Index<'a>>,
|
||||
}
|
||||
|
||||
fn parse_char_string(
|
||||
data: &[u8],
|
||||
metadata: &Table,
|
||||
glyph_id: GlyphId,
|
||||
builder: &mut dyn OutlineBuilder,
|
||||
) -> Result<Rect, CFFError> {
|
||||
let local_subrs = match metadata.kind {
|
||||
FontKind::SID(ref sid) => Some(sid.local_subrs),
|
||||
FontKind::CID(_) => None, // Will be resolved on request.
|
||||
};
|
||||
|
||||
let mut ctx = CharStringParserContext {
|
||||
metadata,
|
||||
width_parsed: false,
|
||||
stems_len: 0,
|
||||
has_endchar: false,
|
||||
has_seac: false,
|
||||
glyph_id,
|
||||
local_subrs,
|
||||
};
|
||||
|
||||
let mut inner_builder = Builder {
|
||||
builder,
|
||||
bbox: BBox::new(),
|
||||
};
|
||||
|
||||
let stack = ArgumentsStack {
|
||||
data: &mut [0.0; MAX_ARGUMENTS_STACK_LEN], // 192B
|
||||
len: 0,
|
||||
max_len: MAX_ARGUMENTS_STACK_LEN,
|
||||
};
|
||||
let mut parser = CharStringParser {
|
||||
stack,
|
||||
builder: &mut inner_builder,
|
||||
x: 0.0,
|
||||
y: 0.0,
|
||||
has_move_to: false,
|
||||
is_first_move_to: true,
|
||||
};
|
||||
_parse_char_string(&mut ctx, data, 0, &mut parser)?;
|
||||
|
||||
if !ctx.has_endchar {
|
||||
return Err(CFFError::MissingEndChar);
|
||||
}
|
||||
|
||||
let bbox = parser.builder.bbox;
|
||||
|
||||
// Check that bbox was changed.
|
||||
if bbox.is_default() {
|
||||
return Err(CFFError::ZeroBBox);
|
||||
}
|
||||
|
||||
bbox.to_rect().ok_or(CFFError::BboxOverflow)
|
||||
}
|
||||
|
||||
|
||||
fn _parse_char_string(
|
||||
ctx: &mut CharStringParserContext,
|
||||
char_string: &[u8],
|
||||
depth: u8,
|
||||
p: &mut CharStringParser,
|
||||
) -> Result<(), CFFError> {
|
||||
let mut s = Stream::new(char_string);
|
||||
while !s.at_end() {
|
||||
let op = s.read::<u8>().ok_or(CFFError::ReadOutOfBounds)?;
|
||||
match op {
|
||||
0 | 2 | 9 | 13 | 15 | 16 | 17 => {
|
||||
// Reserved.
|
||||
return Err(CFFError::InvalidOperator);
|
||||
}
|
||||
operator::HORIZONTAL_STEM |
|
||||
operator::VERTICAL_STEM |
|
||||
operator::HORIZONTAL_STEM_HINT_MASK |
|
||||
operator::VERTICAL_STEM_HINT_MASK => {
|
||||
// y dy {dya dyb}* hstem
|
||||
// x dx {dxa dxb}* vstem
|
||||
// y dy {dya dyb}* hstemhm
|
||||
// x dx {dxa dxb}* vstemhm
|
||||
|
||||
// If the stack length is uneven, than the first value is a `width`.
|
||||
let len = if p.stack.len().is_odd() && !ctx.width_parsed {
|
||||
ctx.width_parsed = true;
|
||||
p.stack.len() - 1
|
||||
} else {
|
||||
p.stack.len()
|
||||
};
|
||||
|
||||
ctx.stems_len += len as u32 >> 1;
|
||||
|
||||
// We are ignoring the hint operators.
|
||||
p.stack.clear();
|
||||
}
|
||||
operator::VERTICAL_MOVE_TO => {
|
||||
let mut i = 0;
|
||||
if p.stack.len() == 2 && !ctx.width_parsed {
|
||||
i += 1;
|
||||
ctx.width_parsed = true;
|
||||
}
|
||||
|
||||
p.parse_vertical_move_to(i)?;
|
||||
}
|
||||
operator::LINE_TO => {
|
||||
p.parse_line_to()?;
|
||||
}
|
||||
operator::HORIZONTAL_LINE_TO => {
|
||||
p.parse_horizontal_line_to()?;
|
||||
}
|
||||
operator::VERTICAL_LINE_TO => {
|
||||
p.parse_vertical_line_to()?;
|
||||
}
|
||||
operator::CURVE_TO => {
|
||||
p.parse_curve_to()?;
|
||||
}
|
||||
operator::CALL_LOCAL_SUBROUTINE => {
|
||||
if p.stack.is_empty() {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
if depth == STACK_LIMIT {
|
||||
return Err(CFFError::NestingLimitReached);
|
||||
}
|
||||
|
||||
// Parse and remember the local subroutine for the current glyph.
|
||||
// Since it's a pretty complex task, we're doing it only when
|
||||
// a local subroutine is actually requested by the glyphs charstring.
|
||||
if ctx.local_subrs.is_none() {
|
||||
if let FontKind::CID(ref cid) = ctx.metadata.kind {
|
||||
ctx.local_subrs = parse_cid_local_subrs(
|
||||
ctx.metadata.table_data, ctx.glyph_id, cid
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(local_subrs) = ctx.local_subrs {
|
||||
let subroutine_bias = calc_subroutine_bias(local_subrs.len());
|
||||
let index = conv_subroutine_index(p.stack.pop(), subroutine_bias)?;
|
||||
let char_string = local_subrs.get(index).ok_or(CFFError::InvalidSubroutineIndex)?;
|
||||
_parse_char_string(ctx, char_string, depth + 1, p)?;
|
||||
} else {
|
||||
return Err(CFFError::NoLocalSubroutines);
|
||||
}
|
||||
|
||||
if ctx.has_endchar && !ctx.has_seac {
|
||||
if !s.at_end() {
|
||||
return Err(CFFError::DataAfterEndChar);
|
||||
}
|
||||
|
||||
break;
|
||||
}
|
||||
}
|
||||
operator::RETURN => {
|
||||
break;
|
||||
}
|
||||
TWO_BYTE_OPERATOR_MARK => {
|
||||
// flex
|
||||
let op2 = s.read::<u8>().ok_or(CFFError::ReadOutOfBounds)?;
|
||||
match op2 {
|
||||
operator::HFLEX => p.parse_hflex()?,
|
||||
operator::FLEX => p.parse_flex()?,
|
||||
operator::HFLEX1 => p.parse_hflex1()?,
|
||||
operator::FLEX1 => p.parse_flex1()?,
|
||||
_ => return Err(CFFError::UnsupportedOperator),
|
||||
}
|
||||
}
|
||||
operator::ENDCHAR => {
|
||||
if p.stack.len() == 4 || (!ctx.width_parsed && p.stack.len() == 5) {
|
||||
// Process 'seac'.
|
||||
let accent_char = seac_code_to_glyph_id(&ctx.metadata.charset, p.stack.pop())
|
||||
.ok_or(CFFError::InvalidSeacCode)?;
|
||||
let base_char = seac_code_to_glyph_id(&ctx.metadata.charset, p.stack.pop())
|
||||
.ok_or(CFFError::InvalidSeacCode)?;
|
||||
let dy = p.stack.pop();
|
||||
let dx = p.stack.pop();
|
||||
|
||||
if !ctx.width_parsed && !p.stack.is_empty() {
|
||||
p.stack.pop();
|
||||
ctx.width_parsed = true;
|
||||
}
|
||||
|
||||
ctx.has_seac = true;
|
||||
|
||||
if depth == STACK_LIMIT {
|
||||
return Err(CFFError::NestingLimitReached);
|
||||
}
|
||||
|
||||
let base_char_string = ctx.metadata.char_strings.get(u32::from(base_char.0))
|
||||
.ok_or(CFFError::InvalidSeacCode)?;
|
||||
_parse_char_string(ctx, base_char_string, depth + 1, p)?;
|
||||
p.x = dx;
|
||||
p.y = dy;
|
||||
|
||||
let accent_char_string = ctx.metadata.char_strings.get(u32::from(accent_char.0))
|
||||
.ok_or(CFFError::InvalidSeacCode)?;
|
||||
_parse_char_string(ctx, accent_char_string, depth + 1, p)?;
|
||||
} else if p.stack.len() == 1 && !ctx.width_parsed {
|
||||
p.stack.pop();
|
||||
ctx.width_parsed = true;
|
||||
}
|
||||
|
||||
if !p.is_first_move_to {
|
||||
p.is_first_move_to = true;
|
||||
p.builder.close();
|
||||
}
|
||||
|
||||
if !s.at_end() {
|
||||
return Err(CFFError::DataAfterEndChar);
|
||||
}
|
||||
|
||||
ctx.has_endchar = true;
|
||||
|
||||
break;
|
||||
}
|
||||
operator::HINT_MASK | operator::COUNTER_MASK => {
|
||||
let mut len = p.stack.len();
|
||||
|
||||
// We are ignoring the hint operators.
|
||||
p.stack.clear();
|
||||
|
||||
// If the stack length is uneven, than the first value is a `width`.
|
||||
if len.is_odd() && !ctx.width_parsed {
|
||||
len -= 1;
|
||||
ctx.width_parsed = true;
|
||||
}
|
||||
|
||||
ctx.stems_len += len as u32 >> 1;
|
||||
|
||||
s.advance(usize::num_from((ctx.stems_len + 7) >> 3));
|
||||
}
|
||||
operator::MOVE_TO => {
|
||||
let mut i = 0;
|
||||
if p.stack.len() == 3 && !ctx.width_parsed {
|
||||
i += 1;
|
||||
ctx.width_parsed = true;
|
||||
}
|
||||
|
||||
p.parse_move_to(i)?;
|
||||
}
|
||||
operator::HORIZONTAL_MOVE_TO => {
|
||||
let mut i = 0;
|
||||
if p.stack.len() == 2 && !ctx.width_parsed {
|
||||
i += 1;
|
||||
ctx.width_parsed = true;
|
||||
}
|
||||
|
||||
p.parse_horizontal_move_to(i)?;
|
||||
}
|
||||
operator::CURVE_LINE => {
|
||||
p.parse_curve_line()?;
|
||||
}
|
||||
operator::LINE_CURVE => {
|
||||
p.parse_line_curve()?;
|
||||
}
|
||||
operator::VV_CURVE_TO => {
|
||||
p.parse_vv_curve_to()?;
|
||||
}
|
||||
operator::HH_CURVE_TO => {
|
||||
p.parse_hh_curve_to()?;
|
||||
}
|
||||
operator::SHORT_INT => {
|
||||
let n = s.read::<i16>().ok_or(CFFError::ReadOutOfBounds)?;
|
||||
p.stack.push(f32::from(n))?;
|
||||
}
|
||||
operator::CALL_GLOBAL_SUBROUTINE => {
|
||||
if p.stack.is_empty() {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
if depth == STACK_LIMIT {
|
||||
return Err(CFFError::NestingLimitReached);
|
||||
}
|
||||
|
||||
let subroutine_bias = calc_subroutine_bias(ctx.metadata.global_subrs.len());
|
||||
let index = conv_subroutine_index(p.stack.pop(), subroutine_bias)?;
|
||||
let char_string = ctx.metadata.global_subrs.get(index)
|
||||
.ok_or(CFFError::InvalidSubroutineIndex)?;
|
||||
_parse_char_string(ctx, char_string, depth + 1, p)?;
|
||||
|
||||
if ctx.has_endchar && !ctx.has_seac {
|
||||
if !s.at_end() {
|
||||
return Err(CFFError::DataAfterEndChar);
|
||||
}
|
||||
|
||||
break;
|
||||
}
|
||||
}
|
||||
operator::VH_CURVE_TO => {
|
||||
p.parse_vh_curve_to()?;
|
||||
}
|
||||
operator::HV_CURVE_TO => {
|
||||
p.parse_hv_curve_to()?;
|
||||
}
|
||||
32..=246 => {
|
||||
p.parse_int1(op)?;
|
||||
}
|
||||
247..=250 => {
|
||||
p.parse_int2(op, &mut s)?;
|
||||
}
|
||||
251..=254 => {
|
||||
p.parse_int3(op, &mut s)?;
|
||||
}
|
||||
operator::FIXED_16_16 => {
|
||||
p.parse_fixed(&mut s)?;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// TODO: 'A charstring subroutine must end with either an endchar or a return operator.'
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn seac_code_to_glyph_id(charset: &Charset, n: f32) -> Option<GlyphId> {
|
||||
let code = u8::try_num_from(n)?;
|
||||
|
||||
let sid = STANDARD_ENCODING[code as usize];
|
||||
let sid = StringId(u16::from(sid));
|
||||
|
||||
match charset {
|
||||
Charset::ISOAdobe => {
|
||||
// ISO Adobe charset only defines string ids up to 228 (zcaron)
|
||||
if code <= 228 { Some(GlyphId(sid.0)) } else { None }
|
||||
}
|
||||
Charset::Expert | Charset::ExpertSubset => None,
|
||||
_ => charset.sid_to_gid(sid),
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
enum FDSelect<'a> {
|
||||
Format0(LazyArray16<'a, u8>),
|
||||
Format3(&'a [u8]), // It's easier to parse it in-place.
|
||||
}
|
||||
|
||||
impl Default for FDSelect<'_> {
|
||||
fn default() -> Self {
|
||||
FDSelect::Format0(LazyArray16::default())
|
||||
}
|
||||
}
|
||||
|
||||
impl FDSelect<'_> {
|
||||
fn font_dict_index(&self, glyph_id: GlyphId) -> Option<u8> {
|
||||
match self {
|
||||
FDSelect::Format0(ref array) => array.get(glyph_id.0),
|
||||
FDSelect::Format3(ref data) => {
|
||||
let mut s = Stream::new(data);
|
||||
let number_of_ranges = s.read::<u16>()?;
|
||||
if number_of_ranges == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
// 'A sentinel GID follows the last range element and serves
|
||||
// to delimit the last range in the array.'
|
||||
// So we can simply increase the number of ranges by one.
|
||||
let number_of_ranges = number_of_ranges.checked_add(1)?;
|
||||
|
||||
// Range is: GlyphId + u8
|
||||
let mut prev_first_glyph = s.read::<GlyphId>()?;
|
||||
let mut prev_index = s.read::<u8>()?;
|
||||
for _ in 1..number_of_ranges {
|
||||
let curr_first_glyph = s.read::<GlyphId>()?;
|
||||
if (prev_first_glyph..curr_first_glyph).contains(&glyph_id) {
|
||||
return Some(prev_index);
|
||||
} else {
|
||||
prev_index = s.read::<u8>()?;
|
||||
}
|
||||
|
||||
prev_first_glyph = curr_first_glyph;
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn parse_fd_select<'a>(number_of_glyphs: u16, s: &mut Stream<'a>) -> Option<FDSelect<'a>> {
|
||||
let format = s.read::<u8>()?;
|
||||
match format {
|
||||
0 => Some(FDSelect::Format0(s.read_array16::<u8>(number_of_glyphs)?)),
|
||||
3 => Some(FDSelect::Format3(s.tail()?)),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
fn parse_sid_metadata(data: &[u8], top_dict: TopDict) -> Option<FontKind> {
|
||||
let subroutines_offset = if let Some(range) = top_dict.private_dict_range.clone() {
|
||||
parse_private_dict(data.get(range)?)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
// Parse Global Subroutines INDEX.
|
||||
let mut metadata = SIDMetadata::default();
|
||||
|
||||
match (top_dict.private_dict_range, subroutines_offset) {
|
||||
(Some(private_dict_range), Some(subroutines_offset)) => {
|
||||
// 'The local subroutines offset is relative to the beginning
|
||||
// of the Private DICT data.'
|
||||
if let Some(start) = private_dict_range.start.checked_add(subroutines_offset) {
|
||||
let data = data.get(start..data.len())?;
|
||||
let mut s = Stream::new(data);
|
||||
metadata.local_subrs = parse_index::<u16>(&mut s)?;
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
|
||||
Some(FontKind::SID(metadata))
|
||||
}
|
||||
|
||||
fn parse_cid_metadata(data: &[u8], top_dict: TopDict, number_of_glyphs: u16) -> Option<FontKind> {
|
||||
let (charset_offset, fd_array_offset, fd_select_offset) =
|
||||
match (top_dict.charset_offset, top_dict.fd_array_offset, top_dict.fd_select_offset) {
|
||||
(Some(a), Some(b), Some(c)) => (a, b, c),
|
||||
_ => return None, // charset, FDArray and FDSelect must be set.
|
||||
};
|
||||
|
||||
if charset_offset <= charset_id::EXPERT_SUBSET {
|
||||
// 'There are no predefined charsets for CID fonts.'
|
||||
// Adobe Technical Note #5176, chapter 18 CID-keyed Fonts
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut metadata = CIDMetadata::default();
|
||||
|
||||
metadata.fd_array = {
|
||||
let mut s = Stream::new_at(data, fd_array_offset)?;
|
||||
parse_index::<u16>(&mut s)?
|
||||
};
|
||||
|
||||
metadata.fd_select = {
|
||||
let mut s = Stream::new_at(data, fd_select_offset)?;
|
||||
parse_fd_select(number_of_glyphs, &mut s)?
|
||||
};
|
||||
|
||||
Some(FontKind::CID(metadata))
|
||||
}
|
||||
|
||||
|
||||
/// A [Compact Font Format Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/cff).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Table<'a> {
|
||||
// The whole CFF table.
|
||||
// Used to resolve a local subroutine in a CID font.
|
||||
table_data: &'a [u8],
|
||||
|
||||
#[allow(dead_code)] strings: Index<'a>,
|
||||
global_subrs: Index<'a>,
|
||||
charset: Charset<'a>,
|
||||
char_strings: Index<'a>,
|
||||
kind: FontKind<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
// Parse Header.
|
||||
let major = s.read::<u8>()?;
|
||||
s.skip::<u8>(); // minor
|
||||
let header_size = s.read::<u8>()?;
|
||||
s.skip::<u8>(); // Absolute offset
|
||||
|
||||
if major != 1 {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Jump to Name INDEX. It's not necessarily right after the header.
|
||||
if header_size > 4 {
|
||||
s.advance(usize::from(header_size) - 4);
|
||||
}
|
||||
|
||||
// Skip Name INDEX.
|
||||
skip_index::<u16>(&mut s)?;
|
||||
|
||||
let top_dict = parse_top_dict(&mut s)?;
|
||||
|
||||
// Must be set, otherwise there are nothing to parse.
|
||||
if top_dict.char_strings_offset == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
// String INDEX.
|
||||
let strings = parse_index::<u16>(&mut s)?;
|
||||
|
||||
// Parse Global Subroutines INDEX.
|
||||
let global_subrs = parse_index::<u16>(&mut s)?;
|
||||
|
||||
let char_strings = {
|
||||
let mut s = Stream::new_at(data, top_dict.char_strings_offset)?;
|
||||
parse_index::<u16>(&mut s)?
|
||||
};
|
||||
|
||||
if char_strings.len() == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
// 'The number of glyphs is the value of the count field in the CharStrings INDEX.'
|
||||
let number_of_glyphs = u16::try_from(char_strings.len()).ok()?;
|
||||
|
||||
let charset = match top_dict.charset_offset {
|
||||
Some(charset_id::ISO_ADOBE) => Charset::ISOAdobe,
|
||||
Some(charset_id::EXPERT) => Charset::Expert,
|
||||
Some(charset_id::EXPERT_SUBSET) => Charset::ExpertSubset,
|
||||
Some(offset) => parse_charset(number_of_glyphs, &mut Stream::new_at(data, offset)?)?,
|
||||
None => Charset::ISOAdobe, // default
|
||||
};
|
||||
|
||||
let kind = if top_dict.has_ros {
|
||||
parse_cid_metadata(data, top_dict, number_of_glyphs)?
|
||||
} else {
|
||||
parse_sid_metadata(data, top_dict)?
|
||||
};
|
||||
|
||||
Some(Self {
|
||||
table_data: data,
|
||||
strings,
|
||||
global_subrs,
|
||||
charset,
|
||||
char_strings,
|
||||
kind,
|
||||
})
|
||||
}
|
||||
|
||||
/// Outlines a glyph.
|
||||
pub fn outline(
|
||||
&self,
|
||||
glyph_id: GlyphId,
|
||||
builder: &mut dyn OutlineBuilder,
|
||||
) -> Result<Rect, CFFError> {
|
||||
let data = self.char_strings.get(u32::from(glyph_id.0)).ok_or(CFFError::NoGlyph)?;
|
||||
parse_char_string(data, self, glyph_id, builder)
|
||||
}
|
||||
|
||||
/// Returns a glyph name.
|
||||
#[cfg(feature = "glyph-names")]
|
||||
pub fn glyph_name(&self, glyph_id: GlyphId) -> Option<&'a str> {
|
||||
match self.kind {
|
||||
FontKind::SID(_) => {
|
||||
let sid = self.charset.gid_to_sid(glyph_id)?;
|
||||
let sid = usize::from(sid.0);
|
||||
match STANDARD_NAMES.get(sid) {
|
||||
Some(name) => Some(name),
|
||||
None => {
|
||||
let idx = u32::try_from(sid - STANDARD_NAMES.len()).ok()?;
|
||||
let name = self.strings.get(idx)?;
|
||||
core::str::from_utf8(name).ok()
|
||||
}
|
||||
}
|
||||
}
|
||||
FontKind::CID(_) => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Table<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Table {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,549 @@
|
|||
//! A [Compact Font Format 2 Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/cff2) implementation.
|
||||
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/cff2charstr
|
||||
|
||||
use core::convert::TryFrom;
|
||||
use core::ops::Range;
|
||||
|
||||
use crate::{GlyphId, OutlineBuilder, Rect, BBox, NormalizedCoordinate};
|
||||
use crate::parser::{Stream, NumFrom, TryNumFrom};
|
||||
use crate::var_store::*;
|
||||
use super::{Builder, CFFError, calc_subroutine_bias, conv_subroutine_index};
|
||||
use super::argstack::ArgumentsStack;
|
||||
use super::charstring::CharStringParser;
|
||||
use super::dict::DictionaryParser;
|
||||
use super::index::{Index, parse_index};
|
||||
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/cff2#7-top-dict-data
|
||||
// 'Operators in DICT may be preceded by up to a maximum of 513 operands.'
|
||||
const MAX_OPERANDS_LEN: usize = 513;
|
||||
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/cff2charstr#appendix-b-cff2-charstring-implementation-limits
|
||||
const STACK_LIMIT: u8 = 10;
|
||||
const MAX_ARGUMENTS_STACK_LEN: usize = 513;
|
||||
|
||||
const TWO_BYTE_OPERATOR_MARK: u8 = 12;
|
||||
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/cff2charstr#4-charstring-operators
|
||||
mod operator {
|
||||
pub const HORIZONTAL_STEM: u8 = 1;
|
||||
pub const VERTICAL_STEM: u8 = 3;
|
||||
pub const VERTICAL_MOVE_TO: u8 = 4;
|
||||
pub const LINE_TO: u8 = 5;
|
||||
pub const HORIZONTAL_LINE_TO: u8 = 6;
|
||||
pub const VERTICAL_LINE_TO: u8 = 7;
|
||||
pub const CURVE_TO: u8 = 8;
|
||||
pub const CALL_LOCAL_SUBROUTINE: u8 = 10;
|
||||
pub const VS_INDEX: u8 = 15;
|
||||
pub const BLEND: u8 = 16;
|
||||
pub const HORIZONTAL_STEM_HINT_MASK: u8 = 18;
|
||||
pub const HINT_MASK: u8 = 19;
|
||||
pub const COUNTER_MASK: u8 = 20;
|
||||
pub const MOVE_TO: u8 = 21;
|
||||
pub const HORIZONTAL_MOVE_TO: u8 = 22;
|
||||
pub const VERTICAL_STEM_HINT_MASK: u8 = 23;
|
||||
pub const CURVE_LINE: u8 = 24;
|
||||
pub const LINE_CURVE: u8 = 25;
|
||||
pub const VV_CURVE_TO: u8 = 26;
|
||||
pub const HH_CURVE_TO: u8 = 27;
|
||||
pub const SHORT_INT: u8 = 28;
|
||||
pub const CALL_GLOBAL_SUBROUTINE: u8 = 29;
|
||||
pub const VH_CURVE_TO: u8 = 30;
|
||||
pub const HV_CURVE_TO: u8 = 31;
|
||||
pub const HFLEX: u8 = 34;
|
||||
pub const FLEX: u8 = 35;
|
||||
pub const HFLEX1: u8 = 36;
|
||||
pub const FLEX1: u8 = 37;
|
||||
pub const FIXED_16_16: u8 = 255;
|
||||
}
|
||||
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/cff2#table-9-top-dict-operator-entries
|
||||
mod top_dict_operator {
|
||||
pub const CHAR_STRINGS_OFFSET: u16 = 17;
|
||||
pub const VARIATION_STORE_OFFSET: u16 = 24;
|
||||
pub const FONT_DICT_INDEX_OFFSET: u16 = 1236;
|
||||
}
|
||||
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/cff2#table-10-font-dict-operator-entries
|
||||
mod font_dict_operator {
|
||||
pub const PRIVATE_DICT_SIZE_AND_OFFSET: u16 = 18;
|
||||
}
|
||||
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/cff2#table-16-private-dict-operators
|
||||
mod private_dict_operator {
|
||||
pub const LOCAL_SUBROUTINES_OFFSET: u16 = 19;
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Default)]
|
||||
struct TopDictData {
|
||||
char_strings_offset: usize,
|
||||
font_dict_index_offset: Option<usize>,
|
||||
variation_store_offset: Option<usize>,
|
||||
}
|
||||
|
||||
fn parse_top_dict(data: &[u8]) -> Option<TopDictData> {
|
||||
let mut dict_data = TopDictData::default();
|
||||
|
||||
let mut operands_buffer = [0; MAX_OPERANDS_LEN];
|
||||
let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer);
|
||||
while let Some(operator) = dict_parser.parse_next() {
|
||||
if operator.get() == top_dict_operator::CHAR_STRINGS_OFFSET {
|
||||
dict_data.char_strings_offset = dict_parser.parse_offset()?;
|
||||
} else if operator.get() == top_dict_operator::FONT_DICT_INDEX_OFFSET {
|
||||
dict_data.font_dict_index_offset = dict_parser.parse_offset();
|
||||
} else if operator.get() == top_dict_operator::VARIATION_STORE_OFFSET {
|
||||
dict_data.variation_store_offset = dict_parser.parse_offset();
|
||||
}
|
||||
}
|
||||
|
||||
// Must be set, otherwise there are nothing to parse.
|
||||
if dict_data.char_strings_offset == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
Some(dict_data)
|
||||
}
|
||||
|
||||
fn parse_font_dict(data: &[u8]) -> Option<Range<usize>> {
|
||||
let mut private_dict_range = None;
|
||||
|
||||
let mut operands_buffer = [0; MAX_OPERANDS_LEN];
|
||||
let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer);
|
||||
while let Some(operator) = dict_parser.parse_next() {
|
||||
if operator.get() == font_dict_operator::PRIVATE_DICT_SIZE_AND_OFFSET {
|
||||
dict_parser.parse_operands()?;
|
||||
let operands = dict_parser.operands();
|
||||
|
||||
if operands.len() == 2 {
|
||||
let len = usize::try_from(operands[0]).ok()?;
|
||||
let start = usize::try_from(operands[1]).ok()?;
|
||||
let end = start.checked_add(len)?;
|
||||
private_dict_range = Some(start..end);
|
||||
}
|
||||
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
private_dict_range
|
||||
}
|
||||
|
||||
fn parse_private_dict(data: &[u8]) -> Option<usize> {
|
||||
let mut subroutines_offset = None;
|
||||
let mut operands_buffer = [0; MAX_OPERANDS_LEN];
|
||||
let mut dict_parser = DictionaryParser::new(data, &mut operands_buffer);
|
||||
while let Some(operator) = dict_parser.parse_next() {
|
||||
if operator.get() == private_dict_operator::LOCAL_SUBROUTINES_OFFSET {
|
||||
dict_parser.parse_operands()?;
|
||||
let operands = dict_parser.operands();
|
||||
|
||||
if operands.len() == 1 {
|
||||
subroutines_offset = usize::try_from(operands[0]).ok();
|
||||
}
|
||||
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
subroutines_offset
|
||||
}
|
||||
|
||||
|
||||
/// CFF2 allows up to 65535 scalars, but an average font will have 3-5.
|
||||
/// So 64 is more than enough.
|
||||
const SCALARS_MAX: u8 = 64;
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
pub(crate) struct Scalars {
|
||||
d: [f32; SCALARS_MAX as usize], // 256B
|
||||
len: u8,
|
||||
}
|
||||
|
||||
impl Default for Scalars {
|
||||
fn default() -> Self {
|
||||
Scalars {
|
||||
d: [0.0; SCALARS_MAX as usize],
|
||||
len: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Scalars {
|
||||
pub fn len(&self) -> u8 {
|
||||
self.len
|
||||
}
|
||||
|
||||
pub fn clear(&mut self) {
|
||||
self.len = 0;
|
||||
}
|
||||
|
||||
pub fn at(&self, i: u8) -> f32 {
|
||||
if i < self.len {
|
||||
self.d[usize::from(i)]
|
||||
} else {
|
||||
0.0
|
||||
}
|
||||
}
|
||||
|
||||
pub fn push(&mut self, n: f32) -> Option<()> {
|
||||
if self.len < SCALARS_MAX {
|
||||
self.d[usize::from(self.len)] = n;
|
||||
self.len += 1;
|
||||
Some(())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
struct CharStringParserContext<'a> {
|
||||
metadata: &'a Table<'a>,
|
||||
coordinates: &'a [NormalizedCoordinate],
|
||||
scalars: Scalars,
|
||||
had_vsindex: bool,
|
||||
had_blend: bool,
|
||||
stems_len: u32,
|
||||
}
|
||||
|
||||
impl CharStringParserContext<'_> {
|
||||
fn update_scalars(&mut self, index: u16) -> Result<(), CFFError> {
|
||||
self.scalars.clear();
|
||||
|
||||
let indices = self.metadata.item_variation_store.region_indices(index)
|
||||
.ok_or(CFFError::InvalidItemVariationDataIndex)?;
|
||||
for index in indices {
|
||||
let scalar = self.metadata.item_variation_store.regions
|
||||
.evaluate_region(index, self.coordinates);
|
||||
self.scalars.push(scalar)
|
||||
.ok_or(CFFError::BlendRegionsLimitReached)?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
fn parse_char_string(
|
||||
data: &[u8],
|
||||
metadata: &Table,
|
||||
coordinates: &[NormalizedCoordinate],
|
||||
builder: &mut dyn OutlineBuilder,
|
||||
) -> Result<Rect, CFFError> {
|
||||
let mut ctx = CharStringParserContext {
|
||||
metadata,
|
||||
coordinates,
|
||||
scalars: Scalars::default(),
|
||||
had_vsindex: false,
|
||||
had_blend: false,
|
||||
stems_len: 0,
|
||||
};
|
||||
|
||||
// Load scalars at default index.
|
||||
ctx.update_scalars(0)?;
|
||||
|
||||
let mut inner_builder = Builder {
|
||||
builder,
|
||||
bbox: BBox::new(),
|
||||
};
|
||||
|
||||
let stack = ArgumentsStack {
|
||||
data: &mut [0.0; MAX_ARGUMENTS_STACK_LEN], // 2052B
|
||||
len: 0,
|
||||
max_len: MAX_ARGUMENTS_STACK_LEN,
|
||||
};
|
||||
let mut parser = CharStringParser {
|
||||
stack,
|
||||
builder: &mut inner_builder,
|
||||
x: 0.0,
|
||||
y: 0.0,
|
||||
has_move_to: false,
|
||||
is_first_move_to: true,
|
||||
};
|
||||
_parse_char_string(&mut ctx, data, 0, &mut parser)?;
|
||||
// let _ = _parse_char_string(&mut ctx, data, 0.0, 0.0, &mut stack, 0, &mut inner_builder)?;
|
||||
|
||||
let bbox = parser.builder.bbox;
|
||||
|
||||
// Check that bbox was changed.
|
||||
if bbox.is_default() {
|
||||
return Err(CFFError::ZeroBBox);
|
||||
}
|
||||
|
||||
bbox.to_rect().ok_or(CFFError::BboxOverflow)
|
||||
}
|
||||
|
||||
fn _parse_char_string(
|
||||
ctx: &mut CharStringParserContext,
|
||||
char_string: &[u8],
|
||||
depth: u8,
|
||||
p: &mut CharStringParser,
|
||||
) -> Result<(), CFFError> {
|
||||
let mut s = Stream::new(char_string);
|
||||
while !s.at_end() {
|
||||
let op = s.read::<u8>().ok_or(CFFError::ReadOutOfBounds)?;
|
||||
match op {
|
||||
0 | 2 | 9 | 11 | 13 | 14 | 17 => {
|
||||
// Reserved.
|
||||
return Err(CFFError::InvalidOperator);
|
||||
}
|
||||
operator::HORIZONTAL_STEM |
|
||||
operator::VERTICAL_STEM |
|
||||
operator::HORIZONTAL_STEM_HINT_MASK |
|
||||
operator::VERTICAL_STEM_HINT_MASK => {
|
||||
// y dy {dya dyb}* hstem
|
||||
// x dx {dxa dxb}* vstem
|
||||
// y dy {dya dyb}* hstemhm
|
||||
// x dx {dxa dxb}* vstemhm
|
||||
|
||||
ctx.stems_len += p.stack.len() as u32 >> 1;
|
||||
|
||||
// We are ignoring the hint operators.
|
||||
p.stack.clear();
|
||||
}
|
||||
operator::VERTICAL_MOVE_TO => {
|
||||
p.parse_vertical_move_to(0)?;
|
||||
}
|
||||
operator::LINE_TO => {
|
||||
p.parse_line_to()?;
|
||||
}
|
||||
operator::HORIZONTAL_LINE_TO => {
|
||||
p.parse_horizontal_line_to()?;
|
||||
}
|
||||
operator::VERTICAL_LINE_TO => {
|
||||
p.parse_vertical_line_to()?;
|
||||
}
|
||||
operator::CURVE_TO => {
|
||||
p.parse_curve_to()?;
|
||||
}
|
||||
operator::CALL_LOCAL_SUBROUTINE => {
|
||||
if p.stack.is_empty() {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
if depth == STACK_LIMIT {
|
||||
return Err(CFFError::NestingLimitReached);
|
||||
}
|
||||
|
||||
let subroutine_bias = calc_subroutine_bias(ctx.metadata.local_subrs.len());
|
||||
let index = conv_subroutine_index(p.stack.pop(), subroutine_bias)?;
|
||||
let char_string = ctx.metadata.local_subrs.get(index)
|
||||
.ok_or(CFFError::InvalidSubroutineIndex)?;
|
||||
_parse_char_string(ctx, char_string, depth + 1, p)?;
|
||||
}
|
||||
TWO_BYTE_OPERATOR_MARK => {
|
||||
// flex
|
||||
let op2 = s.read::<u8>().ok_or(CFFError::ReadOutOfBounds)?;
|
||||
match op2 {
|
||||
operator::HFLEX => p.parse_hflex()?,
|
||||
operator::FLEX => p.parse_flex()?,
|
||||
operator::HFLEX1 => p.parse_hflex1()?,
|
||||
operator::FLEX1 => p.parse_flex1()?,
|
||||
_ => return Err(CFFError::UnsupportedOperator),
|
||||
}
|
||||
}
|
||||
operator::VS_INDEX => {
|
||||
// |- ivs vsindex (15) |-
|
||||
|
||||
// `vsindex` must precede the first `blend` operator, and may occur only once.
|
||||
if ctx.had_blend || ctx.had_vsindex {
|
||||
// TODO: maybe add a custom error
|
||||
return Err(CFFError::InvalidOperator);
|
||||
}
|
||||
|
||||
if p.stack.len() != 1 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let index = u16::try_num_from(p.stack.pop())
|
||||
.ok_or(CFFError::InvalidItemVariationDataIndex)?;
|
||||
ctx.update_scalars(index)?;
|
||||
|
||||
ctx.had_vsindex = true;
|
||||
|
||||
p.stack.clear();
|
||||
}
|
||||
operator::BLEND => {
|
||||
// num(0)..num(n-1), delta(0,0)..delta(k-1,0),
|
||||
// delta(0,1)..delta(k-1,1) .. delta(0,n-1)..delta(k-1,n-1)
|
||||
// n blend (16) val(0)..val(n-1)
|
||||
|
||||
ctx.had_blend = true;
|
||||
|
||||
let n = u16::try_num_from(p.stack.pop())
|
||||
.ok_or(CFFError::InvalidNumberOfBlendOperands)?;
|
||||
let k = ctx.scalars.len();
|
||||
|
||||
let len = usize::from(n) * (usize::from(k) + 1);
|
||||
if p.stack.len() < len {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let start = p.stack.len() - len;
|
||||
for i in (0..n).rev() {
|
||||
for j in 0..k {
|
||||
let delta = p.stack.pop();
|
||||
p.stack.data[start + usize::from(i)] += delta * ctx.scalars.at(k - j - 1);
|
||||
}
|
||||
}
|
||||
}
|
||||
operator::HINT_MASK | operator::COUNTER_MASK => {
|
||||
ctx.stems_len += p.stack.len() as u32 >> 1;
|
||||
s.advance(usize::num_from((ctx.stems_len + 7) >> 3));
|
||||
|
||||
// We are ignoring the hint operators.
|
||||
p.stack.clear();
|
||||
}
|
||||
operator::MOVE_TO => {
|
||||
p.parse_move_to(0)?;
|
||||
}
|
||||
operator::HORIZONTAL_MOVE_TO => {
|
||||
p.parse_horizontal_move_to(0)?;
|
||||
}
|
||||
operator::CURVE_LINE => {
|
||||
p.parse_curve_line()?;
|
||||
}
|
||||
operator::LINE_CURVE => {
|
||||
p.parse_line_curve()?;
|
||||
}
|
||||
operator::VV_CURVE_TO => {
|
||||
p.parse_vv_curve_to()?;
|
||||
}
|
||||
operator::HH_CURVE_TO => {
|
||||
p.parse_hh_curve_to()?;
|
||||
}
|
||||
operator::SHORT_INT => {
|
||||
let n = s.read::<i16>().ok_or(CFFError::ReadOutOfBounds)?;
|
||||
p.stack.push(f32::from(n))?;
|
||||
}
|
||||
operator::CALL_GLOBAL_SUBROUTINE => {
|
||||
if p.stack.is_empty() {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
if depth == STACK_LIMIT {
|
||||
return Err(CFFError::NestingLimitReached);
|
||||
}
|
||||
|
||||
let subroutine_bias = calc_subroutine_bias(ctx.metadata.global_subrs.len());
|
||||
let index = conv_subroutine_index(p.stack.pop(), subroutine_bias)?;
|
||||
let char_string = ctx.metadata.global_subrs.get(index)
|
||||
.ok_or(CFFError::InvalidSubroutineIndex)?;
|
||||
_parse_char_string(ctx, char_string, depth + 1, p)?;
|
||||
}
|
||||
operator::VH_CURVE_TO => {
|
||||
p.parse_vh_curve_to()?;
|
||||
}
|
||||
operator::HV_CURVE_TO => {
|
||||
p.parse_hv_curve_to()?;
|
||||
}
|
||||
32..=246 => {
|
||||
p.parse_int1(op)?;
|
||||
}
|
||||
247..=250 => {
|
||||
p.parse_int2(op, &mut s)?;
|
||||
}
|
||||
251..=254 => {
|
||||
p.parse_int3(op, &mut s)?;
|
||||
}
|
||||
operator::FIXED_16_16 => {
|
||||
p.parse_fixed(&mut s)?;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
||||
/// A [Compact Font Format 2 Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/cff2).
|
||||
#[derive(Clone, Copy, Default)]
|
||||
pub struct Table<'a> {
|
||||
global_subrs: Index<'a>,
|
||||
local_subrs: Index<'a>,
|
||||
char_strings: Index<'a>,
|
||||
item_variation_store: ItemVariationStore<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
// Parse Header.
|
||||
let major = s.read::<u8>()?;
|
||||
s.skip::<u8>(); // minor
|
||||
let header_size = s.read::<u8>()?;
|
||||
let top_dict_length = s.read::<u16>()?;
|
||||
|
||||
if major != 2 {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Jump to Top DICT. It's not necessarily right after the header.
|
||||
if header_size > 5 {
|
||||
s.advance(usize::from(header_size) - 5);
|
||||
}
|
||||
|
||||
let top_dict_data = s.read_bytes(usize::from(top_dict_length))?;
|
||||
let top_dict = parse_top_dict(top_dict_data)?;
|
||||
|
||||
let mut metadata = Self::default();
|
||||
|
||||
// Parse Global Subroutines INDEX.
|
||||
metadata.global_subrs = parse_index::<u32>(&mut s)?;
|
||||
|
||||
metadata.char_strings = {
|
||||
let mut s = Stream::new_at(data, top_dict.char_strings_offset)?;
|
||||
parse_index::<u32>(&mut s)?
|
||||
};
|
||||
|
||||
if let Some(offset) = top_dict.variation_store_offset {
|
||||
let mut s = Stream::new_at(data, offset)?;
|
||||
s.skip::<u16>(); // length
|
||||
metadata.item_variation_store = ItemVariationStore::parse(s)?;
|
||||
}
|
||||
|
||||
// TODO: simplify
|
||||
if let Some(offset) = top_dict.font_dict_index_offset {
|
||||
let mut s = Stream::new_at(data, offset)?;
|
||||
'outer: for font_dict_data in parse_index::<u32>(&mut s)? {
|
||||
if let Some(private_dict_range) = parse_font_dict(font_dict_data) {
|
||||
// 'Private DICT size and offset, from start of the CFF2 table.'
|
||||
let private_dict_data = data.get(private_dict_range.clone())?;
|
||||
if let Some(subroutines_offset) = parse_private_dict(private_dict_data) {
|
||||
// 'The local subroutines offset is relative to the beginning
|
||||
// of the Private DICT data.'
|
||||
if let Some(start) = private_dict_range.start.checked_add(subroutines_offset) {
|
||||
let data = data.get(start..data.len())?;
|
||||
let mut s = Stream::new(data);
|
||||
metadata.local_subrs = parse_index::<u32>(&mut s)?;
|
||||
break 'outer;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Some(metadata)
|
||||
}
|
||||
|
||||
/// Outlines a glyph.
|
||||
pub fn outline(
|
||||
&self,
|
||||
coordinates: &[NormalizedCoordinate],
|
||||
glyph_id: GlyphId,
|
||||
builder: &mut dyn OutlineBuilder,
|
||||
) -> Result<Rect, CFFError> {
|
||||
let data = self.char_strings.get(u32::from(glyph_id.0)).ok_or(CFFError::NoGlyph)?;
|
||||
parse_char_string(data, self, coordinates, builder)
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Table<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Table {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,248 @@
|
|||
use crate::GlyphId;
|
||||
use crate::parser::{Stream, FromData, LazyArray16};
|
||||
use super::StringId;
|
||||
|
||||
/// The Standard Encoding as defined in the Adobe Technical Note #5176 Appendix B.
|
||||
pub const STANDARD_ENCODING: [u8;256] = [
|
||||
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
|
||||
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
|
||||
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16,
|
||||
17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32,
|
||||
33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48,
|
||||
49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64,
|
||||
65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80,
|
||||
81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 0,
|
||||
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
|
||||
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
|
||||
0, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110,
|
||||
0, 111, 112, 113, 114, 0, 115, 116, 117, 118, 119, 120, 121, 122, 0, 123,
|
||||
0, 124, 125, 126, 127, 128, 129, 130, 131, 0, 132, 133, 0, 134, 135, 136,
|
||||
137, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
|
||||
0, 138, 0, 139, 0, 0, 0, 0, 140, 141, 142, 143, 0, 0, 0, 0,
|
||||
0, 144, 0, 0, 0, 145, 0, 0, 146, 147, 148, 149, 0, 0, 0, 0,
|
||||
];
|
||||
|
||||
/// The Expert Encoding conversion as defined in the Adobe Technical Note #5176 Appendix C.
|
||||
#[cfg(feature = "glyph-names")]
|
||||
const EXPERT_ENCODING: &[u16] = &[
|
||||
0, 1, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 13, 14, 15, 99,
|
||||
239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 27, 28, 249, 250, 251, 252,
|
||||
253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 109, 110,
|
||||
267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 280, 281, 282,
|
||||
283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298,
|
||||
299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 312, 313, 314,
|
||||
315, 316, 317, 318, 158, 155, 163, 319, 320, 321, 322, 323, 324, 325, 326, 150,
|
||||
164, 169, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339, 340,
|
||||
341, 342, 343, 344, 345, 346, 347, 348, 349, 350, 351, 352, 353, 354, 355, 356,
|
||||
357, 358, 359, 360, 361, 362, 363, 364, 365, 366, 367, 368, 369, 370, 371, 372,
|
||||
373, 374, 375, 376, 377, 378,
|
||||
];
|
||||
|
||||
/// The Expert Subset Encoding conversion as defined in the Adobe Technical Note #5176 Appendix C.
|
||||
#[cfg(feature = "glyph-names")]
|
||||
const EXPERT_SUBSET_ENCODING: &[u16] = &[
|
||||
0, 1, 231, 232, 235, 236, 237, 238, 13, 14, 15, 99, 239, 240, 241, 242,
|
||||
243, 244, 245, 246, 247, 248, 27, 28, 249, 250, 251, 253, 254, 255, 256, 257,
|
||||
258, 259, 260, 261, 262, 263, 264, 265, 266, 109, 110, 267, 268, 269, 270, 272,
|
||||
300, 301, 302, 305, 314, 315, 158, 155, 163, 320, 321, 322, 323, 324, 325, 326,
|
||||
150, 164, 169, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339,
|
||||
340, 341, 342, 343, 344, 345, 346
|
||||
];
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub(crate) struct Format1Range {
|
||||
first: StringId,
|
||||
left: u8,
|
||||
}
|
||||
|
||||
impl FromData for Format1Range {
|
||||
const SIZE: usize = 3;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Format1Range {
|
||||
first: s.read::<StringId>()?,
|
||||
left: s.read::<u8>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub(crate) struct Format2Range {
|
||||
first: StringId,
|
||||
left: u16,
|
||||
}
|
||||
|
||||
impl FromData for Format2Range {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Format2Range {
|
||||
first: s.read::<StringId>()?,
|
||||
left: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub(crate) enum Charset<'a> {
|
||||
ISOAdobe,
|
||||
Expert,
|
||||
ExpertSubset,
|
||||
Format0(LazyArray16<'a, StringId>),
|
||||
Format1(LazyArray16<'a, Format1Range>),
|
||||
Format2(LazyArray16<'a, Format2Range>),
|
||||
}
|
||||
|
||||
impl Charset<'_> {
|
||||
pub fn sid_to_gid(&self, sid: StringId) -> Option<GlyphId> {
|
||||
if sid.0 == 0 {
|
||||
return Some(GlyphId(0));
|
||||
}
|
||||
|
||||
match self {
|
||||
Charset::ISOAdobe | Charset::Expert | Charset::ExpertSubset => None,
|
||||
Charset::Format0(ref array) => {
|
||||
// First glyph is omitted, so we have to add 1.
|
||||
array.into_iter().position(|n| n == sid).map(|n| GlyphId(n as u16 + 1))
|
||||
}
|
||||
Charset::Format1(array) => {
|
||||
let mut glyph_id = GlyphId(1);
|
||||
for range in *array {
|
||||
let last = u32::from(range.first.0) + u32::from(range.left);
|
||||
if range.first <= sid && u32::from(sid.0) <= last {
|
||||
glyph_id.0 += sid.0 - range.first.0;
|
||||
return Some(glyph_id)
|
||||
}
|
||||
|
||||
glyph_id.0 += u16::from(range.left) + 1;
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
Charset::Format2(array) => {
|
||||
// The same as format 1, but Range::left is u16.
|
||||
let mut glyph_id = GlyphId(1);
|
||||
for range in *array {
|
||||
let last = u32::from(range.first.0) + u32::from(range.left);
|
||||
if sid >= range.first && u32::from(sid.0) <= last {
|
||||
glyph_id.0 += sid.0 - range.first.0;
|
||||
return Some(glyph_id)
|
||||
}
|
||||
|
||||
glyph_id.0 += range.left + 1;
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(feature = "glyph-names")]
|
||||
pub fn gid_to_sid(&self, gid: GlyphId) -> Option<StringId> {
|
||||
match self {
|
||||
Charset::ISOAdobe => {
|
||||
if gid.0 <= 228 { Some(StringId(gid.0)) } else { None }
|
||||
}
|
||||
Charset::Expert => {
|
||||
EXPERT_ENCODING.get(usize::from(gid.0)).cloned().map(StringId)
|
||||
}
|
||||
Charset::ExpertSubset => {
|
||||
EXPERT_SUBSET_ENCODING.get(usize::from(gid.0)).cloned().map(StringId)
|
||||
}
|
||||
Charset::Format0(ref array) => {
|
||||
if gid.0 == 0 {
|
||||
Some(StringId(0))
|
||||
} else {
|
||||
array.get(gid.0 - 1)
|
||||
}
|
||||
}
|
||||
Charset::Format1(array) => {
|
||||
if gid.0 == 0 {
|
||||
Some(StringId(0))
|
||||
} else {
|
||||
let mut sid = gid.0 - 1;
|
||||
for range in *array {
|
||||
if sid <= u16::from(range.left) {
|
||||
sid = sid.checked_add(range.first.0)?;
|
||||
return Some(StringId(sid));
|
||||
}
|
||||
|
||||
sid = sid.checked_sub(u16::from(range.left) + 1)?;
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
}
|
||||
Charset::Format2(array) => {
|
||||
if gid.0 == 0 {
|
||||
Some(StringId(0))
|
||||
} else {
|
||||
let mut sid = gid.0 - 1;
|
||||
for range in *array {
|
||||
if sid <= range.left {
|
||||
sid = sid.checked_add(range.first.0)?;
|
||||
return Some(StringId(sid));
|
||||
}
|
||||
|
||||
sid = sid.checked_sub(range.left.checked_add(1)?)?;
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn parse_charset<'a>(number_of_glyphs: u16, s: &mut Stream<'a>) -> Option<Charset<'a>> {
|
||||
if number_of_glyphs < 2 {
|
||||
return None;
|
||||
}
|
||||
|
||||
// -1 everywhere, since `.notdef` is omitted.
|
||||
let format = s.read::<u8>()?;
|
||||
match format {
|
||||
0 => Some(Charset::Format0(s.read_array16::<StringId>(number_of_glyphs - 1)?)),
|
||||
1 => {
|
||||
// The number of ranges is not defined, so we have to
|
||||
// read until no glyphs are left.
|
||||
let mut count = 0;
|
||||
{
|
||||
let mut s = s.clone();
|
||||
let mut total_left = number_of_glyphs - 1;
|
||||
while total_left > 0 {
|
||||
s.skip::<StringId>(); // first
|
||||
let left = s.read::<u8>()?;
|
||||
total_left = total_left.checked_sub(u16::from(left) + 1)?;
|
||||
count += 1;
|
||||
}
|
||||
}
|
||||
|
||||
s.read_array16::<Format1Range>(count).map(Charset::Format1)
|
||||
}
|
||||
2 => {
|
||||
// The same as format 1, but Range::left is u16.
|
||||
let mut count = 0;
|
||||
{
|
||||
let mut s = s.clone();
|
||||
let mut total_left = number_of_glyphs - 1;
|
||||
while total_left > 0 {
|
||||
s.skip::<StringId>(); // first
|
||||
let left = s.read::<u16>()?.checked_add(1)?;
|
||||
total_left = total_left.checked_sub(left)?;
|
||||
count += 1;
|
||||
}
|
||||
}
|
||||
|
||||
s.read_array16::<Format2Range>(count).map(Charset::Format2)
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
|
@ -0,0 +1,600 @@
|
|||
use crate::parser::{Stream, Fixed};
|
||||
use super::argstack::ArgumentsStack;
|
||||
use super::{Builder, CFFError, IsEven, f32_abs};
|
||||
|
||||
pub(crate) struct CharStringParser<'a> {
|
||||
pub stack: ArgumentsStack<'a>,
|
||||
pub builder: &'a mut Builder<'a>,
|
||||
pub x: f32,
|
||||
pub y: f32,
|
||||
pub has_move_to: bool,
|
||||
pub is_first_move_to: bool,
|
||||
}
|
||||
|
||||
impl CharStringParser<'_> {
|
||||
#[inline]
|
||||
pub fn parse_move_to(&mut self, offset: usize) -> Result<(), CFFError> {
|
||||
// dx1 dy1
|
||||
|
||||
if self.stack.len() != offset + 2 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
if self.is_first_move_to {
|
||||
self.is_first_move_to = false;
|
||||
} else {
|
||||
self.builder.close();
|
||||
}
|
||||
|
||||
self.has_move_to = true;
|
||||
|
||||
self.x += self.stack.at(offset + 0);
|
||||
self.y += self.stack.at(offset + 1);
|
||||
self.builder.move_to(self.x, self.y);
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_horizontal_move_to(&mut self, offset: usize) -> Result<(), CFFError> {
|
||||
// dx1
|
||||
|
||||
if self.stack.len() != offset + 1 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
if self.is_first_move_to {
|
||||
self.is_first_move_to = false;
|
||||
} else {
|
||||
self.builder.close();
|
||||
}
|
||||
|
||||
self.has_move_to = true;
|
||||
|
||||
self.x += self.stack.at(offset);
|
||||
self.builder.move_to(self.x, self.y);
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_vertical_move_to(&mut self, offset: usize) -> Result<(), CFFError> {
|
||||
// dy1
|
||||
|
||||
if self.stack.len() != offset + 1 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
if self.is_first_move_to {
|
||||
self.is_first_move_to = false;
|
||||
} else {
|
||||
self.builder.close();
|
||||
}
|
||||
|
||||
self.has_move_to = true;
|
||||
|
||||
self.y += self.stack.at(offset);
|
||||
self.builder.move_to(self.x, self.y);
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_line_to(&mut self) -> Result<(), CFFError> {
|
||||
// {dxa dya}+
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.len().is_odd() {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let mut i = 0;
|
||||
while i < self.stack.len() {
|
||||
self.x += self.stack.at(i + 0);
|
||||
self.y += self.stack.at(i + 1);
|
||||
self.builder.line_to(self.x, self.y);
|
||||
i += 2;
|
||||
}
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_horizontal_line_to(&mut self) -> Result<(), CFFError> {
|
||||
// dx1 {dya dxb}*
|
||||
// {dxa dyb}+
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.is_empty() {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let mut i = 0;
|
||||
while i < self.stack.len() {
|
||||
self.x += self.stack.at(i);
|
||||
i += 1;
|
||||
self.builder.line_to(self.x, self.y);
|
||||
|
||||
if i == self.stack.len() {
|
||||
break;
|
||||
}
|
||||
|
||||
self.y += self.stack.at(i);
|
||||
i += 1;
|
||||
self.builder.line_to(self.x, self.y);
|
||||
}
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_vertical_line_to(&mut self) -> Result<(), CFFError> {
|
||||
// dy1 {dxa dyb}*
|
||||
// {dya dxb}+
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.is_empty() {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let mut i = 0;
|
||||
while i < self.stack.len() {
|
||||
self.y += self.stack.at(i);
|
||||
i += 1;
|
||||
self.builder.line_to(self.x, self.y);
|
||||
|
||||
if i == self.stack.len() {
|
||||
break;
|
||||
}
|
||||
|
||||
self.x += self.stack.at(i);
|
||||
i += 1;
|
||||
self.builder.line_to(self.x, self.y);
|
||||
}
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_curve_to(&mut self) -> Result<(), CFFError> {
|
||||
// {dxa dya dxb dyb dxc dyc}+
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.len() % 6 != 0 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let mut i = 0;
|
||||
while i < self.stack.len() {
|
||||
let x1 = self.x + self.stack.at(i + 0);
|
||||
let y1 = self.y + self.stack.at(i + 1);
|
||||
let x2 = x1 + self.stack.at(i + 2);
|
||||
let y2 = y1 + self.stack.at(i + 3);
|
||||
self.x = x2 + self.stack.at(i + 4);
|
||||
self.y = y2 + self.stack.at(i + 5);
|
||||
|
||||
self.builder.curve_to(x1, y1, x2, y2, self.x, self.y);
|
||||
i += 6;
|
||||
}
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_curve_line(&mut self) -> Result<(), CFFError> {
|
||||
// {dxa dya dxb dyb dxc dyc}+ dxd dyd
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.len() < 8 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
if (self.stack.len() - 2) % 6 != 0 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let mut i = 0;
|
||||
while i < self.stack.len() - 2 {
|
||||
let x1 = self.x + self.stack.at(i + 0);
|
||||
let y1 = self.y + self.stack.at(i + 1);
|
||||
let x2 = x1 + self.stack.at(i + 2);
|
||||
let y2 = y1 + self.stack.at(i + 3);
|
||||
self.x = x2 + self.stack.at(i + 4);
|
||||
self.y = y2 + self.stack.at(i + 5);
|
||||
|
||||
self.builder.curve_to(x1, y1, x2, y2, self.x, self.y);
|
||||
i += 6;
|
||||
}
|
||||
|
||||
self.x += self.stack.at(i + 0);
|
||||
self.y += self.stack.at(i + 1);
|
||||
self.builder.line_to(self.x, self.y);
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_line_curve(&mut self) -> Result<(), CFFError> {
|
||||
// {dxa dya}+ dxb dyb dxc dyc dxd dyd
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.len() < 8 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
if (self.stack.len() - 6).is_odd() {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let mut i = 0;
|
||||
while i < self.stack.len() - 6 {
|
||||
self.x += self.stack.at(i + 0);
|
||||
self.y += self.stack.at(i + 1);
|
||||
|
||||
self.builder.line_to(self.x, self.y);
|
||||
i += 2;
|
||||
}
|
||||
|
||||
let x1 = self.x + self.stack.at(i + 0);
|
||||
let y1 = self.y + self.stack.at(i + 1);
|
||||
let x2 = x1 + self.stack.at(i + 2);
|
||||
let y2 = y1 + self.stack.at(i + 3);
|
||||
self.x = x2 + self.stack.at(i + 4);
|
||||
self.y = y2 + self.stack.at(i + 5);
|
||||
self.builder.curve_to(x1, y1, x2, y2, self.x, self.y);
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_hh_curve_to(&mut self) -> Result<(), CFFError> {
|
||||
// dy1? {dxa dxb dyb dxc}+
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
let mut i = 0;
|
||||
|
||||
// The odd argument count indicates an Y position.
|
||||
if self.stack.len().is_odd() {
|
||||
self.y += self.stack.at(0);
|
||||
i += 1;
|
||||
}
|
||||
|
||||
if (self.stack.len() - i) % 4 != 0 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
while i < self.stack.len() {
|
||||
let x1 = self.x + self.stack.at(i + 0);
|
||||
let y1 = self.y;
|
||||
let x2 = x1 + self.stack.at(i + 1);
|
||||
let y2 = y1 + self.stack.at(i + 2);
|
||||
self.x = x2 + self.stack.at(i + 3);
|
||||
self.y = y2;
|
||||
|
||||
self.builder.curve_to(x1, y1, x2, y2, self.x, self.y);
|
||||
i += 4;
|
||||
}
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_vv_curve_to(&mut self) -> Result<(), CFFError> {
|
||||
// dx1? {dya dxb dyb dyc}+
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
let mut i = 0;
|
||||
|
||||
// The odd argument count indicates an X position.
|
||||
if self.stack.len().is_odd() {
|
||||
self.x += self.stack.at(0);
|
||||
i += 1;
|
||||
}
|
||||
|
||||
if (self.stack.len() - i) % 4 != 0 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
while i < self.stack.len() {
|
||||
let x1 = self.x;
|
||||
let y1 = self.y + self.stack.at(i + 0);
|
||||
let x2 = x1 + self.stack.at(i + 1);
|
||||
let y2 = y1 + self.stack.at(i + 2);
|
||||
self.x = x2;
|
||||
self.y = y2 + self.stack.at(i + 3);
|
||||
|
||||
self.builder.curve_to(x1, y1, x2, y2, self.x, self.y);
|
||||
i += 4;
|
||||
}
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_hv_curve_to(&mut self) -> Result<(), CFFError> {
|
||||
// dx1 dx2 dy2 dy3 {dya dxb dyb dxc dxd dxe dye dyf}* dxf?
|
||||
// {dxa dxb dyb dyc dyd dxe dye dxf}+ dyf?
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.len() < 4 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
self.stack.reverse();
|
||||
while !self.stack.is_empty() {
|
||||
if self.stack.len() < 4 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let x1 = self.x + self.stack.pop();
|
||||
let y1 = self.y;
|
||||
let x2 = x1 + self.stack.pop();
|
||||
let y2 = y1 + self.stack.pop();
|
||||
self.y = y2 + self.stack.pop();
|
||||
self.x = x2 + if self.stack.len() == 1 { self.stack.pop() } else { 0.0 };
|
||||
self.builder.curve_to(x1, y1, x2, y2, self.x, self.y);
|
||||
if self.stack.is_empty() {
|
||||
break;
|
||||
}
|
||||
|
||||
if self.stack.len() < 4 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let x1 = self.x;
|
||||
let y1 = self.y + self.stack.pop();
|
||||
let x2 = x1 + self.stack.pop();
|
||||
let y2 = y1 + self.stack.pop();
|
||||
self.x = x2 + self.stack.pop();
|
||||
self.y = y2 + if self.stack.len() == 1 { self.stack.pop() } else { 0.0 };
|
||||
self.builder.curve_to(x1, y1, x2, y2, self.x, self.y);
|
||||
}
|
||||
|
||||
debug_assert!(self.stack.is_empty());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_vh_curve_to(&mut self) -> Result<(), CFFError> {
|
||||
// dy1 dx2 dy2 dx3 {dxa dxb dyb dyc dyd dxe dye dxf}* dyf?
|
||||
// {dya dxb dyb dxc dxd dxe dye dyf}+ dxf?
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.len() < 4 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
self.stack.reverse();
|
||||
while !self.stack.is_empty() {
|
||||
if self.stack.len() < 4 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let x1 = self.x;
|
||||
let y1 = self.y + self.stack.pop();
|
||||
let x2 = x1 + self.stack.pop();
|
||||
let y2 = y1 + self.stack.pop();
|
||||
self.x = x2 + self.stack.pop();
|
||||
self.y = y2 + if self.stack.len() == 1 { self.stack.pop() } else { 0.0 };
|
||||
self.builder.curve_to(x1, y1, x2, y2, self.x, self.y);
|
||||
if self.stack.is_empty() {
|
||||
break;
|
||||
}
|
||||
|
||||
if self.stack.len() < 4 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let x1 = self.x + self.stack.pop();
|
||||
let y1 = self.y;
|
||||
let x2 = x1 + self.stack.pop();
|
||||
let y2 = y1 + self.stack.pop();
|
||||
self.y = y2 + self.stack.pop();
|
||||
self.x = x2 + if self.stack.len() == 1 { self.stack.pop() } else { 0.0 };
|
||||
self.builder.curve_to(x1, y1, x2, y2, self.x, self.y);
|
||||
}
|
||||
|
||||
debug_assert!(self.stack.is_empty());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_flex(&mut self) -> Result<(), CFFError> {
|
||||
// dx1 dy1 dx2 dy2 dx3 dy3 dx4 dy4 dx5 dy5 dx6 dy6 fd
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.len() != 13 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let dx1 = self.x + self.stack.at(0);
|
||||
let dy1 = self.y + self.stack.at(1);
|
||||
let dx2 = dx1 + self.stack.at(2);
|
||||
let dy2 = dy1 + self.stack.at(3);
|
||||
let dx3 = dx2 + self.stack.at(4);
|
||||
let dy3 = dy2 + self.stack.at(5);
|
||||
let dx4 = dx3 + self.stack.at(6);
|
||||
let dy4 = dy3 + self.stack.at(7);
|
||||
let dx5 = dx4 + self.stack.at(8);
|
||||
let dy5 = dy4 + self.stack.at(9);
|
||||
self.x = dx5 + self.stack.at(10);
|
||||
self.y = dy5 + self.stack.at(11);
|
||||
self.builder.curve_to(dx1, dy1, dx2, dy2, dx3, dy3);
|
||||
self.builder.curve_to(dx4, dy4, dx5, dy5, self.x, self.y);
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_flex1(&mut self) -> Result<(), CFFError> {
|
||||
// dx1 dy1 dx2 dy2 dx3 dy3 dx4 dy4 dx5 dy5 d6
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.len() != 11 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let dx1 = self.x + self.stack.at(0);
|
||||
let dy1 = self.y + self.stack.at(1);
|
||||
let dx2 = dx1 + self.stack.at(2);
|
||||
let dy2 = dy1 + self.stack.at(3);
|
||||
let dx3 = dx2 + self.stack.at(4);
|
||||
let dy3 = dy2 + self.stack.at(5);
|
||||
let dx4 = dx3 + self.stack.at(6);
|
||||
let dy4 = dy3 + self.stack.at(7);
|
||||
let dx5 = dx4 + self.stack.at(8);
|
||||
let dy5 = dy4 + self.stack.at(9);
|
||||
|
||||
if f32_abs(dx5 - self.x) > f32_abs(dy5 - self.y) {
|
||||
self.x = dx5 + self.stack.at(10);
|
||||
} else {
|
||||
self.y = dy5 + self.stack.at(10);
|
||||
}
|
||||
|
||||
self.builder.curve_to(dx1, dy1, dx2, dy2, dx3, dy3);
|
||||
self.builder.curve_to(dx4, dy4, dx5, dy5, self.x, self.y);
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_hflex(&mut self) -> Result<(), CFFError> {
|
||||
// dx1 dx2 dy2 dx3 dx4 dx5 dx6
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.len() != 7 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let dx1 = self.x + self.stack.at(0);
|
||||
let dy1 = self.y;
|
||||
let dx2 = dx1 + self.stack.at(1);
|
||||
let dy2 = dy1 + self.stack.at(2);
|
||||
let dx3 = dx2 + self.stack.at(3);
|
||||
let dy3 = dy2;
|
||||
let dx4 = dx3 + self.stack.at(4);
|
||||
let dy4 = dy2;
|
||||
let dx5 = dx4 + self.stack.at(5);
|
||||
let dy5 = self.y;
|
||||
self.x = dx5 + self.stack.at(6);
|
||||
self.builder.curve_to(dx1, dy1, dx2, dy2, dx3, dy3);
|
||||
self.builder.curve_to(dx4, dy4, dx5, dy5, self.x, self.y);
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_hflex1(&mut self) -> Result<(), CFFError> {
|
||||
// dx1 dy1 dx2 dy2 dx3 dx4 dx5 dy5 dx6
|
||||
|
||||
if !self.has_move_to {
|
||||
return Err(CFFError::MissingMoveTo);
|
||||
}
|
||||
|
||||
if self.stack.len() != 9 {
|
||||
return Err(CFFError::InvalidArgumentsStackLength);
|
||||
}
|
||||
|
||||
let dx1 = self.x + self.stack.at(0);
|
||||
let dy1 = self.y + self.stack.at(1);
|
||||
let dx2 = dx1 + self.stack.at(2);
|
||||
let dy2 = dy1 + self.stack.at(3);
|
||||
let dx3 = dx2 + self.stack.at(4);
|
||||
let dy3 = dy2;
|
||||
let dx4 = dx3 + self.stack.at(5);
|
||||
let dy4 = dy2;
|
||||
let dx5 = dx4 + self.stack.at(6);
|
||||
let dy5 = dy4 + self.stack.at(7);
|
||||
self.x = dx5 + self.stack.at(8);
|
||||
self.builder.curve_to(dx1, dy1, dx2, dy2, dx3, dy3);
|
||||
self.builder.curve_to(dx4, dy4, dx5, dy5, self.x, self.y);
|
||||
|
||||
self.stack.clear();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_int1(&mut self, op: u8) -> Result<(), CFFError> {
|
||||
let n = i16::from(op) - 139;
|
||||
self.stack.push(f32::from(n))?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_int2(&mut self, op: u8, s: &mut Stream) -> Result<(), CFFError> {
|
||||
let b1 = s.read::<u8>().ok_or(CFFError::ReadOutOfBounds)?;
|
||||
let n = (i16::from(op) - 247) * 256 + i16::from(b1) + 108;
|
||||
debug_assert!((108..=1131).contains(&n));
|
||||
self.stack.push(f32::from(n))?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_int3(&mut self, op: u8, s: &mut Stream) -> Result<(), CFFError> {
|
||||
let b1 = s.read::<u8>().ok_or(CFFError::ReadOutOfBounds)?;
|
||||
let n = -(i16::from(op) - 251) * 256 - i16::from(b1) - 108;
|
||||
debug_assert!((-1131..=-108).contains(&n));
|
||||
self.stack.push(f32::from(n))?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_fixed(&mut self, s: &mut Stream) -> Result<(), CFFError> {
|
||||
let n = s.read::<Fixed>().ok_or(CFFError::ReadOutOfBounds)?;
|
||||
self.stack.push(n.0)?;
|
||||
Ok(())
|
||||
}
|
||||
}
|
|
@ -0,0 +1,225 @@
|
|||
use core::convert::TryFrom;
|
||||
use core::ops::Range;
|
||||
|
||||
use crate::Stream;
|
||||
|
||||
// Limits according to the Adobe Technical Note #5176, chapter 4 DICT Data.
|
||||
const TWO_BYTE_OPERATOR_MARK: u8 = 12;
|
||||
const END_OF_FLOAT_FLAG: u8 = 0xf;
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Operator(pub u16);
|
||||
|
||||
impl Operator {
|
||||
#[inline]
|
||||
pub fn get(self) -> u16 { self.0 }
|
||||
}
|
||||
|
||||
|
||||
pub struct DictionaryParser<'a> {
|
||||
data: &'a [u8],
|
||||
// The current offset.
|
||||
offset: usize,
|
||||
// Offset to the last operands start.
|
||||
operands_offset: usize,
|
||||
// Actual operands.
|
||||
operands: &'a mut [i32],
|
||||
// An amount of operands in the `operands` array.
|
||||
operands_len: u16,
|
||||
}
|
||||
|
||||
impl<'a> DictionaryParser<'a> {
|
||||
#[inline]
|
||||
pub fn new(data: &'a [u8], operands_buffer: &'a mut [i32]) -> Self {
|
||||
DictionaryParser {
|
||||
data,
|
||||
offset: 0,
|
||||
operands_offset: 0,
|
||||
operands: operands_buffer,
|
||||
operands_len: 0,
|
||||
}
|
||||
}
|
||||
|
||||
#[inline(never)]
|
||||
pub fn parse_next(&mut self) -> Option<Operator> {
|
||||
let mut s = Stream::new_at(self.data, self.offset)?;
|
||||
self.operands_offset = self.offset;
|
||||
while !s.at_end() {
|
||||
let b = s.read::<u8>()?;
|
||||
// 0..=21 bytes are operators.
|
||||
if is_dict_one_byte_op(b) {
|
||||
let mut operator = u16::from(b);
|
||||
|
||||
// Check that operator is two byte long.
|
||||
if b == TWO_BYTE_OPERATOR_MARK {
|
||||
// Use a 1200 'prefix' to make two byte operators more readable.
|
||||
// 12 3 => 1203
|
||||
operator = 1200 + u16::from(s.read::<u8>()?);
|
||||
}
|
||||
|
||||
self.offset = s.offset();
|
||||
return Some(Operator(operator));
|
||||
} else {
|
||||
skip_number(b, &mut s)?;
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
/// Parses operands of the current operator.
|
||||
///
|
||||
/// In the DICT structure, operands are defined before an operator.
|
||||
/// So we are trying to find an operator first and the we can actually parse the operands.
|
||||
///
|
||||
/// Since this methods is pretty expensive and we do not care about most of the operators,
|
||||
/// we can speed up parsing by parsing operands only for required operators.
|
||||
///
|
||||
/// We still have to "skip" operands during operators search (see `skip_number()`),
|
||||
/// but it's still faster that a naive method.
|
||||
pub fn parse_operands(&mut self) -> Option<()> {
|
||||
let mut s = Stream::new_at(self.data, self.operands_offset)?;
|
||||
self.operands_len = 0;
|
||||
while !s.at_end() {
|
||||
let b = s.read::<u8>()?;
|
||||
// 0..=21 bytes are operators.
|
||||
if is_dict_one_byte_op(b) {
|
||||
break;
|
||||
} else {
|
||||
let op = parse_number(b, &mut s)?;
|
||||
self.operands[usize::from(self.operands_len)] = op;
|
||||
self.operands_len += 1;
|
||||
|
||||
if usize::from(self.operands_len) >= self.operands.len() {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Some(())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn operands(&self) -> &[i32] {
|
||||
&self.operands[..usize::from(self.operands_len)]
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_offset(&mut self) -> Option<usize> {
|
||||
self.parse_operands()?;
|
||||
let operands = self.operands();
|
||||
if operands.len() == 1 {
|
||||
usize::try_from(operands[0]).ok()
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn parse_range(&mut self) -> Option<Range<usize>> {
|
||||
self.parse_operands()?;
|
||||
let operands = self.operands();
|
||||
if operands.len() == 2 {
|
||||
let len = usize::try_from(operands[0]).ok()?;
|
||||
let start = usize::try_from(operands[1]).ok()?;
|
||||
let end = start.checked_add(len)?;
|
||||
Some(start..end)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// One-byte CFF DICT Operators according to the
|
||||
// Adobe Technical Note #5176, Appendix H CFF DICT Encoding.
|
||||
pub fn is_dict_one_byte_op(b: u8) -> bool {
|
||||
match b {
|
||||
0..=27 => true,
|
||||
28..=30 => false, // numbers
|
||||
31 => true, // Reserved
|
||||
32..=254 => false, // numbers
|
||||
255 => true, // Reserved
|
||||
}
|
||||
}
|
||||
|
||||
// Adobe Technical Note #5177, Table 3 Operand Encoding
|
||||
pub fn parse_number(b0: u8, s: &mut Stream) -> Option<i32> {
|
||||
match b0 {
|
||||
28 => {
|
||||
let n = i32::from(s.read::<i16>()?);
|
||||
Some(n)
|
||||
}
|
||||
29 => {
|
||||
let n = s.read::<i32>()?;
|
||||
Some(n)
|
||||
}
|
||||
30 => {
|
||||
// We do not parse float, because we don't use it.
|
||||
// And by skipping it we can remove the core::num::dec2flt dependency.
|
||||
while !s.at_end() {
|
||||
let b1 = s.read::<u8>()?;
|
||||
let nibble1 = b1 >> 4;
|
||||
let nibble2 = b1 & 15;
|
||||
if nibble1 == END_OF_FLOAT_FLAG || nibble2 == END_OF_FLOAT_FLAG {
|
||||
break;
|
||||
}
|
||||
}
|
||||
Some(0)
|
||||
}
|
||||
32..=246 => {
|
||||
let n = i32::from(b0) - 139;
|
||||
Some(n)
|
||||
}
|
||||
247..=250 => {
|
||||
let b1 = i32::from(s.read::<u8>()?);
|
||||
let n = (i32::from(b0) - 247) * 256 + b1 + 108;
|
||||
Some(n)
|
||||
}
|
||||
251..=254 => {
|
||||
let b1 = i32::from(s.read::<u8>()?);
|
||||
let n = -(i32::from(b0) - 251) * 256 - b1 - 108;
|
||||
Some(n)
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
// Just like `parse_number`, but doesn't actually parses the data.
|
||||
pub fn skip_number(b0: u8, s: &mut Stream) -> Option<()> {
|
||||
match b0 {
|
||||
28 => s.skip::<u16>(),
|
||||
29 => s.skip::<u32>(),
|
||||
30 => {
|
||||
while !s.at_end() {
|
||||
let b1 = s.read::<u8>()?;
|
||||
let nibble1 = b1 >> 4;
|
||||
let nibble2 = b1 & 15;
|
||||
if nibble1 == END_OF_FLOAT_FLAG || nibble2 == END_OF_FLOAT_FLAG {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
32..=246 => {}
|
||||
247..=250 => s.skip::<u8>(),
|
||||
251..=254 => s.skip::<u8>(),
|
||||
_ => return None,
|
||||
}
|
||||
|
||||
Some(())
|
||||
}
|
||||
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn parse_dict_number() {
|
||||
assert_eq!(parse_number(0xFA, &mut Stream::new(&[0x7C])).unwrap(), 1000);
|
||||
assert_eq!(parse_number(0xFE, &mut Stream::new(&[0x7C])).unwrap(), -1000);
|
||||
assert_eq!(parse_number(0x1C, &mut Stream::new(&[0x27, 0x10])).unwrap(), 10000);
|
||||
assert_eq!(parse_number(0x1C, &mut Stream::new(&[0xD8, 0xF0])).unwrap(), -10000);
|
||||
assert_eq!(parse_number(0x1D, &mut Stream::new(&[0x00, 0x01, 0x86, 0xA0])).unwrap(), 100000);
|
||||
assert_eq!(parse_number(0x1D, &mut Stream::new(&[0xFF, 0xFE, 0x79, 0x60])).unwrap(), -100000);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,226 @@
|
|||
use crate::parser::{Stream, U24, NumFrom, FromData};
|
||||
|
||||
pub trait IndexSize: FromData {
|
||||
fn to_u32(self) -> u32;
|
||||
}
|
||||
|
||||
impl IndexSize for u16 {
|
||||
fn to_u32(self) -> u32 { u32::from(self) }
|
||||
}
|
||||
|
||||
impl IndexSize for u32 {
|
||||
fn to_u32(self) -> u32 { self }
|
||||
}
|
||||
|
||||
|
||||
#[inline]
|
||||
pub fn parse_index<'a, T: IndexSize>(s: &mut Stream<'a>) -> Option<Index<'a>> {
|
||||
let count = s.read::<T>()?;
|
||||
parse_index_impl(count.to_u32(), s)
|
||||
}
|
||||
|
||||
#[inline(never)]
|
||||
fn parse_index_impl<'a>(count: u32, s: &mut Stream<'a>) -> Option<Index<'a>> {
|
||||
if count == 0 || count == core::u32::MAX {
|
||||
return Some(Index::default());
|
||||
}
|
||||
|
||||
let offset_size = s.read::<OffsetSize>()?;
|
||||
let offsets_len = (count + 1).checked_mul(offset_size.to_u32())?;
|
||||
let offsets = VarOffsets {
|
||||
data: &s.read_bytes(usize::num_from(offsets_len))?,
|
||||
offset_size,
|
||||
};
|
||||
|
||||
// Last offset indicates a Data Index size.
|
||||
match offsets.last() {
|
||||
Some(last_offset) => {
|
||||
let data = s.read_bytes(usize::num_from(last_offset))?;
|
||||
Some(Index { data, offsets })
|
||||
}
|
||||
None => {
|
||||
Some(Index::default())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn skip_index<T: IndexSize>(s: &mut Stream) -> Option<()> {
|
||||
let count = s.read::<T>()?;
|
||||
skip_index_impl(count.to_u32(), s)
|
||||
}
|
||||
|
||||
#[inline(never)]
|
||||
fn skip_index_impl(count: u32, s: &mut Stream) -> Option<()> {
|
||||
if count == 0 || count == core::u32::MAX {
|
||||
return Some(());
|
||||
}
|
||||
|
||||
let offset_size = s.read::<OffsetSize>()?;
|
||||
let offsets_len = (count + 1).checked_mul(offset_size.to_u32())?;
|
||||
let offsets = VarOffsets {
|
||||
data: &s.read_bytes(usize::num_from(offsets_len))?,
|
||||
offset_size,
|
||||
};
|
||||
|
||||
if let Some(last_offset) = offsets.last() {
|
||||
s.advance(usize::num_from(last_offset));
|
||||
}
|
||||
|
||||
Some(())
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct VarOffsets<'a> {
|
||||
pub data: &'a [u8],
|
||||
pub offset_size: OffsetSize,
|
||||
}
|
||||
|
||||
impl<'a> VarOffsets<'a> {
|
||||
pub fn get(&self, index: u32) -> Option<u32> {
|
||||
if index >= self.len() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let start = usize::num_from(index) * self.offset_size.to_usize();
|
||||
let mut s = Stream::new_at(self.data, start)?;
|
||||
let n: u32 = match self.offset_size {
|
||||
OffsetSize::Size1 => u32::from(s.read::<u8>()?),
|
||||
OffsetSize::Size2 => u32::from(s.read::<u16>()?),
|
||||
OffsetSize::Size3 => s.read::<U24>()?.0,
|
||||
OffsetSize::Size4 => s.read::<u32>()?,
|
||||
};
|
||||
|
||||
// Offsets are offset by one byte in the font,
|
||||
// so we have to shift them back.
|
||||
n.checked_sub(1)
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn last(&self) -> Option<u32> {
|
||||
if !self.is_empty() {
|
||||
self.get(self.len() - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn len(&self) -> u32 {
|
||||
self.data.len() as u32 / self.offset_size as u32
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.len() == 0
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Index<'a> {
|
||||
pub data: &'a [u8],
|
||||
pub offsets: VarOffsets<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Default for Index<'a> {
|
||||
#[inline]
|
||||
fn default() -> Self {
|
||||
Index {
|
||||
data: b"",
|
||||
offsets: VarOffsets { data: b"", offset_size: OffsetSize::Size1 },
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for Index<'a> {
|
||||
type Item = &'a [u8];
|
||||
type IntoIter = IndexIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
IndexIter {
|
||||
data: self,
|
||||
offset_index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Index<'a> {
|
||||
#[inline]
|
||||
pub fn len(&self) -> u32 {
|
||||
// Last offset points to the byte after the `Object data`. We should skip it.
|
||||
self.offsets.len().checked_sub(1).unwrap_or(0)
|
||||
}
|
||||
|
||||
pub fn get(&self, index: u32) -> Option<&'a [u8]> {
|
||||
let next_index = index.checked_add(1)?; // make sure we do not overflow
|
||||
let start = usize::num_from(self.offsets.get(index)?);
|
||||
let end = usize::num_from(self.offsets.get(next_index)?);
|
||||
self.data.get(start..end)
|
||||
}
|
||||
}
|
||||
|
||||
pub struct IndexIter<'a> {
|
||||
data: Index<'a>,
|
||||
offset_index: u32,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for IndexIter<'a> {
|
||||
type Item = &'a [u8];
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.offset_index == self.data.len() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let index = self.offset_index;
|
||||
self.offset_index += 1;
|
||||
self.data.get(index)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, PartialEq, Debug)]
|
||||
pub enum OffsetSize {
|
||||
Size1 = 1,
|
||||
Size2 = 2,
|
||||
Size3 = 3,
|
||||
Size4 = 4,
|
||||
}
|
||||
|
||||
impl OffsetSize {
|
||||
#[inline] pub fn to_u32(self) -> u32 { self as u32 }
|
||||
#[inline] pub fn to_usize(self) -> usize { self as usize }
|
||||
}
|
||||
|
||||
impl FromData for OffsetSize {
|
||||
const SIZE: usize = 1;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
match data.get(0)? {
|
||||
1 => Some(OffsetSize::Size1),
|
||||
2 => Some(OffsetSize::Size2),
|
||||
3 => Some(OffsetSize::Size3),
|
||||
4 => Some(OffsetSize::Size4),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn parse_offset_size() {
|
||||
assert_eq!(core::mem::size_of::<OffsetSize>(), 1);
|
||||
|
||||
assert_eq!(Stream::new(&[0x00]).read::<OffsetSize>(), None);
|
||||
assert_eq!(Stream::new(&[0x01]).read::<OffsetSize>(), Some(OffsetSize::Size1));
|
||||
assert_eq!(Stream::new(&[0x05]).read::<OffsetSize>(), None);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,139 @@
|
|||
pub mod cff1;
|
||||
#[cfg(feature = "variable-fonts")] pub mod cff2;
|
||||
mod argstack;
|
||||
mod charset;
|
||||
mod charstring;
|
||||
mod dict;
|
||||
mod index;
|
||||
#[cfg(feature = "glyph-names")] mod std_names;
|
||||
|
||||
use core::convert::TryFrom;
|
||||
|
||||
use crate::{OutlineBuilder, BBox};
|
||||
use crate::parser::{FromData, TryNumFrom};
|
||||
|
||||
|
||||
/// A list of errors that can occur during a CFF glyph outlining.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, PartialEq, Debug)]
|
||||
pub enum CFFError {
|
||||
NoGlyph,
|
||||
ReadOutOfBounds,
|
||||
ZeroBBox,
|
||||
InvalidOperator,
|
||||
UnsupportedOperator,
|
||||
MissingEndChar,
|
||||
DataAfterEndChar,
|
||||
NestingLimitReached,
|
||||
ArgumentsStackLimitReached,
|
||||
InvalidArgumentsStackLength,
|
||||
BboxOverflow,
|
||||
MissingMoveTo,
|
||||
InvalidSubroutineIndex,
|
||||
NoLocalSubroutines,
|
||||
InvalidSeacCode,
|
||||
#[cfg(feature = "variable-fonts")] InvalidItemVariationDataIndex,
|
||||
#[cfg(feature = "variable-fonts")] InvalidNumberOfBlendOperands,
|
||||
#[cfg(feature = "variable-fonts")] BlendRegionsLimitReached,
|
||||
}
|
||||
|
||||
|
||||
pub(crate) struct Builder<'a> {
|
||||
builder: &'a mut dyn OutlineBuilder,
|
||||
bbox: BBox,
|
||||
}
|
||||
|
||||
impl<'a> Builder<'a> {
|
||||
#[inline]
|
||||
fn move_to(&mut self, x: f32, y: f32) {
|
||||
self.bbox.extend_by(x, y);
|
||||
self.builder.move_to(x, y);
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn line_to(&mut self, x: f32, y: f32) {
|
||||
self.bbox.extend_by(x, y);
|
||||
self.builder.line_to(x, y);
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn curve_to(&mut self, x1: f32, y1: f32, x2: f32, y2: f32, x: f32, y: f32) {
|
||||
self.bbox.extend_by(x1, y1);
|
||||
self.bbox.extend_by(x2, y2);
|
||||
self.bbox.extend_by(x, y);
|
||||
self.builder.curve_to(x1, y1, x2, y2, x, y);
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn close(&mut self) {
|
||||
self.builder.close();
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A type-safe wrapper for string ID.
|
||||
#[derive(Clone, Copy, PartialEq, PartialOrd, Debug)]
|
||||
pub struct StringId(u16);
|
||||
|
||||
impl FromData for StringId {
|
||||
const SIZE: usize = 2;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
u16::parse(data).map(StringId)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
pub trait IsEven {
|
||||
fn is_even(&self) -> bool;
|
||||
fn is_odd(&self) -> bool;
|
||||
}
|
||||
|
||||
impl IsEven for usize {
|
||||
#[inline]
|
||||
fn is_even(&self) -> bool { (*self) & 1 == 0 }
|
||||
|
||||
#[inline]
|
||||
fn is_odd(&self) -> bool { !self.is_even() }
|
||||
}
|
||||
|
||||
|
||||
#[cfg(feature = "std")]
|
||||
#[inline]
|
||||
pub fn f32_abs(n: f32) -> f32 {
|
||||
n.abs()
|
||||
}
|
||||
|
||||
#[cfg(not(feature = "std"))]
|
||||
#[inline]
|
||||
pub fn f32_abs(n: f32) -> f32 {
|
||||
if n.is_sign_negative() { -n } else { n }
|
||||
}
|
||||
|
||||
|
||||
#[inline]
|
||||
pub fn conv_subroutine_index(index: f32, bias: u16) -> Result<u32, CFFError> {
|
||||
conv_subroutine_index_impl(index, bias).ok_or(CFFError::InvalidSubroutineIndex)
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn conv_subroutine_index_impl(index: f32, bias: u16) -> Option<u32> {
|
||||
let index = i32::try_num_from(index)?;
|
||||
let bias = i32::from(bias);
|
||||
|
||||
let index = index.checked_add(bias)?;
|
||||
u32::try_from(index).ok()
|
||||
}
|
||||
|
||||
// Adobe Technical Note #5176, Chapter 16 "Local / Global Subrs INDEXes"
|
||||
#[inline]
|
||||
pub fn calc_subroutine_bias(len: u32) -> u16 {
|
||||
if len < 1240 {
|
||||
107
|
||||
} else if len < 33900 {
|
||||
1131
|
||||
} else {
|
||||
32768
|
||||
}
|
||||
}
|
|
@ -0,0 +1,393 @@
|
|||
pub const STANDARD_NAMES: &[&str] = &[
|
||||
".notdef",
|
||||
"space",
|
||||
"exclam",
|
||||
"quotedbl",
|
||||
"numbersign",
|
||||
"dollar",
|
||||
"percent",
|
||||
"ampersand",
|
||||
"quoteright",
|
||||
"parenleft",
|
||||
"parenright",
|
||||
"asterisk",
|
||||
"plus",
|
||||
"comma",
|
||||
"hyphen",
|
||||
"period",
|
||||
"slash",
|
||||
"zero",
|
||||
"one",
|
||||
"two",
|
||||
"three",
|
||||
"four",
|
||||
"five",
|
||||
"six",
|
||||
"seven",
|
||||
"eight",
|
||||
"nine",
|
||||
"colon",
|
||||
"semicolon",
|
||||
"less",
|
||||
"equal",
|
||||
"greater",
|
||||
"question",
|
||||
"at",
|
||||
"A",
|
||||
"B",
|
||||
"C",
|
||||
"D",
|
||||
"E",
|
||||
"F",
|
||||
"G",
|
||||
"H",
|
||||
"I",
|
||||
"J",
|
||||
"K",
|
||||
"L",
|
||||
"M",
|
||||
"N",
|
||||
"O",
|
||||
"P",
|
||||
"Q",
|
||||
"R",
|
||||
"S",
|
||||
"T",
|
||||
"U",
|
||||
"V",
|
||||
"W",
|
||||
"X",
|
||||
"Y",
|
||||
"Z",
|
||||
"bracketleft",
|
||||
"backslash",
|
||||
"bracketright",
|
||||
"asciicircum",
|
||||
"underscore",
|
||||
"quoteleft",
|
||||
"a",
|
||||
"b",
|
||||
"c",
|
||||
"d",
|
||||
"e",
|
||||
"f",
|
||||
"g",
|
||||
"h",
|
||||
"i",
|
||||
"j",
|
||||
"k",
|
||||
"l",
|
||||
"m",
|
||||
"n",
|
||||
"o",
|
||||
"p",
|
||||
"q",
|
||||
"r",
|
||||
"s",
|
||||
"t",
|
||||
"u",
|
||||
"v",
|
||||
"w",
|
||||
"x",
|
||||
"y",
|
||||
"z",
|
||||
"braceleft",
|
||||
"bar",
|
||||
"braceright",
|
||||
"asciitilde",
|
||||
"exclamdown",
|
||||
"cent",
|
||||
"sterling",
|
||||
"fraction",
|
||||
"yen",
|
||||
"florin",
|
||||
"section",
|
||||
"currency",
|
||||
"quotesingle",
|
||||
"quotedblleft",
|
||||
"guillemotleft",
|
||||
"guilsinglleft",
|
||||
"guilsinglright",
|
||||
"fi",
|
||||
"fl",
|
||||
"endash",
|
||||
"dagger",
|
||||
"daggerdbl",
|
||||
"periodcentered",
|
||||
"paragraph",
|
||||
"bullet",
|
||||
"quotesinglbase",
|
||||
"quotedblbase",
|
||||
"quotedblright",
|
||||
"guillemotright",
|
||||
"ellipsis",
|
||||
"perthousand",
|
||||
"questiondown",
|
||||
"grave",
|
||||
"acute",
|
||||
"circumflex",
|
||||
"tilde",
|
||||
"macron",
|
||||
"breve",
|
||||
"dotaccent",
|
||||
"dieresis",
|
||||
"ring",
|
||||
"cedilla",
|
||||
"hungarumlaut",
|
||||
"ogonek",
|
||||
"caron",
|
||||
"emdash",
|
||||
"AE",
|
||||
"ordfeminine",
|
||||
"Lslash",
|
||||
"Oslash",
|
||||
"OE",
|
||||
"ordmasculine",
|
||||
"ae",
|
||||
"dotlessi",
|
||||
"lslash",
|
||||
"oslash",
|
||||
"oe",
|
||||
"germandbls",
|
||||
"onesuperior",
|
||||
"logicalnot",
|
||||
"mu",
|
||||
"trademark",
|
||||
"Eth",
|
||||
"onehalf",
|
||||
"plusminus",
|
||||
"Thorn",
|
||||
"onequarter",
|
||||
"divide",
|
||||
"brokenbar",
|
||||
"degree",
|
||||
"thorn",
|
||||
"threequarters",
|
||||
"twosuperior",
|
||||
"registered",
|
||||
"minus",
|
||||
"eth",
|
||||
"multiply",
|
||||
"threesuperior",
|
||||
"copyright",
|
||||
"Aacute",
|
||||
"Acircumflex",
|
||||
"Adieresis",
|
||||
"Agrave",
|
||||
"Aring",
|
||||
"Atilde",
|
||||
"Ccedilla",
|
||||
"Eacute",
|
||||
"Ecircumflex",
|
||||
"Edieresis",
|
||||
"Egrave",
|
||||
"Iacute",
|
||||
"Icircumflex",
|
||||
"Idieresis",
|
||||
"Igrave",
|
||||
"Ntilde",
|
||||
"Oacute",
|
||||
"Ocircumflex",
|
||||
"Odieresis",
|
||||
"Ograve",
|
||||
"Otilde",
|
||||
"Scaron",
|
||||
"Uacute",
|
||||
"Ucircumflex",
|
||||
"Udieresis",
|
||||
"Ugrave",
|
||||
"Yacute",
|
||||
"Ydieresis",
|
||||
"Zcaron",
|
||||
"aacute",
|
||||
"acircumflex",
|
||||
"adieresis",
|
||||
"agrave",
|
||||
"aring",
|
||||
"atilde",
|
||||
"ccedilla",
|
||||
"eacute",
|
||||
"ecircumflex",
|
||||
"edieresis",
|
||||
"egrave",
|
||||
"iacute",
|
||||
"icircumflex",
|
||||
"idieresis",
|
||||
"igrave",
|
||||
"ntilde",
|
||||
"oacute",
|
||||
"ocircumflex",
|
||||
"odieresis",
|
||||
"ograve",
|
||||
"otilde",
|
||||
"scaron",
|
||||
"uacute",
|
||||
"ucircumflex",
|
||||
"udieresis",
|
||||
"ugrave",
|
||||
"yacute",
|
||||
"ydieresis",
|
||||
"zcaron",
|
||||
"exclamsmall",
|
||||
"Hungarumlautsmall",
|
||||
"dollaroldstyle",
|
||||
"dollarsuperior",
|
||||
"ampersandsmall",
|
||||
"Acutesmall",
|
||||
"parenleftsuperior",
|
||||
"parenrightsuperior",
|
||||
"twodotenleader",
|
||||
"onedotenleader",
|
||||
"zerooldstyle",
|
||||
"oneoldstyle",
|
||||
"twooldstyle",
|
||||
"threeoldstyle",
|
||||
"fouroldstyle",
|
||||
"fiveoldstyle",
|
||||
"sixoldstyle",
|
||||
"sevenoldstyle",
|
||||
"eightoldstyle",
|
||||
"nineoldstyle",
|
||||
"commasuperior",
|
||||
"threequartersemdash",
|
||||
"periodsuperior",
|
||||
"questionsmall",
|
||||
"asuperior",
|
||||
"bsuperior",
|
||||
"centsuperior",
|
||||
"dsuperior",
|
||||
"esuperior",
|
||||
"isuperior",
|
||||
"lsuperior",
|
||||
"msuperior",
|
||||
"nsuperior",
|
||||
"osuperior",
|
||||
"rsuperior",
|
||||
"ssuperior",
|
||||
"tsuperior",
|
||||
"ff",
|
||||
"ffi",
|
||||
"ffl",
|
||||
"parenleftinferior",
|
||||
"parenrightinferior",
|
||||
"Circumflexsmall",
|
||||
"hyphensuperior",
|
||||
"Gravesmall",
|
||||
"Asmall",
|
||||
"Bsmall",
|
||||
"Csmall",
|
||||
"Dsmall",
|
||||
"Esmall",
|
||||
"Fsmall",
|
||||
"Gsmall",
|
||||
"Hsmall",
|
||||
"Ismall",
|
||||
"Jsmall",
|
||||
"Ksmall",
|
||||
"Lsmall",
|
||||
"Msmall",
|
||||
"Nsmall",
|
||||
"Osmall",
|
||||
"Psmall",
|
||||
"Qsmall",
|
||||
"Rsmall",
|
||||
"Ssmall",
|
||||
"Tsmall",
|
||||
"Usmall",
|
||||
"Vsmall",
|
||||
"Wsmall",
|
||||
"Xsmall",
|
||||
"Ysmall",
|
||||
"Zsmall",
|
||||
"colonmonetary",
|
||||
"onefitted",
|
||||
"rupiah",
|
||||
"Tildesmall",
|
||||
"exclamdownsmall",
|
||||
"centoldstyle",
|
||||
"Lslashsmall",
|
||||
"Scaronsmall",
|
||||
"Zcaronsmall",
|
||||
"Dieresissmall",
|
||||
"Brevesmall",
|
||||
"Caronsmall",
|
||||
"Dotaccentsmall",
|
||||
"Macronsmall",
|
||||
"figuredash",
|
||||
"hypheninferior",
|
||||
"Ogoneksmall",
|
||||
"Ringsmall",
|
||||
"Cedillasmall",
|
||||
"questiondownsmall",
|
||||
"oneeighth",
|
||||
"threeeighths",
|
||||
"fiveeighths",
|
||||
"seveneighths",
|
||||
"onethird",
|
||||
"twothirds",
|
||||
"zerosuperior",
|
||||
"foursuperior",
|
||||
"fivesuperior",
|
||||
"sixsuperior",
|
||||
"sevensuperior",
|
||||
"eightsuperior",
|
||||
"ninesuperior",
|
||||
"zeroinferior",
|
||||
"oneinferior",
|
||||
"twoinferior",
|
||||
"threeinferior",
|
||||
"fourinferior",
|
||||
"fiveinferior",
|
||||
"sixinferior",
|
||||
"seveninferior",
|
||||
"eightinferior",
|
||||
"nineinferior",
|
||||
"centinferior",
|
||||
"dollarinferior",
|
||||
"periodinferior",
|
||||
"commainferior",
|
||||
"Agravesmall",
|
||||
"Aacutesmall",
|
||||
"Acircumflexsmall",
|
||||
"Atildesmall",
|
||||
"Adieresissmall",
|
||||
"Aringsmall",
|
||||
"AEsmall",
|
||||
"Ccedillasmall",
|
||||
"Egravesmall",
|
||||
"Eacutesmall",
|
||||
"Ecircumflexsmall",
|
||||
"Edieresissmall",
|
||||
"Igravesmall",
|
||||
"Iacutesmall",
|
||||
"Icircumflexsmall",
|
||||
"Idieresissmall",
|
||||
"Ethsmall",
|
||||
"Ntildesmall",
|
||||
"Ogravesmall",
|
||||
"Oacutesmall",
|
||||
"Ocircumflexsmall",
|
||||
"Otildesmall",
|
||||
"Odieresissmall",
|
||||
"OEsmall",
|
||||
"Oslashsmall",
|
||||
"Ugravesmall",
|
||||
"Uacutesmall",
|
||||
"Ucircumflexsmall",
|
||||
"Udieresissmall",
|
||||
"Yacutesmall",
|
||||
"Thornsmall",
|
||||
"Ydieresissmall",
|
||||
"001.000",
|
||||
"001.001",
|
||||
"001.002",
|
||||
"001.003",
|
||||
"Black",
|
||||
"Bold",
|
||||
"Book",
|
||||
"Light",
|
||||
"Medium",
|
||||
"Regular",
|
||||
"Roman",
|
||||
"Semibold",
|
||||
];
|
|
@ -0,0 +1,47 @@
|
|||
use crate::parser::{Stream, NumFrom};
|
||||
use crate::GlyphId;
|
||||
|
||||
/// A [format 0](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-0-byte-encoding-table)
|
||||
/// subtable.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Subtable0<'a> {
|
||||
/// Just a list of 256 8bit glyph IDs.
|
||||
pub glyph_ids: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Subtable0<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u16>(); // format
|
||||
s.skip::<u16>(); // length
|
||||
s.skip::<u16>(); // language
|
||||
let glyph_ids = s.read_bytes(256)?;
|
||||
Some(Self { glyph_ids })
|
||||
}
|
||||
|
||||
/// Returns a glyph index for a code point.
|
||||
pub fn glyph_index(&self, code_point: u32) -> Option<GlyphId> {
|
||||
let glyph_id = *self.glyph_ids.get(usize::num_from(code_point))?;
|
||||
// Make sure that the glyph is not zero, the array always has 256 ids,
|
||||
// but some codepoints may be mapped to zero.
|
||||
if glyph_id != 0 {
|
||||
Some(GlyphId(u16::from(glyph_id)))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Calls `f` for each codepoint defined in this table.
|
||||
pub fn codepoints(&self, mut f: impl FnMut(u32)) {
|
||||
for (i, glyph_id) in self.glyph_ids.iter().enumerate() {
|
||||
// In contrast to every other format, here we take a look at the glyph
|
||||
// id and check whether it is zero because otherwise this method would
|
||||
// always simply call `f` for `0..256` which would be kind of pointless
|
||||
// (this array always has length 256 even when the face has fewer glyphs).
|
||||
if *glyph_id != 0 {
|
||||
f(i as u32);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,42 @@
|
|||
use crate::parser::{LazyArray32, Stream};
|
||||
use crate::GlyphId;
|
||||
|
||||
/// A [format 10](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-10-trimmed-array)
|
||||
/// subtable.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Subtable10<'a> {
|
||||
/// First character code covered.
|
||||
pub first_code_point: u32,
|
||||
/// Array of glyph indices for the character codes covered.
|
||||
pub glyphs: LazyArray32<'a, GlyphId>,
|
||||
}
|
||||
|
||||
impl<'a> Subtable10<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u16>(); // format
|
||||
s.skip::<u16>(); // reserved
|
||||
s.skip::<u32>(); // length
|
||||
s.skip::<u32>(); // language
|
||||
let first_code_point = s.read::<u32>()?;
|
||||
let count = s.read::<u32>()?;
|
||||
let glyphs = s.read_array32::<GlyphId>(count)?;
|
||||
Some(Self { first_code_point, glyphs })
|
||||
}
|
||||
|
||||
/// Returns a glyph index for a code point.
|
||||
pub fn glyph_index(&self, code_point: u32) -> Option<GlyphId> {
|
||||
let idx = code_point.checked_sub(self.first_code_point)?;
|
||||
self.glyphs.get(idx)
|
||||
}
|
||||
|
||||
/// Calls `f` for each codepoint defined in this table.
|
||||
pub fn codepoints(&self, mut f: impl FnMut(u32)) {
|
||||
for i in 0..self.glyphs.len() {
|
||||
if let Some(code_point) = self.first_code_point.checked_add(i) {
|
||||
f(code_point);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,80 @@
|
|||
use core::convert::TryFrom;
|
||||
|
||||
use crate::parser::{FromData, LazyArray32, Stream};
|
||||
use crate::GlyphId;
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct SequentialMapGroup {
|
||||
pub start_char_code: u32,
|
||||
pub end_char_code: u32,
|
||||
pub start_glyph_id: u32,
|
||||
}
|
||||
|
||||
impl FromData for SequentialMapGroup {
|
||||
const SIZE: usize = 12;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(SequentialMapGroup {
|
||||
start_char_code: s.read::<u32>()?,
|
||||
end_char_code: s.read::<u32>()?,
|
||||
start_glyph_id: s.read::<u32>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [format 12](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-12-segmented-coverage)
|
||||
/// subtable.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Subtable12<'a> {
|
||||
groups: LazyArray32<'a, SequentialMapGroup>,
|
||||
}
|
||||
|
||||
impl<'a> Subtable12<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u16>(); // format
|
||||
s.skip::<u16>(); // reserved
|
||||
s.skip::<u32>(); // length
|
||||
s.skip::<u32>(); // language
|
||||
let count = s.read::<u32>()?;
|
||||
let groups = s.read_array32::<SequentialMapGroup>(count)?;
|
||||
Some(Self { groups })
|
||||
}
|
||||
|
||||
/// Returns a glyph index for a code point.
|
||||
pub fn glyph_index(&self, code_point: u32) -> Option<GlyphId> {
|
||||
let (_, group) = self.groups.binary_search_by(|range| {
|
||||
use core::cmp::Ordering;
|
||||
|
||||
if range.start_char_code > code_point {
|
||||
Ordering::Greater
|
||||
} else if range.end_char_code < code_point {
|
||||
Ordering::Less
|
||||
} else {
|
||||
Ordering::Equal
|
||||
}
|
||||
})?;
|
||||
|
||||
let id = group.start_glyph_id.checked_add(code_point)?.checked_sub(group.start_char_code)?;
|
||||
return u16::try_from(id).ok().map(GlyphId);
|
||||
}
|
||||
|
||||
/// Calls `f` for each codepoint defined in this table.
|
||||
pub fn codepoints(&self, mut f: impl FnMut(u32)) {
|
||||
for group in self.groups {
|
||||
for code_point in group.start_char_code..=group.end_char_code {
|
||||
f(code_point);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtable12<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtable12 {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,55 @@
|
|||
// https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-13-many-to-one-range-mappings
|
||||
|
||||
use core::convert::TryFrom;
|
||||
|
||||
use crate::parser::{LazyArray32, Stream};
|
||||
use super::format12::SequentialMapGroup;
|
||||
use crate::GlyphId;
|
||||
|
||||
/// A [format 13](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-13-segmented-coverage)
|
||||
/// subtable.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Subtable13<'a> {
|
||||
groups: LazyArray32<'a, SequentialMapGroup>,
|
||||
}
|
||||
|
||||
impl<'a> Subtable13<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u16>(); // format
|
||||
s.skip::<u16>(); // reserved
|
||||
s.skip::<u32>(); // length
|
||||
s.skip::<u32>(); // language
|
||||
let count = s.read::<u32>()?;
|
||||
let groups = s.read_array32::<super::format12::SequentialMapGroup>(count)?;
|
||||
Some(Self { groups })
|
||||
}
|
||||
|
||||
/// Returns a glyph index for a code point.
|
||||
pub fn glyph_index(&self, code_point: u32) -> Option<GlyphId> {
|
||||
for group in self.groups {
|
||||
let start_char_code = group.start_char_code;
|
||||
if code_point >= start_char_code && code_point <= group.end_char_code {
|
||||
return u16::try_from(group.start_glyph_id).ok().map(GlyphId);
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
/// Calls `f` for each codepoint defined in this table.
|
||||
pub fn codepoints(&self, mut f: impl FnMut(u32)) {
|
||||
for group in self.groups {
|
||||
for code_point in group.start_char_code..=group.end_char_code {
|
||||
f(code_point);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtable13<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtable13 {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,140 @@
|
|||
use crate::GlyphId;
|
||||
use crate::parser::{FromData, LazyArray32, Offset, Offset32, Stream, U24};
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct VariationSelectorRecord {
|
||||
var_selector: u32,
|
||||
default_uvs_offset: Option<Offset32>,
|
||||
non_default_uvs_offset: Option<Offset32>,
|
||||
}
|
||||
|
||||
impl FromData for VariationSelectorRecord {
|
||||
const SIZE: usize = 11;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(VariationSelectorRecord {
|
||||
var_selector: s.read::<U24>()?.0,
|
||||
default_uvs_offset: s.read::<Option<Offset32>>()?,
|
||||
non_default_uvs_offset: s.read::<Option<Offset32>>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct UVSMappingRecord {
|
||||
unicode_value: u32,
|
||||
glyph_id: GlyphId,
|
||||
}
|
||||
|
||||
impl FromData for UVSMappingRecord {
|
||||
const SIZE: usize = 5;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(UVSMappingRecord {
|
||||
unicode_value: s.read::<U24>()?.0,
|
||||
glyph_id: s.read::<GlyphId>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct UnicodeRangeRecord {
|
||||
start_unicode_value: u32,
|
||||
additional_count: u8,
|
||||
}
|
||||
|
||||
impl UnicodeRangeRecord {
|
||||
fn contains(&self, c: u32) -> bool {
|
||||
// Never overflows, since `start_unicode_value` is actually u24.
|
||||
let end = self.start_unicode_value + u32::from(self.additional_count);
|
||||
(self.start_unicode_value..=end).contains(&c)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for UnicodeRangeRecord {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(UnicodeRangeRecord {
|
||||
start_unicode_value: s.read::<U24>()?.0,
|
||||
additional_count: s.read::<u8>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A result of a variation glyph mapping.
|
||||
#[derive(Clone, Copy, PartialEq, Debug)]
|
||||
pub enum GlyphVariationResult {
|
||||
/// Glyph was found in the variation encoding table.
|
||||
Found(GlyphId),
|
||||
/// Glyph should be looked in other, non-variation tables.
|
||||
///
|
||||
/// Basically, you should use `Encoding::glyph_index` or `Face::glyph_index`
|
||||
/// in this case.
|
||||
UseDefault,
|
||||
}
|
||||
|
||||
|
||||
/// A [format 14](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-14-unicode-variation-sequences)
|
||||
/// subtable.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Subtable14<'a> {
|
||||
records: LazyArray32<'a, VariationSelectorRecord>,
|
||||
// The whole subtable data.
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Subtable14<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u16>(); // format
|
||||
s.skip::<u32>(); // length
|
||||
let count = s.read::<u32>()?;
|
||||
let records = s.read_array32::<VariationSelectorRecord>(count)?;
|
||||
Some(Self { records, data })
|
||||
}
|
||||
|
||||
/// Returns a glyph index for a code point.
|
||||
pub fn glyph_index(&self, code_point: u32, variation: u32) -> Option<GlyphVariationResult> {
|
||||
let (_, record) = self.records.binary_search_by(|v| v.var_selector.cmp(&variation))?;
|
||||
|
||||
if let Some(offset) = record.default_uvs_offset {
|
||||
let data = self.data.get(offset.to_usize()..)?;
|
||||
let mut s = Stream::new(data);
|
||||
let count = s.read::<u32>()?;
|
||||
let ranges = s.read_array32::<UnicodeRangeRecord>(count)?;
|
||||
for range in ranges {
|
||||
if range.contains(code_point) {
|
||||
return Some(GlyphVariationResult::UseDefault);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(offset) = record.non_default_uvs_offset {
|
||||
let data = self.data.get(offset.to_usize()..)?;
|
||||
let mut s = Stream::new(data);
|
||||
let count = s.read::<u32>()?;
|
||||
let uvs_mappings = s.read_array32::<UVSMappingRecord>(count)?;
|
||||
let (_, mapping) = uvs_mappings.binary_search_by(|v| v.unicode_value.cmp(&code_point))?;
|
||||
return Some(GlyphVariationResult::Found(mapping.glyph_id));
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtable14<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtable14 {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,157 @@
|
|||
// This table has a pretty complex parsing algorithm.
|
||||
// A detailed explanation can be found here:
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-2-high-byte-mapping-through-table
|
||||
// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6cmap.html
|
||||
// https://github.com/fonttools/fonttools/blob/a360252709a3d65f899915db0a5bd753007fdbb7/Lib/fontTools/ttLib/tables/_c_m_a_p.py#L360
|
||||
|
||||
use core::convert::TryFrom;
|
||||
|
||||
use crate::parser::{FromData, LazyArray16, Stream};
|
||||
use crate::GlyphId;
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct SubHeaderRecord {
|
||||
first_code: u16,
|
||||
entry_count: u16,
|
||||
id_delta: i16,
|
||||
id_range_offset: u16,
|
||||
}
|
||||
|
||||
impl FromData for SubHeaderRecord {
|
||||
const SIZE: usize = 8;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(SubHeaderRecord {
|
||||
first_code: s.read::<u16>()?,
|
||||
entry_count: s.read::<u16>()?,
|
||||
id_delta: s.read::<i16>()?,
|
||||
id_range_offset: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// A [format 2](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-2-high-byte-mapping-through-table)
|
||||
/// subtable.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Subtable2<'a> {
|
||||
sub_header_keys: LazyArray16<'a, u16>,
|
||||
sub_headers_offset: usize,
|
||||
sub_headers: LazyArray16<'a, SubHeaderRecord>,
|
||||
// The whole subtable data.
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Subtable2<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u16>(); // format
|
||||
s.skip::<u16>(); // length
|
||||
s.skip::<u16>(); // language
|
||||
let sub_header_keys = s.read_array16::<u16>(256)?;
|
||||
// The maximum index in a sub_header_keys is a sub_headers count.
|
||||
let sub_headers_count = sub_header_keys.into_iter().map(|n| n / 8).max()? + 1;
|
||||
|
||||
// Remember sub_headers offset before reading. Will be used later.
|
||||
let sub_headers_offset = s.offset();
|
||||
let sub_headers = s.read_array16::<SubHeaderRecord>(sub_headers_count)?;
|
||||
|
||||
Some(Self {
|
||||
sub_header_keys,
|
||||
sub_headers_offset,
|
||||
sub_headers,
|
||||
data,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns a glyph index for a code point.
|
||||
///
|
||||
/// Returns `None` when `code_point` is larger than `u16`.
|
||||
pub fn glyph_index(&self, code_point: u32) -> Option<GlyphId> {
|
||||
// This subtable supports code points only in a u16 range.
|
||||
let code_point = u16::try_from(code_point).ok()?;
|
||||
|
||||
let code_point = code_point;
|
||||
let high_byte = code_point >> 8;
|
||||
let low_byte = code_point & 0x00FF;
|
||||
|
||||
let i = if code_point < 0xff {
|
||||
// 'SubHeader 0 is special: it is used for single-byte character codes.'
|
||||
0
|
||||
} else {
|
||||
// 'Array that maps high bytes to subHeaders: value is subHeader index × 8.'
|
||||
self.sub_header_keys.get(high_byte)? / 8
|
||||
};
|
||||
|
||||
let sub_header = self.sub_headers.get(i)?;
|
||||
|
||||
let first_code = sub_header.first_code;
|
||||
let range_end = first_code.checked_add(sub_header.entry_count)?;
|
||||
if low_byte < first_code || low_byte >= range_end {
|
||||
return None;
|
||||
}
|
||||
|
||||
// SubHeaderRecord::id_range_offset points to SubHeaderRecord::first_code
|
||||
// in the glyphIndexArray. So we have to advance to our code point.
|
||||
let index_offset = usize::from(low_byte.checked_sub(first_code)?) * u16::SIZE;
|
||||
|
||||
// 'The value of the idRangeOffset is the number of bytes
|
||||
// past the actual location of the idRangeOffset'.
|
||||
let offset =
|
||||
self.sub_headers_offset
|
||||
// Advance to required subheader.
|
||||
+ SubHeaderRecord::SIZE * usize::from(i + 1)
|
||||
// Move back to idRangeOffset start.
|
||||
- u16::SIZE
|
||||
// Use defined offset.
|
||||
+ usize::from(sub_header.id_range_offset)
|
||||
// Advance to required index in the glyphIndexArray.
|
||||
+ index_offset;
|
||||
|
||||
let glyph: u16 = Stream::read_at(self.data, offset)?;
|
||||
if glyph == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
u16::try_from((i32::from(glyph) + i32::from(sub_header.id_delta)) % 65536).ok().map(GlyphId)
|
||||
}
|
||||
|
||||
/// Calls `f` for each codepoint defined in this table.
|
||||
pub fn codepoints(&self, f: impl FnMut(u32)) {
|
||||
let _ = self.codepoints_inner(f);
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn codepoints_inner(&self, mut f: impl FnMut(u32)) -> Option<()> {
|
||||
for first_byte in 0u16..256 {
|
||||
let i = self.sub_header_keys.get(first_byte)? / 8;
|
||||
let sub_header = self.sub_headers.get(i)?;
|
||||
let first_code = sub_header.first_code;
|
||||
|
||||
if i == 0 {
|
||||
// This is a single byte code.
|
||||
let range_end = first_code.checked_add(sub_header.entry_count)?;
|
||||
if first_byte >= first_code && first_byte < range_end {
|
||||
f(u32::from(first_byte));
|
||||
}
|
||||
} else {
|
||||
// This is a two byte code.
|
||||
let base = first_code.checked_add(first_byte << 8)?;
|
||||
for k in 0..sub_header.entry_count {
|
||||
let code_point = base.checked_add(k)?;
|
||||
f(u32::from(code_point));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Some(())
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtable2<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtable2 {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,121 @@
|
|||
use core::convert::TryFrom;
|
||||
|
||||
use crate::parser::{LazyArray16, Stream};
|
||||
use crate::GlyphId;
|
||||
|
||||
/// A [format 4](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-4-segment-mapping-to-delta-values)
|
||||
/// subtable.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Subtable4<'a> {
|
||||
start_codes: LazyArray16<'a, u16>,
|
||||
end_codes: LazyArray16<'a, u16>,
|
||||
id_deltas: LazyArray16<'a, i16>,
|
||||
id_range_offsets: LazyArray16<'a, u16>,
|
||||
id_range_offset_pos: usize,
|
||||
// The whole subtable data.
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Subtable4<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.advance(6); // format + length + language
|
||||
let seg_count_x2 = s.read::<u16>()?;
|
||||
if seg_count_x2 < 2 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let seg_count = seg_count_x2 / 2;
|
||||
s.advance(6); // searchRange + entrySelector + rangeShift
|
||||
|
||||
let end_codes = s.read_array16::<u16>(seg_count)?;
|
||||
s.skip::<u16>(); // reservedPad
|
||||
let start_codes = s.read_array16::<u16>(seg_count)?;
|
||||
let id_deltas = s.read_array16::<i16>(seg_count)?;
|
||||
let id_range_offset_pos = s.offset();
|
||||
let id_range_offsets = s.read_array16::<u16>(seg_count)?;
|
||||
|
||||
Some(Self {
|
||||
start_codes,
|
||||
end_codes,
|
||||
id_deltas,
|
||||
id_range_offsets,
|
||||
id_range_offset_pos,
|
||||
data,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns a glyph index for a code point.
|
||||
///
|
||||
/// Returns `None` when `code_point` is larger than `u16`.
|
||||
pub fn glyph_index(&self, code_point: u32) -> Option<GlyphId> {
|
||||
// This subtable supports code points only in a u16 range.
|
||||
let code_point = u16::try_from(code_point).ok()?;
|
||||
|
||||
// A custom binary search.
|
||||
let mut start = 0;
|
||||
let mut end = self.start_codes.len();
|
||||
while end > start {
|
||||
let index = (start + end) / 2;
|
||||
let end_value = self.end_codes.get(index)?;
|
||||
if end_value >= code_point {
|
||||
let start_value = self.start_codes.get(index)?;
|
||||
if start_value > code_point {
|
||||
end = index;
|
||||
} else {
|
||||
let id_range_offset = self.id_range_offsets.get(index)?;
|
||||
let id_delta = self.id_deltas.get(index)?;
|
||||
if id_range_offset == 0 {
|
||||
return Some(GlyphId(code_point.wrapping_add(id_delta as u16)));
|
||||
} else if id_range_offset == 0xFFFF {
|
||||
// Some malformed fonts have 0xFFFF as the last offset,
|
||||
// which is invalid and should be ignored.
|
||||
return None;
|
||||
}
|
||||
|
||||
let delta = (u32::from(code_point) - u32::from(start_value)) * 2;
|
||||
let delta = u16::try_from(delta).ok()?;
|
||||
|
||||
let id_range_offset_pos = (self.id_range_offset_pos + usize::from(index) * 2) as u16;
|
||||
let pos = id_range_offset_pos.wrapping_add(delta);
|
||||
let pos = pos.wrapping_add(id_range_offset);
|
||||
|
||||
let glyph_array_value: u16 = Stream::read_at(self.data, usize::from(pos))?;
|
||||
|
||||
// 0 indicates missing glyph.
|
||||
if glyph_array_value == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let glyph_id = (glyph_array_value as i16).wrapping_add(id_delta);
|
||||
return u16::try_from(glyph_id).ok().map(GlyphId);
|
||||
}
|
||||
} else {
|
||||
start = index + 1;
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
/// Calls `f` for each codepoint defined in this table.
|
||||
pub fn codepoints(&self, mut f: impl FnMut(u32)) {
|
||||
for (start, end) in self.start_codes.into_iter().zip(self.end_codes) {
|
||||
// OxFFFF value is special and indicates codes end.
|
||||
if start == end && start == 0xFFFF {
|
||||
break;
|
||||
}
|
||||
|
||||
for code_point in start..=end {
|
||||
f(u32::from(code_point));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtable4<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtable4 {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,47 @@
|
|||
use core::convert::TryFrom;
|
||||
|
||||
use crate::parser::{LazyArray16, Stream};
|
||||
use crate::GlyphId;
|
||||
|
||||
/// A [format 6](https://docs.microsoft.com/en-us/typography/opentype/spec/cmap#format-6-trimmed-table-mapping)
|
||||
/// subtable.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Subtable6<'a> {
|
||||
/// First character code of subrange.
|
||||
pub first_code_point: u16,
|
||||
/// Array of glyph indexes for character codes in the range.
|
||||
pub glyphs: LazyArray16<'a, GlyphId>,
|
||||
}
|
||||
|
||||
impl<'a> Subtable6<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u16>(); // format
|
||||
s.skip::<u16>(); // length
|
||||
s.skip::<u16>(); // language
|
||||
let first_code_point = s.read::<u16>()?;
|
||||
let count = s.read::<u16>()?;
|
||||
let glyphs = s.read_array16::<GlyphId>(count)?;
|
||||
Some(Self { first_code_point, glyphs })
|
||||
}
|
||||
|
||||
/// Returns a glyph index for a code point.
|
||||
///
|
||||
/// Returns `None` when `code_point` is larger than `u16`.
|
||||
pub fn glyph_index(&self, code_point: u32) -> Option<GlyphId> {
|
||||
// This subtable supports code points only in a u16 range.
|
||||
let code_point = u16::try_from(code_point).ok()?;
|
||||
let idx = code_point.checked_sub(self.first_code_point)?;
|
||||
self.glyphs.get(idx)
|
||||
}
|
||||
|
||||
/// Calls `f` for each codepoint defined in this table.
|
||||
pub fn codepoints(&self, mut f: impl FnMut(u32)) {
|
||||
for i in 0..self.glyphs.len() {
|
||||
if let Some(code_point) = self.first_code_point.checked_add(i) {
|
||||
f(u32::from(code_point));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,275 @@
|
|||
/*!
|
||||
A [Character to Glyph Index Mapping Table](
|
||||
https://docs.microsoft.com/en-us/typography/opentype/spec/cmap) implementation.
|
||||
|
||||
This module provides a low-level alternative to
|
||||
[`Face::glyph_index`](../struct.Face.html#method.glyph_index) and
|
||||
[`Face::glyph_variation_index`](../struct.Face.html#method.glyph_variation_index)
|
||||
methods.
|
||||
*/
|
||||
|
||||
use crate::{GlyphId, name::PlatformId};
|
||||
use crate::parser::{FromData, LazyArray16, Offset, Offset32, Stream};
|
||||
|
||||
mod format0;
|
||||
mod format2;
|
||||
mod format4;
|
||||
mod format6;
|
||||
mod format10;
|
||||
mod format12;
|
||||
mod format13;
|
||||
mod format14;
|
||||
|
||||
pub use format0::Subtable0;
|
||||
pub use format2::Subtable2;
|
||||
pub use format4::Subtable4;
|
||||
pub use format6::Subtable6;
|
||||
pub use format10::Subtable10;
|
||||
pub use format12::Subtable12;
|
||||
pub use format13::Subtable13;
|
||||
pub use format14::{Subtable14, GlyphVariationResult};
|
||||
|
||||
/// A character encoding subtable variant.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum Format<'a> {
|
||||
ByteEncodingTable(Subtable0<'a>),
|
||||
HighByteMappingThroughTable(Subtable2<'a>),
|
||||
SegmentMappingToDeltaValues(Subtable4<'a>),
|
||||
TrimmedTableMapping(Subtable6<'a>),
|
||||
MixedCoverage, // unsupported
|
||||
TrimmedArray(Subtable10<'a>),
|
||||
SegmentedCoverage(Subtable12<'a>),
|
||||
ManyToOneRangeMappings(Subtable13<'a>),
|
||||
UnicodeVariationSequences(Subtable14<'a>),
|
||||
}
|
||||
|
||||
|
||||
/// A character encoding subtable.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Subtable<'a> {
|
||||
/// Subtable platform.
|
||||
pub platform_id: PlatformId,
|
||||
/// Subtable encoding.
|
||||
pub encoding_id: u16,
|
||||
/// A subtable format.
|
||||
pub format: Format<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Subtable<'a> {
|
||||
/// Checks that the current encoding is Unicode compatible.
|
||||
#[inline]
|
||||
pub fn is_unicode(&self) -> bool {
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/name#windows-encoding-ids
|
||||
const WINDOWS_UNICODE_BMP_ENCODING_ID: u16 = 1;
|
||||
const WINDOWS_UNICODE_FULL_REPERTOIRE_ENCODING_ID: u16 = 10;
|
||||
|
||||
match self.platform_id {
|
||||
PlatformId::Unicode => true,
|
||||
PlatformId::Windows if self.encoding_id == WINDOWS_UNICODE_BMP_ENCODING_ID => true,
|
||||
PlatformId::Windows => {
|
||||
// "Note: Subtable format 13 has the same structure as format 12; it differs only
|
||||
// in the interpretation of the startGlyphID/glyphID fields".
|
||||
let is_format_12_compatible =
|
||||
matches!(self.format, Format::SegmentedCoverage(..) | Format::ManyToOneRangeMappings(..));
|
||||
|
||||
// "Fonts that support Unicode supplementary-plane characters (U+10000 to U+10FFFF)
|
||||
// on the Windows platform must have a format 12 subtable for platform ID 3,
|
||||
// encoding ID 10."
|
||||
self.encoding_id == WINDOWS_UNICODE_FULL_REPERTOIRE_ENCODING_ID
|
||||
&& is_format_12_compatible
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
/// Maps a character to a glyph ID.
|
||||
///
|
||||
/// This is a low-level method and unlike `Face::glyph_index` it doesn't
|
||||
/// check that the current encoding is Unicode.
|
||||
/// It simply maps a `u32` codepoint number to a glyph ID.
|
||||
///
|
||||
/// Returns `None`:
|
||||
/// - when glyph ID is `0`.
|
||||
/// - when format is `MixedCoverage`, since it's not supported.
|
||||
/// - when format is `UnicodeVariationSequences`. Use `glyph_variation_index` instead.
|
||||
#[inline]
|
||||
pub fn glyph_index(&self, code_point: u32) -> Option<GlyphId> {
|
||||
match self.format {
|
||||
Format::ByteEncodingTable(ref subtable) => subtable.glyph_index(code_point),
|
||||
Format::HighByteMappingThroughTable(ref subtable) => subtable.glyph_index(code_point),
|
||||
Format::SegmentMappingToDeltaValues(ref subtable) => subtable.glyph_index(code_point),
|
||||
Format::TrimmedTableMapping(ref subtable) => subtable.glyph_index(code_point),
|
||||
Format::MixedCoverage => None,
|
||||
Format::TrimmedArray(ref subtable) => subtable.glyph_index(code_point),
|
||||
Format::SegmentedCoverage(ref subtable) => subtable.glyph_index(code_point),
|
||||
Format::ManyToOneRangeMappings(ref subtable) => subtable.glyph_index(code_point),
|
||||
// This subtable should be accessed via glyph_variation_index().
|
||||
Format::UnicodeVariationSequences(_) => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Resolves a variation of a glyph ID from two code points.
|
||||
///
|
||||
/// Returns `None`:
|
||||
/// - when glyph ID is `0`.
|
||||
/// - when format is not `UnicodeVariationSequences`.
|
||||
#[inline]
|
||||
pub fn glyph_variation_index(&self, code_point: u32, variation: u32) -> Option<GlyphVariationResult> {
|
||||
match self.format {
|
||||
Format::UnicodeVariationSequences(ref subtable) => {
|
||||
subtable.glyph_index(code_point, variation)
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Calls `f` for all codepoints contained in this subtable.
|
||||
///
|
||||
/// This is a low-level method and it doesn't check that the current
|
||||
/// encoding is Unicode. It simply calls the function `f` for all `u32`
|
||||
/// codepoints that are present in this subtable.
|
||||
///
|
||||
/// Note that this may list codepoints for which `glyph_index` still returns
|
||||
/// `None` because this method finds all codepoints which were _defined_ in
|
||||
/// this subtable. The subtable may still map them to glyph ID `0`.
|
||||
///
|
||||
/// Returns without doing anything:
|
||||
/// - when format is `MixedCoverage`, since it's not supported.
|
||||
/// - when format is `UnicodeVariationSequences`, since it's not supported.
|
||||
pub fn codepoints<F: FnMut(u32)>(&self, f: F) {
|
||||
match self.format {
|
||||
Format::ByteEncodingTable(ref subtable) => subtable.codepoints(f),
|
||||
Format::HighByteMappingThroughTable(ref subtable) => subtable.codepoints(f),
|
||||
Format::SegmentMappingToDeltaValues(ref subtable) => subtable.codepoints(f),
|
||||
Format::TrimmedTableMapping(ref subtable) => subtable.codepoints(f),
|
||||
Format::MixedCoverage => {} // unsupported
|
||||
Format::TrimmedArray(ref subtable) => subtable.codepoints(f),
|
||||
Format::SegmentedCoverage(ref subtable) => subtable.codepoints(f),
|
||||
Format::ManyToOneRangeMappings(ref subtable) => subtable.codepoints(f),
|
||||
Format::UnicodeVariationSequences(_) => {} // unsupported
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct EncodingRecord {
|
||||
platform_id: PlatformId,
|
||||
encoding_id: u16,
|
||||
offset: Offset32,
|
||||
}
|
||||
|
||||
impl FromData for EncodingRecord {
|
||||
const SIZE: usize = 8;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(EncodingRecord {
|
||||
platform_id: s.read::<PlatformId>()?,
|
||||
encoding_id: s.read::<u16>()?,
|
||||
offset: s.read::<Offset32>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A list of subtables.
|
||||
#[derive(Clone, Copy, Default)]
|
||||
pub struct Subtables<'a> {
|
||||
data: &'a [u8],
|
||||
records: LazyArray16<'a, EncodingRecord>,
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtables<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtables {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Subtables<'a> {
|
||||
/// Returns a subtable at an index.
|
||||
pub fn get(&self, index: u16) -> Option<Subtable<'a>> {
|
||||
let record = self.records.get(index)?;
|
||||
let data = self.data.get(record.offset.to_usize()..)?;
|
||||
let format = match Stream::read_at::<u16>(data, 0)? {
|
||||
0 => Format::ByteEncodingTable(Subtable0::parse(data)?),
|
||||
2 => Format::HighByteMappingThroughTable(Subtable2::parse(data)?),
|
||||
4 => Format::SegmentMappingToDeltaValues(Subtable4::parse(data)?),
|
||||
6 => Format::TrimmedTableMapping(Subtable6::parse(data)?),
|
||||
8 => Format::MixedCoverage, // unsupported
|
||||
10 => Format::TrimmedArray(Subtable10::parse(data)?),
|
||||
12 => Format::SegmentedCoverage(Subtable12::parse(data)?),
|
||||
13 => Format::ManyToOneRangeMappings(Subtable13::parse(data)?),
|
||||
14 => Format::UnicodeVariationSequences(Subtable14::parse(data)?),
|
||||
_ => return None,
|
||||
};
|
||||
|
||||
Some(Subtable {
|
||||
platform_id: record.platform_id,
|
||||
encoding_id: record.encoding_id,
|
||||
format,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns the number
|
||||
#[inline]
|
||||
pub fn len(&self) -> u16 {
|
||||
self.records.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for Subtables<'a> {
|
||||
type Item = Subtable<'a>;
|
||||
type IntoIter = SubtablesIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
SubtablesIter {
|
||||
subtables: self,
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over [`Subtables`].
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct SubtablesIter<'a> {
|
||||
subtables: Subtables<'a>,
|
||||
index: u16,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for SubtablesIter<'a> {
|
||||
type Item = Subtable<'a>;
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index < self.subtables.len() {
|
||||
self.index += 1;
|
||||
self.subtables.get(self.index - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Character to Glyph Index Mapping Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/cmap).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// A list of subtables.
|
||||
pub subtables: Subtables<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u16>(); // version
|
||||
let count = s.read::<u16>()?;
|
||||
let records = s.read_array16::<EncodingRecord>(count)?;
|
||||
Some(Table { subtables: Subtables { data, records }})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,180 @@
|
|||
//! A [Feature Name Table](
|
||||
//! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6feat.html) implementation.
|
||||
|
||||
use crate::parser::{FromData, LazyArray16, Offset, Offset32, Stream};
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct FeatureNameRecord {
|
||||
feature: u16,
|
||||
setting_table_records_count: u16,
|
||||
// Offset from the beginning of the table.
|
||||
setting_table_offset: Offset32,
|
||||
flags: u8,
|
||||
default_setting_index: u8,
|
||||
name_index: u16,
|
||||
}
|
||||
|
||||
impl FromData for FeatureNameRecord {
|
||||
const SIZE: usize = 12;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(FeatureNameRecord {
|
||||
feature: s.read::<u16>()?,
|
||||
setting_table_records_count: s.read::<u16>()?,
|
||||
setting_table_offset: s.read::<Offset32>()?,
|
||||
flags: s.read::<u8>()?,
|
||||
default_setting_index: s.read::<u8>()?,
|
||||
name_index: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A setting name.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct SettingName {
|
||||
/// The setting.
|
||||
pub setting: u16,
|
||||
/// The `name` table index for the feature's name in a 256..32768 range.
|
||||
pub name_index: u16,
|
||||
}
|
||||
|
||||
impl FromData for SettingName {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(SettingName {
|
||||
setting: s.read::<u16>()?,
|
||||
name_index: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A feature names.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct FeatureName<'a> {
|
||||
/// The feature's ID.
|
||||
pub feature: u16,
|
||||
/// The feature's setting names.
|
||||
pub setting_names: LazyArray16<'a, SettingName>,
|
||||
/// The index of the default setting in the `setting_names`.
|
||||
pub default_setting_index: u8,
|
||||
/// The feature's exclusive settings. If set, the feature settings are mutually exclusive.
|
||||
pub exclusive: bool,
|
||||
/// The `name` table index for the feature's name in a 256..32768 range.
|
||||
pub name_index: u16,
|
||||
}
|
||||
|
||||
|
||||
/// A list fo feature names.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct FeatureNames<'a> {
|
||||
data: &'a [u8],
|
||||
records: LazyArray16<'a, FeatureNameRecord>,
|
||||
}
|
||||
|
||||
impl<'a> FeatureNames<'a> {
|
||||
/// Returns a feature name at an index.
|
||||
pub fn get(&self, index: u16) -> Option<FeatureName<'a>> {
|
||||
let record = self.records.get(index)?;
|
||||
let data = self.data.get(record.setting_table_offset.to_usize()..)?;
|
||||
let mut s = Stream::new(data);
|
||||
let setting_names = s.read_array16::<SettingName>(record.setting_table_records_count)?;
|
||||
Some(FeatureName {
|
||||
feature: record.feature,
|
||||
setting_names,
|
||||
default_setting_index:
|
||||
if record.flags & 0x40 != 0 { record.default_setting_index } else { 0 },
|
||||
exclusive: record.flags & 0x80 != 0,
|
||||
name_index: record.name_index,
|
||||
})
|
||||
}
|
||||
|
||||
/// Finds a feature name by ID.
|
||||
pub fn find(&self, feature: u16) -> Option<FeatureName<'a>> {
|
||||
let index = self.records
|
||||
.binary_search_by(|name| name.feature.cmp(&feature)).map(|(i, _)| i)?;
|
||||
self.get(index)
|
||||
}
|
||||
|
||||
/// Returns the number of feature names.
|
||||
pub fn len(&self) -> u16 {
|
||||
self.records.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> core::fmt::Debug for FeatureNames<'a> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
f.debug_list().entries(self.into_iter()).finish()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for FeatureNames<'a> {
|
||||
type Item = FeatureName<'a>;
|
||||
type IntoIter = FeatureNamesIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
FeatureNamesIter {
|
||||
names: self,
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over [`FeatureNames`].
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct FeatureNamesIter<'a> {
|
||||
names: FeatureNames<'a>,
|
||||
index: u16,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for FeatureNamesIter<'a> {
|
||||
type Item = FeatureName<'a>;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index < self.names.len() {
|
||||
self.index += 1;
|
||||
self.names.get(self.index - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Feature Name Table](
|
||||
/// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6feat.html).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// A list of feature names. Sorted by `FeatureName.feature`.
|
||||
pub names: FeatureNames<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let version = s.read::<u32>()?;
|
||||
if version != 0x00010000 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let count = s.read::<u16>()?;
|
||||
s.advance_checked(6)?; // reserved
|
||||
let records = s.read_array16::<FeatureNameRecord>(count)?;
|
||||
|
||||
Some(Table {
|
||||
names: FeatureNames {
|
||||
data,
|
||||
records,
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,97 @@
|
|||
//! A [Font Variations Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/fvar) implementation.
|
||||
|
||||
use core::num::NonZeroU16;
|
||||
|
||||
use crate::{Tag, NormalizedCoordinate};
|
||||
use crate::parser::{Stream, FromData, Fixed, Offset16, Offset, LazyArray16, f32_bound};
|
||||
|
||||
/// A [variation axis](https://docs.microsoft.com/en-us/typography/opentype/spec/fvar#variationaxisrecord).
|
||||
#[repr(C)]
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, PartialEq, Debug)]
|
||||
pub struct VariationAxis {
|
||||
pub tag: Tag,
|
||||
pub min_value: f32,
|
||||
pub def_value: f32,
|
||||
pub max_value: f32,
|
||||
/// An axis name in the `name` table.
|
||||
pub name_id: u16,
|
||||
pub hidden: bool,
|
||||
}
|
||||
|
||||
impl FromData for VariationAxis {
|
||||
const SIZE: usize = 20;
|
||||
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let tag = s.read::<Tag>()?;
|
||||
let min_value = s.read::<Fixed>()?;
|
||||
let def_value = s.read::<Fixed>()?;
|
||||
let max_value = s.read::<Fixed>()?;
|
||||
let flags = s.read::<u16>()?;
|
||||
let name_id = s.read::<u16>()?;
|
||||
|
||||
Some(VariationAxis {
|
||||
tag,
|
||||
min_value: def_value.0.min(min_value.0),
|
||||
def_value: def_value.0,
|
||||
max_value: def_value.0.max(max_value.0),
|
||||
name_id,
|
||||
hidden: (flags >> 3) & 1 == 1,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl VariationAxis {
|
||||
/// Returns a normalized variation coordinate for this axis.
|
||||
pub(crate) fn normalized_value(&self, mut v: f32) -> NormalizedCoordinate {
|
||||
// Based on
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/avar#overview
|
||||
|
||||
v = f32_bound(self.min_value, v, self.max_value);
|
||||
if v == self.def_value {
|
||||
v = 0.0;
|
||||
} else if v < self.def_value {
|
||||
v = (v - self.def_value) / (self.def_value - self.min_value);
|
||||
} else {
|
||||
v = (v - self.def_value) / (self.max_value - self.def_value);
|
||||
}
|
||||
|
||||
NormalizedCoordinate::from(v)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Font Variations Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/fvar).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// A list of variation axes.
|
||||
pub axes: LazyArray16<'a, VariationAxis>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let version = s.read::<u32>()?;
|
||||
if version != 0x00010000 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let axes_array_offset = s.read::<Offset16>()?;
|
||||
s.skip::<u16>(); // reserved
|
||||
let axis_count = s.read::<u16>()?;
|
||||
|
||||
// 'If axisCount is zero, then the font is not functional as a variable font,
|
||||
// and must be treated as a non-variable font;
|
||||
// any variation-specific tables or data is ignored.'
|
||||
let axis_count = NonZeroU16::new(axis_count)?;
|
||||
|
||||
let mut s = Stream::new_at(data, axes_array_offset.to_usize())?;
|
||||
let axes = s.read_array16::<VariationAxis>(axis_count.get())?;
|
||||
|
||||
Some(Table { axes })
|
||||
}
|
||||
}
|
|
@ -0,0 +1,194 @@
|
|||
//! A [Glyph Definition Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/gdef) implementation.
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::opentype_layout::{Class, ClassDefinition, Coverage};
|
||||
use crate::parser::{LazyArray16, Offset, Offset16, Offset32, Stream, FromSlice};
|
||||
|
||||
#[cfg(feature = "variable-fonts")] use crate::NormalizedCoordinate;
|
||||
#[cfg(feature = "variable-fonts")] use crate::var_store::ItemVariationStore;
|
||||
|
||||
|
||||
/// A [glyph class](https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#glyph-class-definition-table).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Debug, Hash)]
|
||||
pub enum GlyphClass {
|
||||
Base = 1,
|
||||
Ligature = 2,
|
||||
Mark = 3,
|
||||
Component = 4,
|
||||
}
|
||||
|
||||
|
||||
/// A [Glyph Definition Table](https://docs.microsoft.com/en-us/typography/opentype/spec/gdef).
|
||||
#[allow(missing_debug_implementations)]
|
||||
#[derive(Clone, Copy, Default)]
|
||||
pub struct Table<'a> {
|
||||
glyph_classes: Option<ClassDefinition<'a>>,
|
||||
mark_attach_classes: Option<ClassDefinition<'a>>,
|
||||
mark_glyph_coverage_offsets: Option<(&'a [u8], LazyArray16<'a, Offset32>)>,
|
||||
#[cfg(feature = "variable-fonts")] variation_store: Option<ItemVariationStore<'a>>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let version = s.read::<u32>()?;
|
||||
if !(version == 0x00010000 || version == 0x00010002 || version == 0x00010003) {
|
||||
return None;
|
||||
}
|
||||
|
||||
let glyph_class_def_offset = s.read::<Option<Offset16>>()?;
|
||||
s.skip::<Offset16>(); // attachListOffset
|
||||
s.skip::<Offset16>(); // ligCaretListOffset
|
||||
let mark_attach_class_def_offset = s.read::<Option<Offset16>>()?;
|
||||
|
||||
let mut mark_glyph_sets_def_offset: Option<Offset16> = None;
|
||||
if version > 0x00010000 {
|
||||
mark_glyph_sets_def_offset = s.read::<Option<Offset16>>()?;
|
||||
}
|
||||
|
||||
#[allow(unused_mut)]
|
||||
#[allow(unused_variables)]
|
||||
let mut var_store_offset: Option<Offset32> = None;
|
||||
|
||||
#[cfg(feature = "variable-fonts")]
|
||||
{
|
||||
if version > 0x00010002 {
|
||||
var_store_offset = s.read::<Option<Offset32>>()?;
|
||||
}
|
||||
}
|
||||
|
||||
let mut table = Table::default();
|
||||
|
||||
if let Some(offset) = glyph_class_def_offset {
|
||||
|
||||
if let Some(subdata) = data.get(offset.to_usize()..) {
|
||||
table.glyph_classes = ClassDefinition::parse(subdata);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(offset) = mark_attach_class_def_offset {
|
||||
if let Some(subdata) = data.get(offset.to_usize()..) {
|
||||
table.mark_attach_classes = ClassDefinition::parse(subdata);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(offset) = mark_glyph_sets_def_offset {
|
||||
if let Some(subdata) = data.get(offset.to_usize()..) {
|
||||
let mut s = Stream::new(subdata);
|
||||
let format = s.read::<u16>()?;
|
||||
if format == 1 {
|
||||
if let Some(count) = s.read::<u16>() {
|
||||
if let Some(array) = s.read_array16::<Offset32>(count) {
|
||||
table.mark_glyph_coverage_offsets = Some((subdata, array));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(feature = "variable-fonts")]
|
||||
{
|
||||
if let Some(offset) = var_store_offset {
|
||||
if let Some(subdata) = data.get(offset.to_usize()..) {
|
||||
let s = Stream::new(subdata);
|
||||
table.variation_store = ItemVariationStore::parse(s);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Some(table)
|
||||
}
|
||||
|
||||
/// Checks that face has
|
||||
/// [Glyph Class Definition Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#glyph-class-definition-table).
|
||||
#[inline]
|
||||
pub fn has_glyph_classes(&self) -> bool {
|
||||
self.glyph_classes.is_some()
|
||||
}
|
||||
|
||||
/// Returns glyph's class according to
|
||||
/// [Glyph Class Definition Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#glyph-class-definition-table).
|
||||
///
|
||||
/// Returns `None` when *Glyph Class Definition Table* is not set
|
||||
/// or glyph class is not set or invalid.
|
||||
#[inline]
|
||||
pub fn glyph_class(&self, glyph_id: GlyphId) -> Option<GlyphClass> {
|
||||
match self.glyph_classes?.get(glyph_id) {
|
||||
1 => Some(GlyphClass::Base),
|
||||
2 => Some(GlyphClass::Ligature),
|
||||
3 => Some(GlyphClass::Mark),
|
||||
4 => Some(GlyphClass::Component),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns glyph's mark attachment class according to
|
||||
/// [Mark Attachment Class Definition Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#mark-attachment-class-definition-table).
|
||||
///
|
||||
/// All glyphs not assigned to a class fall into Class 0.
|
||||
#[inline]
|
||||
pub fn glyph_mark_attachment_class(&self, glyph_id: GlyphId) -> Class {
|
||||
self.mark_attach_classes
|
||||
.map(|def| def.get(glyph_id))
|
||||
.unwrap_or(0)
|
||||
}
|
||||
|
||||
/// Checks that glyph is a mark according to
|
||||
/// [Mark Glyph Sets Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#mark-glyph-sets-table).
|
||||
///
|
||||
/// `set_index` allows checking a specific glyph coverage set.
|
||||
/// Otherwise all sets will be checked.
|
||||
#[inline]
|
||||
pub fn is_mark_glyph(&self, glyph_id: GlyphId, set_index: Option<u16>) -> bool {
|
||||
is_mark_glyph_impl(self, glyph_id, set_index).is_some()
|
||||
}
|
||||
|
||||
/// Returns glyph's variation delta at a specified index according to
|
||||
/// [Item Variation Store Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gdef#item-variation-store-table).
|
||||
#[cfg(feature = "variable-fonts")]
|
||||
#[inline]
|
||||
pub fn glyph_variation_delta(
|
||||
&self,
|
||||
outer_index: u16,
|
||||
inner_index: u16,
|
||||
coordinates: &[NormalizedCoordinate],
|
||||
) -> Option<f32> {
|
||||
self.variation_store
|
||||
.and_then(|store| store.parse_delta(outer_index, inner_index, coordinates))
|
||||
}
|
||||
}
|
||||
|
||||
#[inline(never)]
|
||||
fn is_mark_glyph_impl(
|
||||
table: &Table,
|
||||
glyph_id: GlyphId,
|
||||
set_index: Option<u16>,
|
||||
) -> Option<()> {
|
||||
let (data, offsets) = table.mark_glyph_coverage_offsets?;
|
||||
|
||||
if let Some(set_index) = set_index {
|
||||
if let Some(offset) = offsets.get(set_index) {
|
||||
let table = Coverage::parse(data.get(offset.to_usize()..)?)?;
|
||||
if table.contains(glyph_id) {
|
||||
return Some(());
|
||||
}
|
||||
}
|
||||
} else {
|
||||
for offset in offsets {
|
||||
let table = Coverage::parse(data.get(offset.to_usize()..)?)?;
|
||||
if table.contains(glyph_id) {
|
||||
return Some(());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
|
@ -0,0 +1,677 @@
|
|||
//! A [Glyph Data Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/glyf) implementation.
|
||||
|
||||
use core::num::NonZeroU16;
|
||||
|
||||
use crate::parser::{Stream, F2DOT14, LazyArray16, NumFrom};
|
||||
use crate::{loca, GlyphId, OutlineBuilder, Rect, BBox};
|
||||
|
||||
pub(crate) struct Builder<'a> {
|
||||
pub builder: &'a mut dyn OutlineBuilder,
|
||||
pub transform: Transform,
|
||||
is_default_ts: bool, // `bool` is faster than `Option` or `is_default`.
|
||||
// We have to always calculate the bbox, because `gvar` doesn't store one
|
||||
// and in case of a malformed bbox in `glyf`.
|
||||
pub bbox: BBox,
|
||||
first_on_curve: Option<Point>,
|
||||
first_off_curve: Option<Point>,
|
||||
last_off_curve: Option<Point>,
|
||||
}
|
||||
|
||||
impl<'a> Builder<'a> {
|
||||
#[inline]
|
||||
pub fn new(
|
||||
transform: Transform,
|
||||
bbox: BBox,
|
||||
builder: &'a mut dyn OutlineBuilder,
|
||||
) -> Self {
|
||||
Builder {
|
||||
builder,
|
||||
transform,
|
||||
is_default_ts: transform.is_default(),
|
||||
bbox,
|
||||
first_on_curve: None,
|
||||
first_off_curve: None,
|
||||
last_off_curve: None,
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn move_to(&mut self, mut x: f32, mut y: f32) {
|
||||
if !self.is_default_ts {
|
||||
self.transform.apply_to(&mut x, &mut y);
|
||||
}
|
||||
|
||||
self.bbox.extend_by(x, y);
|
||||
|
||||
self.builder.move_to(x, y);
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn line_to(&mut self, mut x: f32, mut y: f32) {
|
||||
if !self.is_default_ts {
|
||||
self.transform.apply_to(&mut x, &mut y);
|
||||
}
|
||||
|
||||
self.bbox.extend_by(x, y);
|
||||
|
||||
self.builder.line_to(x, y);
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn quad_to(&mut self, mut x1: f32, mut y1: f32, mut x: f32, mut y: f32) {
|
||||
if !self.is_default_ts {
|
||||
self.transform.apply_to(&mut x1, &mut y1);
|
||||
self.transform.apply_to(&mut x, &mut y);
|
||||
}
|
||||
|
||||
self.bbox.extend_by(x1, y1);
|
||||
self.bbox.extend_by(x, y);
|
||||
|
||||
self.builder.quad_to(x1, y1, x, y);
|
||||
}
|
||||
|
||||
// Useful links:
|
||||
//
|
||||
// - https://developer.apple.com/fonts/TrueType-Reference-Manual/RM01/Chap1.html
|
||||
// - https://stackoverflow.com/a/20772557
|
||||
#[inline]
|
||||
pub fn push_point(&mut self, x: f32, y: f32, on_curve_point: bool, last_point: bool) {
|
||||
let p = Point { x, y };
|
||||
if self.first_on_curve.is_none() {
|
||||
if on_curve_point {
|
||||
self.first_on_curve = Some(p);
|
||||
self.move_to(p.x, p.y);
|
||||
} else {
|
||||
if let Some(offcurve) = self.first_off_curve {
|
||||
let mid = offcurve.lerp(p, 0.5);
|
||||
self.first_on_curve = Some(mid);
|
||||
self.last_off_curve = Some(p);
|
||||
self.move_to(mid.x, mid.y);
|
||||
} else {
|
||||
self.first_off_curve = Some(p);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
match (self.last_off_curve, on_curve_point) {
|
||||
(Some(offcurve), true) => {
|
||||
self.last_off_curve = None;
|
||||
self.quad_to(offcurve.x, offcurve.y, p.x, p.y);
|
||||
}
|
||||
(Some(offcurve), false) => {
|
||||
self.last_off_curve = Some(p);
|
||||
let mid = offcurve.lerp(p, 0.5);
|
||||
self.quad_to(offcurve.x, offcurve.y, mid.x, mid.y);
|
||||
}
|
||||
(None, true) => {
|
||||
self.line_to(p.x, p.y);
|
||||
}
|
||||
(None, false) => {
|
||||
self.last_off_curve = Some(p);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if last_point {
|
||||
self.finish_contour();
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn finish_contour(&mut self) {
|
||||
if let (Some(offcurve1), Some(offcurve2)) = (self.first_off_curve, self.last_off_curve) {
|
||||
self.last_off_curve = None;
|
||||
let mid = offcurve2.lerp(offcurve1, 0.5);
|
||||
self.quad_to(offcurve2.x, offcurve2.y, mid.x, mid.y);
|
||||
}
|
||||
|
||||
if let (Some(p), Some(offcurve1)) = (self.first_on_curve, self.first_off_curve) {
|
||||
self.quad_to(offcurve1.x, offcurve1.y, p.x, p.y);
|
||||
} else if let (Some(p), Some(offcurve2)) = (self.first_on_curve, self.last_off_curve) {
|
||||
self.quad_to(offcurve2.x, offcurve2.y, p.x, p.y);
|
||||
} else if let Some(p) = self.first_on_curve {
|
||||
self.line_to(p.x, p.y);
|
||||
}
|
||||
|
||||
self.first_on_curve = None;
|
||||
self.first_off_curve = None;
|
||||
self.last_off_curve = None;
|
||||
|
||||
self.builder.close();
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
pub(crate) struct Transform {
|
||||
pub a: f32, pub b: f32, pub c: f32,
|
||||
pub d: f32, pub e: f32, pub f: f32,
|
||||
}
|
||||
|
||||
impl Transform {
|
||||
#[cfg(feature = "variable-fonts")]
|
||||
#[inline]
|
||||
pub fn new_translate(tx: f32, ty: f32) -> Self {
|
||||
Transform { a: 1.0, b: 0.0, c: 0.0, d: 1.0, e: tx, f: ty }
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn combine(ts1: Self, ts2: Self) -> Self {
|
||||
Transform {
|
||||
a: ts1.a * ts2.a + ts1.c * ts2.b,
|
||||
b: ts1.b * ts2.a + ts1.d * ts2.b,
|
||||
c: ts1.a * ts2.c + ts1.c * ts2.d,
|
||||
d: ts1.b * ts2.c + ts1.d * ts2.d,
|
||||
e: ts1.a * ts2.e + ts1.c * ts2.f + ts1.e,
|
||||
f: ts1.b * ts2.e + ts1.d * ts2.f + ts1.f,
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn apply_to(&self, x: &mut f32, y: &mut f32) {
|
||||
let tx = *x;
|
||||
let ty = *y;
|
||||
*x = self.a * tx + self.c * ty + self.e;
|
||||
*y = self.b * tx + self.d * ty + self.f;
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn is_default(&self) -> bool {
|
||||
// A direct float comparison is fine in our case.
|
||||
self.a == 1.0
|
||||
&& self.b == 0.0
|
||||
&& self.c == 0.0
|
||||
&& self.d == 1.0
|
||||
&& self.e == 0.0
|
||||
&& self.f == 0.0
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for Transform {
|
||||
#[inline]
|
||||
fn default() -> Self {
|
||||
Transform { a: 1.0, b: 0.0, c: 0.0, d: 1.0, e: 0.0, f: 0.0 }
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Transform {
|
||||
#[inline]
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Transform({} {} {} {} {} {})", self.a, self.b, self.c, self.d, self.e, self.f)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub(crate) struct CompositeGlyphInfo {
|
||||
pub glyph_id: GlyphId,
|
||||
pub transform: Transform,
|
||||
#[allow(dead_code)] pub flags: CompositeGlyphFlags,
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone)]
|
||||
pub(crate) struct CompositeGlyphIter<'a> {
|
||||
stream: Stream<'a>,
|
||||
}
|
||||
|
||||
impl<'a> CompositeGlyphIter<'a> {
|
||||
#[inline]
|
||||
pub fn new(data: &'a [u8]) -> Self {
|
||||
CompositeGlyphIter { stream: Stream::new(data) }
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Iterator for CompositeGlyphIter<'a> {
|
||||
type Item = CompositeGlyphInfo;
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
let flags = CompositeGlyphFlags(self.stream.read::<u16>()?);
|
||||
let glyph_id = self.stream.read::<GlyphId>()?;
|
||||
|
||||
let mut ts = Transform::default();
|
||||
|
||||
if flags.args_are_xy_values() {
|
||||
if flags.arg_1_and_2_are_words() {
|
||||
ts.e = f32::from(self.stream.read::<i16>()?);
|
||||
ts.f = f32::from(self.stream.read::<i16>()?);
|
||||
} else {
|
||||
ts.e = f32::from(self.stream.read::<i8>()?);
|
||||
ts.f = f32::from(self.stream.read::<i8>()?);
|
||||
}
|
||||
}
|
||||
|
||||
if flags.we_have_a_two_by_two() {
|
||||
ts.a = self.stream.read::<F2DOT14>()?.to_f32();
|
||||
ts.b = self.stream.read::<F2DOT14>()?.to_f32();
|
||||
ts.c = self.stream.read::<F2DOT14>()?.to_f32();
|
||||
ts.d = self.stream.read::<F2DOT14>()?.to_f32();
|
||||
} else if flags.we_have_an_x_and_y_scale() {
|
||||
ts.a = self.stream.read::<F2DOT14>()?.to_f32();
|
||||
ts.d = self.stream.read::<F2DOT14>()?.to_f32();
|
||||
} else if flags.we_have_a_scale() {
|
||||
ts.a = self.stream.read::<F2DOT14>()?.to_f32();
|
||||
ts.d = ts.a;
|
||||
}
|
||||
|
||||
if !flags.more_components() {
|
||||
// Finish the iterator even if stream still has some data.
|
||||
self.stream.jump_to_end();
|
||||
}
|
||||
|
||||
Some(CompositeGlyphInfo {
|
||||
glyph_id,
|
||||
transform: ts,
|
||||
flags,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// Due to some optimization magic, using f32 instead of i16
|
||||
// makes the code ~10% slower. At least on my machine.
|
||||
// I guess it's due to the fact that with i16 the struct
|
||||
// fits into the machine word.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub(crate) struct GlyphPoint {
|
||||
pub x: i16,
|
||||
pub y: i16,
|
||||
/// Indicates that a point is a point on curve
|
||||
/// and not a control point.
|
||||
pub on_curve_point: bool,
|
||||
pub last_point: bool,
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Default)]
|
||||
pub(crate) struct GlyphPointsIter<'a> {
|
||||
endpoints: EndpointsIter<'a>,
|
||||
flags: FlagsIter<'a>,
|
||||
x_coords: CoordsIter<'a>,
|
||||
y_coords: CoordsIter<'a>,
|
||||
pub points_left: u16, // Number of points left in the glyph.
|
||||
}
|
||||
|
||||
#[cfg(feature = "variable-fonts")]
|
||||
impl GlyphPointsIter<'_> {
|
||||
#[inline]
|
||||
pub fn current_contour(&self) -> u16 {
|
||||
self.endpoints.index - 1
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Iterator for GlyphPointsIter<'a> {
|
||||
type Item = GlyphPoint;
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
self.points_left = self.points_left.checked_sub(1)?;
|
||||
|
||||
// TODO: skip empty contours
|
||||
|
||||
let last_point = self.endpoints.next();
|
||||
let flags = self.flags.next()?;
|
||||
Some(GlyphPoint {
|
||||
x: self.x_coords.next(flags.x_short(), flags.x_is_same_or_positive_short()),
|
||||
y: self.y_coords.next(flags.y_short(), flags.y_is_same_or_positive_short()),
|
||||
on_curve_point: flags.on_curve_point(),
|
||||
last_point,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A simple flattening iterator for glyph's endpoints.
|
||||
///
|
||||
/// Translates endpoints like: 2 4 7
|
||||
/// into flags: 0 0 1 0 1 0 0 1
|
||||
#[derive(Clone, Copy, Default)]
|
||||
struct EndpointsIter<'a> {
|
||||
endpoints: LazyArray16<'a, u16>, // Each endpoint indicates a contour end.
|
||||
index: u16,
|
||||
left: u16,
|
||||
}
|
||||
|
||||
impl<'a> EndpointsIter<'a> {
|
||||
#[inline]
|
||||
fn new(endpoints: LazyArray16<'a, u16>) -> Option<Self> {
|
||||
Some(EndpointsIter {
|
||||
endpoints,
|
||||
index: 1,
|
||||
left: endpoints.get(0)?,
|
||||
})
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self) -> bool {
|
||||
if self.left == 0 {
|
||||
if let Some(end) = self.endpoints.get(self.index) {
|
||||
let prev = self.endpoints.get(self.index - 1).unwrap_or(0);
|
||||
// Malformed font can have endpoints not in increasing order,
|
||||
// so we have to use checked_sub.
|
||||
self.left = end.checked_sub(prev).unwrap_or(0);
|
||||
self.left = self.left.checked_sub(1).unwrap_or(0);
|
||||
}
|
||||
|
||||
// Always advance the index, so we can check the current contour number.
|
||||
if let Some(n) = self.index.checked_add(1) {
|
||||
self.index = n;
|
||||
}
|
||||
|
||||
true
|
||||
} else {
|
||||
self.left -= 1;
|
||||
false
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Default)]
|
||||
struct FlagsIter<'a> {
|
||||
stream: Stream<'a>,
|
||||
// Number of times the `flags` should be used
|
||||
// before reading the next one from `stream`.
|
||||
repeats: u8,
|
||||
flags: SimpleGlyphFlags,
|
||||
}
|
||||
|
||||
impl<'a> FlagsIter<'a> {
|
||||
#[inline]
|
||||
fn new(data: &'a [u8]) -> Self {
|
||||
FlagsIter {
|
||||
stream: Stream::new(data),
|
||||
repeats: 0,
|
||||
flags: SimpleGlyphFlags(0),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Iterator for FlagsIter<'a> {
|
||||
type Item = SimpleGlyphFlags;
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.repeats == 0 {
|
||||
self.flags = SimpleGlyphFlags(self.stream.read::<u8>().unwrap_or(0));
|
||||
if self.flags.repeat_flag() {
|
||||
self.repeats = self.stream.read::<u8>().unwrap_or(0);
|
||||
}
|
||||
} else {
|
||||
self.repeats -= 1;
|
||||
}
|
||||
|
||||
Some(self.flags)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Default)]
|
||||
struct CoordsIter<'a> {
|
||||
stream: Stream<'a>,
|
||||
prev: i16, // Points are stored as deltas, so we have to keep the previous one.
|
||||
}
|
||||
|
||||
impl<'a> CoordsIter<'a> {
|
||||
#[inline]
|
||||
fn new(data: &'a [u8]) -> Self {
|
||||
CoordsIter {
|
||||
stream: Stream::new(data),
|
||||
prev: 0,
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self, is_short: bool, is_same_or_short: bool) -> i16 {
|
||||
// See https://docs.microsoft.com/en-us/typography/opentype/spec/glyf#simple-glyph-description
|
||||
// for details about Simple Glyph Flags processing.
|
||||
|
||||
// We've already checked the coords data, so it's safe to fallback to 0.
|
||||
|
||||
let mut n = 0;
|
||||
if is_short {
|
||||
n = i16::from(self.stream.read::<u8>().unwrap_or(0));
|
||||
if !is_same_or_short {
|
||||
n = -n;
|
||||
}
|
||||
} else if !is_same_or_short {
|
||||
n = self.stream.read::<i16>().unwrap_or(0);
|
||||
}
|
||||
|
||||
self.prev = self.prev.wrapping_add(n);
|
||||
self.prev
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct Point {
|
||||
x: f32,
|
||||
y: f32,
|
||||
}
|
||||
|
||||
impl Point {
|
||||
#[inline]
|
||||
fn lerp(self, other: Point, t: f32) -> Point {
|
||||
Point {
|
||||
x: self.x + t * (other.x - self.x),
|
||||
y: self.y + t * (other.y - self.y),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/glyf#simple-glyph-description
|
||||
#[derive(Clone, Copy, Default)]
|
||||
struct SimpleGlyphFlags(u8);
|
||||
|
||||
impl SimpleGlyphFlags {
|
||||
#[inline] fn on_curve_point(self) -> bool { self.0 & 0x01 != 0 }
|
||||
#[inline] fn x_short(self) -> bool { self.0 & 0x02 != 0 }
|
||||
#[inline] fn y_short(self) -> bool { self.0 & 0x04 != 0 }
|
||||
#[inline] fn repeat_flag(self) -> bool { self.0 & 0x08 != 0 }
|
||||
#[inline] fn x_is_same_or_positive_short(self) -> bool { self.0 & 0x10 != 0 }
|
||||
#[inline] fn y_is_same_or_positive_short(self) -> bool { self.0 & 0x20 != 0 }
|
||||
}
|
||||
|
||||
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/glyf#composite-glyph-description
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub(crate) struct CompositeGlyphFlags(u16);
|
||||
|
||||
impl CompositeGlyphFlags {
|
||||
#[inline] pub fn arg_1_and_2_are_words(self) -> bool { self.0 & 0x0001 != 0 }
|
||||
#[inline] pub fn args_are_xy_values(self) -> bool { self.0 & 0x0002 != 0 }
|
||||
#[inline] pub fn we_have_a_scale(self) -> bool { self.0 & 0x0008 != 0 }
|
||||
#[inline] pub fn more_components(self) -> bool { self.0 & 0x0020 != 0 }
|
||||
#[inline] pub fn we_have_an_x_and_y_scale(self) -> bool { self.0 & 0x0040 != 0 }
|
||||
#[inline] pub fn we_have_a_two_by_two(self) -> bool { self.0 & 0x0080 != 0 }
|
||||
}
|
||||
|
||||
|
||||
// It's not defined in the spec, so we are using our own value.
|
||||
pub(crate) const MAX_COMPONENTS: u8 = 32;
|
||||
|
||||
#[inline]
|
||||
fn outline_impl(
|
||||
loca_table: loca::Table,
|
||||
glyf_table: &[u8],
|
||||
data: &[u8],
|
||||
depth: u8,
|
||||
builder: &mut Builder,
|
||||
) -> Option<Option<Rect>> {
|
||||
if depth >= MAX_COMPONENTS {
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut s = Stream::new(data);
|
||||
let number_of_contours = s.read::<i16>()?;
|
||||
s.advance(8); // Skip bbox. We use calculated one.
|
||||
|
||||
if number_of_contours > 0 {
|
||||
// Simple glyph.
|
||||
|
||||
// u16 casting is safe, since we already checked that the value is positive.
|
||||
let number_of_contours = NonZeroU16::new(number_of_contours as u16)?;
|
||||
for point in parse_simple_outline(s.tail()?, number_of_contours)? {
|
||||
builder.push_point(f32::from(point.x), f32::from(point.y),
|
||||
point.on_curve_point, point.last_point);
|
||||
}
|
||||
} else if number_of_contours < 0 {
|
||||
// Composite glyph.
|
||||
for comp in CompositeGlyphIter::new(s.tail()?) {
|
||||
if let Some(range) = loca_table.glyph_range(comp.glyph_id) {
|
||||
if let Some(glyph_data) = glyf_table.get(range) {
|
||||
let transform = Transform::combine(builder.transform, comp.transform);
|
||||
let mut b = Builder::new(transform, builder.bbox, builder.builder);
|
||||
outline_impl(loca_table, glyf_table, glyph_data, depth + 1, &mut b)?;
|
||||
|
||||
// Take updated bbox.
|
||||
builder.bbox = b.bbox;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if builder.bbox.is_default() {
|
||||
return Some(None);
|
||||
}
|
||||
|
||||
Some(builder.bbox.to_rect())
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub(crate) fn parse_simple_outline(
|
||||
glyph_data: &[u8],
|
||||
number_of_contours: NonZeroU16,
|
||||
) -> Option<GlyphPointsIter> {
|
||||
let mut s = Stream::new(glyph_data);
|
||||
let endpoints = s.read_array16::<u16>(number_of_contours.get())?;
|
||||
|
||||
let points_total = endpoints.last()?.checked_add(1)?;
|
||||
|
||||
// Contours with a single point should be ignored.
|
||||
// But this is not an error, so we should return an "empty" iterator.
|
||||
if points_total == 1 {
|
||||
return Some(GlyphPointsIter::default());
|
||||
}
|
||||
|
||||
// Skip instructions byte code.
|
||||
let instructions_len = s.read::<u16>()?;
|
||||
s.advance(usize::from(instructions_len));
|
||||
|
||||
let flags_offset = s.offset();
|
||||
let (x_coords_len, y_coords_len) = resolve_coords_len(&mut s, points_total)?;
|
||||
let x_coords_offset = s.offset();
|
||||
let y_coords_offset = x_coords_offset + usize::num_from(x_coords_len);
|
||||
let y_coords_end = y_coords_offset + usize::num_from(y_coords_len);
|
||||
|
||||
Some(GlyphPointsIter {
|
||||
endpoints: EndpointsIter::new(endpoints)?,
|
||||
flags: FlagsIter::new(glyph_data.get(flags_offset..x_coords_offset)?),
|
||||
x_coords: CoordsIter::new(glyph_data.get(x_coords_offset..y_coords_offset)?),
|
||||
y_coords: CoordsIter::new(glyph_data.get(y_coords_offset..y_coords_end)?),
|
||||
points_left: points_total,
|
||||
})
|
||||
}
|
||||
|
||||
/// Resolves coordinate arrays length.
|
||||
///
|
||||
/// The length depends on *Simple Glyph Flags*, so we have to process them all to find it.
|
||||
fn resolve_coords_len(
|
||||
s: &mut Stream,
|
||||
points_total: u16,
|
||||
) -> Option<(u32, u32)> {
|
||||
let mut flags_left = u32::from(points_total);
|
||||
let mut repeats;
|
||||
let mut x_coords_len = 0;
|
||||
let mut y_coords_len = 0;
|
||||
while flags_left > 0 {
|
||||
let flags = SimpleGlyphFlags(s.read::<u8>()?);
|
||||
|
||||
// The number of times a glyph point repeats.
|
||||
repeats = if flags.repeat_flag() {
|
||||
let repeats = s.read::<u8>()?;
|
||||
u32::from(repeats) + 1
|
||||
} else {
|
||||
1
|
||||
};
|
||||
|
||||
if repeats > flags_left {
|
||||
return None;
|
||||
}
|
||||
|
||||
// No need to check for `*_coords_len` overflow since u32 is more than enough.
|
||||
|
||||
// Non-obfuscated code below.
|
||||
// Branchless version is surprisingly faster.
|
||||
//
|
||||
// if flags.x_short() {
|
||||
// // Coordinate is 1 byte long.
|
||||
// x_coords_len += repeats;
|
||||
// } else if !flags.x_is_same_or_positive_short() {
|
||||
// // Coordinate is 2 bytes long.
|
||||
// x_coords_len += repeats * 2;
|
||||
// }
|
||||
// if flags.y_short() {
|
||||
// // Coordinate is 1 byte long.
|
||||
// y_coords_len += repeats;
|
||||
// } else if !flags.y_is_same_or_positive_short() {
|
||||
// // Coordinate is 2 bytes long.
|
||||
// y_coords_len += repeats * 2;
|
||||
// }
|
||||
|
||||
x_coords_len += (flags.0 & 0x02 != 0) as u32 * repeats;
|
||||
x_coords_len += (flags.0 & (0x02 | 0x10) == 0) as u32 * (repeats * 2);
|
||||
|
||||
y_coords_len += (flags.0 & 0x04 != 0) as u32 * repeats;
|
||||
y_coords_len += (flags.0 & (0x04 | 0x20) == 0) as u32 * (repeats * 2);
|
||||
|
||||
|
||||
flags_left -= repeats;
|
||||
}
|
||||
|
||||
Some((x_coords_len, y_coords_len))
|
||||
}
|
||||
|
||||
|
||||
/// A [Glyph Data Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/glyf).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Table<'a> {
|
||||
pub(crate) data: &'a [u8],
|
||||
loca_table: loca::Table<'a>,
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Table<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Table {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
#[inline]
|
||||
pub fn parse(loca_table: loca::Table<'a>, data: &'a [u8]) -> Option<Self> {
|
||||
Some(Table { loca_table, data })
|
||||
}
|
||||
|
||||
/// Outlines a glyph.
|
||||
#[inline]
|
||||
pub fn outline(
|
||||
&self,
|
||||
glyph_id: GlyphId,
|
||||
builder: &mut dyn OutlineBuilder,
|
||||
) -> Option<Rect> {
|
||||
let mut b = Builder::new(Transform::default(), BBox::new(), builder);
|
||||
let glyph_data = self.get(glyph_id)?;
|
||||
outline_impl(self.loca_table, self.data, glyph_data, 0, &mut b)?
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub(crate) fn get(&self, glyph_id: GlyphId) -> Option<&'a [u8]> {
|
||||
let range = self.loca_table.glyph_range(glyph_id)?;
|
||||
self.data.get(range)
|
||||
}
|
||||
}
|
|
@ -0,0 +1,953 @@
|
|||
//! A [Glyph Positioning Table](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos)
|
||||
//! implementation.
|
||||
|
||||
// A heavily modified port of https://github.com/RazrFalcon/rustybuzz implementation
|
||||
// originally written by https://github.com/laurmaedje
|
||||
|
||||
use core::convert::TryFrom;
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::opentype_layout::{Class, ClassDefinition, ContextLookup, Coverage, LookupSubtable};
|
||||
use crate::opentype_layout::ChainedContextLookup;
|
||||
use crate::parser::{FromData, FromSlice, LazyArray16, LazyArray32, NumFrom, Offset, Offset16, Stream};
|
||||
|
||||
/// A [Device Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#devVarIdxTbls)
|
||||
/// hinting values.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct HintingDevice<'a> {
|
||||
start_size: u16,
|
||||
end_size: u16,
|
||||
delta_format: u16,
|
||||
delta_values: LazyArray16<'a, u16>,
|
||||
}
|
||||
|
||||
impl HintingDevice<'_> {
|
||||
/// Returns X-axis delta.
|
||||
pub fn x_delta(&self, units_per_em: u16, pixels_per_em: Option<(u16, u16)>) -> Option<i32> {
|
||||
let ppem = pixels_per_em.map(|(x, _)| x)?;
|
||||
self.get_delta(ppem, units_per_em)
|
||||
}
|
||||
|
||||
/// Returns Y-axis delta.
|
||||
pub fn y_delta(&self, units_per_em: u16, pixels_per_em: Option<(u16, u16)>) -> Option<i32> {
|
||||
let ppem = pixels_per_em.map(|(_, y)| y)?;
|
||||
self.get_delta(ppem, units_per_em)
|
||||
}
|
||||
|
||||
fn get_delta(&self, ppem: u16, scale: u16) -> Option<i32> {
|
||||
let f = self.delta_format;
|
||||
debug_assert!(matches!(f, 1..=3));
|
||||
|
||||
if ppem == 0 || ppem < self.start_size || ppem > self.end_size {
|
||||
return None;
|
||||
}
|
||||
|
||||
let s = ppem - self.start_size;
|
||||
let byte = self.delta_values.get(s >> (4 - f))?;
|
||||
let bits = byte >> (16 - (((s & ((1 << (4 - f)) - 1)) + 1) << f));
|
||||
let mask = 0xFFFF >> (16 - (1 << f));
|
||||
|
||||
let mut delta = i64::from(bits & mask);
|
||||
if delta >= i64::from(mask + 1 >> 1) {
|
||||
delta -= i64::from(mask + 1);
|
||||
}
|
||||
|
||||
i32::try_from(delta * i64::from(scale) / i64::from(ppem)).ok()
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for HintingDevice<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "HintingDevice {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
/// A [Device Table](https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#devVarIdxTbls)
|
||||
/// indexes into [Item Variation Store](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/otvarcommonformats#IVS).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct VariationDevice {
|
||||
pub outer_index: u16,
|
||||
pub inner_index: u16,
|
||||
}
|
||||
|
||||
|
||||
/// A [Device Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/chapter2#devVarIdxTbls).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum Device<'a> {
|
||||
Hinting(HintingDevice<'a>),
|
||||
Variation(VariationDevice),
|
||||
}
|
||||
|
||||
impl<'a> Device<'a> {
|
||||
pub(crate) fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let first = s.read::<u16>()?;
|
||||
let second = s.read::<u16>()?;
|
||||
let format = s.read::<u16>()?;
|
||||
match format {
|
||||
1..=3 => {
|
||||
let start_size = first;
|
||||
let end_size = second;
|
||||
let count = 1 + (end_size - start_size) >> (4 - format);
|
||||
let delta_values = s.read_array16(count)?;
|
||||
Some(Self::Hinting(HintingDevice {
|
||||
start_size,
|
||||
end_size,
|
||||
delta_format: format,
|
||||
delta_values,
|
||||
}))
|
||||
}
|
||||
0x8000 => {
|
||||
Some(Self::Variation(VariationDevice {
|
||||
outer_index: first,
|
||||
inner_index: second,
|
||||
}))
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Default, Debug)]
|
||||
struct ValueFormatFlags(u8);
|
||||
|
||||
impl ValueFormatFlags {
|
||||
#[inline] fn x_placement(self) -> bool { self.0 & 0x01 != 0 }
|
||||
#[inline] fn y_placement(self) -> bool { self.0 & 0x02 != 0 }
|
||||
#[inline] fn x_advance(self) -> bool { self.0 & 0x04 != 0 }
|
||||
#[inline] fn y_advance(self) -> bool { self.0 & 0x08 != 0 }
|
||||
#[inline] fn x_placement_device(self) -> bool { self.0 & 0x10 != 0 }
|
||||
#[inline] fn y_placement_device(self) -> bool { self.0 & 0x20 != 0 }
|
||||
#[inline] fn x_advance_device(self) -> bool { self.0 & 0x40 != 0 }
|
||||
#[inline] fn y_advance_device(self) -> bool { self.0 & 0x80 != 0 }
|
||||
|
||||
// The ValueRecord struct constrain either i16 values or Offset16 offsets
|
||||
// and the total size depend on how many flags are enabled.
|
||||
fn size(self) -> usize {
|
||||
// The high 8 bits are not used, so make sure we ignore them using 0xFF.
|
||||
u16::SIZE * usize::num_from(self.0.count_ones())
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for ValueFormatFlags {
|
||||
const SIZE: usize = 2;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
// There is no data in high 8 bits, so skip it.
|
||||
Some(Self(data[1]))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Value Record](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#value-record).
|
||||
#[derive(Clone, Copy, Default, Debug)]
|
||||
pub struct ValueRecord<'a> {
|
||||
/// Horizontal adjustment for placement, in design units.
|
||||
pub x_placement: i16,
|
||||
/// Vertical adjustment for placement, in design units.
|
||||
pub y_placement: i16,
|
||||
/// Horizontal adjustment for advance, in design units — only used for horizontal layout.
|
||||
pub x_advance: i16,
|
||||
/// Vertical adjustment for advance, in design units — only used for vertical layout.
|
||||
pub y_advance: i16,
|
||||
|
||||
/// A [`Device`] table with horizontal adjustment for placement.
|
||||
pub x_placement_device: Option<Device<'a>>,
|
||||
/// A [`Device`] table with vertical adjustment for placement.
|
||||
pub y_placement_device: Option<Device<'a>>,
|
||||
/// A [`Device`] table with horizontal adjustment for advance.
|
||||
pub x_advance_device: Option<Device<'a>>,
|
||||
/// A [`Device`] table with vertical adjustment for advance.
|
||||
pub y_advance_device: Option<Device<'a>>,
|
||||
}
|
||||
|
||||
impl<'a> ValueRecord<'a> {
|
||||
// Returns `None` only on parsing error.
|
||||
fn parse(table_data: &'a [u8], s: &mut Stream, flags: ValueFormatFlags) -> Option<ValueRecord<'a>> {
|
||||
let mut record = ValueRecord::default();
|
||||
|
||||
if flags.x_placement() {
|
||||
record.x_placement = s.read::<i16>()?;
|
||||
}
|
||||
|
||||
if flags.y_placement() {
|
||||
record.y_placement = s.read::<i16>()?;
|
||||
}
|
||||
|
||||
if flags.x_advance() {
|
||||
record.x_advance = s.read::<i16>()?;
|
||||
}
|
||||
|
||||
if flags.y_advance() {
|
||||
record.y_advance = s.read::<i16>()?;
|
||||
}
|
||||
|
||||
if flags.x_placement_device() {
|
||||
if let Some(offset) = s.read::<Option<Offset16>>()? {
|
||||
record.x_placement_device = table_data.get(offset.to_usize()..).and_then(Device::parse)
|
||||
}
|
||||
}
|
||||
|
||||
if flags.y_placement_device() {
|
||||
if let Some(offset) = s.read::<Option<Offset16>>()? {
|
||||
record.y_placement_device = table_data.get(offset.to_usize()..).and_then(Device::parse)
|
||||
}
|
||||
}
|
||||
|
||||
if flags.x_advance_device() {
|
||||
if let Some(offset) = s.read::<Option<Offset16>>()? {
|
||||
record.x_advance_device = table_data.get(offset.to_usize()..).and_then(Device::parse)
|
||||
}
|
||||
}
|
||||
|
||||
if flags.y_advance_device() {
|
||||
if let Some(offset) = s.read::<Option<Offset16>>()? {
|
||||
record.y_advance_device = table_data.get(offset.to_usize()..).and_then(Device::parse)
|
||||
}
|
||||
}
|
||||
|
||||
Some(record)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An array of
|
||||
/// [Value Records](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#value-record).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct ValueRecordsArray<'a> {
|
||||
// We have to store the original table data because ValueRecords can have
|
||||
// a offset to Device tables and offset is from the beginning of the table.
|
||||
table_data: &'a [u8],
|
||||
// A slice that contains all ValueRecords.
|
||||
data: &'a [u8],
|
||||
// Number of records.
|
||||
len: u16,
|
||||
// Size of the single record.
|
||||
value_len: usize,
|
||||
// Flags, used during ValueRecord parsing.
|
||||
flags: ValueFormatFlags,
|
||||
}
|
||||
|
||||
impl<'a> ValueRecordsArray<'a> {
|
||||
fn parse(
|
||||
table_data: &'a [u8],
|
||||
count: u16,
|
||||
flags: ValueFormatFlags,
|
||||
s: &mut Stream<'a>,
|
||||
) -> Option<Self> {
|
||||
Some(Self {
|
||||
table_data,
|
||||
flags,
|
||||
len: count,
|
||||
value_len: flags.size(),
|
||||
data: s.read_bytes(usize::from(count) * flags.size())?,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns array's length.
|
||||
#[inline]
|
||||
pub fn len(&self) -> u16 {
|
||||
self.len
|
||||
}
|
||||
|
||||
/// Returns a [`ValueRecord`] at index.
|
||||
pub fn get(&self, index: u16) -> Option<ValueRecord<'a>> {
|
||||
let start = usize::from(index) * self.value_len;
|
||||
let end = start + self.value_len;
|
||||
let data = self.data.get(start..end)?;
|
||||
let mut s = Stream::new(data);
|
||||
ValueRecord::parse(self.table_data, &mut s, self.flags)
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for ValueRecordsArray<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "ValueRecordsArray {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Single Adjustment Positioning Subtable](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#SP).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum SingleAdjustment<'a> {
|
||||
Format1 {
|
||||
coverage: Coverage<'a>,
|
||||
value: ValueRecord<'a>,
|
||||
},
|
||||
Format2 {
|
||||
coverage: Coverage<'a>,
|
||||
values: ValueRecordsArray<'a>,
|
||||
},
|
||||
}
|
||||
|
||||
impl<'a> SingleAdjustment<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let flags = s.read::<ValueFormatFlags>()?;
|
||||
let value = ValueRecord::parse(data, &mut s, flags)?;
|
||||
Some(Self::Format1 {
|
||||
coverage,
|
||||
value,
|
||||
})
|
||||
}
|
||||
2 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let flags = s.read::<ValueFormatFlags>()?;
|
||||
let count = s.read::<u16>()?;
|
||||
let values = ValueRecordsArray::parse(data, count, flags, &mut s)?;
|
||||
Some(Self::Format2 {
|
||||
coverage,
|
||||
values,
|
||||
})
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the subtable coverage.
|
||||
#[inline]
|
||||
pub fn coverage(&self) -> Coverage<'a> {
|
||||
match self {
|
||||
Self::Format1 { coverage, .. } => *coverage,
|
||||
Self::Format2 { coverage, .. } => *coverage,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [`ValueRecord`] pairs set used by [`PairAdjustment`].
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct PairSet<'a> {
|
||||
data: &'a [u8],
|
||||
flags: (ValueFormatFlags, ValueFormatFlags),
|
||||
record_len: u8,
|
||||
}
|
||||
|
||||
impl<'a> PairSet<'a> {
|
||||
fn parse(data: &'a [u8], flags: (ValueFormatFlags, ValueFormatFlags)) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let count = s.read::<u16>()?;
|
||||
// Max len is 34, so u8 is just enough.
|
||||
let record_len = (GlyphId::SIZE + flags.0.size() + flags.1.size()) as u8;
|
||||
let data = s.read_bytes(usize::from(count) * usize::from(record_len))?;
|
||||
Some(Self { data, flags, record_len })
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn binary_search(&self, second: GlyphId) -> Option<&'a [u8]> {
|
||||
// Based on Rust std implementation.
|
||||
|
||||
let mut size = self.data.len() / usize::from(self.record_len);
|
||||
if size == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let get_record = |index| {
|
||||
let start = index * usize::from(self.record_len);
|
||||
let end = start + usize::from(self.record_len);
|
||||
self.data.get(start..end)
|
||||
};
|
||||
|
||||
let get_glyph = |data: &[u8]| {
|
||||
GlyphId(u16::from_be_bytes([data[0], data[1]]))
|
||||
};
|
||||
|
||||
let mut base = 0;
|
||||
while size > 1 {
|
||||
let half = size / 2;
|
||||
let mid = base + half;
|
||||
// mid is always in [0, size), that means mid is >= 0 and < size.
|
||||
// mid >= 0: by definition
|
||||
// mid < size: mid = size / 2 + size / 4 + size / 8 ...
|
||||
let cmp = get_glyph(get_record(mid)?).cmp(&second);
|
||||
base = if cmp == core::cmp::Ordering::Greater { base } else { mid };
|
||||
size -= half;
|
||||
}
|
||||
|
||||
// base is always in [0, size) because base <= mid.
|
||||
let value = get_record(base)?;
|
||||
if get_glyph(value).cmp(&second) == core::cmp::Ordering::Equal { Some(value) } else { None }
|
||||
}
|
||||
|
||||
/// Returns a [`ValueRecord`] pair using the second glyph.
|
||||
pub fn get(&self, second: GlyphId) -> Option<(ValueRecord<'a>, ValueRecord<'a>)> {
|
||||
let record_data = self.binary_search(second)?;
|
||||
let mut s = Stream::new(record_data);
|
||||
s.skip::<GlyphId>();
|
||||
Some((
|
||||
ValueRecord::parse(self.data, &mut s, self.flags.0)?,
|
||||
ValueRecord::parse(self.data, &mut s, self.flags.1)?,
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for PairSet<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "PairSet {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// Essentially a `LazyOffsetArray16` but stores additional data required to parse [`PairSet`].
|
||||
|
||||
/// A list of [`PairSet`]s.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct PairSets<'a> {
|
||||
data: &'a [u8],
|
||||
// Zero offsets must be ignored, therefore we're using `Option<Offset16>`.
|
||||
offsets: LazyArray16<'a, Option<Offset16>>,
|
||||
flags: (ValueFormatFlags, ValueFormatFlags),
|
||||
}
|
||||
|
||||
impl<'a> PairSets<'a> {
|
||||
fn new(
|
||||
data: &'a [u8],
|
||||
offsets: LazyArray16<'a, Option<Offset16>>,
|
||||
flags: (ValueFormatFlags, ValueFormatFlags),
|
||||
) -> Self {
|
||||
Self { data, offsets, flags }
|
||||
}
|
||||
|
||||
/// Returns a value at `index`.
|
||||
#[inline]
|
||||
pub fn get(&self, index: u16) -> Option<PairSet<'a>> {
|
||||
let offset = self.offsets.get(index)??.to_usize();
|
||||
self.data.get(offset..).and_then(|data| PairSet::parse(data, self.flags))
|
||||
}
|
||||
|
||||
/// Returns array's length.
|
||||
#[inline]
|
||||
pub fn len(&self) -> u16 {
|
||||
self.offsets.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for PairSets<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "PairSets {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [`ValueRecord`] pairs matrix used by [`PairAdjustment`].
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct ClassMatrix<'a> {
|
||||
// We have to store table's original slice,
|
||||
// because offsets in ValueRecords are from the begging of the table.
|
||||
table_data: &'a [u8],
|
||||
matrix: &'a [u8],
|
||||
counts: (u16, u16),
|
||||
flags: (ValueFormatFlags, ValueFormatFlags),
|
||||
record_len: u8,
|
||||
}
|
||||
|
||||
impl<'a> ClassMatrix<'a> {
|
||||
fn parse(
|
||||
table_data: &'a [u8],
|
||||
counts: (u16, u16),
|
||||
flags: (ValueFormatFlags, ValueFormatFlags),
|
||||
s: &mut Stream<'a>,
|
||||
) -> Option<Self> {
|
||||
let count = usize::num_from(u32::from(counts.0) * u32::from(counts.1));
|
||||
// Max len is 32, so u8 is just enough.
|
||||
let record_len = (flags.0.size() + flags.1.size()) as u8;
|
||||
let matrix = s.read_bytes(usize::from(count) * usize::from(record_len))?;
|
||||
Some(Self { table_data, matrix, counts, flags, record_len })
|
||||
}
|
||||
|
||||
/// Returns a [`ValueRecord`] pair using specified classes.
|
||||
pub fn get(&self, classes: (u16, u16)) -> Option<(ValueRecord<'a>, ValueRecord<'a>)> {
|
||||
if classes.0 >= self.counts.0 || classes.1 >= self.counts.1 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let idx = usize::from(classes.0) * usize::from(self.counts.1) + usize::from(classes.1);
|
||||
let record = self.matrix.get(idx * usize::from(self.record_len)..)?;
|
||||
|
||||
let mut s = Stream::new(record);
|
||||
Some((
|
||||
ValueRecord::parse(self.table_data, &mut s, self.flags.0)?,
|
||||
ValueRecord::parse(self.table_data, &mut s, self.flags.1)?,
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for ClassMatrix<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "ClassMatrix {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Pair Adjustment Positioning Subtable](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#PP).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum PairAdjustment<'a> {
|
||||
Format1 {
|
||||
coverage: Coverage<'a>,
|
||||
sets: PairSets<'a>,
|
||||
},
|
||||
Format2 {
|
||||
coverage: Coverage<'a>,
|
||||
classes: (ClassDefinition<'a>, ClassDefinition<'a>),
|
||||
matrix: ClassMatrix<'a>,
|
||||
},
|
||||
}
|
||||
|
||||
impl<'a> PairAdjustment<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let flags = (
|
||||
s.read::<ValueFormatFlags>()?,
|
||||
s.read::<ValueFormatFlags>()?,
|
||||
);
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
Some(Self::Format1 {
|
||||
coverage,
|
||||
sets: PairSets::new(data, offsets, flags)
|
||||
})
|
||||
}
|
||||
2 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let flags = (
|
||||
s.read::<ValueFormatFlags>()?,
|
||||
s.read::<ValueFormatFlags>()?,
|
||||
);
|
||||
let classes = (
|
||||
ClassDefinition::parse(s.read_at_offset16(data)?)?,
|
||||
ClassDefinition::parse(s.read_at_offset16(data)?)?,
|
||||
);
|
||||
let counts = (
|
||||
s.read::<u16>()?,
|
||||
s.read::<u16>()?,
|
||||
);
|
||||
Some(Self::Format2 {
|
||||
coverage,
|
||||
classes,
|
||||
matrix: ClassMatrix::parse(data, counts, flags, &mut s)?,
|
||||
})
|
||||
},
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the subtable coverage.
|
||||
#[inline]
|
||||
pub fn coverage(&self) -> Coverage<'a> {
|
||||
match self {
|
||||
Self::Format1 { coverage, .. } => *coverage,
|
||||
Self::Format2 { coverage, .. } => *coverage,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct EntryExitRecord {
|
||||
entry_anchor_offset: Option<Offset16>,
|
||||
exit_anchor_offset: Option<Offset16>,
|
||||
}
|
||||
|
||||
impl FromData for EntryExitRecord {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Self {
|
||||
entry_anchor_offset: s.read::<Option<Offset16>>()?,
|
||||
exit_anchor_offset: s.read::<Option<Offset16>>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A list of entry and exit [`Anchor`] pairs.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct CursiveAnchorSet<'a> {
|
||||
data: &'a [u8],
|
||||
records: LazyArray16<'a, EntryExitRecord>,
|
||||
}
|
||||
|
||||
impl<'a> CursiveAnchorSet<'a> {
|
||||
/// Returns an entry [`Anchor`] at index.
|
||||
pub fn entry(&self, index: u16) -> Option<Anchor<'a>> {
|
||||
let offset = self.records.get(index)?.entry_anchor_offset?.to_usize();
|
||||
self.data.get(offset..).and_then(Anchor::parse)
|
||||
}
|
||||
|
||||
/// Returns an exit [`Anchor`] at index.
|
||||
pub fn exit(&self, index: u16) -> Option<Anchor<'a>> {
|
||||
let offset = self.records.get(index)?.exit_anchor_offset?.to_usize();
|
||||
self.data.get(offset..).and_then(Anchor::parse)
|
||||
}
|
||||
|
||||
/// Returns the number of items.
|
||||
pub fn len(&self) -> u16 {
|
||||
self.records.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for CursiveAnchorSet<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "CursiveAnchorSet {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Cursive Attachment Positioning Subtable](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#CAP).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct CursiveAdjustment<'a> {
|
||||
pub coverage: Coverage<'a>,
|
||||
pub sets: CursiveAnchorSet<'a>,
|
||||
}
|
||||
|
||||
impl<'a> CursiveAdjustment<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let count = s.read::<u16>()?;
|
||||
let records = s.read_array16(count)?;
|
||||
Some(Self {
|
||||
coverage,
|
||||
sets: CursiveAnchorSet { data, records }
|
||||
})
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Mark-to-Base Attachment Positioning Subtable](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#MBP).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct MarkToBaseAdjustment<'a> {
|
||||
/// A mark coverage.
|
||||
pub mark_coverage: Coverage<'a>,
|
||||
/// A base coverage.
|
||||
pub base_coverage: Coverage<'a>,
|
||||
/// A list of mark anchors.
|
||||
pub marks: MarkArray<'a>,
|
||||
/// An anchors matrix.
|
||||
pub anchors: AnchorMatrix<'a>,
|
||||
}
|
||||
|
||||
impl<'a> MarkToBaseAdjustment<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let mark_coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let base_coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let class_count = s.read::<u16>()?;
|
||||
let marks = MarkArray::parse(s.read_at_offset16(data)?)?;
|
||||
let anchors = AnchorMatrix::parse(s.read_at_offset16(data)?, class_count)?;
|
||||
Some(Self { mark_coverage, base_coverage, marks, anchors })
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A [Mark-to-Ligature Attachment Positioning Subtable](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#MLP).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct MarkToLigatureAdjustment<'a> {
|
||||
pub mark_coverage: Coverage<'a>,
|
||||
pub ligature_coverage: Coverage<'a>,
|
||||
pub marks: MarkArray<'a>,
|
||||
pub ligature_array: LigatureArray<'a>,
|
||||
}
|
||||
|
||||
impl<'a> MarkToLigatureAdjustment<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let mark_coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let ligature_coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let class_count = s.read::<u16>()?;
|
||||
let marks = MarkArray::parse(s.read_at_offset16(data)?)?;
|
||||
let ligature_array = LigatureArray::parse(s.read_at_offset16(data)?, class_count)?;
|
||||
Some(Self { mark_coverage, ligature_coverage, marks, ligature_array })
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An array or ligature anchor matrices.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct LigatureArray<'a> {
|
||||
data: &'a [u8],
|
||||
class_count: u16,
|
||||
offsets: LazyArray16<'a, Offset16>,
|
||||
}
|
||||
|
||||
impl<'a> LigatureArray<'a> {
|
||||
fn parse(data: &'a [u8], class_count: u16) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
Some(Self { data, class_count, offsets })
|
||||
}
|
||||
|
||||
/// Returns an [`AnchorMatrix`] at index.
|
||||
pub fn get(&self, index: u16) -> Option<AnchorMatrix<'a>> {
|
||||
let offset = self.offsets.get(index)?.to_usize();
|
||||
let data = self.data.get(offset..)?;
|
||||
AnchorMatrix::parse(data, self.class_count)
|
||||
}
|
||||
|
||||
/// Returns the array length.
|
||||
pub fn len(&self) -> u16 {
|
||||
self.offsets.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for LigatureArray<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "LigatureArray {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct MarkRecord {
|
||||
class: Class,
|
||||
mark_anchor: Offset16,
|
||||
}
|
||||
|
||||
impl FromData for MarkRecord {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Self {
|
||||
class: s.read::<Class>()?,
|
||||
mark_anchor: s.read::<Offset16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Mark Array](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#mark-array-table).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct MarkArray<'a> {
|
||||
data: &'a [u8],
|
||||
array: LazyArray16<'a, MarkRecord>,
|
||||
}
|
||||
|
||||
impl<'a> MarkArray<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let count = s.read::<u16>()?;
|
||||
let array = s.read_array16(count)?;
|
||||
Some(Self { data, array })
|
||||
}
|
||||
|
||||
/// Returns contained data at index.
|
||||
pub fn get(&self, index: u16) -> Option<(Class, Anchor<'a>)> {
|
||||
let record = self.array.get(index)?;
|
||||
let anchor = self.data.get(record.mark_anchor.to_usize()..).and_then(Anchor::parse)?;
|
||||
Some((record.class, anchor))
|
||||
}
|
||||
|
||||
/// Returns the array length.
|
||||
pub fn len(&self) -> u16 {
|
||||
self.array.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for MarkArray<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "MarkArray {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An [Anchor Table](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#anchor-tables).
|
||||
///
|
||||
/// The *Anchor Table Format 2: Design Units Plus Contour Point* is not supported.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Anchor<'a> {
|
||||
/// Horizontal value, in design units.
|
||||
pub x: i16,
|
||||
/// Vertical value, in design units.
|
||||
pub y: i16,
|
||||
/// A [`Device`] table with horizontal value.
|
||||
pub x_device: Option<Device<'a>>,
|
||||
/// A [`Device`] table with vertical value.
|
||||
pub y_device: Option<Device<'a>>,
|
||||
}
|
||||
|
||||
impl<'a> Anchor<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let format = s.read::<u16>()?;
|
||||
if !matches!(format, 1..=3) {
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut table = Anchor {
|
||||
x: s.read::<i16>()?,
|
||||
y: s.read::<i16>()?,
|
||||
x_device: None,
|
||||
y_device: None,
|
||||
};
|
||||
|
||||
// Note: Format 2 is not handled since there is currently no way to
|
||||
// get a glyph contour point by index.
|
||||
|
||||
if format == 3 {
|
||||
table.x_device = s.read::<Option<Offset16>>()?
|
||||
.and_then(|offset| data.get(offset.to_usize()..))
|
||||
.and_then(Device::parse);
|
||||
|
||||
table.y_device = s.read::<Option<Offset16>>()?
|
||||
.and_then(|offset| data.get(offset.to_usize()..))
|
||||
.and_then(Device::parse);
|
||||
}
|
||||
|
||||
Some(table)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An [`Anchor`] parsing helper.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct AnchorMatrix<'a> {
|
||||
data: &'a [u8],
|
||||
/// Number of rows in the matrix.
|
||||
pub rows: u16,
|
||||
/// Number of columns in the matrix.
|
||||
pub cols: u16,
|
||||
matrix: LazyArray32<'a, Offset16>,
|
||||
}
|
||||
|
||||
impl<'a> AnchorMatrix<'a> {
|
||||
fn parse(data: &'a [u8], cols: u16) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let rows = s.read::<u16>()?;
|
||||
let count = u32::from(rows) * u32::from(cols);
|
||||
let matrix = s.read_array32(count)?;
|
||||
Some(Self { data, rows, cols, matrix })
|
||||
}
|
||||
|
||||
/// Returns an [`Anchor`] at position.
|
||||
pub fn get(&self, row: u16, col: u16) -> Option<Anchor> {
|
||||
let idx = u32::from(row) * u32::from(self.cols) + u32::from(col);
|
||||
let offset = self.matrix.get(idx)?.to_usize();
|
||||
Anchor::parse(self.data.get(offset..)?)
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for AnchorMatrix<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "AnchorMatrix {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Mark-to-Mark Attachment Positioning Subtable](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#MMP).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct MarkToMarkAdjustment<'a> {
|
||||
pub mark1_coverage: Coverage<'a>,
|
||||
pub mark2_coverage: Coverage<'a>,
|
||||
pub marks: MarkArray<'a>,
|
||||
pub mark2_matrix: AnchorMatrix<'a>,
|
||||
}
|
||||
|
||||
impl<'a> MarkToMarkAdjustment<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let mark1_coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let mark2_coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let class_count = s.read::<u16>()?;
|
||||
let marks = MarkArray::parse(s.read_at_offset16(data)?)?;
|
||||
let mark2_matrix = AnchorMatrix::parse(s.read_at_offset16(data)?, class_count)?;
|
||||
Some(Self { mark1_coverage, mark2_coverage, marks, mark2_matrix })
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A glyph positioning
|
||||
/// [lookup subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gpos#table-organization)
|
||||
/// enumeration.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum PositioningSubtable<'a> {
|
||||
Single(SingleAdjustment<'a>),
|
||||
Pair(PairAdjustment<'a>),
|
||||
Cursive(CursiveAdjustment<'a>),
|
||||
MarkToBase(MarkToBaseAdjustment<'a>),
|
||||
MarkToLigature(MarkToLigatureAdjustment<'a>),
|
||||
MarkToMark(MarkToMarkAdjustment<'a>),
|
||||
Context(ContextLookup<'a>),
|
||||
ChainContext(ChainedContextLookup<'a>),
|
||||
}
|
||||
|
||||
impl<'a> LookupSubtable<'a> for PositioningSubtable<'a> {
|
||||
fn parse(data: &'a [u8], kind: u16) -> Option<Self> {
|
||||
match kind {
|
||||
1 => SingleAdjustment::parse(data).map(Self::Single),
|
||||
2 => PairAdjustment::parse(data).map(Self::Pair),
|
||||
3 => CursiveAdjustment::parse(data).map(Self::Cursive),
|
||||
4 => MarkToBaseAdjustment::parse(data).map(Self::MarkToBase),
|
||||
5 => MarkToLigatureAdjustment::parse(data).map(Self::MarkToLigature),
|
||||
6 => MarkToMarkAdjustment::parse(data).map(Self::MarkToMark),
|
||||
7 => ContextLookup::parse(data).map(Self::Context),
|
||||
8 => ChainedContextLookup::parse(data).map(Self::ChainContext),
|
||||
9 => crate::ggg::parse_extension_lookup(data, Self::parse),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> PositioningSubtable<'a> {
|
||||
/// Returns the subtable coverage.
|
||||
#[inline]
|
||||
pub fn coverage(&self) -> Coverage<'a> {
|
||||
match self {
|
||||
Self::Single(t) => t.coverage(),
|
||||
Self::Pair(t) => t.coverage(),
|
||||
Self::Cursive(t) => t.coverage,
|
||||
Self::MarkToBase(t) => t.mark_coverage,
|
||||
Self::MarkToLigature(t) => t.mark_coverage,
|
||||
Self::MarkToMark(t) => t.mark1_coverage,
|
||||
Self::Context(t) => t.coverage(),
|
||||
Self::ChainContext(t) => t.coverage(),
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,294 @@
|
|||
//! A [Glyph Substitution Table](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub)
|
||||
//! implementation.
|
||||
|
||||
// A heavily modified port of https://github.com/RazrFalcon/rustybuzz implementation
|
||||
// originally written by https://github.com/laurmaedje
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::opentype_layout::{ChainedContextLookup, ContextLookup, Coverage, LookupSubtable};
|
||||
use crate::parser::{FromSlice, LazyArray16, LazyOffsetArray16, Stream};
|
||||
|
||||
/// A [Single Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#SS).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum SingleSubstitution<'a> {
|
||||
Format1 {
|
||||
coverage: Coverage<'a>,
|
||||
delta: i16,
|
||||
},
|
||||
Format2 {
|
||||
coverage: Coverage<'a>,
|
||||
substitutes: LazyArray16<'a, GlyphId>,
|
||||
},
|
||||
}
|
||||
|
||||
impl<'a> SingleSubstitution<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let delta = s.read::<i16>()?;
|
||||
Some(Self::Format1 { coverage, delta })
|
||||
}
|
||||
2 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let count = s.read::<u16>()?;
|
||||
let substitutes = s.read_array16(count)?;
|
||||
Some(Self::Format2 { coverage, substitutes })
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the subtable coverage.
|
||||
#[inline]
|
||||
pub fn coverage(&self) -> Coverage<'a> {
|
||||
match self {
|
||||
Self::Format1 { coverage, .. } => *coverage,
|
||||
Self::Format2 { coverage, .. } => *coverage,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A sequence of glyphs for
|
||||
/// [Multiple Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#MS).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Sequence<'a> {
|
||||
/// A list of substitute glyphs.
|
||||
pub substitutes: LazyArray16<'a, GlyphId>,
|
||||
}
|
||||
|
||||
impl<'a> FromSlice<'a> for Sequence<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let count = s.read::<u16>()?;
|
||||
let substitutes = s.read_array16(count)?;
|
||||
Some(Self { substitutes })
|
||||
}
|
||||
}
|
||||
|
||||
/// A list of [`Sequence`] tables.
|
||||
pub type SequenceList<'a> = LazyOffsetArray16<'a, Sequence<'a>>;
|
||||
|
||||
/// A [Multiple Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#MS).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct MultipleSubstitution<'a> {
|
||||
pub coverage: Coverage<'a>,
|
||||
pub sequences: SequenceList<'a>,
|
||||
}
|
||||
|
||||
impl<'a> MultipleSubstitution<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
Some(Self {
|
||||
coverage,
|
||||
sequences: SequenceList::new(data, offsets),
|
||||
})
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A list of glyphs for
|
||||
/// [Alternate Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#AS).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct AlternateSet<'a> {
|
||||
/// Array of alternate glyph IDs, in arbitrary order.
|
||||
pub alternates: LazyArray16<'a, GlyphId>,
|
||||
}
|
||||
|
||||
impl<'a> FromSlice<'a> for AlternateSet<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let count = s.read::<u16>()?;
|
||||
let alternates = s.read_array16(count)?;
|
||||
Some(Self { alternates })
|
||||
}
|
||||
}
|
||||
|
||||
/// A set of [`AlternateSet`].
|
||||
pub type AlternateSets<'a> = LazyOffsetArray16<'a, AlternateSet<'a>>;
|
||||
|
||||
/// A [Alternate Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#AS).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct AlternateSubstitution<'a> {
|
||||
pub coverage: Coverage<'a>,
|
||||
pub alternate_sets: AlternateSets<'a>,
|
||||
}
|
||||
|
||||
impl<'a> AlternateSubstitution<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
Some(Self {
|
||||
coverage,
|
||||
alternate_sets: AlternateSets::new(data, offsets),
|
||||
})
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// Glyph components for one ligature.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Ligature<'a> {
|
||||
/// Ligature to substitute.
|
||||
pub glyph: GlyphId,
|
||||
/// Glyph components for one ligature.
|
||||
pub components: LazyArray16<'a, GlyphId>,
|
||||
}
|
||||
|
||||
impl<'a> FromSlice<'a> for Ligature<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let glyph = s.read::<GlyphId>()?;
|
||||
let count = s.read::<u16>()?;
|
||||
let components = s.read_array16(count.checked_sub(1)?)?;
|
||||
Some(Self { glyph, components })
|
||||
}
|
||||
}
|
||||
|
||||
/// A [`Ligature`] set.
|
||||
pub type LigatureSet<'a> = LazyOffsetArray16<'a, Ligature<'a>>;
|
||||
|
||||
impl<'a> FromSlice<'a> for LigatureSet<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
Self::parse(data)
|
||||
}
|
||||
}
|
||||
|
||||
/// A list of [`Ligature`] sets.
|
||||
pub type LigatureSets<'a> = LazyOffsetArray16<'a, LigatureSet<'a>>;
|
||||
|
||||
/// A [Ligature Substitution Subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#LS).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct LigatureSubstitution<'a> {
|
||||
pub coverage: Coverage<'a>,
|
||||
pub ligature_sets: LigatureSets<'a>,
|
||||
}
|
||||
|
||||
impl<'a> LigatureSubstitution<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(count)?;
|
||||
Some(Self {
|
||||
coverage,
|
||||
ligature_sets: LigatureSets::new(data, offsets),
|
||||
})
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Reverse Chaining Contextual Single Substitution Subtable](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#RCCS).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct ReverseChainSingleSubstitution<'a> {
|
||||
pub coverage: Coverage<'a>,
|
||||
pub backtrack_coverages: LazyOffsetArray16<'a, Coverage<'a>>,
|
||||
pub lookahead_coverages: LazyOffsetArray16<'a, Coverage<'a>>,
|
||||
pub substitutes: LazyArray16<'a, GlyphId>,
|
||||
}
|
||||
|
||||
impl<'a> ReverseChainSingleSubstitution<'a> {
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
match s.read::<u16>()? {
|
||||
1 => {
|
||||
let coverage = Coverage::parse(s.read_at_offset16(data)?)?;
|
||||
let backtrack_count = s.read::<u16>()?;
|
||||
let backtrack_coverages = s.read_array16(backtrack_count)?;
|
||||
let lookahead_count = s.read::<u16>()?;
|
||||
let lookahead_coverages = s.read_array16(lookahead_count)?;
|
||||
let substitute_count = s.read::<u16>()?;
|
||||
let substitutes = s.read_array16(substitute_count)?;
|
||||
Some(Self {
|
||||
coverage,
|
||||
backtrack_coverages: LazyOffsetArray16::new(data, backtrack_coverages),
|
||||
lookahead_coverages: LazyOffsetArray16::new(data, lookahead_coverages),
|
||||
substitutes,
|
||||
})
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A glyph substitution
|
||||
/// [lookup subtable](https://docs.microsoft.com/en-us/typography/opentype/spec/gsub#table-organization)
|
||||
/// enumeration.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum SubstitutionSubtable<'a> {
|
||||
Single(SingleSubstitution<'a>),
|
||||
Multiple(MultipleSubstitution<'a>),
|
||||
Alternate(AlternateSubstitution<'a>),
|
||||
Ligature(LigatureSubstitution<'a>),
|
||||
Context(ContextLookup<'a>),
|
||||
ChainContext(ChainedContextLookup<'a>),
|
||||
ReverseChainSingle(ReverseChainSingleSubstitution<'a>),
|
||||
}
|
||||
|
||||
impl<'a> LookupSubtable<'a> for SubstitutionSubtable<'a> {
|
||||
fn parse(data: &'a [u8], kind: u16) -> Option<Self> {
|
||||
match kind {
|
||||
1 => SingleSubstitution::parse(data).map(Self::Single),
|
||||
2 => MultipleSubstitution::parse(data).map(Self::Multiple),
|
||||
3 => AlternateSubstitution::parse(data).map(Self::Alternate),
|
||||
4 => LigatureSubstitution::parse(data).map(Self::Ligature),
|
||||
5 => ContextLookup::parse(data).map(Self::Context),
|
||||
6 => ChainedContextLookup::parse(data).map(Self::ChainContext),
|
||||
7 => crate::ggg::parse_extension_lookup(data, Self::parse),
|
||||
8 => ReverseChainSingleSubstitution::parse(data).map(Self::ReverseChainSingle),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> SubstitutionSubtable<'a> {
|
||||
/// Returns the subtable coverage.
|
||||
#[inline]
|
||||
pub fn coverage(&self) -> Coverage<'a> {
|
||||
match self {
|
||||
Self::Single(t) => t.coverage(),
|
||||
Self::Multiple(t) => t.coverage,
|
||||
Self::Alternate(t) => t.coverage,
|
||||
Self::Ligature(t) => t.coverage,
|
||||
Self::Context(t) => t.coverage(),
|
||||
Self::ChainContext(t) => t.coverage(),
|
||||
Self::ReverseChainSingle(t) => t.coverage,
|
||||
}
|
||||
}
|
||||
|
||||
/// Checks that the current subtable is *Reverse Chaining Contextual Single*.
|
||||
#[inline]
|
||||
pub fn is_reverse(&self) -> bool {
|
||||
matches!(self, Self::ReverseChainSingle(_))
|
||||
}
|
||||
}
|
File diff suppressed because it is too large
Load Diff
|
@ -0,0 +1,72 @@
|
|||
//! A [Font Header Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/head) implementation.
|
||||
|
||||
use crate::Rect;
|
||||
use crate::parser::{Stream, Fixed};
|
||||
|
||||
/// An index format used by the [Index to Location Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/loca).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, PartialEq, Debug)]
|
||||
pub enum IndexToLocationFormat {
|
||||
Short,
|
||||
Long,
|
||||
}
|
||||
|
||||
|
||||
/// A [Font Header Table](https://docs.microsoft.com/en-us/typography/opentype/spec/head).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table {
|
||||
/// Units per EM.
|
||||
///
|
||||
/// Guarantee to be in a 16..=16384 range.
|
||||
pub units_per_em: u16,
|
||||
/// A bounding box that large enough to enclose any glyph from the face.
|
||||
pub global_bbox: Rect,
|
||||
/// An index format used by the [Index to Location Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/loca).
|
||||
pub index_to_location_format: IndexToLocationFormat,
|
||||
}
|
||||
|
||||
impl Table {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &[u8]) -> Option<Self> {
|
||||
if data.len() != 54 {
|
||||
return None
|
||||
}
|
||||
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u32>(); // version
|
||||
s.skip::<Fixed>(); // font revision
|
||||
s.skip::<u32>(); // checksum adjustment
|
||||
s.skip::<u32>(); // magic number
|
||||
s.skip::<u16>(); // flags
|
||||
let units_per_em = s.read::<u16>()?;
|
||||
s.skip::<u64>(); // created time
|
||||
s.skip::<u64>(); // modified time
|
||||
let x_min = s.read::<i16>()?;
|
||||
let y_min = s.read::<i16>()?;
|
||||
let x_max = s.read::<i16>()?;
|
||||
let y_max = s.read::<i16>()?;
|
||||
s.skip::<u16>(); // mac style
|
||||
s.skip::<u16>(); // lowest PPEM
|
||||
s.skip::<i16>(); // font direction hint
|
||||
let index_to_location_format = s.read::<u16>()?;
|
||||
|
||||
if !(units_per_em >= 16 && units_per_em <= 16384) {
|
||||
return None;
|
||||
}
|
||||
|
||||
let index_to_location_format = match index_to_location_format {
|
||||
0 => IndexToLocationFormat::Short,
|
||||
1 => IndexToLocationFormat::Long,
|
||||
_ => return None,
|
||||
};
|
||||
|
||||
Some(Table {
|
||||
units_per_em,
|
||||
global_bbox: Rect { x_min, y_min, x_max, y_max },
|
||||
index_to_location_format,
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,41 @@
|
|||
//! A [Horizontal Header Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/hhea) implementation.
|
||||
|
||||
use crate::parser::Stream;
|
||||
|
||||
/// A [Horizontal Header Table](https://docs.microsoft.com/en-us/typography/opentype/spec/hhea).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table {
|
||||
/// Face ascender.
|
||||
pub ascender: i16,
|
||||
/// Face descender.
|
||||
pub descender: i16,
|
||||
/// Face line gap.
|
||||
pub line_gap: i16,
|
||||
/// Number of metrics in the `hmtx` table.
|
||||
pub number_of_metrics: u16,
|
||||
}
|
||||
|
||||
impl Table {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &[u8]) -> Option<Self> {
|
||||
if data.len() != 36 {
|
||||
return None
|
||||
}
|
||||
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u32>(); // version
|
||||
let ascender = s.read::<i16>()?;
|
||||
let descender = s.read::<i16>()?;
|
||||
let line_gap = s.read::<i16>()?;
|
||||
s.advance(24);
|
||||
let number_of_metrics = s.read::<u16>()?;
|
||||
|
||||
Some(Table {
|
||||
ascender,
|
||||
descender,
|
||||
line_gap,
|
||||
number_of_metrics,
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,110 @@
|
|||
//! A [Horizontal/Vertical Metrics Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/hmtx) implementation.
|
||||
|
||||
use core::num::NonZeroU16;
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::parser::{Stream, FromData, LazyArray16};
|
||||
|
||||
/// Horizontal/Vertical Metrics.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Metrics {
|
||||
/// Width/Height advance for `hmtx`/`vmtx`.
|
||||
pub advance: u16,
|
||||
/// Left/Top side bearing for `hmtx`/`vmtx`.
|
||||
pub side_bearing: i16,
|
||||
}
|
||||
|
||||
impl FromData for Metrics {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Metrics {
|
||||
advance: s.read::<u16>()?,
|
||||
side_bearing: s.read::<i16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Horizontal/Vertical Metrics Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/hmtx).
|
||||
///
|
||||
/// `hmtx` and `vmtx` tables has the same structure, so we're reusing the same struct for both.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// A list of metrics indexed by glyph ID.
|
||||
pub metrics: LazyArray16<'a, Metrics>,
|
||||
/// Side bearings for glyph IDs greater than or equal to the number of `metrics` values.
|
||||
pub bearings: LazyArray16<'a, i16>,
|
||||
/// Sum of long metrics + bearings.
|
||||
pub number_of_metrics: u16,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
///
|
||||
/// - `number_of_metrics` is from the `hhea`/`vhea` table.
|
||||
/// - `number_of_glyphs` is from the `maxp` table.
|
||||
pub fn parse(
|
||||
mut number_of_metrics: u16,
|
||||
number_of_glyphs: NonZeroU16,
|
||||
data: &'a [u8],
|
||||
) -> Option<Self> {
|
||||
if number_of_metrics == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut s = Stream::new(data);
|
||||
let metrics = s.read_array16::<Metrics>(number_of_metrics)?;
|
||||
|
||||
// 'If the number_of_metrics is less than the total number of glyphs,
|
||||
// then that array is followed by an array for the left side bearing values
|
||||
// of the remaining glyphs.'
|
||||
let bearings_count = number_of_glyphs.get().checked_sub(number_of_metrics);
|
||||
let bearings = if let Some(count) = bearings_count {
|
||||
number_of_metrics += count;
|
||||
s.read_array16::<i16>(count)?
|
||||
} else {
|
||||
LazyArray16::default()
|
||||
};
|
||||
|
||||
Some(Table {
|
||||
metrics,
|
||||
bearings,
|
||||
number_of_metrics,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns advance for a glyph.
|
||||
#[inline]
|
||||
pub fn advance(&self, glyph_id: GlyphId) -> Option<u16> {
|
||||
if glyph_id.0 >= self.number_of_metrics {
|
||||
return None;
|
||||
}
|
||||
|
||||
if let Some(metrics) = self.metrics.get(glyph_id.0) {
|
||||
Some(metrics.advance)
|
||||
} else {
|
||||
// 'As an optimization, the number of records can be less than the number of glyphs,
|
||||
// in which case the advance value of the last record applies
|
||||
// to all remaining glyph IDs.'
|
||||
self.metrics.last().map(|m| m.advance)
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns side bearing for a glyph.
|
||||
#[inline]
|
||||
pub fn side_bearing(&self, glyph_id: GlyphId) -> Option<i16> {
|
||||
if let Some(metrics) = self.metrics.get(glyph_id.0) {
|
||||
Some(metrics.side_bearing)
|
||||
} else {
|
||||
// 'If the number_of_metrics is less than the total number of glyphs,
|
||||
// then that array is followed by an array for the side bearing values
|
||||
// of the remaining glyphs.'
|
||||
self.bearings.get(glyph_id.0.checked_sub(self.metrics.len())?)
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,126 @@
|
|||
//! A [Horizontal/Vertical Metrics Variations Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/hvar) implementation.
|
||||
|
||||
use core::convert::TryFrom;
|
||||
|
||||
use crate::{GlyphId, NormalizedCoordinate};
|
||||
use crate::parser::{Stream, Offset, Offset32};
|
||||
use crate::var_store::ItemVariationStore;
|
||||
|
||||
struct DeltaSetIndexMap<'a> {
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> DeltaSetIndexMap<'a> {
|
||||
#[inline]
|
||||
fn new(data: &'a [u8]) -> Self {
|
||||
DeltaSetIndexMap { data }
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn map(&self, glyph_id: GlyphId) -> Option<(u16, u16)> {
|
||||
let mut idx = glyph_id.0;
|
||||
|
||||
let mut s = Stream::new(self.data);
|
||||
let entry_format = s.read::<u16>()?;
|
||||
let map_count = s.read::<u16>()?;
|
||||
|
||||
if map_count == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
// 'If a given glyph ID is greater than mapCount-1, then the last entry is used.'
|
||||
if idx >= map_count {
|
||||
idx = map_count - 1;
|
||||
}
|
||||
|
||||
let entry_size = ((entry_format >> 4) & 3) + 1;
|
||||
let inner_index_bit_count = u32::from((entry_format & 0xF) + 1);
|
||||
|
||||
s.advance(usize::from(entry_size) * usize::from(idx));
|
||||
|
||||
let mut n = 0u32;
|
||||
for b in s.read_bytes(usize::from(entry_size))? {
|
||||
n = (n << 8) + u32::from(*b);
|
||||
}
|
||||
|
||||
let outer_index = n >> inner_index_bit_count;
|
||||
let inner_index = n & ((1 << inner_index_bit_count) - 1);
|
||||
Some((
|
||||
u16::try_from(outer_index).ok()?,
|
||||
u16::try_from(inner_index).ok()?
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Horizontal/Vertical Metrics Variations Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/hvar).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Table<'a> {
|
||||
data: &'a [u8],
|
||||
variation_store: ItemVariationStore<'a>,
|
||||
advance_width_mapping_offset: Option<Offset32>,
|
||||
lsb_mapping_offset: Option<Offset32>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let version = s.read::<u32>()?;
|
||||
if version != 0x00010000 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let variation_store_offset = s.read::<Offset32>()?;
|
||||
let var_store_s = Stream::new_at(data, variation_store_offset.to_usize())?;
|
||||
let variation_store = ItemVariationStore::parse(var_store_s)?;
|
||||
|
||||
Some(Table {
|
||||
data,
|
||||
variation_store,
|
||||
advance_width_mapping_offset: s.read::<Option<Offset32>>()?,
|
||||
lsb_mapping_offset: s.read::<Option<Offset32>>()?,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns advance offset for a glyph.
|
||||
#[inline]
|
||||
pub fn advance_offset(
|
||||
&self,
|
||||
glyph_id: GlyphId,
|
||||
coordinates: &[NormalizedCoordinate],
|
||||
) -> Option<f32> {
|
||||
let (outer_idx, inner_idx) = if let Some(offset) = self.advance_width_mapping_offset {
|
||||
DeltaSetIndexMap::new(self.data.get(offset.to_usize()..)?).map(glyph_id)?
|
||||
} else {
|
||||
// 'If there is no delta-set index mapping table for advance widths,
|
||||
// then glyph IDs implicitly provide the indices:
|
||||
// for a given glyph ID, the delta-set outer-level index is zero,
|
||||
// and the glyph ID is the delta-set inner-level index.'
|
||||
(0, glyph_id.0)
|
||||
};
|
||||
|
||||
self.variation_store.parse_delta(outer_idx, inner_idx, coordinates)
|
||||
}
|
||||
|
||||
/// Returns side bearing offset for a glyph.
|
||||
#[inline]
|
||||
pub fn side_bearing_offset(
|
||||
&self,
|
||||
glyph_id: GlyphId,
|
||||
coordinates: &[NormalizedCoordinate],
|
||||
) -> Option<f32> {
|
||||
let set_data = self.data.get(self.lsb_mapping_offset?.to_usize()..)?;
|
||||
let (outer_idx, inner_idx) = DeltaSetIndexMap::new(set_data).map(glyph_id)?;
|
||||
self.variation_store.parse_delta(outer_idx, inner_idx, coordinates)
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Table<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Table {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,463 @@
|
|||
/*!
|
||||
A [Kerning Table](
|
||||
https://docs.microsoft.com/en-us/typography/opentype/spec/kern) implementation.
|
||||
|
||||
Supports both
|
||||
[OpenType](https://docs.microsoft.com/en-us/typography/opentype/spec/kern)
|
||||
and
|
||||
[Apple Advanced Typography](https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6kern.html)
|
||||
variants.
|
||||
|
||||
Since there is no single correct way to process a kerning data,
|
||||
we have to provide an access to kerning subtables, so a caller can implement
|
||||
a kerning algorithm manually.
|
||||
But we still try to keep the API as high-level as possible.
|
||||
*/
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::parser::{FromData, LazyArray16, NumFrom, Offset, Offset16, Stream};
|
||||
#[cfg(feature = "apple-layout")] use crate::aat;
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct OTCoverage(u8);
|
||||
|
||||
impl OTCoverage {
|
||||
#[inline] fn is_horizontal(self) -> bool { self.0 & (1 << 0) != 0 }
|
||||
#[inline] fn has_cross_stream(self) -> bool { self.0 & (1 << 2) != 0 }
|
||||
}
|
||||
|
||||
impl FromData for OTCoverage {
|
||||
const SIZE: usize = 1;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
data.get(0).copied().map(OTCoverage)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct AATCoverage(u8);
|
||||
|
||||
impl AATCoverage {
|
||||
#[inline] fn is_horizontal(self) -> bool { self.0 & (1 << 7) == 0 }
|
||||
#[inline] fn has_cross_stream(self) -> bool { self.0 & (1 << 6) != 0 }
|
||||
#[inline] fn is_variable(self) -> bool { self.0 & (1 << 5) != 0 }
|
||||
}
|
||||
|
||||
impl FromData for AATCoverage {
|
||||
const SIZE: usize = 1;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
data.get(0).copied().map(AATCoverage)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A kerning pair.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct KerningPair {
|
||||
/// Glyphs pair.
|
||||
///
|
||||
/// In the kern table spec, a kerning pair is stored as two u16,
|
||||
/// but we are using one u32, so we can binary search it directly.
|
||||
pub pair: u32,
|
||||
/// Kerning value.
|
||||
pub value: i16,
|
||||
}
|
||||
|
||||
impl KerningPair {
|
||||
/// Returns left glyph ID.
|
||||
#[inline]
|
||||
pub fn left(&self) -> GlyphId {
|
||||
GlyphId((self.pair >> 16) as u16)
|
||||
}
|
||||
|
||||
/// Returns right glyph ID.
|
||||
#[inline]
|
||||
pub fn right(&self) -> GlyphId {
|
||||
GlyphId(self.pair as u16)
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for KerningPair {
|
||||
const SIZE: usize = 6;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(KerningPair {
|
||||
pair: s.read::<u32>()?,
|
||||
value: s.read::<i16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A kerning subtable format.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Debug)]
|
||||
pub enum Format<'a> {
|
||||
Format0(Subtable0<'a>),
|
||||
#[cfg(feature = "apple-layout")] Format1(aat::StateTable<'a>),
|
||||
#[cfg(not(feature = "apple-layout"))] Format1,
|
||||
Format2(Subtable2<'a>),
|
||||
Format3(Subtable3<'a>),
|
||||
}
|
||||
|
||||
|
||||
/// A kerning subtable.
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Subtable<'a> {
|
||||
/// Indicates that subtable is for horizontal text.
|
||||
pub horizontal: bool,
|
||||
/// Indicates that subtable is variable.
|
||||
pub variable: bool,
|
||||
/// Indicates that subtable has a cross-stream values.
|
||||
pub has_cross_stream: bool,
|
||||
/// Indicates that subtable uses a state machine.
|
||||
///
|
||||
/// In this case `glyphs_kerning()` will return `None`.
|
||||
pub has_state_machine: bool,
|
||||
/// Subtable format.
|
||||
pub format: Format<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Subtable<'a> {
|
||||
/// Returns kerning for a pair of glyphs.
|
||||
///
|
||||
/// Returns `None` in case of state machine based subtable.
|
||||
#[inline]
|
||||
pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option<i16> {
|
||||
match self.format {
|
||||
Format::Format0(ref subtable) => subtable.glyphs_kerning(left, right),
|
||||
Format::Format2(ref subtable) => subtable.glyphs_kerning(left, right),
|
||||
Format::Format3(ref subtable) => subtable.glyphs_kerning(left, right),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A list of subtables.
|
||||
///
|
||||
/// The internal data layout is not designed for random access,
|
||||
/// therefore we're not providing the `get()` method and only an iterator.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Subtables<'a> {
|
||||
/// Indicates an Apple Advanced Typography format.
|
||||
is_aat: bool,
|
||||
/// The total number of tables.
|
||||
count: u32,
|
||||
/// Actual data. Starts right after the `kern` header.
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Subtables<'a> {
|
||||
/// Returns the number of subtables.
|
||||
pub fn len(&self) -> u32 {
|
||||
self.count
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtables<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtables {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for Subtables<'a> {
|
||||
type Item = Subtable<'a>;
|
||||
type IntoIter = SubtablesIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
SubtablesIter {
|
||||
is_aat: self.is_aat,
|
||||
table_index: 0,
|
||||
number_of_tables: self.count,
|
||||
stream: Stream::new(self.data),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An iterator over kerning subtables.
|
||||
#[allow(missing_debug_implementations)]
|
||||
#[derive(Clone, Copy, Default)]
|
||||
pub struct SubtablesIter<'a> {
|
||||
/// Indicates an Apple Advanced Typography format.
|
||||
is_aat: bool,
|
||||
/// The current table index,
|
||||
table_index: u32,
|
||||
/// The total number of tables.
|
||||
number_of_tables: u32,
|
||||
/// Actual data. Starts right after `kern` header.
|
||||
stream: Stream<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for SubtablesIter<'a> {
|
||||
type Item = Subtable<'a>;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.table_index == self.number_of_tables {
|
||||
return None;
|
||||
}
|
||||
|
||||
if self.stream.at_end() {
|
||||
return None;
|
||||
}
|
||||
|
||||
if self.is_aat {
|
||||
const HEADER_SIZE: u8 = 8;
|
||||
|
||||
let table_len = self.stream.read::<u32>()?;
|
||||
let coverage = self.stream.read::<AATCoverage>()?;
|
||||
let format_id = self.stream.read::<u8>()?;
|
||||
self.stream.skip::<u16>(); // variation tuple index
|
||||
|
||||
if format_id > 3 {
|
||||
// Unknown format.
|
||||
return None;
|
||||
}
|
||||
|
||||
// Subtract the header size.
|
||||
let data_len = usize::num_from(table_len).checked_sub(usize::from(HEADER_SIZE))?;
|
||||
let data = self.stream.read_bytes(data_len)?;
|
||||
|
||||
let format = match format_id {
|
||||
0 => Format::Format0(Subtable0::parse(data)?),
|
||||
#[cfg(feature = "apple-layout")]
|
||||
1 => Format::Format1(aat::StateTable::parse(data)?),
|
||||
#[cfg(not(feature = "apple-layout"))]
|
||||
1 => Format::Format1,
|
||||
2 => Format::Format2(Subtable2::parse(HEADER_SIZE, data)?),
|
||||
3 => Format::Format3(Subtable3::parse(data)?),
|
||||
_ => return None,
|
||||
};
|
||||
|
||||
Some(Subtable {
|
||||
horizontal: coverage.is_horizontal(),
|
||||
variable: coverage.is_variable(),
|
||||
has_cross_stream: coverage.has_cross_stream(),
|
||||
has_state_machine: format_id == 1,
|
||||
format,
|
||||
})
|
||||
} else {
|
||||
const HEADER_SIZE: u8 = 6;
|
||||
|
||||
self.stream.skip::<u16>(); // version
|
||||
let table_len = self.stream.read::<u16>()?;
|
||||
// In the OpenType variant, `format` comes first.
|
||||
let format_id = self.stream.read::<u8>()?;
|
||||
let coverage = self.stream.read::<OTCoverage>()?;
|
||||
|
||||
if format_id != 0 && format_id != 2 {
|
||||
// Unknown format.
|
||||
return None;
|
||||
}
|
||||
|
||||
let data_len = if self.number_of_tables == 1 {
|
||||
// An OpenType `kern` table with just one subtable is a special case.
|
||||
// The `table_len` property is mainly required to jump to the next subtable,
|
||||
// but if there is only one subtable, this property can be ignored.
|
||||
// This is abused by some fonts, to get around the `u16` size limit.
|
||||
self.stream.tail()?.len()
|
||||
} else {
|
||||
// Subtract the header size.
|
||||
usize::from(table_len).checked_sub(usize::from(HEADER_SIZE))?
|
||||
};
|
||||
|
||||
let data = self.stream.read_bytes(data_len)?;
|
||||
|
||||
let format = match format_id {
|
||||
0 => Format::Format0(Subtable0::parse(data)?),
|
||||
2 => Format::Format2(Subtable2::parse(HEADER_SIZE, data)?),
|
||||
_ => return None,
|
||||
};
|
||||
|
||||
Some(Subtable {
|
||||
horizontal: coverage.is_horizontal(),
|
||||
variable: false, // Only AAT supports it.
|
||||
has_cross_stream: coverage.has_cross_stream(),
|
||||
has_state_machine: format_id == 1,
|
||||
format,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A format 0 subtable.
|
||||
///
|
||||
/// Ordered List of Kerning Pairs.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Subtable0<'a> {
|
||||
/// A list of kerning pairs.
|
||||
pub pairs: LazyArray16<'a, KerningPair>,
|
||||
}
|
||||
|
||||
impl<'a> Subtable0<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let number_of_pairs = s.read::<u16>()?;
|
||||
s.advance(6); // search_range (u16) + entry_selector (u16) + range_shift (u16)
|
||||
let pairs = s.read_array16::<KerningPair>(number_of_pairs)?;
|
||||
Some(Self { pairs })
|
||||
}
|
||||
|
||||
/// Returns kerning for a pair of glyphs.
|
||||
#[inline]
|
||||
pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option<i16> {
|
||||
let needle = u32::from(left.0) << 16 | u32::from(right.0);
|
||||
self.pairs.binary_search_by(|v| v.pair.cmp(&needle)).map(|(_, v)| v.value)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A format 2 subtable.
|
||||
///
|
||||
/// Simple n x m Array of Kerning Values.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Subtable2<'a> {
|
||||
// TODO: parse actual structure
|
||||
data: &'a [u8],
|
||||
header_len: u8,
|
||||
}
|
||||
|
||||
impl<'a> Subtable2<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(header_len: u8, data: &'a [u8]) -> Option<Self> {
|
||||
Some(Self { header_len, data })
|
||||
}
|
||||
|
||||
/// Returns kerning for a pair of glyphs.
|
||||
pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option<i16> {
|
||||
let mut s = Stream::new(self.data);
|
||||
s.skip::<u16>(); // row_width
|
||||
|
||||
// Offsets are from beginning of the subtable and not from the `data` start,
|
||||
// so we have to subtract the header.
|
||||
let header_len = usize::from(self.header_len);
|
||||
let left_hand_table_offset = s.read::<Offset16>()?.to_usize().checked_sub(header_len)?;
|
||||
let right_hand_table_offset = s.read::<Offset16>()?.to_usize().checked_sub(header_len)?;
|
||||
let array_offset = s.read::<Offset16>()?.to_usize().checked_sub(header_len)?;
|
||||
|
||||
// 'The array can be indexed by completing the left-hand and right-hand class mappings,
|
||||
// adding the class values to the address of the subtable,
|
||||
// and fetching the kerning value to which the new address points.'
|
||||
|
||||
let left_class = get_format2_class(left.0, left_hand_table_offset, self.data).unwrap_or(0);
|
||||
let right_class = get_format2_class(right.0, right_hand_table_offset, self.data).unwrap_or(0);
|
||||
|
||||
// 'Values within the left-hand offset table should not be less than the kerning array offset.'
|
||||
if usize::from(left_class) < array_offset {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Classes are already premultiplied, so we only need to sum them.
|
||||
let index = usize::from(left_class) + usize::from(right_class);
|
||||
let value_offset = index.checked_sub(header_len)?;
|
||||
Stream::read_at::<i16>(self.data, value_offset)
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn get_format2_class(glyph_id: u16, offset: usize, data: &[u8]) -> Option<u16> {
|
||||
let mut s = Stream::new_at(data, offset)?;
|
||||
let first_glyph = s.read::<u16>()?;
|
||||
let index = glyph_id.checked_sub(first_glyph)?;
|
||||
|
||||
let number_of_classes = s.read::<u16>()?;
|
||||
let classes = s.read_array16::<u16>(number_of_classes)?;
|
||||
classes.get(index)
|
||||
}
|
||||
|
||||
|
||||
/// A format 3 subtable.
|
||||
///
|
||||
/// Simple n x m Array of Kerning Indices.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Subtable3<'a> {
|
||||
// TODO: parse actual structure
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Subtable3<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
Some(Self { data })
|
||||
}
|
||||
|
||||
/// Returns kerning for a pair of glyphs.
|
||||
#[inline]
|
||||
pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option<i16> {
|
||||
let mut s = Stream::new(self.data);
|
||||
let glyph_count = s.read::<u16>()?;
|
||||
let kerning_values_count = s.read::<u8>()?;
|
||||
let left_hand_classes_count = s.read::<u8>()?;
|
||||
let right_hand_classes_count = s.read::<u8>()?;
|
||||
s.skip::<u8>(); // reserved
|
||||
let indices_count = u16::from(left_hand_classes_count) * u16::from(right_hand_classes_count);
|
||||
|
||||
let kerning_values = s.read_array16::<i16>(u16::from(kerning_values_count))?;
|
||||
let left_hand_classes = s.read_array16::<u8>(glyph_count)?;
|
||||
let right_hand_classes = s.read_array16::<u8>(glyph_count)?;
|
||||
let indices = s.read_array16::<u8>(indices_count)?;
|
||||
|
||||
let left_class = left_hand_classes.get(left.0)?;
|
||||
let right_class = right_hand_classes.get(right.0)?;
|
||||
|
||||
if left_class > left_hand_classes_count || right_class > right_hand_classes_count {
|
||||
return None;
|
||||
}
|
||||
|
||||
let index = u16::from(left_class) * u16::from(right_hand_classes_count) + u16::from(right_class);
|
||||
let index = indices.get(index)?;
|
||||
kerning_values.get(u16::from(index))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Kerning Table](https://docs.microsoft.com/en-us/typography/opentype/spec/kern).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// A list of subtables.
|
||||
pub subtables: Subtables<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
// The `kern` table has two variants: OpenType and Apple.
|
||||
// And they both have different headers.
|
||||
// There are no robust way to distinguish them, so we have to guess.
|
||||
//
|
||||
// The OpenType one has the first two bytes (UInt16) as a version set to 0.
|
||||
// While Apple one has the first four bytes (Fixed) set to 1.0
|
||||
// So the first two bytes in case of an OpenType format will be 0x0000
|
||||
// and 0x0001 in case of an Apple format.
|
||||
let mut s = Stream::new(data);
|
||||
let version = s.read::<u16>()?;
|
||||
let subtables = if version == 0 {
|
||||
let count = s.read::<u16>()?;
|
||||
Subtables {
|
||||
is_aat: false,
|
||||
count: u32::from(count),
|
||||
data: s.tail()?,
|
||||
}
|
||||
} else {
|
||||
s.skip::<u16>(); // Skip the second part of u32 version.
|
||||
// Note that AAT stores the number of tables as u32 and not as u16.
|
||||
let count = s.read::<u32>()?;
|
||||
Subtables {
|
||||
is_aat: true,
|
||||
count,
|
||||
data: s.tail()?,
|
||||
}
|
||||
};
|
||||
|
||||
Some(Self { subtables })
|
||||
}
|
||||
}
|
|
@ -0,0 +1,480 @@
|
|||
//! An [Extended Kerning Table](
|
||||
//! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6kerx.html) implementation.
|
||||
|
||||
// TODO: find a way to test this table
|
||||
// This table is basically untested because it uses Apple's State Tables
|
||||
// and I have no idea how to generate them.
|
||||
|
||||
use core::num::NonZeroU16;
|
||||
|
||||
use crate::{aat, GlyphId};
|
||||
use crate::kern::KerningPair;
|
||||
use crate::parser::{Stream, FromData, NumFrom, Offset32, Offset, LazyArray32};
|
||||
|
||||
const HEADER_SIZE: usize = 12;
|
||||
|
||||
/// A format 0 subtable.
|
||||
///
|
||||
/// Ordered List of Kerning Pairs.
|
||||
///
|
||||
/// The same as in `kern`, but uses `LazyArray32` instead of `LazyArray16`.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Subtable0<'a> {
|
||||
/// A list of kerning pairs.
|
||||
pub pairs: LazyArray32<'a, KerningPair>,
|
||||
}
|
||||
|
||||
impl<'a> Subtable0<'a> {
|
||||
/// Parses a subtable from raw data.
|
||||
fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let number_of_pairs = s.read::<u32>()?;
|
||||
s.advance(12); // search_range (u32) + entry_selector (u32) + range_shift (u32)
|
||||
let pairs = s.read_array32::<KerningPair>(number_of_pairs)?;
|
||||
Some(Self { pairs })
|
||||
}
|
||||
|
||||
/// Returns kerning for a pair of glyphs.
|
||||
#[inline]
|
||||
pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option<i16> {
|
||||
let needle = u32::from(left.0) << 16 | u32::from(right.0);
|
||||
self.pairs.binary_search_by(|v| v.pair.cmp(&needle)).map(|(_, v)| v.value)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A state machine entry.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct EntryData {
|
||||
/// An action index.
|
||||
pub action_index: u16,
|
||||
}
|
||||
|
||||
impl FromData for EntryData {
|
||||
const SIZE: usize = 2;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(EntryData {
|
||||
action_index: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// A format 1 subtable.
|
||||
///
|
||||
/// State Table for Contextual Kerning.
|
||||
#[derive(Clone)]
|
||||
pub struct Subtable1<'a> {
|
||||
/// A state table.
|
||||
pub state_table: aat::ExtendedStateTable<'a, EntryData>,
|
||||
actions_data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Subtable1<'a> {
|
||||
fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let state_table = aat::ExtendedStateTable::parse(number_of_glyphs, &mut s)?;
|
||||
|
||||
// Actions offset is right after the state table.
|
||||
let actions_offset = s.read::<Offset32>()?;
|
||||
// Actions offset is from the start of the state table and not from the start of subtable.
|
||||
// And since we don't know the length of the actions data,
|
||||
// simply store all the data after the offset.
|
||||
let actions_data = data.get(actions_offset.to_usize()..)?;
|
||||
|
||||
Some(Subtable1 {
|
||||
state_table,
|
||||
actions_data,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns kerning at action index.
|
||||
#[inline]
|
||||
pub fn glyphs_kerning(&self, action_index: u16) -> Option<i16> {
|
||||
Stream::read_at(self.actions_data, usize::from(action_index) * i16::SIZE)
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> core::ops::Deref for Subtable1<'a> {
|
||||
type Target = aat::ExtendedStateTable<'a, EntryData>;
|
||||
|
||||
fn deref(&self) -> &Self::Target {
|
||||
&self.state_table
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtable1<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtable1 {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A format 2 subtable.
|
||||
///
|
||||
/// Simple n x m Array of Kerning Values.
|
||||
///
|
||||
/// The same as in `kern`, but uses 32bit offsets instead of 16bit one.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Subtable2<'a>(&'a [u8]); // TODO: parse actual structure
|
||||
|
||||
impl<'a> Subtable2<'a> {
|
||||
/// Returns kerning for a pair of glyphs.
|
||||
pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option<i16> {
|
||||
let mut s = Stream::new(self.0);
|
||||
s.skip::<u32>(); // row_width
|
||||
|
||||
// Offsets are from beginning of the subtable and not from the `data` start,
|
||||
// so we have to subtract the header.
|
||||
let left_hand_table_offset = s.read::<Offset32>()?.to_usize().checked_sub(HEADER_SIZE)?;
|
||||
let right_hand_table_offset = s.read::<Offset32>()?.to_usize().checked_sub(HEADER_SIZE)?;
|
||||
let array_offset = s.read::<Offset32>()?.to_usize().checked_sub(HEADER_SIZE)?;
|
||||
|
||||
// 'The array can be indexed by completing the left-hand and right-hand class mappings,
|
||||
// adding the class values to the address of the subtable,
|
||||
// and fetching the kerning value to which the new address points.'
|
||||
|
||||
let left_class = crate::kern::get_format2_class(left.0, left_hand_table_offset, self.0).unwrap_or(0);
|
||||
let right_class = crate::kern::get_format2_class(right.0, right_hand_table_offset, self.0).unwrap_or(0);
|
||||
|
||||
// 'Values within the left-hand offset table should not be less than the kerning array offset.'
|
||||
if usize::from(left_class) < array_offset {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Classes are already premultiplied, so we only need to sum them.
|
||||
let index = usize::from(left_class) + usize::from(right_class);
|
||||
let value_offset = index.checked_sub(HEADER_SIZE)?;
|
||||
Stream::read_at::<i16>(self.0, value_offset)
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtable2<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtable2 {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A container of Anchor Points used by [`Subtable4`].
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct AnchorPoints<'a>(&'a [u8]);
|
||||
|
||||
impl AnchorPoints<'_> {
|
||||
/// Returns a mark and current anchor points at action index.
|
||||
pub fn get(&self, action_index: u16) -> Option<(u16, u16)> {
|
||||
let offset = usize::from(action_index) * u16::SIZE;
|
||||
let mut s = Stream::new_at(self.0, offset)?;
|
||||
Some((s.read::<u16>()?, s.read::<u16>()?))
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for AnchorPoints<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "AnchorPoints {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
/// A format 4 subtable.
|
||||
///
|
||||
/// State Table for Control Point/Anchor Point Positioning.
|
||||
///
|
||||
/// Note: I wasn't able to find any fonts that actually use
|
||||
/// `ControlPointActions` and/or `ControlPointCoordinateActions`,
|
||||
/// therefore only `AnchorPointActions` is supported.
|
||||
#[derive(Clone)]
|
||||
pub struct Subtable4<'a> {
|
||||
/// A state table.
|
||||
pub state_table: aat::ExtendedStateTable<'a, EntryData>,
|
||||
/// Anchor points.
|
||||
pub anchor_points: AnchorPoints<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Subtable4<'a> {
|
||||
fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let state_table = aat::ExtendedStateTable::parse(number_of_glyphs, &mut s)?;
|
||||
let flags = s.read::<u32>()?;
|
||||
let action_type = ((flags & 0xC0000000) >> 30) as u8;
|
||||
let points_offset = usize::num_from(flags & 0x00FFFFFF);
|
||||
|
||||
// We support only Anchor Point Actions.
|
||||
if action_type != 1 {
|
||||
return None;
|
||||
}
|
||||
|
||||
Some(Self {
|
||||
state_table,
|
||||
anchor_points: AnchorPoints(data.get(points_offset..)?),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> core::ops::Deref for Subtable4<'a> {
|
||||
type Target = aat::ExtendedStateTable<'a, EntryData>;
|
||||
|
||||
fn deref(&self) -> &Self::Target {
|
||||
&self.state_table
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtable4<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtable4 {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A format 6 subtable.
|
||||
///
|
||||
/// Simple Index-based n x m Array of Kerning Values.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Subtable6<'a>{
|
||||
data: &'a [u8],
|
||||
number_of_glyphs: NonZeroU16,
|
||||
}
|
||||
|
||||
impl<'a> Subtable6<'a> {
|
||||
// TODO: parse actual structure
|
||||
fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Self {
|
||||
Subtable6 { number_of_glyphs, data }
|
||||
}
|
||||
|
||||
/// Returns kerning for a pair of glyphs.
|
||||
pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option<i16> {
|
||||
use core::convert::TryFrom;
|
||||
|
||||
let mut s = Stream::new(self.data);
|
||||
let flags = s.read::<u32>()?;
|
||||
s.skip::<u16>(); // row_count
|
||||
s.skip::<u16>(); // col_count
|
||||
// All offsets are from the start of the subtable.
|
||||
let row_index_table_offset = s.read::<Offset32>()?.to_usize().checked_sub(HEADER_SIZE)?;
|
||||
let column_index_table_offset = s.read::<Offset32>()?.to_usize().checked_sub(HEADER_SIZE)?;
|
||||
let kerning_array_offset = s.read::<Offset32>()?.to_usize().checked_sub(HEADER_SIZE)?;
|
||||
let kerning_vector_offset = s.read::<Offset32>()?.to_usize().checked_sub(HEADER_SIZE)?;
|
||||
|
||||
let row_index_table_data = self.data.get(row_index_table_offset..)?;
|
||||
let column_index_table_data = self.data.get(column_index_table_offset..)?;
|
||||
let kerning_array_data = self.data.get(kerning_array_offset..)?;
|
||||
let kerning_vector_data = self.data.get(kerning_vector_offset..)?;
|
||||
|
||||
let has_long_values = flags & 0x00000001 != 0;
|
||||
if has_long_values {
|
||||
let l: u32 = aat::Lookup::parse(self.number_of_glyphs, row_index_table_data)?
|
||||
.value(left).unwrap_or(0) as u32;
|
||||
|
||||
let r: u32 = aat::Lookup::parse(self.number_of_glyphs, column_index_table_data)?
|
||||
.value(right).unwrap_or(0) as u32;
|
||||
|
||||
let array_offset = usize::try_from(l + r).ok()?.checked_mul(i32::SIZE)?;
|
||||
let vector_offset: u32 = Stream::read_at(kerning_array_data, array_offset)?;
|
||||
|
||||
Stream::read_at(kerning_vector_data, usize::num_from(vector_offset))
|
||||
} else {
|
||||
let l: u16 = aat::Lookup::parse(self.number_of_glyphs, row_index_table_data)?
|
||||
.value(left).unwrap_or(0);
|
||||
|
||||
let r: u16 = aat::Lookup::parse(self.number_of_glyphs, column_index_table_data)?
|
||||
.value(right).unwrap_or(0);
|
||||
|
||||
let array_offset = usize::try_from(l + r).ok()?.checked_mul(i16::SIZE)?;
|
||||
let vector_offset: u16 = Stream::read_at(kerning_array_data, array_offset)?;
|
||||
|
||||
Stream::read_at(kerning_vector_data, usize::from(vector_offset))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtable6<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtable6 {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An extended kerning subtable format.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Debug)]
|
||||
pub enum Format<'a> {
|
||||
Format0(Subtable0<'a>),
|
||||
Format1(Subtable1<'a>),
|
||||
Format2(Subtable2<'a>),
|
||||
Format4(Subtable4<'a>),
|
||||
Format6(Subtable6<'a>),
|
||||
}
|
||||
|
||||
/// A kerning subtable.
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Subtable<'a> {
|
||||
/// Indicates that subtable is for horizontal text.
|
||||
pub horizontal: bool,
|
||||
/// Indicates that subtable is variable.
|
||||
pub variable: bool,
|
||||
/// Indicates that subtable has a cross-stream values.
|
||||
pub has_cross_stream: bool,
|
||||
/// Indicates that subtable uses a state machine.
|
||||
///
|
||||
/// In this case `glyphs_kerning()` will return `None`.
|
||||
pub has_state_machine: bool,
|
||||
/// The tuple count.
|
||||
///
|
||||
/// This value is only used with variation fonts and should be 0 for all other fonts.
|
||||
pub tuple_count: u32,
|
||||
/// Subtable format.
|
||||
pub format: Format<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Subtable<'a> {
|
||||
/// Returns kerning for a pair of glyphs.
|
||||
///
|
||||
/// Returns `None` in case of state machine based subtable.
|
||||
#[inline]
|
||||
pub fn glyphs_kerning(&self, left: GlyphId, right: GlyphId) -> Option<i16> {
|
||||
match self.format {
|
||||
Format::Format0(ref subtable) => subtable.glyphs_kerning(left, right),
|
||||
Format::Format1(_) => None,
|
||||
Format::Format2(ref subtable) => subtable.glyphs_kerning(left, right),
|
||||
Format::Format4(_) => None,
|
||||
Format::Format6(ref subtable) => subtable.glyphs_kerning(left, right),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct Coverage(u8);
|
||||
|
||||
impl Coverage {
|
||||
// TODO: use hex
|
||||
#[inline] pub fn is_horizontal(self) -> bool { self.0 & (1 << 7) == 0 }
|
||||
#[inline] pub fn has_cross_stream(self) -> bool { self.0 & (1 << 6) != 0 }
|
||||
#[inline] pub fn is_variable(self) -> bool { self.0 & (1 << 5) != 0 }
|
||||
}
|
||||
|
||||
|
||||
/// A list of extended kerning subtables.
|
||||
///
|
||||
/// The internal data layout is not designed for random access,
|
||||
/// therefore we're not providing the `get()` method and only an iterator.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Subtables<'a> {
|
||||
/// The number of glyphs from the `maxp` table.
|
||||
number_of_glyphs: NonZeroU16,
|
||||
/// The total number of tables.
|
||||
number_of_tables: u32,
|
||||
/// Actual data. Starts right after the `kerx` header.
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtables<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtables {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for Subtables<'a> {
|
||||
type Item = Subtable<'a>;
|
||||
type IntoIter = SubtablesIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
SubtablesIter {
|
||||
number_of_glyphs: self.number_of_glyphs,
|
||||
table_index: 0,
|
||||
number_of_tables: self.number_of_tables,
|
||||
stream: Stream::new(self.data),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over extended kerning subtables.
|
||||
#[allow(missing_debug_implementations)]
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct SubtablesIter<'a> {
|
||||
/// The number of glyphs from the `maxp` table.
|
||||
number_of_glyphs: NonZeroU16,
|
||||
/// The current table index.
|
||||
table_index: u32,
|
||||
/// The total number of tables.
|
||||
number_of_tables: u32,
|
||||
/// Actual data. Starts right after the `kerx` header.
|
||||
stream: Stream<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for SubtablesIter<'a> {
|
||||
type Item = Subtable<'a>;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.table_index == self.number_of_tables {
|
||||
return None;
|
||||
}
|
||||
|
||||
if self.stream.at_end() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let s = &mut self.stream;
|
||||
|
||||
let table_len = s.read::<u32>()?;
|
||||
let coverage = Coverage(s.read::<u8>()?);
|
||||
s.skip::<u16>(); // unused
|
||||
let raw_format = s.read::<u8>()?;
|
||||
let tuple_count = s.read::<u32>()?;
|
||||
|
||||
// Subtract the header size.
|
||||
let data_len = usize::num_from(table_len).checked_sub(HEADER_SIZE)?;
|
||||
let data = s.read_bytes(data_len)?;
|
||||
|
||||
let format = match raw_format {
|
||||
0 => Subtable0::parse(data).map(Format::Format0)?,
|
||||
1 => Subtable1::parse(self.number_of_glyphs, data).map(Format::Format1)?,
|
||||
2 => Format::Format2(Subtable2(data)),
|
||||
4 => Subtable4::parse(self.number_of_glyphs, data).map(Format::Format4)?,
|
||||
6 => Format::Format6(Subtable6::parse(self.number_of_glyphs, data)),
|
||||
_ => {
|
||||
// Unknown format.
|
||||
return None;
|
||||
}
|
||||
};
|
||||
|
||||
Some(Subtable {
|
||||
horizontal: coverage.is_horizontal(),
|
||||
variable: coverage.is_variable(),
|
||||
has_cross_stream: coverage.has_cross_stream(),
|
||||
has_state_machine: raw_format == 1 || raw_format == 4,
|
||||
tuple_count,
|
||||
format,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An [Extended Kerning Table](
|
||||
/// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6kerx.html).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// A list of subtables.
|
||||
pub subtables: Subtables<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
///
|
||||
/// `number_of_glyphs` is from the `maxp` table.
|
||||
pub fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u16>(); // version
|
||||
s.skip::<u16>(); // padding
|
||||
let number_of_tables = s.read::<u32>()?;
|
||||
let subtables = Subtables {
|
||||
number_of_glyphs,
|
||||
number_of_tables,
|
||||
data: s.tail()?,
|
||||
};
|
||||
|
||||
Some(Table { subtables })
|
||||
}
|
||||
}
|
|
@ -0,0 +1,89 @@
|
|||
//! An [Index to Location Table](https://docs.microsoft.com/en-us/typography/opentype/spec/loca)
|
||||
//! implementation.
|
||||
|
||||
use core::num::NonZeroU16;
|
||||
use core::ops::Range;
|
||||
|
||||
use crate::{GlyphId, IndexToLocationFormat};
|
||||
use crate::parser::{Stream, LazyArray16, NumFrom};
|
||||
|
||||
/// An [Index to Location Table](https://docs.microsoft.com/en-us/typography/opentype/spec/loca).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub enum Table<'a> {
|
||||
/// Short offsets.
|
||||
Short(LazyArray16<'a, u16>),
|
||||
/// Long offsets.
|
||||
Long(LazyArray16<'a, u32>),
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
///
|
||||
/// - `number_of_glyphs` is from the `maxp` table.
|
||||
/// - `format` is from the `head` table.
|
||||
pub fn parse(
|
||||
number_of_glyphs: NonZeroU16,
|
||||
format: IndexToLocationFormat,
|
||||
data: &'a [u8],
|
||||
) -> Option<Self> {
|
||||
// The number of ranges is `maxp.numGlyphs + 1`.
|
||||
//
|
||||
// Check for overflow first.
|
||||
let total = if number_of_glyphs.get() == core::u16::MAX {
|
||||
number_of_glyphs.get()
|
||||
} else {
|
||||
number_of_glyphs.get() + 1
|
||||
};
|
||||
|
||||
let mut s = Stream::new(data);
|
||||
match format {
|
||||
IndexToLocationFormat::Short => {
|
||||
Some(Table::Short(s.read_array16::<u16>(total)?))
|
||||
}
|
||||
IndexToLocationFormat::Long => {
|
||||
Some(Table::Long(s.read_array16::<u32>(total)?))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns offsets length.
|
||||
#[inline]
|
||||
pub fn len(&self) -> u16 {
|
||||
match self {
|
||||
Table::Short(ref array) => array.len(),
|
||||
Table::Long(ref array) => array.len(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns glyph's range in the `glyf` table.
|
||||
#[inline]
|
||||
pub fn glyph_range(&self, glyph_id: GlyphId) -> Option<Range<usize>> {
|
||||
let glyph_id = glyph_id.0;
|
||||
if glyph_id == core::u16::MAX {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Glyph ID must be smaller than total number of values in a `loca` array.
|
||||
if glyph_id + 1 >= self.len() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let range = match self {
|
||||
Table::Short(ref array) => {
|
||||
// 'The actual local offset divided by 2 is stored.'
|
||||
usize::from(array.get(glyph_id)?) * 2 .. usize::from(array.get(glyph_id + 1)?) * 2
|
||||
}
|
||||
Table::Long(ref array) => {
|
||||
usize::num_from(array.get(glyph_id)?) .. usize::num_from(array.get(glyph_id + 1)?)
|
||||
}
|
||||
};
|
||||
|
||||
if range.start >= range.end {
|
||||
// 'The offsets must be in ascending order.'
|
||||
// And range cannot be empty.
|
||||
None
|
||||
} else {
|
||||
Some(range)
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,30 @@
|
|||
//! A [Maximum Profile Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/maxp) implementation.
|
||||
|
||||
use core::num::NonZeroU16;
|
||||
|
||||
use crate::parser::Stream;
|
||||
|
||||
/// A [Maximum Profile Table](https://docs.microsoft.com/en-us/typography/opentype/spec/maxp).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table {
|
||||
/// The total number of glyphs in the face.
|
||||
pub number_of_glyphs: NonZeroU16,
|
||||
}
|
||||
|
||||
impl Table {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let version = s.read::<u32>()?;
|
||||
if !(version == 0x00005000 || version == 0x00010000) {
|
||||
return None;
|
||||
}
|
||||
|
||||
let n = s.read::<u16>()?;
|
||||
let number_of_glyphs = NonZeroU16::new(n)?;
|
||||
Some(Table {
|
||||
number_of_glyphs
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,38 @@
|
|||
pub mod cbdt;
|
||||
pub mod cblc;
|
||||
mod cff;
|
||||
pub mod cmap;
|
||||
pub mod glyf;
|
||||
pub mod head;
|
||||
pub mod hhea;
|
||||
pub mod hmtx;
|
||||
pub mod kern;
|
||||
pub mod loca;
|
||||
pub mod maxp;
|
||||
pub mod name;
|
||||
pub mod os2;
|
||||
pub mod post;
|
||||
pub mod sbix;
|
||||
pub mod svg;
|
||||
pub mod vhea;
|
||||
pub mod vorg;
|
||||
|
||||
#[cfg(feature = "opentype-layout")] pub mod gdef;
|
||||
#[cfg(feature = "opentype-layout")] pub mod gsub;
|
||||
#[cfg(feature = "opentype-layout")] pub mod gpos;
|
||||
|
||||
#[cfg(feature = "apple-layout")] pub mod ankr;
|
||||
#[cfg(feature = "apple-layout")] pub mod feat;
|
||||
#[cfg(feature = "apple-layout")] pub mod kerx;
|
||||
#[cfg(feature = "apple-layout")] pub mod morx;
|
||||
#[cfg(feature = "apple-layout")] pub mod trak;
|
||||
|
||||
#[cfg(feature = "variable-fonts")] pub mod avar;
|
||||
#[cfg(feature = "variable-fonts")] pub mod fvar;
|
||||
#[cfg(feature = "variable-fonts")] pub mod gvar;
|
||||
#[cfg(feature = "variable-fonts")] pub mod hvar;
|
||||
#[cfg(feature = "variable-fonts")] pub mod mvar;
|
||||
|
||||
pub use cff::cff1;
|
||||
pub use cff::CFFError;
|
||||
#[cfg(feature = "variable-fonts")] pub use cff::cff2;
|
|
@ -0,0 +1,483 @@
|
|||
//! An [Extended Glyph Metamorphosis Table](
|
||||
//! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6morx.html) implementation.
|
||||
|
||||
// Note: We do not have tests for this table because it has a very complicated structure.
|
||||
// Specifically, the State Machine Tables. I have no idea how to generate them.
|
||||
// And all fonts that use this table are mainly Apple one, so we cannot use them for legal reasons.
|
||||
//
|
||||
// On the other hand, this table is tested indirectly by https://github.com/RazrFalcon/rustybuzz
|
||||
// And it has like 170 tests. Which is pretty good.
|
||||
// Therefore after applying any changes to this table,
|
||||
// you have to check that all rustybuzz tests are still passing.
|
||||
|
||||
use core::num::NonZeroU16;
|
||||
|
||||
use crate::{aat, GlyphId};
|
||||
use crate::parser::{Stream, FromData, LazyArray32, NumFrom, Offset32, Offset};
|
||||
|
||||
/// The feature table is used to compute the sub-feature flags
|
||||
/// for a list of requested features and settings.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Feature {
|
||||
/// The type of feature.
|
||||
pub kind: u16,
|
||||
/// The feature's setting (aka selector).
|
||||
pub setting: u16,
|
||||
/// Flags for the settings that this feature and setting enables.
|
||||
pub enable_flags: u32,
|
||||
/// Complement of flags for the settings that this feature and setting disable.
|
||||
pub disable_flags: u32,
|
||||
}
|
||||
|
||||
impl FromData for Feature {
|
||||
const SIZE: usize = 12;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(Feature {
|
||||
kind: s.read::<u16>()?,
|
||||
setting: s.read::<u16>()?,
|
||||
enable_flags: s.read::<u32>()?,
|
||||
disable_flags: s.read::<u32>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A contextual subtable state table trailing data.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct ContextualEntryData {
|
||||
/// A mark index.
|
||||
pub mark_index: u16,
|
||||
/// A current index.
|
||||
pub current_index: u16,
|
||||
}
|
||||
|
||||
impl FromData for ContextualEntryData {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(ContextualEntryData {
|
||||
mark_index: s.read::<u16>()?,
|
||||
current_index: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// A contextual subtable.
|
||||
#[derive(Clone)]
|
||||
pub struct ContextualSubtable<'a> {
|
||||
/// The contextual glyph substitution state table.
|
||||
pub state: aat::ExtendedStateTable<'a, ContextualEntryData>,
|
||||
offsets_data: &'a [u8],
|
||||
offsets: LazyArray32<'a, Offset32>,
|
||||
number_of_glyphs: NonZeroU16,
|
||||
}
|
||||
|
||||
impl<'a> ContextualSubtable<'a> {
|
||||
fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let state = aat::ExtendedStateTable::parse(number_of_glyphs, &mut s)?;
|
||||
|
||||
// While the spec clearly states that this is an
|
||||
// 'offset from the beginning of the state subtable',
|
||||
// it's actually not. Subtable header should not be included.
|
||||
let offset = s.read::<Offset32>()?.to_usize();
|
||||
|
||||
// The offsets list is unsized.
|
||||
let offsets_data = data.get(offset..)?;
|
||||
let offsets = LazyArray32::<Offset32>::new(offsets_data);
|
||||
|
||||
Some(ContextualSubtable {
|
||||
state,
|
||||
offsets_data,
|
||||
offsets,
|
||||
number_of_glyphs,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns a [Lookup](aat::Lookup) at index.
|
||||
pub fn lookup(&self, index: u32) -> Option<aat::Lookup<'a>> {
|
||||
let offset = self.offsets.get(index)?.to_usize();
|
||||
let lookup_data = self.offsets_data.get(offset..)?;
|
||||
aat::Lookup::parse(self.number_of_glyphs, lookup_data)
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for ContextualSubtable<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "ContextualSubtable {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A ligature subtable.
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct LigatureSubtable<'a> {
|
||||
/// A state table.
|
||||
pub state: aat::ExtendedStateTable<'a, u16>,
|
||||
/// Ligature actions.
|
||||
pub ligature_actions: LazyArray32<'a, u32>,
|
||||
/// Ligature components.
|
||||
pub components: LazyArray32<'a, u16>,
|
||||
/// Ligatures.
|
||||
pub ligatures: LazyArray32<'a, GlyphId>,
|
||||
}
|
||||
|
||||
impl<'a> LigatureSubtable<'a> {
|
||||
fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let state = aat::ExtendedStateTable::parse(number_of_glyphs, &mut s)?;
|
||||
|
||||
// Offset are from `ExtendedStateTable`/`data`, not from subtable start.
|
||||
let ligature_action_offset = s.read::<Offset32>()?.to_usize();
|
||||
let component_offset = s.read::<Offset32>()?.to_usize();
|
||||
let ligature_offset = s.read::<Offset32>()?.to_usize();
|
||||
|
||||
// All three arrays are unsized, so we're simply reading/mapping all the data past offset.
|
||||
let ligature_actions = LazyArray32::<u32>::new(data.get(ligature_action_offset..)?);
|
||||
let components = LazyArray32::<u16>::new(data.get(component_offset..)?);
|
||||
let ligatures = LazyArray32::<GlyphId>::new(data.get(ligature_offset..)?);
|
||||
|
||||
Some(LigatureSubtable {
|
||||
state,
|
||||
ligature_actions,
|
||||
components,
|
||||
ligatures,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A contextual subtable state table trailing data.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct InsertionEntryData {
|
||||
/// A current insert index.
|
||||
pub current_insert_index: u16,
|
||||
/// A marked insert index.
|
||||
pub marked_insert_index: u16,
|
||||
}
|
||||
|
||||
impl FromData for InsertionEntryData {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(InsertionEntryData {
|
||||
current_insert_index: s.read::<u16>()?,
|
||||
marked_insert_index: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An insertion subtable.
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct InsertionSubtable<'a> {
|
||||
/// A state table.
|
||||
pub state: aat::ExtendedStateTable<'a, InsertionEntryData>,
|
||||
/// Insertion glyphs.
|
||||
pub glyphs: LazyArray32<'a, GlyphId>,
|
||||
}
|
||||
|
||||
impl<'a> InsertionSubtable<'a> {
|
||||
fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let state = aat::ExtendedStateTable::parse(number_of_glyphs, &mut s)?;
|
||||
let offset = s.read::<Offset32>()?.to_usize();
|
||||
|
||||
// TODO: unsized array?
|
||||
// The list is unsized.
|
||||
let glyphs = LazyArray32::<GlyphId>::new(data.get(offset..)?);
|
||||
|
||||
Some(InsertionSubtable {
|
||||
state,
|
||||
glyphs,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A subtable kind.
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Debug)]
|
||||
pub enum SubtableKind<'a> {
|
||||
Rearrangement(aat::ExtendedStateTable<'a, ()>),
|
||||
Contextual(ContextualSubtable<'a>),
|
||||
Ligature(LigatureSubtable<'a>),
|
||||
NonContextual(aat::Lookup<'a>),
|
||||
Insertion(InsertionSubtable<'a>),
|
||||
}
|
||||
|
||||
|
||||
/// A subtable coverage.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Coverage(u8);
|
||||
|
||||
impl Coverage {
|
||||
/// If true, this subtable will process glyphs in logical order
|
||||
/// (or reverse logical order if [`is_vertical`](Self::is_vertical) is also true).
|
||||
#[inline] pub fn is_logical(self) -> bool { self.0 & 0x10 != 0 }
|
||||
/// If true, this subtable will be applied to both horizontal and vertical text
|
||||
/// ([`is_vertical`](Self::is_vertical) should be ignored).
|
||||
#[inline] pub fn is_all_directions(self) -> bool { self.0 & 0x20 != 0 }
|
||||
/// If true, this subtable will process glyphs in descending order.
|
||||
#[inline] pub fn is_backwards(self) -> bool { self.0 & 0x40 != 0 }
|
||||
/// If true, this subtable will only be applied to vertical text.
|
||||
#[inline] pub fn is_vertical(self) -> bool { self.0 & 0x80 != 0 }
|
||||
}
|
||||
|
||||
|
||||
/// A subtable in a metamorphosis chain.
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Subtable<'a> {
|
||||
/// A subtable kind.
|
||||
pub kind: SubtableKind<'a>,
|
||||
/// A subtable coverage.
|
||||
pub coverage: Coverage,
|
||||
/// Subtable feature flags.
|
||||
pub feature_flags: u32,
|
||||
}
|
||||
|
||||
|
||||
/// A list of subtables in a metamorphosis chain.
|
||||
///
|
||||
/// The internal data layout is not designed for random access,
|
||||
/// therefore we're not providing the `get()` method and only an iterator.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Subtables<'a> {
|
||||
count: u32,
|
||||
data: &'a [u8],
|
||||
number_of_glyphs: NonZeroU16,
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for Subtables<'a> {
|
||||
type Item = Subtable<'a>;
|
||||
type IntoIter = SubtablesIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
SubtablesIter {
|
||||
index: 0,
|
||||
count: self.count,
|
||||
stream: Stream::new(self.data),
|
||||
number_of_glyphs: self.number_of_glyphs,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Subtables<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Subtables {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An iterator over a metamorphosis chain subtables.
|
||||
#[allow(missing_debug_implementations)]
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct SubtablesIter<'a> {
|
||||
index: u32,
|
||||
count: u32,
|
||||
stream: Stream<'a>,
|
||||
number_of_glyphs: NonZeroU16,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for SubtablesIter<'a> {
|
||||
type Item = Subtable<'a>;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index == self.count {
|
||||
return None;
|
||||
}
|
||||
|
||||
let s = &mut self.stream;
|
||||
if s.at_end() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let len = s.read::<u32>()?;
|
||||
let coverage = Coverage(s.read::<u8>()?);
|
||||
s.skip::<u16>(); // reserved
|
||||
let kind = s.read::<u8>()?;
|
||||
let feature_flags = s.read::<u32>()?;
|
||||
|
||||
const HEADER_LEN: usize = 12;
|
||||
let len = usize::num_from(len).checked_sub(HEADER_LEN)?;
|
||||
let subtables_data = s.read_bytes(len)?;
|
||||
|
||||
let kind = match kind {
|
||||
0 => {
|
||||
let mut s = Stream::new(subtables_data);
|
||||
let table = aat::ExtendedStateTable::parse(self.number_of_glyphs, &mut s)?;
|
||||
SubtableKind::Rearrangement(table)
|
||||
}
|
||||
1 => {
|
||||
let table = ContextualSubtable::parse(self.number_of_glyphs, subtables_data)?;
|
||||
SubtableKind::Contextual(table)
|
||||
}
|
||||
2 => {
|
||||
let table = LigatureSubtable::parse(self.number_of_glyphs, subtables_data)?;
|
||||
SubtableKind::Ligature(table)
|
||||
}
|
||||
// 3 - reserved
|
||||
4 => {
|
||||
SubtableKind::NonContextual(aat::Lookup::parse(self.number_of_glyphs, subtables_data)?)
|
||||
}
|
||||
5 => {
|
||||
let table = InsertionSubtable::parse(self.number_of_glyphs, subtables_data)?;
|
||||
SubtableKind::Insertion(table)
|
||||
}
|
||||
_ => return None,
|
||||
};
|
||||
|
||||
Some(Subtable {
|
||||
kind,
|
||||
coverage,
|
||||
feature_flags,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A metamorphosis chain.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Chain<'a> {
|
||||
/// Default chain features.
|
||||
pub default_flags: u32,
|
||||
/// A list of chain features.
|
||||
pub features: LazyArray32<'a, Feature>,
|
||||
/// A list of chain subtables.
|
||||
pub subtables: Subtables<'a>,
|
||||
}
|
||||
|
||||
|
||||
/// A list of metamorphosis chains.
|
||||
///
|
||||
/// The internal data layout is not designed for random access,
|
||||
/// therefore we're not providing the `get()` method and only an iterator.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Chains<'a> {
|
||||
data: &'a [u8],
|
||||
count: u32,
|
||||
number_of_glyphs: NonZeroU16,
|
||||
}
|
||||
|
||||
impl<'a> Chains<'a> {
|
||||
fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
s.skip::<u16>(); // version
|
||||
s.skip::<u16>(); // reserved
|
||||
let count = s.read::<u32>()?;
|
||||
|
||||
Some(Chains {
|
||||
count,
|
||||
data: s.tail()?,
|
||||
number_of_glyphs,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for Chains<'a> {
|
||||
type Item = Chain<'a>;
|
||||
type IntoIter = ChainsIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
ChainsIter {
|
||||
index: 0,
|
||||
count: self.count,
|
||||
stream: Stream::new(self.data),
|
||||
number_of_glyphs: self.number_of_glyphs,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Chains<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Chains {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over metamorphosis chains.
|
||||
#[allow(missing_debug_implementations)]
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct ChainsIter<'a> {
|
||||
index: u32,
|
||||
count: u32,
|
||||
stream: Stream<'a>,
|
||||
number_of_glyphs: NonZeroU16,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for ChainsIter<'a> {
|
||||
type Item = Chain<'a>;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index == self.count {
|
||||
return None;
|
||||
}
|
||||
|
||||
if self.stream.at_end() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let default_flags = self.stream.read::<u32>()?;
|
||||
let len = self.stream.read::<u32>()?;
|
||||
let features_count = self.stream.read::<u32>()?;
|
||||
let subtables_count = self.stream.read::<u32>()?;
|
||||
|
||||
let features = self.stream.read_array32::<Feature>(features_count)?;
|
||||
|
||||
const HEADER_LEN: usize = 16;
|
||||
let len = usize::num_from(len)
|
||||
.checked_sub(HEADER_LEN)?
|
||||
.checked_sub(Feature::SIZE * usize::num_from(features_count))?;
|
||||
|
||||
let subtables_data = self.stream.read_bytes(len)?;
|
||||
|
||||
let subtables = Subtables {
|
||||
data: subtables_data,
|
||||
count: subtables_count,
|
||||
number_of_glyphs: self.number_of_glyphs,
|
||||
};
|
||||
|
||||
Some(Chain {
|
||||
default_flags,
|
||||
features,
|
||||
subtables,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An [Extended Glyph Metamorphosis Table](
|
||||
/// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6morx.html).
|
||||
///
|
||||
/// Subtable Glyph Coverage used by morx v3 is not supported.
|
||||
#[derive(Clone)]
|
||||
pub struct Table<'a> {
|
||||
/// A list of metamorphosis chains.
|
||||
pub chains: Chains<'a>,
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Table<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Table {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
///
|
||||
/// `number_of_glyphs` is from the `maxp` table.
|
||||
pub fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
Chains::parse(number_of_glyphs, data).map(|chains| Self { chains })
|
||||
}
|
||||
}
|
|
@ -0,0 +1,85 @@
|
|||
//! A [Metrics Variations Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/mvar) implementation.
|
||||
|
||||
use crate::{Tag, NormalizedCoordinate};
|
||||
use crate::parser::{Stream, FromData, Offset, Offset16, LazyArray16};
|
||||
use crate::var_store::ItemVariationStore;
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct ValueRecord {
|
||||
value_tag: Tag,
|
||||
delta_set_outer_index: u16,
|
||||
delta_set_inner_index: u16,
|
||||
}
|
||||
|
||||
impl FromData for ValueRecord {
|
||||
const SIZE: usize = 8;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(ValueRecord {
|
||||
value_tag: s.read::<Tag>()?,
|
||||
delta_set_outer_index: s.read::<u16>()?,
|
||||
delta_set_inner_index: s.read::<u16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Metrics Variations Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/mvar).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Table<'a> {
|
||||
variation_store: ItemVariationStore<'a>,
|
||||
records: LazyArray16<'a, ValueRecord>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let version = s.read::<u32>()?;
|
||||
if version != 0x00010000 {
|
||||
return None;
|
||||
}
|
||||
|
||||
s.skip::<u16>(); // reserved
|
||||
let value_record_size = s.read::<u16>()?;
|
||||
|
||||
if usize::from(value_record_size) != ValueRecord::SIZE {
|
||||
return None;
|
||||
}
|
||||
|
||||
let count = s.read::<u16>()?;
|
||||
if count == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let var_store_offset = s.read::<Option<Offset16>>()??.to_usize();
|
||||
let records = s.read_array16::<ValueRecord>(count)?;
|
||||
let variation_store = ItemVariationStore::parse(Stream::new_at(data, var_store_offset)?)?;
|
||||
|
||||
Some(Table {
|
||||
variation_store,
|
||||
records,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns a metric offset by tag.
|
||||
pub fn metric_offset(&self, tag: Tag, coordinates: &[NormalizedCoordinate]) -> Option<f32> {
|
||||
let (_, record) = self.records.binary_search_by(|r| r.value_tag.cmp(&tag))?;
|
||||
self.variation_store.parse_delta(
|
||||
record.delta_set_outer_index,
|
||||
record.delta_set_inner_index,
|
||||
coordinates
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Table<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Table {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,319 @@
|
|||
//! A [Naming Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/name) implementation.
|
||||
|
||||
#[cfg(feature = "std")] use std::vec::Vec;
|
||||
#[cfg(feature = "std")] use std::string::String;
|
||||
|
||||
use crate::parser::{LazyArray16, FromData, Offset, Offset16, Stream};
|
||||
|
||||
/// A list of [name ID](https://docs.microsoft.com/en-us/typography/opentype/spec/name#name-ids)'s.
|
||||
pub mod name_id {
|
||||
#![allow(missing_docs)]
|
||||
|
||||
pub const COPYRIGHT_NOTICE: u16 = 0;
|
||||
pub const FAMILY: u16 = 1;
|
||||
pub const SUBFAMILY: u16 = 2;
|
||||
pub const UNIQUE_ID: u16 = 3;
|
||||
pub const FULL_NAME: u16 = 4;
|
||||
pub const VERSION: u16 = 5;
|
||||
pub const POST_SCRIPT_NAME: u16 = 6;
|
||||
pub const TRADEMARK: u16 = 7;
|
||||
pub const MANUFACTURER: u16 = 8;
|
||||
pub const DESIGNER: u16 = 9;
|
||||
pub const DESCRIPTION: u16 = 10;
|
||||
pub const VENDOR_URL: u16 = 11;
|
||||
pub const DESIGNER_URL: u16 = 12;
|
||||
pub const LICENSE: u16 = 13;
|
||||
pub const LICENSE_URL: u16 = 14;
|
||||
// RESERVED = 15
|
||||
pub const TYPOGRAPHIC_FAMILY: u16 = 16;
|
||||
pub const TYPOGRAPHIC_SUBFAMILY: u16 = 17;
|
||||
pub const COMPATIBLE_FULL: u16 = 18;
|
||||
pub const SAMPLE_TEXT: u16 = 19;
|
||||
pub const POST_SCRIPT_CID: u16 = 20;
|
||||
pub const WWS_FAMILY: u16 = 21;
|
||||
pub const WWS_SUBFAMILY: u16 = 22;
|
||||
pub const LIGHT_BACKGROUND_PALETTE: u16 = 23;
|
||||
pub const DARK_BACKGROUND_PALETTE: u16 = 24;
|
||||
pub const VARIATIONS_POST_SCRIPT_NAME_PREFIX: u16 = 25;
|
||||
}
|
||||
|
||||
|
||||
/// A [platform ID](https://docs.microsoft.com/en-us/typography/opentype/spec/name#platform-ids).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, PartialEq, Debug)]
|
||||
pub enum PlatformId {
|
||||
Unicode,
|
||||
Macintosh,
|
||||
Iso,
|
||||
Windows,
|
||||
Custom,
|
||||
}
|
||||
|
||||
impl FromData for PlatformId {
|
||||
const SIZE: usize = 2;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
match u16::parse(data)? {
|
||||
0 => Some(PlatformId::Unicode),
|
||||
1 => Some(PlatformId::Macintosh),
|
||||
2 => Some(PlatformId::Iso),
|
||||
3 => Some(PlatformId::Windows),
|
||||
4 => Some(PlatformId::Custom),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[inline]
|
||||
fn is_unicode_encoding(platform_id: PlatformId, encoding_id: u16) -> bool {
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/name#windows-encoding-ids
|
||||
const WINDOWS_SYMBOL_ENCODING_ID: u16 = 0;
|
||||
const WINDOWS_UNICODE_BMP_ENCODING_ID: u16 = 1;
|
||||
|
||||
match platform_id {
|
||||
PlatformId::Unicode => true,
|
||||
PlatformId::Windows => match encoding_id {
|
||||
WINDOWS_SYMBOL_ENCODING_ID |
|
||||
WINDOWS_UNICODE_BMP_ENCODING_ID => true,
|
||||
_ => false,
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct NameRecord {
|
||||
platform_id: PlatformId,
|
||||
encoding_id: u16,
|
||||
language_id: u16,
|
||||
name_id: u16,
|
||||
length: u16,
|
||||
offset: Offset16,
|
||||
}
|
||||
|
||||
impl FromData for NameRecord {
|
||||
const SIZE: usize = 12;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(NameRecord {
|
||||
platform_id: s.read::<PlatformId>()?,
|
||||
encoding_id: s.read::<u16>()?,
|
||||
language_id: s.read::<u16>()?,
|
||||
name_id: s.read::<u16>()?,
|
||||
length: s.read::<u16>()?,
|
||||
offset: s.read::<Offset16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Name Record](https://docs.microsoft.com/en-us/typography/opentype/spec/name#name-records).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Name<'a> {
|
||||
/// A platform ID.
|
||||
pub platform_id: PlatformId,
|
||||
/// A platform-specific encoding ID.
|
||||
pub encoding_id: u16,
|
||||
/// A language ID.
|
||||
pub language_id: u16,
|
||||
/// A [Name ID](https://docs.microsoft.com/en-us/typography/opentype/spec/name#name-ids).
|
||||
///
|
||||
/// A predefined list of ID's can be found in the [`name_id`](name_id/index.html) module.
|
||||
pub name_id: u16,
|
||||
/// A raw name data.
|
||||
///
|
||||
/// Can be in any encoding. Can be empty.
|
||||
pub name: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Name<'a> {
|
||||
/// Returns the Name's data as a UTF-8 string.
|
||||
///
|
||||
/// Only Unicode names are supported. And since they are stored as UTF-16BE,
|
||||
/// we can't return `&str` and have to allocate a `String`.
|
||||
///
|
||||
/// Supports:
|
||||
/// - Unicode Platform ID
|
||||
/// - Windows Platform ID + Symbol
|
||||
/// - Windows Platform ID + Unicode BMP
|
||||
#[cfg(feature = "std")]
|
||||
#[inline(never)]
|
||||
pub fn to_string(&self) -> Option<String> {
|
||||
if self.is_unicode() {
|
||||
self.name_from_utf16_be()
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Checks that the current Name data has a Unicode encoding.
|
||||
#[inline]
|
||||
pub fn is_unicode(&self) -> bool {
|
||||
is_unicode_encoding(self.platform_id, self.encoding_id)
|
||||
}
|
||||
|
||||
#[cfg(feature = "std")]
|
||||
#[inline(never)]
|
||||
fn name_from_utf16_be(&self) -> Option<String> {
|
||||
let mut name: Vec<u16> = Vec::new();
|
||||
for c in LazyArray16::<u16>::new(self.name) {
|
||||
name.push(c);
|
||||
}
|
||||
|
||||
String::from_utf16(&name).ok()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(feature = "std")]
|
||||
impl<'a> core::fmt::Debug for Name<'a> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
// TODO: https://github.com/rust-lang/rust/issues/50264
|
||||
|
||||
let name = self.to_string();
|
||||
f.debug_struct("Name")
|
||||
.field("name", &name.as_ref().map(core::ops::Deref::deref)
|
||||
.unwrap_or("unsupported encoding"))
|
||||
.field("platform_id", &self.platform_id)
|
||||
.field("encoding_id", &self.encoding_id)
|
||||
.field("language_id", &self.language_id)
|
||||
.field("name_id", &self.name_id)
|
||||
.finish()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(not(feature = "std"))]
|
||||
impl<'a> core::fmt::Debug for Name<'a> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
f.debug_struct("Name")
|
||||
.field("name", &self.name)
|
||||
.field("platform_id", &self.platform_id)
|
||||
.field("encoding_id", &self.encoding_id)
|
||||
.field("language_id", &self.language_id)
|
||||
.field("name_id", &self.name_id)
|
||||
.finish()
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A list of face names.
|
||||
#[derive(Clone, Copy, Default)]
|
||||
pub struct Names<'a> {
|
||||
records: LazyArray16<'a, NameRecord>,
|
||||
storage: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Names<'a> {
|
||||
/// Returns a name at index.
|
||||
pub fn get(&self, index: u16) -> Option<Name<'a>> {
|
||||
let record = self.records.get(index)?;
|
||||
let name_start = record.offset.to_usize();
|
||||
let name_end = name_start + usize::from(record.length);
|
||||
let name = self.storage.get(name_start..name_end)?;
|
||||
Some(Name {
|
||||
platform_id: record.platform_id,
|
||||
encoding_id: record.encoding_id,
|
||||
language_id: record.language_id,
|
||||
name_id: record.name_id,
|
||||
name,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns a number of name records.
|
||||
pub fn len(&self) -> u16 {
|
||||
self.records.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Names<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Names {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for Names<'a> {
|
||||
type Item = Name<'a>;
|
||||
type IntoIter = NamesIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
NamesIter {
|
||||
names: self,
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over face names.
|
||||
#[derive(Clone, Copy)]
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct NamesIter<'a> {
|
||||
names: Names<'a>,
|
||||
index: u16,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for NamesIter<'a> {
|
||||
type Item = Name<'a>;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index < self.names.len() {
|
||||
self.index += 1;
|
||||
self.names.get(self.index - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn count(self) -> usize {
|
||||
usize::from(self.names.len().checked_sub(self.index).unwrap_or(0))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Naming Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/name).
|
||||
#[derive(Clone, Copy, Default, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// A list of names.
|
||||
pub names: Names<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/name#naming-table-format-1
|
||||
const LANG_TAG_RECORD_SIZE: u16 = 4;
|
||||
|
||||
let mut s = Stream::new(data);
|
||||
let version = s.read::<u16>()?;
|
||||
let count = s.read::<u16>()?;
|
||||
let storage_offset = s.read::<Offset16>()?.to_usize();
|
||||
|
||||
if version == 0 {
|
||||
// Do nothing.
|
||||
} else if version == 1 {
|
||||
let lang_tag_count = s.read::<u16>()?;
|
||||
let lang_tag_len = lang_tag_count.checked_mul(LANG_TAG_RECORD_SIZE)?;
|
||||
s.advance(usize::from(lang_tag_len)); // langTagRecords
|
||||
} else {
|
||||
// Unsupported version.
|
||||
return None;
|
||||
}
|
||||
|
||||
let records = s.read_array16::<NameRecord>(count)?;
|
||||
|
||||
if s.offset() < storage_offset {
|
||||
s.advance(storage_offset - s.offset());
|
||||
}
|
||||
|
||||
let storage = s.tail()?;
|
||||
|
||||
Some(Table { names: Names { records, storage } })
|
||||
}
|
||||
}
|
|
@ -0,0 +1,359 @@
|
|||
//! A [OS/2 and Windows Metrics Table](https://docs.microsoft.com/en-us/typography/opentype/spec/os2)
|
||||
//! implementation.
|
||||
|
||||
use crate::LineMetrics;
|
||||
use crate::parser::Stream;
|
||||
|
||||
const WEIGHT_CLASS_OFFSET: usize = 4;
|
||||
const WIDTH_CLASS_OFFSET: usize = 6;
|
||||
const Y_SUBSCRIPT_X_SIZE_OFFSET: usize = 10;
|
||||
const Y_SUPERSCRIPT_X_SIZE_OFFSET: usize = 18;
|
||||
const Y_STRIKEOUT_SIZE_OFFSET: usize = 26;
|
||||
const Y_STRIKEOUT_POSITION_OFFSET: usize = 28;
|
||||
const FS_SELECTION_OFFSET: usize = 62;
|
||||
const TYPO_ASCENDER_OFFSET: usize = 68;
|
||||
const TYPO_DESCENDER_OFFSET: usize = 70;
|
||||
const TYPO_LINE_GAP_OFFSET: usize = 72;
|
||||
const WIN_ASCENT: usize = 74;
|
||||
const WIN_DESCENT: usize = 76;
|
||||
const X_HEIGHT_OFFSET: usize = 86;
|
||||
const CAP_HEIGHT_OFFSET: usize = 88;
|
||||
|
||||
/// A face [weight](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#usweightclass).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Eq, PartialEq, Debug, Hash)]
|
||||
pub enum Weight {
|
||||
Thin,
|
||||
ExtraLight,
|
||||
Light,
|
||||
Normal,
|
||||
Medium,
|
||||
SemiBold,
|
||||
Bold,
|
||||
ExtraBold,
|
||||
Black,
|
||||
Other(u16),
|
||||
}
|
||||
|
||||
impl Weight {
|
||||
/// Returns a numeric representation of a weight.
|
||||
#[inline]
|
||||
pub fn to_number(self) -> u16 {
|
||||
match self {
|
||||
Weight::Thin => 100,
|
||||
Weight::ExtraLight => 200,
|
||||
Weight::Light => 300,
|
||||
Weight::Normal => 400,
|
||||
Weight::Medium => 500,
|
||||
Weight::SemiBold => 600,
|
||||
Weight::Bold => 700,
|
||||
Weight::ExtraBold => 800,
|
||||
Weight::Black => 900,
|
||||
Weight::Other(n) => n,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<u16> for Weight {
|
||||
#[inline]
|
||||
fn from(value: u16) -> Self {
|
||||
match value {
|
||||
100 => Weight::Thin,
|
||||
200 => Weight::ExtraLight,
|
||||
300 => Weight::Light,
|
||||
400 => Weight::Normal,
|
||||
500 => Weight::Medium,
|
||||
600 => Weight::SemiBold,
|
||||
700 => Weight::Bold,
|
||||
800 => Weight::ExtraBold,
|
||||
900 => Weight::Black,
|
||||
_ => Weight::Other(value),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for Weight {
|
||||
#[inline]
|
||||
fn default() -> Self {
|
||||
Weight::Normal
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A face [width](https://docs.microsoft.com/en-us/typography/opentype/spec/os2#uswidthclass).
|
||||
#[allow(missing_docs)]
|
||||
#[derive(Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Debug, Hash)]
|
||||
pub enum Width {
|
||||
UltraCondensed,
|
||||
ExtraCondensed,
|
||||
Condensed,
|
||||
SemiCondensed,
|
||||
Normal,
|
||||
SemiExpanded,
|
||||
Expanded,
|
||||
ExtraExpanded,
|
||||
UltraExpanded,
|
||||
}
|
||||
|
||||
impl Width {
|
||||
/// Returns a numeric representation of a width.
|
||||
#[inline]
|
||||
pub fn to_number(self) -> u16 {
|
||||
match self {
|
||||
Width::UltraCondensed => 1,
|
||||
Width::ExtraCondensed => 2,
|
||||
Width::Condensed => 3,
|
||||
Width::SemiCondensed => 4,
|
||||
Width::Normal => 5,
|
||||
Width::SemiExpanded => 6,
|
||||
Width::Expanded => 7,
|
||||
Width::ExtraExpanded => 8,
|
||||
Width::UltraExpanded => 9,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for Width {
|
||||
#[inline]
|
||||
fn default() -> Self {
|
||||
Width::Normal
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A face style.
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Debug, Hash)]
|
||||
pub enum Style {
|
||||
/// A face that is neither italic not obliqued.
|
||||
Normal,
|
||||
/// A form that is generally cursive in nature.
|
||||
Italic,
|
||||
/// A typically-sloped version of the regular face.
|
||||
Oblique,
|
||||
}
|
||||
|
||||
impl Default for Style {
|
||||
#[inline]
|
||||
fn default() -> Style {
|
||||
Style::Normal
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A script metrics used by subscript and superscript.
|
||||
#[repr(C)]
|
||||
#[derive(Clone, Copy, Eq, PartialEq, Debug, Hash)]
|
||||
pub struct ScriptMetrics {
|
||||
/// Horizontal face size.
|
||||
pub x_size: i16,
|
||||
|
||||
/// Vertical face size.
|
||||
pub y_size: i16,
|
||||
|
||||
/// X offset.
|
||||
pub x_offset: i16,
|
||||
|
||||
/// Y offset.
|
||||
pub y_offset: i16,
|
||||
}
|
||||
|
||||
|
||||
// https://docs.microsoft.com/en-us/typography/opentype/spec/os2#fsselection
|
||||
#[derive(Clone, Copy)]
|
||||
struct SelectionFlags(u16);
|
||||
|
||||
impl SelectionFlags {
|
||||
#[inline] fn italic(self) -> bool { self.0 & (1 << 0) != 0 }
|
||||
#[inline] fn bold(self) -> bool { self.0 & (1 << 5) != 0 }
|
||||
// #[inline] fn regular(self) -> bool { self.0 & (1 << 6) != 0 }
|
||||
#[inline] fn use_typo_metrics(self) -> bool { self.0 & (1 << 7) != 0 }
|
||||
#[inline] fn oblique(self) -> bool { self.0 & (1 << 9) != 0 }
|
||||
}
|
||||
|
||||
|
||||
/// A [OS/2 and Windows Metrics Table](https://docs.microsoft.com/en-us/typography/opentype/spec/os2).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Table<'a> {
|
||||
/// Table version.
|
||||
pub version: u8,
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let version = s.read::<u16>()?;
|
||||
|
||||
let table_len = match version {
|
||||
0 => 78,
|
||||
1 => 86,
|
||||
2 => 96,
|
||||
3 => 96,
|
||||
4 => 96,
|
||||
5 => 100,
|
||||
_ => return None,
|
||||
};
|
||||
|
||||
if data.len() != table_len {
|
||||
return None;
|
||||
}
|
||||
|
||||
Some(Table {
|
||||
version: version as u8,
|
||||
data,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns weight class.
|
||||
#[inline]
|
||||
pub fn weight(&self) -> Weight {
|
||||
Weight::from(Stream::read_at::<u16>(self.data, WEIGHT_CLASS_OFFSET).unwrap_or(0))
|
||||
}
|
||||
|
||||
/// Returns face width.
|
||||
#[inline]
|
||||
pub fn width(&self) -> Width {
|
||||
match Stream::read_at::<u16>(self.data, WIDTH_CLASS_OFFSET).unwrap_or(0) {
|
||||
1 => Width::UltraCondensed,
|
||||
2 => Width::ExtraCondensed,
|
||||
3 => Width::Condensed,
|
||||
4 => Width::SemiCondensed,
|
||||
5 => Width::Normal,
|
||||
6 => Width::SemiExpanded,
|
||||
7 => Width::Expanded,
|
||||
8 => Width::ExtraExpanded,
|
||||
9 => Width::UltraExpanded,
|
||||
_ => Width::Normal,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns subscript metrics.
|
||||
#[inline]
|
||||
pub fn subscript_metrics(&self) -> ScriptMetrics {
|
||||
let mut s = Stream::new_at(self.data, Y_SUBSCRIPT_X_SIZE_OFFSET).unwrap_or_default();
|
||||
ScriptMetrics {
|
||||
x_size: s.read::<i16>().unwrap_or(0),
|
||||
y_size: s.read::<i16>().unwrap_or(0),
|
||||
x_offset: s.read::<i16>().unwrap_or(0),
|
||||
y_offset: s.read::<i16>().unwrap_or(0),
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns superscript metrics.
|
||||
#[inline]
|
||||
pub fn superscript_metrics(&self) -> ScriptMetrics {
|
||||
let mut s = Stream::new_at(self.data, Y_SUPERSCRIPT_X_SIZE_OFFSET).unwrap_or_default();
|
||||
ScriptMetrics {
|
||||
x_size: s.read::<i16>().unwrap_or(0),
|
||||
y_size: s.read::<i16>().unwrap_or(0),
|
||||
x_offset: s.read::<i16>().unwrap_or(0),
|
||||
y_offset: s.read::<i16>().unwrap_or(0),
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns strikeout metrics.
|
||||
#[inline]
|
||||
pub fn strikeout_metrics(&self) -> LineMetrics {
|
||||
LineMetrics {
|
||||
thickness: Stream::read_at::<i16>(self.data, Y_STRIKEOUT_SIZE_OFFSET).unwrap_or(0),
|
||||
position: Stream::read_at::<i16>(self.data, Y_STRIKEOUT_POSITION_OFFSET).unwrap_or(0),
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn fs_selection(&self) -> u16 {
|
||||
Stream::read_at::<u16>(self.data, FS_SELECTION_OFFSET).unwrap_or(0)
|
||||
}
|
||||
|
||||
/// Returns style.
|
||||
pub fn style(&self) -> Style {
|
||||
let flags = SelectionFlags(self.fs_selection());
|
||||
if flags.italic() {
|
||||
Style::Italic
|
||||
} else if self.version >= 4 && flags.oblique() {
|
||||
Style::Oblique
|
||||
} else {
|
||||
Style::Normal
|
||||
}
|
||||
}
|
||||
|
||||
/// Checks if face is bold.
|
||||
///
|
||||
/// Do not confuse with [`Weight::Bold`].
|
||||
#[inline]
|
||||
pub fn is_bold(&self) -> bool {
|
||||
SelectionFlags(self.fs_selection()).bold()
|
||||
}
|
||||
|
||||
/// Checks if typographic metrics should be used.
|
||||
#[inline]
|
||||
pub fn use_typographic_metrics(&self) -> bool {
|
||||
if self.version < 4 {
|
||||
false
|
||||
} else {
|
||||
SelectionFlags(self.fs_selection()).use_typo_metrics()
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns typographic ascender.
|
||||
#[inline]
|
||||
pub fn typographic_ascender(&self) -> i16 {
|
||||
Stream::read_at::<i16>(self.data, TYPO_ASCENDER_OFFSET).unwrap_or(0)
|
||||
}
|
||||
|
||||
/// Returns typographic descender.
|
||||
#[inline]
|
||||
pub fn typographic_descender(&self) -> i16 {
|
||||
Stream::read_at::<i16>(self.data, TYPO_DESCENDER_OFFSET).unwrap_or(0)
|
||||
}
|
||||
|
||||
/// Returns typographic line gap.
|
||||
#[inline]
|
||||
pub fn typographic_line_gap(&self) -> i16 {
|
||||
Stream::read_at::<i16>(self.data, TYPO_LINE_GAP_OFFSET).unwrap_or(0)
|
||||
}
|
||||
|
||||
/// Returns Windows ascender.
|
||||
#[inline]
|
||||
pub fn windows_ascender(&self) -> i16 {
|
||||
Stream::read_at::<i16>(self.data, WIN_ASCENT).unwrap_or(0)
|
||||
}
|
||||
|
||||
/// Returns Windows descender.
|
||||
#[inline]
|
||||
pub fn windows_descender(&self) -> i16 {
|
||||
// Should be negated.
|
||||
-Stream::read_at::<i16>(self.data, WIN_DESCENT).unwrap_or(0)
|
||||
}
|
||||
|
||||
/// Returns x height.
|
||||
///
|
||||
/// Returns `None` version is < 2.
|
||||
#[inline]
|
||||
pub fn x_height(&self) -> Option<i16> {
|
||||
if self.version < 2 {
|
||||
None
|
||||
} else {
|
||||
Stream::read_at::<i16>(self.data, X_HEIGHT_OFFSET)
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns capital height.
|
||||
///
|
||||
/// Returns `None` version is < 2.
|
||||
#[inline]
|
||||
pub fn capital_height(&self) -> Option<i16> {
|
||||
if self.version < 2 {
|
||||
None
|
||||
} else {
|
||||
Stream::read_at::<i16>(self.data, CAP_HEIGHT_OFFSET)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Table<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Table {{ ... }}")
|
||||
}
|
||||
}
|
|
@ -0,0 +1,394 @@
|
|||
//! A [PostScript Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/post) implementation.
|
||||
|
||||
use crate::LineMetrics;
|
||||
use crate::parser::{Stream, Fixed, LazyArray16};
|
||||
#[cfg(feature = "glyph-names")] use crate::GlyphId;
|
||||
|
||||
const TABLE_SIZE: usize = 32;
|
||||
const ITALIC_ANGLE_OFFSET: usize = 4;
|
||||
const UNDERLINE_POSITION_OFFSET: usize = 8;
|
||||
const UNDERLINE_THICKNESS_OFFSET: usize = 10;
|
||||
const IS_FIXED_PITCH_OFFSET: usize = 12;
|
||||
|
||||
// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6post.html
|
||||
/// A list of Macintosh glyph names.
|
||||
#[cfg(feature = "glyph-names")]
|
||||
const MACINTOSH_NAMES: &[&str] = &[
|
||||
".notdef",
|
||||
".null",
|
||||
"nonmarkingreturn",
|
||||
"space",
|
||||
"exclam",
|
||||
"quotedbl",
|
||||
"numbersign",
|
||||
"dollar",
|
||||
"percent",
|
||||
"ampersand",
|
||||
"quotesingle",
|
||||
"parenleft",
|
||||
"parenright",
|
||||
"asterisk",
|
||||
"plus",
|
||||
"comma",
|
||||
"hyphen",
|
||||
"period",
|
||||
"slash",
|
||||
"zero",
|
||||
"one",
|
||||
"two",
|
||||
"three",
|
||||
"four",
|
||||
"five",
|
||||
"six",
|
||||
"seven",
|
||||
"eight",
|
||||
"nine",
|
||||
"colon",
|
||||
"semicolon",
|
||||
"less",
|
||||
"equal",
|
||||
"greater",
|
||||
"question",
|
||||
"at",
|
||||
"A",
|
||||
"B",
|
||||
"C",
|
||||
"D",
|
||||
"E",
|
||||
"F",
|
||||
"G",
|
||||
"H",
|
||||
"I",
|
||||
"J",
|
||||
"K",
|
||||
"L",
|
||||
"M",
|
||||
"N",
|
||||
"O",
|
||||
"P",
|
||||
"Q",
|
||||
"R",
|
||||
"S",
|
||||
"T",
|
||||
"U",
|
||||
"V",
|
||||
"W",
|
||||
"X",
|
||||
"Y",
|
||||
"Z",
|
||||
"bracketleft",
|
||||
"backslash",
|
||||
"bracketright",
|
||||
"asciicircum",
|
||||
"underscore",
|
||||
"grave",
|
||||
"a",
|
||||
"b",
|
||||
"c",
|
||||
"d",
|
||||
"e",
|
||||
"f",
|
||||
"g",
|
||||
"h",
|
||||
"i",
|
||||
"j",
|
||||
"k",
|
||||
"l",
|
||||
"m",
|
||||
"n",
|
||||
"o",
|
||||
"p",
|
||||
"q",
|
||||
"r",
|
||||
"s",
|
||||
"t",
|
||||
"u",
|
||||
"v",
|
||||
"w",
|
||||
"x",
|
||||
"y",
|
||||
"z",
|
||||
"braceleft",
|
||||
"bar",
|
||||
"braceright",
|
||||
"asciitilde",
|
||||
"Adieresis",
|
||||
"Aring",
|
||||
"Ccedilla",
|
||||
"Eacute",
|
||||
"Ntilde",
|
||||
"Odieresis",
|
||||
"Udieresis",
|
||||
"aacute",
|
||||
"agrave",
|
||||
"acircumflex",
|
||||
"adieresis",
|
||||
"atilde",
|
||||
"aring",
|
||||
"ccedilla",
|
||||
"eacute",
|
||||
"egrave",
|
||||
"ecircumflex",
|
||||
"edieresis",
|
||||
"iacute",
|
||||
"igrave",
|
||||
"icircumflex",
|
||||
"idieresis",
|
||||
"ntilde",
|
||||
"oacute",
|
||||
"ograve",
|
||||
"ocircumflex",
|
||||
"odieresis",
|
||||
"otilde",
|
||||
"uacute",
|
||||
"ugrave",
|
||||
"ucircumflex",
|
||||
"udieresis",
|
||||
"dagger",
|
||||
"degree",
|
||||
"cent",
|
||||
"sterling",
|
||||
"section",
|
||||
"bullet",
|
||||
"paragraph",
|
||||
"germandbls",
|
||||
"registered",
|
||||
"copyright",
|
||||
"trademark",
|
||||
"acute",
|
||||
"dieresis",
|
||||
"notequal",
|
||||
"AE",
|
||||
"Oslash",
|
||||
"infinity",
|
||||
"plusminus",
|
||||
"lessequal",
|
||||
"greaterequal",
|
||||
"yen",
|
||||
"mu",
|
||||
"partialdiff",
|
||||
"summation",
|
||||
"product",
|
||||
"pi",
|
||||
"integral",
|
||||
"ordfeminine",
|
||||
"ordmasculine",
|
||||
"Omega",
|
||||
"ae",
|
||||
"oslash",
|
||||
"questiondown",
|
||||
"exclamdown",
|
||||
"logicalnot",
|
||||
"radical",
|
||||
"florin",
|
||||
"approxequal",
|
||||
"Delta",
|
||||
"guillemotleft",
|
||||
"guillemotright",
|
||||
"ellipsis",
|
||||
"nonbreakingspace",
|
||||
"Agrave",
|
||||
"Atilde",
|
||||
"Otilde",
|
||||
"OE",
|
||||
"oe",
|
||||
"endash",
|
||||
"emdash",
|
||||
"quotedblleft",
|
||||
"quotedblright",
|
||||
"quoteleft",
|
||||
"quoteright",
|
||||
"divide",
|
||||
"lozenge",
|
||||
"ydieresis",
|
||||
"Ydieresis",
|
||||
"fraction",
|
||||
"currency",
|
||||
"guilsinglleft",
|
||||
"guilsinglright",
|
||||
"fi",
|
||||
"fl",
|
||||
"daggerdbl",
|
||||
"periodcentered",
|
||||
"quotesinglbase",
|
||||
"quotedblbase",
|
||||
"perthousand",
|
||||
"Acircumflex",
|
||||
"Ecircumflex",
|
||||
"Aacute",
|
||||
"Edieresis",
|
||||
"Egrave",
|
||||
"Iacute",
|
||||
"Icircumflex",
|
||||
"Idieresis",
|
||||
"Igrave",
|
||||
"Oacute",
|
||||
"Ocircumflex",
|
||||
"apple",
|
||||
"Ograve",
|
||||
"Uacute",
|
||||
"Ucircumflex",
|
||||
"Ugrave",
|
||||
"dotlessi",
|
||||
"circumflex",
|
||||
"tilde",
|
||||
"macron",
|
||||
"breve",
|
||||
"dotaccent",
|
||||
"ring",
|
||||
"cedilla",
|
||||
"hungarumlaut",
|
||||
"ogonek",
|
||||
"caron",
|
||||
"Lslash",
|
||||
"lslash",
|
||||
"Scaron",
|
||||
"scaron",
|
||||
"Zcaron",
|
||||
"zcaron",
|
||||
"brokenbar",
|
||||
"Eth",
|
||||
"eth",
|
||||
"Yacute",
|
||||
"yacute",
|
||||
"Thorn",
|
||||
"thorn",
|
||||
"minus",
|
||||
"multiply",
|
||||
"onesuperior",
|
||||
"twosuperior",
|
||||
"threesuperior",
|
||||
"onehalf",
|
||||
"onequarter",
|
||||
"threequarters",
|
||||
"franc",
|
||||
"Gbreve",
|
||||
"gbreve",
|
||||
"Idotaccent",
|
||||
"Scedilla",
|
||||
"scedilla",
|
||||
"Cacute",
|
||||
"cacute",
|
||||
"Ccaron",
|
||||
"ccaron",
|
||||
"dcroat",
|
||||
];
|
||||
|
||||
|
||||
/// A list of glyph names.
|
||||
#[derive(Clone, Copy, Default)]
|
||||
pub struct Names<'a> {
|
||||
indexes: LazyArray16<'a, u16>,
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
// TODO: add low-level iterator
|
||||
impl<'a> Names<'a> {
|
||||
/// Returns a glyph name by ID.
|
||||
#[cfg(feature = "glyph-names")]
|
||||
pub fn get(&self, glyph_id: GlyphId) -> Option<&'a str> {
|
||||
let mut index = self.indexes.get(glyph_id.0)?;
|
||||
|
||||
// 'If the name index is between 0 and 257, treat the name index
|
||||
// as a glyph index in the Macintosh standard order.'
|
||||
if usize::from(index) < MACINTOSH_NAMES.len() {
|
||||
Some(MACINTOSH_NAMES[usize::from(index)])
|
||||
} else {
|
||||
// 'If the name index is between 258 and 65535, then subtract 258 and use that
|
||||
// to index into the list of Pascal strings at the end of the table.'
|
||||
index -= MACINTOSH_NAMES.len() as u16;
|
||||
|
||||
let mut s = Stream::new(self.data);
|
||||
let mut i = 0;
|
||||
while !s.at_end() && i < core::u16::MAX {
|
||||
let len = s.read::<u8>()?;
|
||||
|
||||
if i == index {
|
||||
if len == 0 {
|
||||
// Empty name is an error.
|
||||
break;
|
||||
} else {
|
||||
let name = s.read_bytes(usize::from(len))?;
|
||||
return core::str::from_utf8(name).ok();
|
||||
}
|
||||
} else {
|
||||
s.advance(usize::from(len));
|
||||
}
|
||||
|
||||
i += 1;
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns names count.
|
||||
#[inline]
|
||||
pub fn len(&self) -> u16 {
|
||||
self.indexes.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Names<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Names {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [PostScript Table](https://docs.microsoft.com/en-us/typography/opentype/spec/post).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// Italic angle in counter-clockwise degrees from the vertical.
|
||||
pub italic_angle: f32,
|
||||
/// Underline metrics.
|
||||
pub underline_metrics: LineMetrics,
|
||||
/// Flag that indicates that the font is monospaced.
|
||||
pub is_monospaced: bool,
|
||||
/// A list of glyph names.
|
||||
pub names: Names<'a>,
|
||||
}
|
||||
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
if data.len() < TABLE_SIZE {
|
||||
return None;
|
||||
}
|
||||
|
||||
let version = Stream::new(data).read::<u32>()?;
|
||||
if !(version == 0x00010000 || version == 0x00020000 ||
|
||||
version == 0x00025000 || version == 0x00030000 ||
|
||||
version == 0x00040000)
|
||||
{
|
||||
return None;
|
||||
}
|
||||
|
||||
let italic_angle = Stream::read_at::<Fixed>(data, ITALIC_ANGLE_OFFSET)?.0;
|
||||
|
||||
let underline_metrics = LineMetrics {
|
||||
position: Stream::read_at::<i16>(data, UNDERLINE_POSITION_OFFSET)?,
|
||||
thickness: Stream::read_at::<i16>(data, UNDERLINE_THICKNESS_OFFSET)?,
|
||||
};
|
||||
|
||||
let is_monospaced = Stream::read_at::<u32>(data, IS_FIXED_PITCH_OFFSET)? != 0;
|
||||
|
||||
let mut names = Names::default();
|
||||
// Only version 2.0 of the table has data at the end.
|
||||
if version == 0x00020000 {
|
||||
let mut s = Stream::new_at(data, TABLE_SIZE)?;
|
||||
let count = s.read::<u16>()?;
|
||||
names.indexes = s.read_array16::<u16>(count)?;
|
||||
names.data = s.tail()?;
|
||||
}
|
||||
|
||||
Some(Table {
|
||||
italic_angle,
|
||||
underline_metrics,
|
||||
is_monospaced,
|
||||
names,
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,247 @@
|
|||
//! A [Standard Bitmap Graphics Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/sbix) implementation.
|
||||
|
||||
use core::convert::TryFrom;
|
||||
use core::num::NonZeroU16;
|
||||
|
||||
use crate::{GlyphId, RasterGlyphImage, RasterImageFormat, Tag};
|
||||
use crate::parser::{Stream, FromData, Offset, Offset32, LazyArray16, LazyArray32};
|
||||
|
||||
/// A strike of glyphs.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Strike<'a> {
|
||||
/// The pixels per EM size for which this strike was designed.
|
||||
pub pixels_per_em: u16,
|
||||
/// The device pixel density (in PPI) for which this strike was designed.
|
||||
pub ppi: u16,
|
||||
offsets: LazyArray16<'a, Offset32>,
|
||||
/// Data from the beginning of the `Strikes` table.
|
||||
data: &'a [u8],
|
||||
}
|
||||
|
||||
impl<'a> Strike<'a> {
|
||||
fn parse(number_of_glyphs: u16, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
let pixels_per_em = s.read::<u16>()?;
|
||||
let ppi = s.read::<u16>()?;
|
||||
let offsets = s.read_array16(number_of_glyphs)?;
|
||||
Some(Strike {
|
||||
pixels_per_em,
|
||||
ppi,
|
||||
offsets,
|
||||
data,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns a glyph data.
|
||||
pub fn get(&self, glyph_id: GlyphId) -> Option<RasterGlyphImage<'a>> {
|
||||
self.get_inner(glyph_id, 0)
|
||||
}
|
||||
|
||||
fn get_inner(&self, glyph_id: GlyphId, depth: u8) -> Option<RasterGlyphImage<'a>> {
|
||||
// Recursive `dupe`. Bail.
|
||||
if depth == 10 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let start = self.offsets.get(glyph_id.0)?.to_usize();
|
||||
let end = self.offsets.get(glyph_id.0.checked_add(1)?)?.to_usize();
|
||||
|
||||
if start == end {
|
||||
return None;
|
||||
}
|
||||
|
||||
let data_len = end.checked_sub(start)?.checked_sub(8)?; // 8 is a Glyph data header size.
|
||||
|
||||
let mut s = Stream::new_at(self.data, start)?;
|
||||
let x = s.read::<i16>()?;
|
||||
let y = s.read::<i16>()?;
|
||||
let image_type = s.read::<Tag>()?;
|
||||
let image_data = s.read_bytes(data_len)?;
|
||||
|
||||
// We do ignore `pdf` and `mask` intentionally, because Apple docs state that:
|
||||
// 'Support for the 'pdf ' and 'mask' data types and sbixDrawOutlines flag
|
||||
// are planned for future releases of iOS and OS X.'
|
||||
let format = match &image_type.to_bytes() {
|
||||
b"png " => RasterImageFormat::PNG,
|
||||
b"dupe" => {
|
||||
// 'The special graphicType of 'dupe' indicates that
|
||||
// the data field contains a glyph ID. The bitmap data for
|
||||
// the indicated glyph should be used for the current glyph.'
|
||||
let glyph_id = GlyphId::parse(image_data)?;
|
||||
// TODO: The spec isn't clear about which x/y values should we use.
|
||||
// The current glyph or the referenced one.
|
||||
return self.get_inner(glyph_id, depth + 1);
|
||||
}
|
||||
_ => {
|
||||
// TODO: support JPEG and TIFF
|
||||
return None;
|
||||
}
|
||||
};
|
||||
|
||||
let (width, height) = png_size(image_data)?;
|
||||
|
||||
Some(RasterGlyphImage {
|
||||
x,
|
||||
y,
|
||||
width,
|
||||
height,
|
||||
pixels_per_em: self.pixels_per_em,
|
||||
format,
|
||||
data: image_data,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns the number of glyphs in this strike.
|
||||
#[inline]
|
||||
pub fn len(&self) -> u16 {
|
||||
// The last offset simply indicates the glyph data end. We don't need it.
|
||||
self.offsets.len() - 1
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Strike<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Strike {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A list of [`Strike`]s.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct Strikes<'a> {
|
||||
/// `sbix` table data.
|
||||
data: &'a [u8],
|
||||
// Offsets from the beginning of the `sbix` table.
|
||||
offsets: LazyArray32<'a, Offset32>,
|
||||
// The total number of glyphs in the face + 1. From the `maxp` table.
|
||||
number_of_glyphs: u16,
|
||||
}
|
||||
|
||||
impl<'a> Strikes<'a> {
|
||||
/// Returns a strike at the index.
|
||||
pub fn get(&self, index: u32) -> Option<Strike<'a>> {
|
||||
let offset = self.offsets.get(index)?.to_usize();
|
||||
let data = self.data.get(offset..)?;
|
||||
Strike::parse(self.number_of_glyphs, data)
|
||||
}
|
||||
|
||||
/// Returns the number of strikes.
|
||||
#[inline]
|
||||
pub fn len(&self) -> u32 {
|
||||
self.offsets.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for Strikes<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "Strikes {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for Strikes<'a> {
|
||||
type Item = Strike<'a>;
|
||||
type IntoIter = StrikesIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
StrikesIter {
|
||||
strikes: self,
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over [`Strikes`].
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct StrikesIter<'a> {
|
||||
strikes: Strikes<'a>,
|
||||
index: u32,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for StrikesIter<'a> {
|
||||
type Item = Strike<'a>;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index < self.strikes.len() {
|
||||
self.index += 1;
|
||||
self.strikes.get(self.index - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Standard Bitmap Graphics Table](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/sbix).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// A list of [`Strike`]s.
|
||||
pub strikes: Strikes<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
///
|
||||
/// - `number_of_glyphs` is from the `maxp` table.
|
||||
pub fn parse(number_of_glyphs: NonZeroU16, data: &'a [u8]) -> Option<Self> {
|
||||
let number_of_glyphs = number_of_glyphs.get().checked_add(1)?;
|
||||
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let version = s.read::<u16>()?;
|
||||
if version != 1 {
|
||||
return None;
|
||||
}
|
||||
|
||||
s.skip::<u16>(); // flags
|
||||
|
||||
let strikes_count = s.read::<u32>()?;
|
||||
if strikes_count == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let offsets = s.read_array32::<Offset32>(strikes_count)?;
|
||||
|
||||
Some(Table {
|
||||
strikes: Strikes {
|
||||
data,
|
||||
offsets,
|
||||
number_of_glyphs,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
/// Selects the best matching [`Strike`] based on `pixels_per_em`.
|
||||
pub fn best_strike(&self, pixels_per_em: u16) -> Option<Strike<'a>> {
|
||||
let mut idx = 0;
|
||||
let mut max_ppem = 0;
|
||||
for (i, strike) in self.strikes.into_iter().enumerate() {
|
||||
if (pixels_per_em <= strike.pixels_per_em && strike.pixels_per_em < max_ppem) ||
|
||||
(pixels_per_em > max_ppem && strike.pixels_per_em > max_ppem)
|
||||
{
|
||||
idx = i as u32;
|
||||
max_ppem = strike.pixels_per_em;
|
||||
}
|
||||
}
|
||||
|
||||
self.strikes.get(idx)
|
||||
}
|
||||
}
|
||||
|
||||
// The `sbix` table doesn't store the image size, so we have to parse it manually.
|
||||
// Which is quite simple in case of PNG, but way more complex for JPEG.
|
||||
// Therefore we are omitting it for now.
|
||||
fn png_size(data: &[u8]) -> Option<(u16, u16)> {
|
||||
// PNG stores its size as u32 BE at a fixed offset.
|
||||
let mut s = Stream::new_at(data, 16)?;
|
||||
let width = s.read::<u32>()?;
|
||||
let height = s.read::<u32>()?;
|
||||
|
||||
// PNG size larger than u16::MAX is an error.
|
||||
Some((
|
||||
u16::try_from(width).ok()?,
|
||||
u16::try_from(height).ok()?,
|
||||
))
|
||||
}
|
|
@ -0,0 +1,136 @@
|
|||
//! An [SVG Table](https://docs.microsoft.com/en-us/typography/opentype/spec/svg) implementation.
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::parser::{FromData, LazyArray16, NumFrom, Offset, Offset32, Stream};
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct SvgDocumentRecord {
|
||||
start_glyph_id: GlyphId,
|
||||
end_glyph_id: GlyphId,
|
||||
svg_doc_offset: Option<Offset32>,
|
||||
svg_doc_length: u32,
|
||||
}
|
||||
|
||||
impl FromData for SvgDocumentRecord {
|
||||
const SIZE: usize = 12;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(SvgDocumentRecord {
|
||||
start_glyph_id: s.read::<GlyphId>()?,
|
||||
end_glyph_id: s.read::<GlyphId>()?,
|
||||
svg_doc_offset: s.read::<Option<Offset32>>()?,
|
||||
svg_doc_length: s.read::<u32>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A list of [SVG documents](
|
||||
/// https://docs.microsoft.com/en-us/typography/opentype/spec/svg#svg-document-list).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct SvgDocumentsList<'a> {
|
||||
data: &'a [u8],
|
||||
records: LazyArray16<'a, SvgDocumentRecord>,
|
||||
}
|
||||
|
||||
impl<'a> SvgDocumentsList<'a> {
|
||||
/// Returns SVG document data at index.
|
||||
///
|
||||
/// `index` is not a GlyphId. You should use [`find()`](SvgDocumentsList::find) instead.
|
||||
#[inline]
|
||||
pub fn get(&self, index: u16) -> Option<&'a [u8]> {
|
||||
let record = self.records.get(index)?;
|
||||
let offset = record.svg_doc_offset?.to_usize();
|
||||
self.data.get(offset..offset + usize::num_from(record.svg_doc_length))
|
||||
}
|
||||
|
||||
/// Returns a SVG document data by glyph ID.
|
||||
#[inline]
|
||||
pub fn find(&self, glyph_id: GlyphId) -> Option<&'a [u8]> {
|
||||
let index = self.records.into_iter()
|
||||
.position(|v| (v.start_glyph_id..=v.end_glyph_id).contains(&glyph_id))?;
|
||||
self.get(index as u16)
|
||||
}
|
||||
|
||||
/// Returns the number of SVG documents in the list.
|
||||
pub fn len(&self) -> u16 {
|
||||
self.records.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl core::fmt::Debug for SvgDocumentsList<'_> {
|
||||
fn fmt(&self, f: &mut core::fmt::Formatter) -> core::fmt::Result {
|
||||
write!(f, "SvgDocumentsList {{ ... }}")
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for SvgDocumentsList<'a,> {
|
||||
type Item = &'a [u8];
|
||||
type IntoIter = SvgDocumentsListIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
SvgDocumentsListIter {
|
||||
list: self,
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An iterator over [`SvgDocumentsList`] values.
|
||||
#[derive(Clone, Copy)]
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct SvgDocumentsListIter<'a> {
|
||||
list: SvgDocumentsList<'a>,
|
||||
index: u16,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for SvgDocumentsListIter<'a> {
|
||||
type Item = &'a [u8];
|
||||
|
||||
#[inline]
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index < self.list.len() {
|
||||
self.index += 1;
|
||||
self.list.get(self.index - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn count(self) -> usize {
|
||||
usize::from(self.list.len().checked_sub(self.index).unwrap_or(0))
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// An [SVG Table](https://docs.microsoft.com/en-us/typography/opentype/spec/svg).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// A list of SVG documents.
|
||||
pub documents: SvgDocumentsList<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u16>(); // version
|
||||
let doc_list_offset = s.read::<Option<Offset32>>()??;
|
||||
|
||||
let mut s = Stream::new_at(data, doc_list_offset.to_usize())?;
|
||||
let count = s.read::<u16>()?;
|
||||
let records = s.read_array16::<SvgDocumentRecord>(count)?;
|
||||
|
||||
Some(Table {
|
||||
documents: SvgDocumentsList {
|
||||
data: &data[doc_list_offset.0 as usize..],
|
||||
records,
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,179 @@
|
|||
//! A [Tracking Table](
|
||||
//! https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6trak.html) implementation.
|
||||
|
||||
use crate::parser::{FromData, LazyArray16, Offset, Offset16, Offset32, Fixed, Stream};
|
||||
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct TrackTableRecord {
|
||||
value: Fixed,
|
||||
name_id: u16,
|
||||
offset: Offset16, // Offset from start of the table.
|
||||
}
|
||||
|
||||
impl FromData for TrackTableRecord {
|
||||
const SIZE: usize = 8;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(TrackTableRecord {
|
||||
value: s.read::<Fixed>()?,
|
||||
name_id: s.read::<u16>()?,
|
||||
offset: s.read::<Offset16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// A single track.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Track<'a> {
|
||||
/// A track value.
|
||||
pub value: f32,
|
||||
/// The `name` table index for the track's name.
|
||||
pub name_index: u16,
|
||||
/// A list of tracking values for each size.
|
||||
pub values: LazyArray16<'a, i16>,
|
||||
}
|
||||
|
||||
|
||||
/// A list of tracks.
|
||||
#[derive(Clone, Copy, Default, Debug)]
|
||||
pub struct Tracks<'a> {
|
||||
data: &'a [u8], // the whole table
|
||||
records: LazyArray16<'a, TrackTableRecord>,
|
||||
sizes_count: u16,
|
||||
}
|
||||
|
||||
impl<'a> Tracks<'a> {
|
||||
/// Returns a track at index.
|
||||
pub fn get(&self, index: u16) -> Option<Track<'a>> {
|
||||
let record = self.records.get(index)?;
|
||||
let mut s = Stream::new(self.data.get(record.offset.to_usize()..)?);
|
||||
Some(Track {
|
||||
value: record.value.0,
|
||||
values: s.read_array16::<i16>(self.sizes_count)?,
|
||||
name_index: record.name_id,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns the number of tracks.
|
||||
pub fn len(&self) -> u16 {
|
||||
self.records.len()
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> IntoIterator for Tracks<'a> {
|
||||
type Item = Track<'a>;
|
||||
type IntoIter = TracksIter<'a>;
|
||||
|
||||
#[inline]
|
||||
fn into_iter(self) -> Self::IntoIter {
|
||||
TracksIter {
|
||||
tracks: self,
|
||||
index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// An iterator over [`Tracks`].
|
||||
#[allow(missing_debug_implementations)]
|
||||
pub struct TracksIter<'a> {
|
||||
tracks: Tracks<'a>,
|
||||
index: u16,
|
||||
}
|
||||
|
||||
impl<'a> Iterator for TracksIter<'a> {
|
||||
type Item = Track<'a>;
|
||||
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.index < self.tracks.len() {
|
||||
self.index += 1;
|
||||
self.tracks.get(self.index - 1)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A track data.
|
||||
#[derive(Clone, Copy, Default, Debug)]
|
||||
pub struct TrackData<'a> {
|
||||
/// A list of tracks.
|
||||
pub tracks: Tracks<'a>,
|
||||
/// A list of sizes.
|
||||
pub sizes: LazyArray16<'a, Fixed>,
|
||||
}
|
||||
|
||||
impl<'a> TrackData<'a> {
|
||||
fn parse(offset: usize, data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new_at(data, offset)?;
|
||||
let tracks_count = s.read::<u16>()?;
|
||||
let sizes_count = s.read::<u16>()?;
|
||||
let size_table_offset = s.read::<Offset32>()?; // Offset from start of the table.
|
||||
|
||||
let tracks = Tracks {
|
||||
data,
|
||||
records: s.read_array16::<TrackTableRecord>(tracks_count)?,
|
||||
sizes_count,
|
||||
};
|
||||
|
||||
// TODO: Isn't the size table is directly after the tracks table?!
|
||||
// Why we need an offset then?
|
||||
let sizes = {
|
||||
let mut s = Stream::new_at(data, size_table_offset.to_usize())?;
|
||||
s.read_array16::<Fixed>(sizes_count)?
|
||||
};
|
||||
|
||||
Some(TrackData { tracks, sizes })
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Tracking Table](
|
||||
/// https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6trak.html).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// Horizontal track data.
|
||||
pub horizontal: TrackData<'a>,
|
||||
/// Vertical track data.
|
||||
pub vertical: TrackData<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let version = s.read::<u32>()?;
|
||||
if version != 0x00010000 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let format = s.read::<u16>()?;
|
||||
if format != 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let hor_offset = s.read::<Option<Offset16>>()?;
|
||||
let ver_offset = s.read::<Option<Offset16>>()?;
|
||||
s.skip::<u16>(); // reserved
|
||||
|
||||
let horizontal = if let Some(offset) = hor_offset {
|
||||
TrackData::parse(offset.to_usize(), data)?
|
||||
} else {
|
||||
TrackData::default()
|
||||
};
|
||||
|
||||
let vertical = if let Some(offset) = ver_offset {
|
||||
TrackData::parse(offset.to_usize(), data)?
|
||||
} else {
|
||||
TrackData::default()
|
||||
};
|
||||
|
||||
Some(Table {
|
||||
horizontal,
|
||||
vertical,
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,41 @@
|
|||
//! A [Vertical Header Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/vhea) implementation.
|
||||
|
||||
use crate::parser::Stream;
|
||||
|
||||
/// A [Vertical Header Table](https://docs.microsoft.com/en-us/typography/opentype/spec/vhea).
|
||||
#[derive(Clone, Copy, Default, Debug)]
|
||||
pub struct Table {
|
||||
/// Face ascender.
|
||||
pub ascender: i16,
|
||||
/// Face descender.
|
||||
pub descender: i16,
|
||||
/// Face line gap.
|
||||
pub line_gap: i16,
|
||||
/// Number of metrics in the `vmtx` table.
|
||||
pub number_of_metrics: u16,
|
||||
}
|
||||
|
||||
impl Table {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &[u8]) -> Option<Self> {
|
||||
if data.len() != 36 {
|
||||
return None
|
||||
}
|
||||
|
||||
let mut s = Stream::new(data);
|
||||
s.skip::<u32>(); // version
|
||||
let ascender = s.read::<i16>()?;
|
||||
let descender = s.read::<i16>()?;
|
||||
let line_gap = s.read::<i16>()?;
|
||||
s.advance(24);
|
||||
let number_of_metrics = s.read::<u16>()?;
|
||||
|
||||
Some(Table {
|
||||
ascender,
|
||||
descender,
|
||||
line_gap,
|
||||
number_of_metrics,
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,68 @@
|
|||
//! A [Vertical Origin Table](
|
||||
//! https://docs.microsoft.com/en-us/typography/opentype/spec/vorg) implementation.
|
||||
|
||||
use crate::GlyphId;
|
||||
use crate::parser::{Stream, FromData, LazyArray16};
|
||||
|
||||
/// Vertical origin metrics for the
|
||||
/// [Vertical Origin Table](https://docs.microsoft.com/en-us/typography/opentype/spec/vorg).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct VerticalOriginMetrics {
|
||||
/// Glyph ID.
|
||||
pub glyph_id: GlyphId,
|
||||
/// Y coordinate, in the font's design coordinate system, of the vertical origin.
|
||||
pub y: i16,
|
||||
}
|
||||
|
||||
impl FromData for VerticalOriginMetrics {
|
||||
const SIZE: usize = 4;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(VerticalOriginMetrics {
|
||||
glyph_id: s.read::<GlyphId>()?,
|
||||
y: s.read::<i16>()?,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
/// A [Vertical Origin Table](https://docs.microsoft.com/en-us/typography/opentype/spec/vorg).
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
pub struct Table<'a> {
|
||||
/// Default origin.
|
||||
pub default_y: i16,
|
||||
/// A list of metrics for each glyph.
|
||||
///
|
||||
/// Ordered by `glyph_id`.
|
||||
pub metrics: LazyArray16<'a, VerticalOriginMetrics>,
|
||||
}
|
||||
|
||||
impl<'a> Table<'a> {
|
||||
/// Parses a table from raw data.
|
||||
pub fn parse(data: &'a [u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
|
||||
let version = s.read::<u32>()?;
|
||||
if version != 0x00010000 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let default_y = s.read::<i16>()?;
|
||||
let count = s.read::<u16>()?;
|
||||
let metrics = s.read_array16::<VerticalOriginMetrics>(count)?;
|
||||
|
||||
Some(Table {
|
||||
default_y,
|
||||
metrics,
|
||||
})
|
||||
}
|
||||
|
||||
/// Returns glyph's Y origin.
|
||||
pub fn glyph_y_origin(&self, glyph_id: GlyphId) -> i16 {
|
||||
self.metrics.binary_search_by(|m| m.glyph_id.cmp(&glyph_id))
|
||||
.map(|(_, m)| m.y)
|
||||
.unwrap_or(self.default_y)
|
||||
}
|
||||
}
|
|
@ -0,0 +1,193 @@
|
|||
//! Implementation of Item Variation Store
|
||||
//!
|
||||
//! <https://docs.microsoft.com/en-us/typography/opentype/spec/otvarcommonformats#item-variation-store>
|
||||
|
||||
use crate::NormalizedCoordinate;
|
||||
use crate::parser::{Stream, FromData, LazyArray16, NumFrom};
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
pub(crate) struct ItemVariationStore<'a> {
|
||||
data: &'a [u8],
|
||||
data_offsets: LazyArray16<'a, u32>,
|
||||
pub regions: VariationRegionList<'a>,
|
||||
}
|
||||
|
||||
impl<'a> Default for ItemVariationStore<'a> {
|
||||
#[inline]
|
||||
fn default() -> Self {
|
||||
ItemVariationStore {
|
||||
data: &[],
|
||||
data_offsets: LazyArray16::new(&[]),
|
||||
regions: VariationRegionList {
|
||||
axis_count: 0,
|
||||
regions: LazyArray16::new(&[]),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> ItemVariationStore<'a> {
|
||||
#[inline]
|
||||
pub fn parse(mut s: Stream) -> Option<ItemVariationStore> {
|
||||
let data = s.tail()?;
|
||||
|
||||
let mut regions_s = s.clone();
|
||||
let format = s.read::<u16>()?;
|
||||
if format != 1 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let region_list_offset = s.read::<u32>()?;
|
||||
let count = s.read::<u16>()?;
|
||||
let offsets = s.read_array16::<u32>(count)?;
|
||||
|
||||
let regions = {
|
||||
regions_s.advance(usize::num_from(region_list_offset));
|
||||
// TODO: should be the same as in `fvar`
|
||||
let axis_count = regions_s.read::<u16>()?;
|
||||
let count = regions_s.read::<u16>()?;
|
||||
let total = count.checked_mul(axis_count)?;
|
||||
VariationRegionList {
|
||||
axis_count,
|
||||
regions: regions_s.read_array16::<RegionAxisCoordinatesRecord>(total)?,
|
||||
}
|
||||
};
|
||||
|
||||
Some(ItemVariationStore { data, data_offsets: offsets, regions })
|
||||
}
|
||||
|
||||
pub fn region_indices(&self, index: u16) -> Option<LazyArray16<u16>> {
|
||||
// Offsets in bytes from the start of the item variation store
|
||||
// to each item variation data subtable.
|
||||
let offset = self.data_offsets.get(index)?;
|
||||
let mut s = Stream::new_at(self.data, usize::num_from(offset))?;
|
||||
s.skip::<u16>(); // item_count
|
||||
s.skip::<u16>(); // short_delta_count
|
||||
let count = s.read::<u16>()?;
|
||||
s.read_array16::<u16>(count)
|
||||
}
|
||||
|
||||
pub fn parse_delta(
|
||||
&self,
|
||||
outer_index: u16,
|
||||
inner_index: u16,
|
||||
coordinates: &[NormalizedCoordinate],
|
||||
) -> Option<f32> {
|
||||
let offset = self.data_offsets.get(outer_index)?;
|
||||
let mut s = Stream::new_at(self.data, usize::num_from(offset))?;
|
||||
let item_count = s.read::<u16>()?;
|
||||
let short_delta_count = s.read::<u16>()?;
|
||||
let region_index_count = s.read::<u16>()?;
|
||||
let region_indices = s.read_array16::<u16>(region_index_count)?;
|
||||
|
||||
if inner_index >= item_count {
|
||||
return None;
|
||||
}
|
||||
|
||||
let delta_set_len = usize::from(short_delta_count) + usize::from(region_index_count);
|
||||
s.advance(usize::from(inner_index).checked_mul(delta_set_len)?);
|
||||
|
||||
let mut delta = 0.0;
|
||||
let mut i = 0;
|
||||
while i < short_delta_count {
|
||||
let idx = region_indices.get(i)?;
|
||||
delta += f32::from(s.read::<i16>()?) * self.regions.evaluate_region(idx, coordinates);
|
||||
i += 1;
|
||||
}
|
||||
|
||||
while i < region_index_count {
|
||||
let idx = region_indices.get(i)?;
|
||||
delta += f32::from(s.read::<i8>()?) * self.regions.evaluate_region(idx, coordinates);
|
||||
i += 1;
|
||||
}
|
||||
|
||||
Some(delta)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct VariationRegionList<'a> {
|
||||
axis_count: u16,
|
||||
regions: LazyArray16<'a, RegionAxisCoordinatesRecord>,
|
||||
}
|
||||
|
||||
impl<'a> VariationRegionList<'a> {
|
||||
#[inline]
|
||||
pub(crate) fn evaluate_region(
|
||||
&self,
|
||||
index: u16,
|
||||
coordinates: &[NormalizedCoordinate],
|
||||
) -> f32 {
|
||||
let mut v = 1.0;
|
||||
for (i, coord) in coordinates.iter().enumerate() {
|
||||
let region = match self.regions.get(index * self.axis_count + i as u16) {
|
||||
Some(r) => r,
|
||||
None => return 0.0,
|
||||
};
|
||||
|
||||
let factor = region.evaluate_axis(coord.get());
|
||||
if factor == 0.0 {
|
||||
return 0.0;
|
||||
}
|
||||
|
||||
v *= factor;
|
||||
}
|
||||
|
||||
v
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct RegionAxisCoordinatesRecord {
|
||||
start_coord: i16,
|
||||
peak_coord: i16,
|
||||
end_coord: i16,
|
||||
}
|
||||
|
||||
impl RegionAxisCoordinatesRecord {
|
||||
#[inline]
|
||||
pub fn evaluate_axis(&self, coord: i16) -> f32 {
|
||||
let start = self.start_coord;
|
||||
let peak = self.peak_coord;
|
||||
let end = self.end_coord;
|
||||
|
||||
if start > peak || peak > end {
|
||||
return 1.0;
|
||||
}
|
||||
|
||||
if start < 0 && end > 0 && peak != 0 {
|
||||
return 1.0;
|
||||
}
|
||||
|
||||
if peak == 0 || coord == peak {
|
||||
return 1.0;
|
||||
}
|
||||
|
||||
if coord <= start || end <= coord {
|
||||
return 0.0;
|
||||
}
|
||||
|
||||
if coord < peak {
|
||||
f32::from(coord - start) / f32::from(peak - start)
|
||||
} else {
|
||||
f32::from(end - coord) / f32::from(end - peak)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl FromData for RegionAxisCoordinatesRecord {
|
||||
const SIZE: usize = 6;
|
||||
|
||||
#[inline]
|
||||
fn parse(data: &[u8]) -> Option<Self> {
|
||||
let mut s = Stream::new(data);
|
||||
Some(RegionAxisCoordinatesRecord {
|
||||
start_coord: s.read::<i16>()?,
|
||||
peak_coord: s.read::<i16>()?,
|
||||
end_coord: s.read::<i16>()?,
|
||||
})
|
||||
}
|
||||
}
|
|
@ -0,0 +1,7 @@
|
|||
*.o
|
||||
moc_*.cpp
|
||||
moc_*.h
|
||||
ui_*.h
|
||||
font-view.app
|
||||
Makefile
|
||||
.qmake.stash
|
|
@ -0,0 +1,33 @@
|
|||
# font-view
|
||||
|
||||
A simple tool to preview all glyphs in the font using `ttf-parser`, `freetype` and `harfbuzz`.
|
||||
|
||||
## Build
|
||||
|
||||
```sh
|
||||
# build ttf-parser C API first
|
||||
cargo build --release --manifest-path ../../c-api/Cargo.toml
|
||||
|
||||
# build only with ttf-parser support
|
||||
qmake
|
||||
make
|
||||
|
||||
# or build with freetype support
|
||||
qmake DEFINES+=WITH_FREETYPE
|
||||
make
|
||||
|
||||
# or build with harfbuzz support
|
||||
# note that harfbuzz should be built from sources using meson,
|
||||
# because we're using an unstable API
|
||||
#
|
||||
# build harfbuzz first
|
||||
meson builddir -Dexperimental_api=true --buildtype release
|
||||
ninja -C builddir
|
||||
# build font-view
|
||||
qmake DEFINES+=WITH_HARFBUZZ HARFBUZZ_SRC=/path/to/harfbuzz-master/
|
||||
make
|
||||
|
||||
# or with all
|
||||
qmake DEFINES+=WITH_FREETYPE DEFINES+=WITH_HARFBUZZ HARFBUZZ_SRC=/path/to/harfbuzz-master/
|
||||
make
|
||||
```
|
|
@ -0,0 +1,50 @@
|
|||
QT += widgets
|
||||
|
||||
CONFIG += c++14
|
||||
|
||||
CONFIG(release, debug|release): LIBS += -L$$PWD/../../c-api/target/release/ -lttfparser
|
||||
else:CONFIG(debug, debug|release): LIBS += -L$$PWD/../../c-api/target/debug/ -lttfparser
|
||||
|
||||
INCLUDEPATH += $$PWD/../../c-api
|
||||
DEPENDPATH += $$PWD/../../c-api
|
||||
|
||||
SOURCES += \
|
||||
glyphsview.cpp \
|
||||
main.cpp \
|
||||
mainwindow.cpp \
|
||||
ttfparserfont.cpp
|
||||
|
||||
HEADERS += \
|
||||
glyph.h \
|
||||
glyphsview.h \
|
||||
mainwindow.h \
|
||||
ttfparserfont.h
|
||||
|
||||
FORMS += \
|
||||
mainwindow.ui
|
||||
|
||||
macx {
|
||||
QT_CONFIG -= no-pkg-config
|
||||
PKG_CONFIG = /opt/homebrew/bin/pkg-config
|
||||
}
|
||||
|
||||
# qmake DEFINES+=WITH_FREETYPE
|
||||
contains(DEFINES, WITH_FREETYPE) {
|
||||
SOURCES += freetypefont.cpp
|
||||
HEADERS += freetypefont.h
|
||||
|
||||
CONFIG += link_pkgconfig
|
||||
PKGCONFIG += freetype2
|
||||
}
|
||||
|
||||
# qmake DEFINES+=WITH_HARFBUZZ HARFBUZZ_SRC=/path/to/harfbuzz-master/
|
||||
contains(DEFINES, WITH_HARFBUZZ) {
|
||||
DEFINES += HB_EXPERIMENTAL_API
|
||||
|
||||
SOURCES += harfbuzzfont.cpp
|
||||
HEADERS += harfbuzzfont.h
|
||||
|
||||
# harfbuzz should be built with meson
|
||||
LIBS += -L$$HARFBUZZ_SRC/builddir/src/ -lharfbuzz
|
||||
INCLUDEPATH += $$HARFBUZZ_SRC/src
|
||||
}
|
|
@ -0,0 +1,175 @@
|
|||
// Based on https://www.freetype.org/freetype2/docs/tutorial/example5.cpp
|
||||
|
||||
#include <QDebug>
|
||||
|
||||
#include "freetypefont.h"
|
||||
|
||||
const FT_Fixed MULTIPLIER_FT = 65536L;
|
||||
|
||||
const char* getErrorMessage(FT_Error err)
|
||||
{
|
||||
#undef __FTERRORS_H__
|
||||
#define FT_ERRORDEF( e, v, s ) case e: return s;
|
||||
#define FT_ERROR_START_LIST switch (err) {
|
||||
#define FT_ERROR_END_LIST }
|
||||
#include FT_ERRORS_H
|
||||
return "(Unknown error)";
|
||||
}
|
||||
|
||||
struct Outliner
|
||||
{
|
||||
static int moveToFn(const FT_Vector *to, void *user)
|
||||
{
|
||||
auto self = static_cast<Outliner *>(user);
|
||||
self->path.moveTo(to->x, to->y);
|
||||
return 0;
|
||||
}
|
||||
|
||||
static int lineToFn(const FT_Vector *to, void *user)
|
||||
{
|
||||
auto self = static_cast<Outliner *>(user);
|
||||
self->path.lineTo(to->x, to->y);
|
||||
return 0;
|
||||
}
|
||||
|
||||
static int quadToFn(const FT_Vector *control, const FT_Vector *to, void *user)
|
||||
{
|
||||
auto self = static_cast<Outliner *>(user);
|
||||
self->path.quadTo(control->x, control->y, to->x, to->y);
|
||||
return 0;
|
||||
}
|
||||
|
||||
static int cubicToFn(const FT_Vector *controlOne,
|
||||
const FT_Vector *controlTwo,
|
||||
const FT_Vector *to,
|
||||
void *user)
|
||||
{
|
||||
auto self = static_cast<Outliner *>(user);
|
||||
self->path.cubicTo(controlOne->x, controlOne->y, controlTwo->x, controlTwo->y, to->x, to->y);
|
||||
return 0;
|
||||
}
|
||||
|
||||
QPainterPath path;
|
||||
};
|
||||
|
||||
FreeTypeFont::FreeTypeFont()
|
||||
{
|
||||
const auto error = FT_Init_FreeType(&m_ftLibrary);
|
||||
if (error) {
|
||||
throw tr("Failed to init FreeType.\n%1").arg(getErrorMessage(error));
|
||||
}
|
||||
}
|
||||
|
||||
FreeTypeFont::~FreeTypeFont()
|
||||
{
|
||||
if (m_ftFace) {
|
||||
FT_Done_Face(m_ftFace);
|
||||
}
|
||||
|
||||
FT_Done_FreeType(m_ftLibrary);
|
||||
}
|
||||
|
||||
void FreeTypeFont::open(const QString &path, const quint32 index)
|
||||
{
|
||||
if (isOpen()) {
|
||||
FT_Done_Face(m_ftFace);
|
||||
m_ftFace = nullptr;
|
||||
}
|
||||
|
||||
const auto utf8Path = path.toUtf8();
|
||||
const auto error = FT_New_Face(m_ftLibrary, utf8Path.constData(), index, &m_ftFace);
|
||||
if (error) {
|
||||
throw tr("Failed to open a font.\n%1").arg(getErrorMessage(error));
|
||||
}
|
||||
}
|
||||
|
||||
bool FreeTypeFont::isOpen() const
|
||||
{
|
||||
return m_ftFace != nullptr;
|
||||
}
|
||||
|
||||
FontInfo FreeTypeFont::fontInfo() const
|
||||
{
|
||||
if (!isOpen()) {
|
||||
throw tr("Font is not loaded.");
|
||||
}
|
||||
|
||||
return FontInfo {
|
||||
m_ftFace->ascender,
|
||||
m_ftFace->height,
|
||||
(quint16)m_ftFace->num_glyphs, // TrueType allows only u16.
|
||||
};
|
||||
}
|
||||
|
||||
Glyph FreeTypeFont::outline(const quint16 gid) const
|
||||
{
|
||||
if (!isOpen()) {
|
||||
throw tr("Font is not loaded.");
|
||||
}
|
||||
|
||||
auto error = FT_Load_Glyph(m_ftFace, gid, FT_LOAD_NO_SCALE | FT_LOAD_NO_BITMAP);
|
||||
if (error) {
|
||||
throw tr("Failed to load a glyph.\n%1").arg(getErrorMessage(error));
|
||||
}
|
||||
|
||||
Outliner outliner;
|
||||
|
||||
FT_Outline_Funcs funcs;
|
||||
funcs.move_to = outliner.moveToFn;
|
||||
funcs.line_to = outliner.lineToFn;
|
||||
funcs.conic_to = outliner.quadToFn;
|
||||
funcs.cubic_to = outliner.cubicToFn;
|
||||
funcs.shift = 0;
|
||||
funcs.delta = 0;
|
||||
|
||||
auto slot = m_ftFace->glyph;
|
||||
auto &outline = slot->outline;
|
||||
|
||||
// Flip outline around x-axis.
|
||||
FT_Matrix matrix;
|
||||
matrix.xx = 1L * MULTIPLIER_FT;
|
||||
matrix.xy = 0L * MULTIPLIER_FT;
|
||||
matrix.yx = 0L * MULTIPLIER_FT;
|
||||
matrix.yy = -1L * MULTIPLIER_FT;
|
||||
FT_Outline_Transform(&outline, &matrix);
|
||||
|
||||
FT_BBox bboxFt;
|
||||
FT_Outline_Get_BBox(&outline, &bboxFt);
|
||||
|
||||
const QRect bbox(
|
||||
(int)bboxFt.xMin,
|
||||
(int)bboxFt.yMin,
|
||||
(int)bboxFt.xMax - (int)bboxFt.xMin,
|
||||
(int)bboxFt.yMax - (int)bboxFt.yMin
|
||||
);
|
||||
|
||||
error = FT_Outline_Decompose(&outline, &funcs, &outliner);
|
||||
if (error) {
|
||||
throw tr("Failed to outline a glyph.\n%1").arg(getErrorMessage(error));
|
||||
}
|
||||
|
||||
outliner.path.setFillRule(Qt::WindingFill);
|
||||
|
||||
return Glyph {
|
||||
outliner.path,
|
||||
bbox,
|
||||
};
|
||||
}
|
||||
|
||||
void FreeTypeFont::setVariations(const QVector<Variation> &variations)
|
||||
{
|
||||
if (!isOpen()) {
|
||||
throw tr("Font is not loaded.");
|
||||
}
|
||||
|
||||
QVector<FT_Fixed> ftCoords;
|
||||
|
||||
for (const auto &var : variations) {
|
||||
ftCoords << var.value * MULTIPLIER_FT;
|
||||
}
|
||||
|
||||
const auto error = FT_Set_Var_Design_Coordinates(m_ftFace, ftCoords.size(), ftCoords.data());
|
||||
if (error) {
|
||||
throw tr("Failed to set variation.\n%1").arg(getErrorMessage(error));
|
||||
}
|
||||
}
|
|
@ -0,0 +1,32 @@
|
|||
#pragma once
|
||||
|
||||
#include <ft2build.h>
|
||||
#include FT_FREETYPE_H
|
||||
#include FT_OUTLINE_H
|
||||
#include FT_BBOX_H
|
||||
#include FT_MULTIPLE_MASTERS_H
|
||||
|
||||
#include <QCoreApplication>
|
||||
|
||||
#include "glyph.h"
|
||||
|
||||
class FreeTypeFont
|
||||
{
|
||||
Q_DECLARE_TR_FUNCTIONS(FreeTypeFont)
|
||||
|
||||
public:
|
||||
FreeTypeFont();
|
||||
~FreeTypeFont();
|
||||
|
||||
void open(const QString &path, const quint32 index = 0);
|
||||
bool isOpen() const;
|
||||
|
||||
FontInfo fontInfo() const;
|
||||
Glyph outline(const quint16 gid) const;
|
||||
|
||||
void setVariations(const QVector<Variation> &variations);
|
||||
|
||||
private:
|
||||
FT_Library m_ftLibrary = nullptr;
|
||||
FT_Face m_ftFace = nullptr;
|
||||
};
|
|
@ -0,0 +1,48 @@
|
|||
#pragma once
|
||||
|
||||
#include <QPainterPath>
|
||||
|
||||
struct Tag
|
||||
{
|
||||
Tag(quint32 v) : value(v) {}
|
||||
|
||||
QString toString() const
|
||||
{
|
||||
QString s;
|
||||
s.append(QChar(value >> 24 & 0xff));
|
||||
s.append(QChar(value >> 16 & 0xff));
|
||||
s.append(QChar(value >> 8 & 0xff));
|
||||
s.append(QChar(value >> 0 & 0xff));
|
||||
return s;
|
||||
}
|
||||
|
||||
quint32 value;
|
||||
};
|
||||
|
||||
struct FontInfo
|
||||
{
|
||||
qint16 ascender = 0;
|
||||
qint16 height = 1000;
|
||||
quint16 numberOfGlyphs = 0;
|
||||
};
|
||||
|
||||
struct Glyph
|
||||
{
|
||||
QPainterPath outline;
|
||||
QRect bbox;
|
||||
};
|
||||
|
||||
struct VariationInfo
|
||||
{
|
||||
QString name;
|
||||
Tag tag;
|
||||
qint16 min = 0;
|
||||
qint16 def = 0;
|
||||
qint16 max = 0;
|
||||
};
|
||||
|
||||
struct Variation
|
||||
{
|
||||
Tag tag;
|
||||
int value;
|
||||
};
|
|
@ -0,0 +1,286 @@
|
|||
#include <QDebug>
|
||||
#include <QMouseEvent>
|
||||
#include <QPainter>
|
||||
#include <QScrollBar>
|
||||
|
||||
#include <cmath>
|
||||
|
||||
#include "glyphsview.h"
|
||||
|
||||
static const int COLUMNS_COUNT = 100;
|
||||
|
||||
GlyphsView::GlyphsView(QWidget *parent) : QAbstractScrollArea(parent)
|
||||
{
|
||||
setHorizontalScrollBarPolicy(Qt::ScrollBarAlwaysOn);
|
||||
setVerticalScrollBarPolicy(Qt::ScrollBarAlwaysOn);
|
||||
}
|
||||
|
||||
void GlyphsView::setFontInfo(const FontInfo &fi)
|
||||
{
|
||||
m_fontInfo = fi;
|
||||
m_glyphs.resize(fi.numberOfGlyphs);
|
||||
#ifdef WITH_FREETYPE
|
||||
m_ftGlyphs.resize(fi.numberOfGlyphs);
|
||||
#endif
|
||||
#ifdef WITH_HARFBUZZ
|
||||
m_hbGlyphs.resize(fi.numberOfGlyphs);
|
||||
#endif
|
||||
|
||||
m_indexes.clear();
|
||||
for (int i = 0; i < fi.numberOfGlyphs; ++i) {
|
||||
QStaticText text(QString::number(i));
|
||||
text.prepare();
|
||||
m_indexes << text;
|
||||
}
|
||||
|
||||
updateScrollBars();
|
||||
horizontalScrollBar()->setValue(0);
|
||||
verticalScrollBar()->setValue(0);
|
||||
}
|
||||
|
||||
void GlyphsView::setGlyph(int idx, const Glyph &glyph)
|
||||
{
|
||||
m_glyphs.replace(idx, glyph);
|
||||
}
|
||||
|
||||
#ifdef WITH_FREETYPE
|
||||
void GlyphsView::setFTGlyph(int idx, const Glyph &glyph)
|
||||
{
|
||||
m_ftGlyphs.replace(idx, glyph);
|
||||
}
|
||||
#endif
|
||||
|
||||
#ifdef WITH_HARFBUZZ
|
||||
void GlyphsView::setHBGlyph(int idx, const Glyph &glyph)
|
||||
{
|
||||
m_hbGlyphs.replace(idx, glyph);
|
||||
}
|
||||
#endif
|
||||
|
||||
void GlyphsView::setDrawBboxes(const bool flag)
|
||||
{
|
||||
m_drawBboxes = flag;
|
||||
viewport()->update();
|
||||
}
|
||||
|
||||
void GlyphsView::setDrawGlyphs(const bool flag)
|
||||
{
|
||||
m_drawGlyphs = flag;
|
||||
viewport()->update();
|
||||
}
|
||||
|
||||
void GlyphsView::setDrawFTGlyphs(const bool flag)
|
||||
{
|
||||
m_drawFTGlyphs = flag;
|
||||
viewport()->update();
|
||||
}
|
||||
|
||||
void GlyphsView::setDrawHBGlyphs(const bool flag)
|
||||
{
|
||||
m_drawHBGlyphs = flag;
|
||||
viewport()->update();
|
||||
}
|
||||
|
||||
void GlyphsView::paintEvent(QPaintEvent *)
|
||||
{
|
||||
QPainter p(viewport());
|
||||
p.translate(-horizontalScrollBar()->value(), -verticalScrollBar()->value());
|
||||
|
||||
const double cellHeight = m_fontInfo.height * m_scale;
|
||||
drawGrid(p, cellHeight);
|
||||
|
||||
p.setRenderHint(QPainter::Antialiasing);
|
||||
|
||||
{
|
||||
auto font = p.font();
|
||||
font.setPointSize(10);
|
||||
p.setFont(font);
|
||||
}
|
||||
|
||||
int x = 0;
|
||||
int y = m_fontInfo.ascender;
|
||||
int num_y = m_fontInfo.height;
|
||||
for (int i = 0; i < m_glyphs.size(); ++i) {
|
||||
// Text rendering is the slowest part, so we are using preprocessed text.
|
||||
p.setPen(palette().color(QPalette::Text));
|
||||
p.drawStaticText(
|
||||
qRound(x * m_scale + 1),
|
||||
qRound(num_y * m_scale - p.fontMetrics().ascent() - 2),
|
||||
m_indexes.at(i)
|
||||
);
|
||||
|
||||
if (m_drawGlyphs) {
|
||||
p.save();
|
||||
|
||||
const int dx = qRound((m_fontInfo.height - m_glyphs.at(i).bbox.width()) / 2.0)
|
||||
- m_glyphs.at(i).bbox.x();
|
||||
|
||||
p.scale(m_scale, m_scale);
|
||||
p.translate(x + dx, y);
|
||||
|
||||
if (m_drawBboxes) {
|
||||
p.setPen(QPen(Qt::darkGreen, 0.5 / m_scale));
|
||||
p.setBrush(Qt::NoBrush);
|
||||
p.drawRect(m_glyphs.at(i).bbox);
|
||||
}
|
||||
|
||||
p.setPen(Qt::NoPen);
|
||||
p.setPen(Qt::NoPen);
|
||||
if (m_drawFTGlyphs || m_drawHBGlyphs) {
|
||||
p.setBrush(Qt::red);
|
||||
} else {
|
||||
p.setBrush(palette().color(QPalette::Text));
|
||||
}
|
||||
|
||||
p.drawPath(m_glyphs.at(i).outline);
|
||||
|
||||
p.restore();
|
||||
}
|
||||
|
||||
#ifdef WITH_HARFBUZZ
|
||||
if (m_drawHBGlyphs) {
|
||||
p.save();
|
||||
|
||||
const int dx = qRound((m_fontInfo.height - m_hbGlyphs.at(i).bbox.width()) / 2.0)
|
||||
- m_hbGlyphs.at(i).bbox.x();
|
||||
|
||||
p.scale(m_scale, m_scale);
|
||||
p.translate(x + dx, y);
|
||||
|
||||
if (m_drawBboxes) {
|
||||
p.setPen(QPen(Qt::darkGreen, 0.5 / m_scale));
|
||||
p.setBrush(Qt::NoBrush);
|
||||
p.drawRect(m_hbGlyphs.at(i).bbox);
|
||||
}
|
||||
|
||||
p.setPen(Qt::NoPen);
|
||||
if (m_drawFTGlyphs) {
|
||||
p.setBrush(Qt::blue);
|
||||
} else {
|
||||
p.setBrush(palette().color(QPalette::Text));
|
||||
}
|
||||
|
||||
p.drawPath(m_hbGlyphs.at(i).outline);
|
||||
|
||||
p.restore();
|
||||
}
|
||||
#endif
|
||||
|
||||
#ifdef WITH_FREETYPE
|
||||
if (m_drawFTGlyphs) {
|
||||
p.save();
|
||||
|
||||
const int dx = qRound((m_fontInfo.height - m_ftGlyphs.at(i).bbox.width()) / 2.0)
|
||||
- m_ftGlyphs.at(i).bbox.x();
|
||||
|
||||
p.scale(m_scale, m_scale);
|
||||
p.translate(x + dx, y);
|
||||
|
||||
if (m_drawBboxes) {
|
||||
p.setPen(QPen(Qt::darkGreen, 0.5 / m_scale));
|
||||
p.setBrush(Qt::NoBrush);
|
||||
p.drawRect(m_ftGlyphs.at(i).bbox);
|
||||
}
|
||||
|
||||
p.setPen(Qt::NoPen);
|
||||
p.setBrush(palette().color(QPalette::Text));
|
||||
|
||||
if (m_drawGlyphs || m_drawHBGlyphs) {
|
||||
p.setBrush(palette().color(QPalette::Base));
|
||||
}
|
||||
|
||||
p.drawPath(m_ftGlyphs.at(i).outline);
|
||||
|
||||
p.restore();
|
||||
}
|
||||
#endif
|
||||
|
||||
x += m_fontInfo.height;
|
||||
if (i > 0 && (i + 1) % COLUMNS_COUNT == 0) {
|
||||
x = 0;
|
||||
y += m_fontInfo.height;
|
||||
num_y += m_fontInfo.height;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void GlyphsView::drawGrid(QPainter &p, const double cellHeight)
|
||||
{
|
||||
p.setRenderHint(QPainter::Antialiasing, false);
|
||||
p.setPen(QPen(palette().color(QPalette::Text), 0.25));
|
||||
p.setBrush(Qt::NoBrush);
|
||||
|
||||
const int rows = qRound(floor(m_glyphs.size() / COLUMNS_COUNT)) + 1;
|
||||
const auto maxH = qMin(rows * cellHeight, (double)horizontalScrollBar()->maximum());
|
||||
|
||||
double x = cellHeight;
|
||||
for (int c = 1; c < COLUMNS_COUNT; ++c) {
|
||||
p.drawLine(QLineF(x, 0, x, maxH));
|
||||
x += cellHeight;
|
||||
}
|
||||
|
||||
double y = cellHeight;
|
||||
for (int r = 1; r <= rows; ++r) {
|
||||
p.drawLine(QLineF(0, y, horizontalScrollBar()->maximum() + viewport()->width(), y));
|
||||
y += cellHeight;
|
||||
}
|
||||
}
|
||||
|
||||
void GlyphsView::mousePressEvent(QMouseEvent *e)
|
||||
{
|
||||
if (e->button() & Qt::LeftButton) {
|
||||
m_mousePressPos = e->pos();
|
||||
m_origOffset = QPoint(horizontalScrollBar()->value(), verticalScrollBar()->value());
|
||||
}
|
||||
}
|
||||
|
||||
void GlyphsView::mouseMoveEvent(QMouseEvent *e)
|
||||
{
|
||||
if (m_mousePressPos.isNull()) {
|
||||
return;
|
||||
}
|
||||
|
||||
const auto diff = m_mousePressPos - e->pos();
|
||||
horizontalScrollBar()->setValue(m_origOffset.x() + diff.x());
|
||||
verticalScrollBar()->setValue(m_origOffset.y() + diff.y());
|
||||
}
|
||||
|
||||
void GlyphsView::mouseReleaseEvent(QMouseEvent *)
|
||||
{
|
||||
m_mousePressPos = QPoint();
|
||||
m_origOffset = QPoint();
|
||||
}
|
||||
|
||||
void GlyphsView::wheelEvent(QWheelEvent *e)
|
||||
{
|
||||
e->accept();
|
||||
|
||||
if (e->angleDelta().y() > 0) {
|
||||
m_scale += 0.01;
|
||||
} else {
|
||||
m_scale -= 0.01;
|
||||
}
|
||||
|
||||
m_scale = qBound(0.03, m_scale, 1.0);
|
||||
|
||||
updateScrollBars();
|
||||
viewport()->update();
|
||||
}
|
||||
|
||||
void GlyphsView::resizeEvent(QResizeEvent *e)
|
||||
{
|
||||
QAbstractScrollArea::resizeEvent(e);
|
||||
updateScrollBars();
|
||||
}
|
||||
|
||||
void GlyphsView::updateScrollBars()
|
||||
{
|
||||
const double cellHeight = m_fontInfo.height * m_scale;
|
||||
const int rows = qRound(floor(m_glyphs.size() / COLUMNS_COUNT)) + 1;
|
||||
const auto w = COLUMNS_COUNT * cellHeight - viewport()->width();
|
||||
const auto h = rows * cellHeight - viewport()->height();
|
||||
horizontalScrollBar()->setMinimum(0);
|
||||
verticalScrollBar()->setMinimum(0);
|
||||
horizontalScrollBar()->setMaximum(qMax(0, qRound(w)));
|
||||
verticalScrollBar()->setMaximum(qMax(0, qRound(h)));
|
||||
}
|
|
@ -0,0 +1,61 @@
|
|||
#pragma once
|
||||
|
||||
#include <QAbstractScrollArea>
|
||||
#include <QStaticText>
|
||||
|
||||
#include "glyph.h"
|
||||
|
||||
class GlyphsView : public QAbstractScrollArea
|
||||
{
|
||||
Q_OBJECT
|
||||
|
||||
public:
|
||||
explicit GlyphsView(QWidget *parent = nullptr);
|
||||
|
||||
void setFontInfo(const FontInfo &fi);
|
||||
void setGlyph(int idx, const Glyph &glyph);
|
||||
#ifdef WITH_FREETYPE
|
||||
void setFTGlyph(int idx, const Glyph &glyph);
|
||||
#endif
|
||||
#ifdef WITH_HARFBUZZ
|
||||
void setHBGlyph(int idx, const Glyph &glyph);
|
||||
#endif
|
||||
|
||||
void setDrawBboxes(const bool flag);
|
||||
void setDrawGlyphs(const bool flag);
|
||||
void setDrawFTGlyphs(const bool flag);
|
||||
void setDrawHBGlyphs(const bool flag);
|
||||
|
||||
private:
|
||||
void paintEvent(QPaintEvent *);
|
||||
void drawGrid(QPainter &p, const double cellHeight);
|
||||
|
||||
void mousePressEvent(QMouseEvent *e);
|
||||
void mouseMoveEvent(QMouseEvent *e);
|
||||
void mouseReleaseEvent(QMouseEvent *e);
|
||||
void wheelEvent(QWheelEvent *e);
|
||||
|
||||
void resizeEvent(QResizeEvent *);
|
||||
|
||||
void updateScrollBars();
|
||||
|
||||
private:
|
||||
QPoint m_mousePressPos;
|
||||
QPoint m_origOffset;
|
||||
|
||||
double m_scale = 0.05;
|
||||
bool m_drawBboxes = true;
|
||||
bool m_drawGlyphs = true;
|
||||
bool m_drawFTGlyphs = false;
|
||||
bool m_drawHBGlyphs = false;
|
||||
|
||||
FontInfo m_fontInfo;
|
||||
QVector<Glyph> m_glyphs;
|
||||
#ifdef WITH_FREETYPE
|
||||
QVector<Glyph> m_ftGlyphs;
|
||||
#endif
|
||||
#ifdef WITH_HARFBUZZ
|
||||
QVector<Glyph> m_hbGlyphs;
|
||||
#endif
|
||||
QVector<QStaticText> m_indexes;
|
||||
};
|
|
@ -0,0 +1,160 @@
|
|||
#include <QTransform>
|
||||
#include <QDebug>
|
||||
|
||||
#include <hb.h>
|
||||
|
||||
#include "harfbuzzfont.h"
|
||||
|
||||
struct Outliner
|
||||
{
|
||||
static void moveToFn(hb_position_t to_x, hb_position_t to_y, Outliner &outliner)
|
||||
{
|
||||
outliner.path.moveTo(to_x, to_y);
|
||||
}
|
||||
|
||||
static void lineToFn(hb_position_t to_x, hb_position_t to_y, Outliner &outliner)
|
||||
{
|
||||
outliner.path.lineTo(to_x, to_y);
|
||||
}
|
||||
|
||||
static void quadToFn(hb_position_t control_x, hb_position_t control_y,
|
||||
hb_position_t to_x, hb_position_t to_y,
|
||||
Outliner &outliner)
|
||||
{
|
||||
outliner.path.quadTo(control_x, control_y, to_x, to_y);
|
||||
}
|
||||
|
||||
static void cubicToFn(hb_position_t control1_x, hb_position_t control1_y,
|
||||
hb_position_t control2_x, hb_position_t control2_y,
|
||||
hb_position_t to_x, hb_position_t to_y,
|
||||
Outliner &outliner)
|
||||
{
|
||||
outliner.path.cubicTo(control1_x, control1_y, control2_x, control2_y, to_x, to_y);
|
||||
}
|
||||
|
||||
static void closePathFn(Outliner &outliner)
|
||||
{
|
||||
outliner.path.closeSubpath();
|
||||
}
|
||||
|
||||
QPainterPath path;
|
||||
};
|
||||
|
||||
HarfBuzzFont::HarfBuzzFont()
|
||||
{
|
||||
|
||||
}
|
||||
|
||||
HarfBuzzFont::~HarfBuzzFont()
|
||||
{
|
||||
reset();
|
||||
}
|
||||
|
||||
void HarfBuzzFont::open(const QString &path, const quint32 index)
|
||||
{
|
||||
if (isOpen()) {
|
||||
reset();
|
||||
}
|
||||
|
||||
const auto utf8Path = path.toUtf8();
|
||||
hb_blob_t *blob = hb_blob_create_from_file(utf8Path.constData());
|
||||
if (!blob) {
|
||||
throw tr("Failed to open a font.");
|
||||
}
|
||||
|
||||
hb_face_t *face = hb_face_create(blob, index);
|
||||
if (!face) {
|
||||
throw tr("Failed to open a font.");
|
||||
}
|
||||
|
||||
hb_font_t *font = hb_font_create(face);
|
||||
if (!font) {
|
||||
throw tr("Failed to open a font.");
|
||||
}
|
||||
|
||||
m_blob = blob;
|
||||
m_face = face;
|
||||
m_font = font;
|
||||
}
|
||||
|
||||
bool HarfBuzzFont::isOpen() const
|
||||
{
|
||||
return m_font != nullptr;
|
||||
}
|
||||
|
||||
Glyph HarfBuzzFont::outline(const quint16 gid) const
|
||||
{
|
||||
if (!isOpen()) {
|
||||
throw tr("Font is not loaded.");
|
||||
}
|
||||
|
||||
Outliner outliner;
|
||||
|
||||
hb_draw_funcs_t *funcs = hb_draw_funcs_create();
|
||||
hb_draw_funcs_set_move_to_func(funcs, (hb_draw_move_to_func_t)outliner.moveToFn);
|
||||
hb_draw_funcs_set_line_to_func(funcs, (hb_draw_line_to_func_t)outliner.lineToFn);
|
||||
hb_draw_funcs_set_quadratic_to_func(funcs, (hb_draw_quadratic_to_func_t)outliner.quadToFn);
|
||||
hb_draw_funcs_set_cubic_to_func(funcs, (hb_draw_cubic_to_func_t)outliner.cubicToFn);
|
||||
hb_draw_funcs_set_close_path_func(funcs, (hb_draw_close_path_func_t)outliner.closePathFn);
|
||||
|
||||
if (!hb_font_draw_glyph(m_font, gid, funcs, &outliner)) {
|
||||
throw tr("Failed to outline a glyph %1.").arg(gid);
|
||||
}
|
||||
|
||||
hb_draw_funcs_destroy(funcs);
|
||||
|
||||
hb_glyph_extents_t extents = {0, 0, 0, 0};
|
||||
if (!hb_font_get_glyph_extents(m_font, gid, &extents)) {
|
||||
throw tr("Failed to query glyph extents.");
|
||||
}
|
||||
|
||||
const QRect bbox(
|
||||
extents.x_bearing,
|
||||
-extents.y_bearing,
|
||||
extents.width,
|
||||
-extents.height
|
||||
);
|
||||
|
||||
// Flip outline around x-axis.
|
||||
QTransform ts(1, 0, 0, -1, 0, 0);
|
||||
outliner.path = ts.map(outliner.path);
|
||||
|
||||
outliner.path.setFillRule(Qt::WindingFill);
|
||||
|
||||
return Glyph {
|
||||
outliner.path,
|
||||
bbox,
|
||||
};
|
||||
}
|
||||
|
||||
void HarfBuzzFont::setVariations(const QVector<Variation> &variations)
|
||||
{
|
||||
if (!isOpen()) {
|
||||
throw tr("Font is not loaded.");
|
||||
}
|
||||
|
||||
QVector<hb_variation_t> hbVariations;
|
||||
for (const auto &var : variations) {
|
||||
hbVariations.append({ var.tag.value, (float)var.value });
|
||||
}
|
||||
|
||||
hb_font_set_variations(m_font, hbVariations.constData(), hbVariations.size());
|
||||
}
|
||||
|
||||
void HarfBuzzFont::reset()
|
||||
{
|
||||
if (m_blob) {
|
||||
hb_blob_destroy(m_blob);
|
||||
m_blob = nullptr;
|
||||
}
|
||||
|
||||
if (m_font) {
|
||||
hb_font_destroy(m_font);
|
||||
m_font = nullptr;
|
||||
}
|
||||
|
||||
if (m_face) {
|
||||
hb_face_destroy(m_face);
|
||||
m_face = nullptr;
|
||||
}
|
||||
}
|
|
@ -0,0 +1,34 @@
|
|||
#pragma once
|
||||
|
||||
#include <QCoreApplication>
|
||||
|
||||
#include "glyph.h"
|
||||
|
||||
struct hb_blob_t;
|
||||
struct hb_face_t;
|
||||
struct hb_font_t;
|
||||
struct hb_draw_funcs_t;
|
||||
|
||||
class HarfBuzzFont
|
||||
{
|
||||
Q_DECLARE_TR_FUNCTIONS(HarfBuzzFont)
|
||||
|
||||
public:
|
||||
HarfBuzzFont();
|
||||
~HarfBuzzFont();
|
||||
|
||||
void open(const QString &path, const quint32 index = 0);
|
||||
bool isOpen() const;
|
||||
|
||||
Glyph outline(const quint16 gid) const;
|
||||
|
||||
void setVariations(const QVector<Variation> &variations);
|
||||
|
||||
private:
|
||||
void reset();
|
||||
|
||||
private:
|
||||
hb_blob_t *m_blob = nullptr;
|
||||
hb_face_t *m_face = nullptr;
|
||||
hb_font_t *m_font = nullptr;
|
||||
};
|
|
@ -0,0 +1,13 @@
|
|||
#include <QApplication>
|
||||
|
||||
#include "mainwindow.h"
|
||||
|
||||
int main(int argc, char *argv[])
|
||||
{
|
||||
QApplication a(argc, argv);
|
||||
|
||||
MainWindow w;
|
||||
w.show();
|
||||
|
||||
return a.exec();
|
||||
}
|
|
@ -0,0 +1,162 @@
|
|||
#include <QElapsedTimer>
|
||||
#include <QSlider>
|
||||
#include <QTimer>
|
||||
#include <QMessageBox>
|
||||
#include <QDebug>
|
||||
|
||||
#include "mainwindow.h"
|
||||
#include "ui_mainwindow.h"
|
||||
|
||||
MainWindow::MainWindow(QWidget *parent)
|
||||
: QMainWindow(parent)
|
||||
, ui(new Ui::MainWindow)
|
||||
{
|
||||
ui->setupUi(this);
|
||||
|
||||
#ifndef WITH_FREETYPE
|
||||
ui->chBoxDrawFreeType->hide();
|
||||
#endif
|
||||
|
||||
#ifndef WITH_HARFBUZZ
|
||||
ui->chBoxDrawHarfBuzz->hide();
|
||||
#endif
|
||||
|
||||
if (qApp->arguments().size() == 2) {
|
||||
QTimer::singleShot(1, this, [this](){
|
||||
loadFont(qApp->arguments().at(1));
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
MainWindow::~MainWindow()
|
||||
{
|
||||
delete ui;
|
||||
}
|
||||
|
||||
void MainWindow::loadFont(const QString &path)
|
||||
{
|
||||
try {
|
||||
m_ttfpFont.open(path);
|
||||
|
||||
const auto variations = m_ttfpFont.loadVariations();
|
||||
if (!variations.isEmpty()) {
|
||||
ui->widgetVariations->show();
|
||||
|
||||
// Clear layout.
|
||||
while (ui->layVariations->count()) {
|
||||
delete ui->layVariations->takeAt(0);
|
||||
}
|
||||
|
||||
m_variationSliders.clear();
|
||||
|
||||
QVector<Variation> newVariations;
|
||||
|
||||
for (const auto &var : variations) {
|
||||
auto hlay = new QHBoxLayout();
|
||||
hlay->setContentsMargins(0, 0, 0, 0);
|
||||
hlay->addWidget(new QLabel(var.name));
|
||||
|
||||
auto slider = new QSlider(Qt::Horizontal);
|
||||
slider->setMinimum(var.min);
|
||||
slider->setMaximum(var.max);
|
||||
slider->setValue(var.def);
|
||||
hlay->addWidget(slider);
|
||||
ui->layVariations->addLayout(hlay);
|
||||
|
||||
m_variationSliders.append({ slider, var.tag });
|
||||
|
||||
connect(slider, &QSlider::valueChanged, this, &MainWindow::onVariationChanged);
|
||||
|
||||
newVariations.append({ var.tag, var.def });
|
||||
}
|
||||
|
||||
m_ttfpFont.setVariations(newVariations);
|
||||
} else {
|
||||
ui->widgetVariations->hide();
|
||||
}
|
||||
|
||||
#ifdef WITH_FREETYPE
|
||||
m_ftFont.open(path);
|
||||
#endif
|
||||
|
||||
#ifdef WITH_HARFBUZZ
|
||||
m_hbFont.open(path);
|
||||
#endif
|
||||
|
||||
ui->glyphsView->setFontInfo(m_ttfpFont.fontInfo());
|
||||
reloadGlyphs();
|
||||
} catch (const QString &err) {
|
||||
QMessageBox::warning(this, tr("Error"), err);
|
||||
}
|
||||
}
|
||||
|
||||
void MainWindow::reloadGlyphs()
|
||||
{
|
||||
const auto fi = m_ttfpFont.fontInfo();
|
||||
for (quint16 i = 0; i < fi.numberOfGlyphs; ++i) {
|
||||
try {
|
||||
ui->glyphsView->setGlyph(i, m_ttfpFont.outline(i));
|
||||
} catch (...) {
|
||||
}
|
||||
|
||||
#ifdef WITH_FREETYPE
|
||||
try {
|
||||
ui->glyphsView->setFTGlyph(i, m_ftFont.outline(i));
|
||||
} catch (...) {
|
||||
}
|
||||
#endif
|
||||
|
||||
#ifdef WITH_HARFBUZZ
|
||||
try {
|
||||
ui->glyphsView->setHBGlyph(i, m_hbFont.outline(i));
|
||||
} catch (...) {
|
||||
}
|
||||
#endif
|
||||
}
|
||||
|
||||
ui->glyphsView->viewport()->update();
|
||||
}
|
||||
|
||||
void MainWindow::onVariationChanged()
|
||||
{
|
||||
try {
|
||||
QVector<Variation> variations;
|
||||
|
||||
for (auto var : m_variationSliders) {
|
||||
variations.append({ var.tag, var.slider->value() });
|
||||
}
|
||||
|
||||
#ifdef WITH_FREETYPE
|
||||
m_ftFont.setVariations(variations);
|
||||
#endif
|
||||
|
||||
#ifdef WITH_HARFBUZZ
|
||||
m_hbFont.setVariations(variations);
|
||||
#endif
|
||||
m_ttfpFont.setVariations(variations);
|
||||
|
||||
reloadGlyphs();
|
||||
} catch (const QString &err) {
|
||||
QMessageBox::warning(this, tr("Error"), err);
|
||||
}
|
||||
}
|
||||
|
||||
void MainWindow::on_chBoxDrawBboxes_stateChanged(int flag)
|
||||
{
|
||||
ui->glyphsView->setDrawBboxes(flag);
|
||||
}
|
||||
|
||||
void MainWindow::on_chBoxDrawTtfParser_stateChanged(int flag)
|
||||
{
|
||||
ui->glyphsView->setDrawGlyphs(flag);
|
||||
}
|
||||
|
||||
void MainWindow::on_chBoxDrawFreeType_stateChanged(int flag)
|
||||
{
|
||||
ui->glyphsView->setDrawFTGlyphs(flag);
|
||||
}
|
||||
|
||||
void MainWindow::on_chBoxDrawHarfBuzz_stateChanged(int flag)
|
||||
{
|
||||
ui->glyphsView->setDrawHBGlyphs(flag);
|
||||
}
|
|
@ -0,0 +1,54 @@
|
|||
#pragma once
|
||||
|
||||
#include <QMainWindow>
|
||||
|
||||
#ifdef WITH_FREETYPE
|
||||
#include "freetypefont.h"
|
||||
#endif
|
||||
|
||||
#ifdef WITH_FREETYPE
|
||||
#include "harfbuzzfont.h"
|
||||
#endif
|
||||
|
||||
#include "ttfparserfont.h"
|
||||
|
||||
namespace Ui { class MainWindow; }
|
||||
|
||||
class QSlider;
|
||||
|
||||
class MainWindow : public QMainWindow
|
||||
{
|
||||
Q_OBJECT
|
||||
|
||||
public:
|
||||
MainWindow(QWidget *parent = nullptr);
|
||||
~MainWindow();
|
||||
|
||||
private:
|
||||
void loadFont(const QString &path);
|
||||
void reloadGlyphs();
|
||||
void onVariationChanged();
|
||||
|
||||
private slots:
|
||||
void on_chBoxDrawBboxes_stateChanged(int flag);
|
||||
void on_chBoxDrawTtfParser_stateChanged(int flag);
|
||||
void on_chBoxDrawFreeType_stateChanged(int flag);
|
||||
void on_chBoxDrawHarfBuzz_stateChanged(int flag);
|
||||
|
||||
private:
|
||||
struct VariationSlider
|
||||
{
|
||||
QSlider *slider;
|
||||
Tag tag;
|
||||
};
|
||||
|
||||
Ui::MainWindow * const ui;
|
||||
QVector<VariationSlider> m_variationSliders;
|
||||
TtfParserFont m_ttfpFont;
|
||||
#ifdef WITH_FREETYPE
|
||||
FreeTypeFont m_ftFont;
|
||||
#endif
|
||||
#ifdef WITH_HARFBUZZ
|
||||
HarfBuzzFont m_hbFont;
|
||||
#endif
|
||||
};
|
|
@ -0,0 +1,156 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<ui version="4.0">
|
||||
<class>MainWindow</class>
|
||||
<widget class="QMainWindow" name="MainWindow">
|
||||
<property name="geometry">
|
||||
<rect>
|
||||
<x>0</x>
|
||||
<y>0</y>
|
||||
<width>800</width>
|
||||
<height>600</height>
|
||||
</rect>
|
||||
</property>
|
||||
<property name="windowTitle">
|
||||
<string>FontView</string>
|
||||
</property>
|
||||
<widget class="QWidget" name="centralwidget">
|
||||
<layout class="QHBoxLayout" name="horizontalLayout">
|
||||
<property name="leftMargin">
|
||||
<number>3</number>
|
||||
</property>
|
||||
<property name="topMargin">
|
||||
<number>3</number>
|
||||
</property>
|
||||
<property name="rightMargin">
|
||||
<number>3</number>
|
||||
</property>
|
||||
<property name="bottomMargin">
|
||||
<number>3</number>
|
||||
</property>
|
||||
<item>
|
||||
<widget class="GlyphsView" name="glyphsView" native="true">
|
||||
<property name="sizePolicy">
|
||||
<sizepolicy hsizetype="Expanding" vsizetype="Preferred">
|
||||
<horstretch>0</horstretch>
|
||||
<verstretch>0</verstretch>
|
||||
</sizepolicy>
|
||||
</property>
|
||||
</widget>
|
||||
</item>
|
||||
<item>
|
||||
<widget class="QWidget" name="sidebar" native="true">
|
||||
<property name="sizePolicy">
|
||||
<sizepolicy hsizetype="Fixed" vsizetype="Preferred">
|
||||
<horstretch>0</horstretch>
|
||||
<verstretch>0</verstretch>
|
||||
</sizepolicy>
|
||||
</property>
|
||||
<property name="minimumSize">
|
||||
<size>
|
||||
<width>200</width>
|
||||
<height>0</height>
|
||||
</size>
|
||||
</property>
|
||||
<layout class="QVBoxLayout" name="verticalLayout">
|
||||
<item>
|
||||
<widget class="QCheckBox" name="chBoxDrawTtfParser">
|
||||
<property name="text">
|
||||
<string>Draw ttf-parser</string>
|
||||
</property>
|
||||
<property name="checked">
|
||||
<bool>true</bool>
|
||||
</property>
|
||||
</widget>
|
||||
</item>
|
||||
<item>
|
||||
<widget class="QCheckBox" name="chBoxDrawFreeType">
|
||||
<property name="text">
|
||||
<string>Draw FreeType</string>
|
||||
</property>
|
||||
</widget>
|
||||
</item>
|
||||
<item>
|
||||
<widget class="QCheckBox" name="chBoxDrawHarfBuzz">
|
||||
<property name="text">
|
||||
<string>Draw HarfBuzz</string>
|
||||
</property>
|
||||
</widget>
|
||||
</item>
|
||||
<item>
|
||||
<widget class="QCheckBox" name="chBoxDrawBboxes">
|
||||
<property name="text">
|
||||
<string>Draw bboxes</string>
|
||||
</property>
|
||||
<property name="checked">
|
||||
<bool>true</bool>
|
||||
</property>
|
||||
</widget>
|
||||
</item>
|
||||
<item>
|
||||
<widget class="QWidget" name="widgetVariations" native="true">
|
||||
<layout class="QVBoxLayout" name="verticalLayout_2">
|
||||
<property name="leftMargin">
|
||||
<number>0</number>
|
||||
</property>
|
||||
<property name="topMargin">
|
||||
<number>0</number>
|
||||
</property>
|
||||
<property name="rightMargin">
|
||||
<number>0</number>
|
||||
</property>
|
||||
<property name="bottomMargin">
|
||||
<number>0</number>
|
||||
</property>
|
||||
<item>
|
||||
<widget class="QLabel" name="label">
|
||||
<property name="text">
|
||||
<string>Variations:</string>
|
||||
</property>
|
||||
</widget>
|
||||
</item>
|
||||
<item>
|
||||
<layout class="QVBoxLayout" name="layVariations"/>
|
||||
</item>
|
||||
</layout>
|
||||
</widget>
|
||||
</item>
|
||||
<item>
|
||||
<spacer name="verticalSpacer">
|
||||
<property name="orientation">
|
||||
<enum>Qt::Vertical</enum>
|
||||
</property>
|
||||
<property name="sizeHint" stdset="0">
|
||||
<size>
|
||||
<width>20</width>
|
||||
<height>40</height>
|
||||
</size>
|
||||
</property>
|
||||
</spacer>
|
||||
</item>
|
||||
</layout>
|
||||
</widget>
|
||||
</item>
|
||||
</layout>
|
||||
</widget>
|
||||
<widget class="QMenuBar" name="menubar">
|
||||
<property name="geometry">
|
||||
<rect>
|
||||
<x>0</x>
|
||||
<y>0</y>
|
||||
<width>800</width>
|
||||
<height>32</height>
|
||||
</rect>
|
||||
</property>
|
||||
</widget>
|
||||
</widget>
|
||||
<customwidgets>
|
||||
<customwidget>
|
||||
<class>GlyphsView</class>
|
||||
<extends>QWidget</extends>
|
||||
<header>glyphsview.h</header>
|
||||
<container>1</container>
|
||||
</customwidget>
|
||||
</customwidgets>
|
||||
<resources/>
|
||||
<connections/>
|
||||
</ui>
|
|
@ -0,0 +1,165 @@
|
|||
#include <QTransform>
|
||||
#include <QFile>
|
||||
#include <QDebug>
|
||||
|
||||
#include "ttfparserfont.h"
|
||||
|
||||
struct Outliner
|
||||
{
|
||||
static void moveToFn(float x, float y, void *user)
|
||||
{
|
||||
auto self = static_cast<Outliner *>(user);
|
||||
self->path.moveTo(double(x), double(y));
|
||||
}
|
||||
|
||||
static void lineToFn(float x, float y, void *user)
|
||||
{
|
||||
auto self = static_cast<Outliner *>(user);
|
||||
self->path.lineTo(double(x), double(y));
|
||||
}
|
||||
|
||||
static void quadToFn(float x1, float y1, float x, float y, void *user)
|
||||
{
|
||||
auto self = static_cast<Outliner *>(user);
|
||||
self->path.quadTo(double(x1), double(y1), double(x), double(y));
|
||||
}
|
||||
|
||||
static void curveToFn(float x1, float y1, float x2, float y2, float x, float y, void *user)
|
||||
{
|
||||
auto self = static_cast<Outliner *>(user);
|
||||
self->path.cubicTo(double(x1), double(y1), double(x2), double(y2), double(x), double(y));
|
||||
}
|
||||
|
||||
static void closePathFn(void *user)
|
||||
{
|
||||
auto self = static_cast<Outliner *>(user);
|
||||
self->path.closeSubpath();
|
||||
}
|
||||
|
||||
QPainterPath path;
|
||||
};
|
||||
|
||||
TtfParserFont::TtfParserFont()
|
||||
{
|
||||
}
|
||||
|
||||
void TtfParserFont::open(const QString &path, const quint32 index)
|
||||
{
|
||||
if (isOpen()) {
|
||||
m_face.reset();
|
||||
}
|
||||
|
||||
QFile file(path);
|
||||
file.open(QFile::ReadOnly);
|
||||
m_fontData = file.readAll();
|
||||
|
||||
m_face.reset((ttfp_face*)malloc(ttfp_face_size_of()));
|
||||
const auto res = ttfp_face_init(m_fontData.constData(), m_fontData.size(), index, m_face.get());
|
||||
|
||||
if (!res) {
|
||||
throw tr("Failed to open a font.");
|
||||
}
|
||||
}
|
||||
|
||||
bool TtfParserFont::isOpen() const
|
||||
{
|
||||
return m_face != nullptr;
|
||||
}
|
||||
|
||||
FontInfo TtfParserFont::fontInfo() const
|
||||
{
|
||||
if (!isOpen()) {
|
||||
throw tr("Font is not loaded.");
|
||||
}
|
||||
|
||||
return FontInfo {
|
||||
ttfp_get_ascender(m_face.get()),
|
||||
ttfp_get_height(m_face.get()),
|
||||
ttfp_get_number_of_glyphs(m_face.get()),
|
||||
};
|
||||
}
|
||||
|
||||
Glyph TtfParserFont::outline(const quint16 gid) const
|
||||
{
|
||||
if (!isOpen()) {
|
||||
throw tr("Font is not loaded.");
|
||||
}
|
||||
|
||||
Outliner outliner;
|
||||
ttfp_outline_builder builder;
|
||||
builder.move_to = outliner.moveToFn;
|
||||
builder.line_to = outliner.lineToFn;
|
||||
builder.quad_to = outliner.quadToFn;
|
||||
builder.curve_to = outliner.curveToFn;
|
||||
builder.close_path = outliner.closePathFn;
|
||||
|
||||
ttfp_rect rawBbox;
|
||||
|
||||
const bool ok = ttfp_outline_glyph(
|
||||
m_face.get(),
|
||||
builder,
|
||||
&outliner,
|
||||
gid,
|
||||
&rawBbox
|
||||
);
|
||||
|
||||
if (!ok) {
|
||||
return Glyph {
|
||||
QPainterPath(),
|
||||
QRect(),
|
||||
};
|
||||
}
|
||||
|
||||
const QRect bbox(
|
||||
rawBbox.x_min,
|
||||
-rawBbox.y_max,
|
||||
rawBbox.x_max - rawBbox.x_min,
|
||||
rawBbox.y_max - rawBbox.y_min
|
||||
);
|
||||
|
||||
// Flip outline around x-axis.
|
||||
QTransform ts(1, 0, 0, -1, 0, 0);
|
||||
outliner.path = ts.map(outliner.path);
|
||||
|
||||
outliner.path.setFillRule(Qt::WindingFill);
|
||||
|
||||
return Glyph {
|
||||
outliner.path,
|
||||
bbox,
|
||||
};
|
||||
}
|
||||
|
||||
QVector<VariationInfo> TtfParserFont::loadVariations()
|
||||
{
|
||||
if (!isOpen()) {
|
||||
throw tr("Font is not loaded.");
|
||||
}
|
||||
|
||||
QVector<VariationInfo> variations;
|
||||
|
||||
for (uint16_t i = 0; i < ttfp_get_variation_axes_count(m_face.get()); ++i) {
|
||||
ttfp_variation_axis axis;
|
||||
ttfp_get_variation_axis(m_face.get(), i, &axis);
|
||||
|
||||
variations.append(VariationInfo {
|
||||
Tag(axis.tag).toString(),
|
||||
{ static_cast<quint32>(axis.tag) },
|
||||
static_cast<qint16>(axis.min_value),
|
||||
static_cast<qint16>(axis.def_value),
|
||||
static_cast<qint16>(axis.max_value),
|
||||
});
|
||||
}
|
||||
|
||||
return variations;
|
||||
}
|
||||
|
||||
void TtfParserFont::setVariations(const QVector<Variation> &variations)
|
||||
{
|
||||
if (!isOpen()) {
|
||||
throw tr("Font is not loaded.");
|
||||
}
|
||||
|
||||
for (const auto &variation : variations) {
|
||||
ttfp_set_variation(m_face.get(), variation.tag.value, variation.value);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,35 @@
|
|||
#pragma once
|
||||
|
||||
#include <QCoreApplication>
|
||||
#include <QPainterPath>
|
||||
|
||||
#include <memory>
|
||||
|
||||
#define TTFP_VARIABLE_FONTS
|
||||
#include <ttfparser.h>
|
||||
|
||||
#include "glyph.h"
|
||||
|
||||
class TtfParserFont
|
||||
{
|
||||
Q_DECLARE_TR_FUNCTIONS(TtfParserFont)
|
||||
|
||||
public:
|
||||
TtfParserFont();
|
||||
|
||||
void open(const QString &path, const quint32 index = 0);
|
||||
bool isOpen() const;
|
||||
|
||||
FontInfo fontInfo() const;
|
||||
Glyph outline(const quint16 gid) const;
|
||||
|
||||
QVector<VariationInfo> loadVariations();
|
||||
void setVariations(const QVector<Variation> &variations);
|
||||
|
||||
private:
|
||||
struct FreeCPtr
|
||||
{ void operator()(void* x) { free(x); } };
|
||||
|
||||
QByteArray m_fontData;
|
||||
std::unique_ptr<ttfp_face, FreeCPtr> m_face;
|
||||
};
|
Binary file not shown.
|
@ -0,0 +1,119 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<ttFont sfntVersion="\x00\x01\x00\x00">
|
||||
|
||||
<GlyphOrder>
|
||||
<GlyphID id="0" name=".notdef"/>
|
||||
<GlyphID id="1" name="A"/>
|
||||
</GlyphOrder>
|
||||
|
||||
<head>
|
||||
<tableVersion value="1.0"/>
|
||||
<fontRevision value="1.0"/>
|
||||
<checkSumAdjustment value="0x0000000"/>
|
||||
<magicNumber value="0x5f0f3cf5"/>
|
||||
<flags value="00000000 00000000"/>
|
||||
<unitsPerEm value="1000"/>
|
||||
<created value="Sat Jan 1 00:00:00 2000"/>
|
||||
<modified value="Sat Jan 1 00:00:00 2000"/>
|
||||
<xMin value="0"/>
|
||||
<yMin value="0"/>
|
||||
<xMax value="0"/>
|
||||
<yMax value="0"/>
|
||||
<macStyle value="00000000 00000000"/>
|
||||
<lowestRecPPEM value="3"/>
|
||||
<fontDirectionHint value="2"/>
|
||||
<indexToLocFormat value="0"/>
|
||||
<glyphDataFormat value="0"/>
|
||||
</head>
|
||||
|
||||
<hhea>
|
||||
<tableVersion value="0x00010000"/>
|
||||
<ascent value="1024"/>
|
||||
<descent value="-400"/>
|
||||
<lineGap value="0"/>
|
||||
<advanceWidthMax value="0"/>
|
||||
<minLeftSideBearing value="0"/>
|
||||
<minRightSideBearing value="0"/>
|
||||
<xMaxExtent value="0"/>
|
||||
<caretSlopeRise value="1"/>
|
||||
<caretSlopeRun value="0"/>
|
||||
<caretOffset value="0"/>
|
||||
<reserved0 value="0"/>
|
||||
<reserved1 value="0"/>
|
||||
<reserved2 value="0"/>
|
||||
<reserved3 value="0"/>
|
||||
<metricDataFormat value="0"/>
|
||||
<numberOfHMetrics value="0"/>
|
||||
</hhea>
|
||||
|
||||
<maxp>
|
||||
<tableVersion value="0x10000"/>
|
||||
<numGlyphs value="0"/>
|
||||
<maxPoints value="0"/>
|
||||
<maxContours value="0"/>
|
||||
<maxCompositePoints value="0"/>
|
||||
<maxCompositeContours value="0"/>
|
||||
<maxZones value="0"/>
|
||||
<maxTwilightPoints value="0"/>
|
||||
<maxStorage value="0"/>
|
||||
<maxFunctionDefs value="0"/>
|
||||
<maxInstructionDefs value="0"/>
|
||||
<maxStackElements value="0"/>
|
||||
<maxSizeOfInstructions value="0"/>
|
||||
<maxComponentElements value="0"/>
|
||||
<maxComponentDepth value="0"/>
|
||||
</maxp>
|
||||
|
||||
<hmtx>
|
||||
<mtx name=".notdef" width="600" lsb="100"/>
|
||||
<mtx name="A" width="540" lsb="6"/>
|
||||
</hmtx>
|
||||
|
||||
<cmap>
|
||||
<tableVersion version="0"/>
|
||||
<cmap_format_4 platformID="0" platEncID="3" language="0">
|
||||
<map code="0x41" name="A"/>
|
||||
</cmap_format_4>
|
||||
</cmap>
|
||||
|
||||
<loca>
|
||||
</loca>
|
||||
|
||||
<glyf>
|
||||
<TTGlyph name=".notdef">
|
||||
<contour>
|
||||
<pt x="100" y="0" on="1"/>
|
||||
<pt x="100" y="700" on="1"/>
|
||||
<pt x="600" y="700" on="1"/>
|
||||
<pt x="600" y="0" on="1"/>
|
||||
</contour>
|
||||
<contour>
|
||||
<pt x="140" y="40" on="1"/>
|
||||
<pt x="560" y="40" on="1"/>
|
||||
<pt x="560" y="660" on="1"/>
|
||||
<pt x="140" y="660" on="1"/>
|
||||
</contour>
|
||||
<instructions/>
|
||||
</TTGlyph>
|
||||
|
||||
<TTGlyph name="A">
|
||||
<contour>
|
||||
<pt x="173" y="267" on="1"/>
|
||||
<pt x="369" y="267" on="1"/>
|
||||
<pt x="270" y="587" on="1"/>
|
||||
</contour>
|
||||
<contour>
|
||||
<pt x="6" y="0" on="1"/>
|
||||
<pt x="224" y="656" on="1"/>
|
||||
<pt x="320" y="656" on="1"/>
|
||||
<pt x="541" y="0" on="1"/>
|
||||
<pt x="452" y="0" on="1"/>
|
||||
<pt x="390" y="200" on="1"/>
|
||||
<pt x="151" y="200" on="1"/>
|
||||
<pt x="85" y="0" on="1"/>
|
||||
</contour>
|
||||
<instructions/>
|
||||
</TTGlyph>
|
||||
</glyf>
|
||||
|
||||
</ttFont>
|
|
@ -0,0 +1,389 @@
|
|||
use std::num::NonZeroU16;
|
||||
use ttf_parser::GlyphId;
|
||||
use ttf_parser::apple_layout::Lookup;
|
||||
use crate::{convert, Unit::*};
|
||||
|
||||
mod format0 {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn single() {
|
||||
let data = convert(&[
|
||||
UInt16(0), // format
|
||||
UInt16(10), // value
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert_eq!(table.value(GlyphId(0)).unwrap(), 10);
|
||||
assert!(table.value(GlyphId(1)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn not_enough_glyphs() {
|
||||
let data = convert(&[
|
||||
UInt16(0), // format
|
||||
UInt16(10), // value
|
||||
]);
|
||||
|
||||
assert!(Lookup::parse(NonZeroU16::new(2).unwrap(), &data).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn too_many_glyphs() {
|
||||
let data = convert(&[
|
||||
UInt16(0), // format
|
||||
UInt16(10), // value
|
||||
UInt16(11), // value <-- will be ignored
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert_eq!(table.value(GlyphId(0)).unwrap(), 10);
|
||||
assert!(table.value(GlyphId(1)).is_none());
|
||||
}
|
||||
}
|
||||
|
||||
mod format2 {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn single() {
|
||||
let data = convert(&[
|
||||
UInt16(2), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(6), // segment size
|
||||
UInt16(1), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(118), // last glyph
|
||||
UInt16(118), // first glyph
|
||||
UInt16(10), // value
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert_eq!(table.value(GlyphId(118)).unwrap(), 10);
|
||||
assert!(table.value(GlyphId(1)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn range() {
|
||||
let data = convert(&[
|
||||
UInt16(2), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(6), // segment size
|
||||
UInt16(1), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(7), // last glyph
|
||||
UInt16(5), // first glyph
|
||||
UInt16(18), // offset
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert!(table.value(GlyphId(4)).is_none());
|
||||
assert_eq!(table.value(GlyphId(5)).unwrap(), 18);
|
||||
assert_eq!(table.value(GlyphId(6)).unwrap(), 18);
|
||||
assert_eq!(table.value(GlyphId(7)).unwrap(), 18);
|
||||
assert!(table.value(GlyphId(8)).is_none());
|
||||
}
|
||||
}
|
||||
|
||||
mod format4 {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn single() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(6), // segment size
|
||||
UInt16(1), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(118), // last glyph
|
||||
UInt16(118), // first glyph
|
||||
UInt16(18), // offset
|
||||
|
||||
// Values [0]
|
||||
UInt16(10), // value [0]
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert_eq!(table.value(GlyphId(118)).unwrap(), 10);
|
||||
assert!(table.value(GlyphId(1)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn range() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(6), // segment size
|
||||
UInt16(1), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(7), // last glyph
|
||||
UInt16(5), // first glyph
|
||||
UInt16(18), // offset
|
||||
|
||||
// Values [0]
|
||||
UInt16(10), // value [0]
|
||||
UInt16(11), // value [1]
|
||||
UInt16(12), // value [2]
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert!(table.value(GlyphId(4)).is_none());
|
||||
assert_eq!(table.value(GlyphId(5)).unwrap(), 10);
|
||||
assert_eq!(table.value(GlyphId(6)).unwrap(), 11);
|
||||
assert_eq!(table.value(GlyphId(7)).unwrap(), 12);
|
||||
assert!(table.value(GlyphId(8)).is_none());
|
||||
}
|
||||
}
|
||||
|
||||
mod format6 {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn single() {
|
||||
let data = convert(&[
|
||||
UInt16(6), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(4), // segment size
|
||||
UInt16(1), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(0), // glyph
|
||||
UInt16(10), // value
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert_eq!(table.value(GlyphId(0)).unwrap(), 10);
|
||||
assert!(table.value(GlyphId(1)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn multiple() {
|
||||
let data = convert(&[
|
||||
UInt16(6), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(4), // segment size
|
||||
UInt16(3), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(0), // glyph
|
||||
UInt16(10), // value
|
||||
// Segment [1]
|
||||
UInt16(5), // glyph
|
||||
UInt16(20), // value
|
||||
// Segment [2]
|
||||
UInt16(10), // glyph
|
||||
UInt16(30), // value
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert_eq!(table.value(GlyphId(0)).unwrap(), 10);
|
||||
assert_eq!(table.value(GlyphId(5)).unwrap(), 20);
|
||||
assert_eq!(table.value(GlyphId(10)).unwrap(), 30);
|
||||
assert!(table.value(GlyphId(1)).is_none());
|
||||
}
|
||||
|
||||
// Tests below are indirectly testing BinarySearchTable.
|
||||
|
||||
#[test]
|
||||
fn no_segments() {
|
||||
let data = convert(&[
|
||||
UInt16(6), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(4), // segment size
|
||||
UInt16(0), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
]);
|
||||
|
||||
assert!(Lookup::parse(NonZeroU16::new(1).unwrap(), &data).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn ignore_termination() {
|
||||
let data = convert(&[
|
||||
UInt16(6), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(4), // segment size
|
||||
UInt16(2), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(0), // glyph
|
||||
UInt16(10), // value
|
||||
// Segment [1]
|
||||
UInt16(0xFFFF), // glyph
|
||||
UInt16(0xFFFF), // value
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert!(table.value(GlyphId(0xFFFF)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn only_termination() {
|
||||
let data = convert(&[
|
||||
UInt16(6), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(4), // segment size
|
||||
UInt16(1), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(0xFFFF), // glyph
|
||||
UInt16(0xFFFF), // value
|
||||
]);
|
||||
|
||||
assert!(Lookup::parse(NonZeroU16::new(1).unwrap(), &data).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn invalid_segment_size() {
|
||||
let data = convert(&[
|
||||
UInt16(6), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(8), // segment size <-- must be 4
|
||||
UInt16(1), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(0), // glyph
|
||||
UInt16(10), // value
|
||||
]);
|
||||
|
||||
assert!(Lookup::parse(NonZeroU16::new(1).unwrap(), &data).is_none());
|
||||
}
|
||||
}
|
||||
|
||||
mod format8 {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn single() {
|
||||
let data = convert(&[
|
||||
UInt16(8), // format
|
||||
UInt16(0), // first glyph
|
||||
UInt16(1), // glyphs count
|
||||
UInt16(2), // value [0]
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert_eq!(table.value(GlyphId(0)).unwrap(), 2);
|
||||
assert!(table.value(GlyphId(1)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn non_zero_first() {
|
||||
let data = convert(&[
|
||||
UInt16(8), // format
|
||||
UInt16(5), // first glyph
|
||||
UInt16(1), // glyphs count
|
||||
UInt16(2), // value [0]
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert_eq!(table.value(GlyphId(5)).unwrap(), 2);
|
||||
assert!(table.value(GlyphId(1)).is_none());
|
||||
assert!(table.value(GlyphId(6)).is_none());
|
||||
}
|
||||
}
|
||||
|
||||
mod format10 {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn single() {
|
||||
let data = convert(&[
|
||||
UInt16(10), // format
|
||||
UInt16(1), // value size: u8
|
||||
UInt16(0), // first glyph
|
||||
UInt16(1), // glyphs count
|
||||
UInt8(2), // value [0]
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert_eq!(table.value(GlyphId(0)).unwrap(), 2);
|
||||
assert!(table.value(GlyphId(1)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn invalid_value_size() {
|
||||
let data = convert(&[
|
||||
UInt16(10), // format
|
||||
UInt16(50), // value size <-- invalid
|
||||
UInt16(0), // first glyph
|
||||
UInt16(1), // glyphs count
|
||||
UInt8(2), // value [0]
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert!(table.value(GlyphId(0)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn unsupported_value_size() {
|
||||
let data = convert(&[
|
||||
UInt16(10), // format
|
||||
UInt16(8), // value size <-- we do not support u64
|
||||
UInt16(0), // first glyph
|
||||
UInt16(1), // glyphs count
|
||||
Raw(&[0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02]), // value [0]
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert!(table.value(GlyphId(0)).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn u32_value_size() {
|
||||
let data = convert(&[
|
||||
UInt16(10), // format
|
||||
UInt16(4), // value size
|
||||
UInt16(0), // first glyph
|
||||
UInt16(1), // glyphs count
|
||||
UInt32(0xFFFF + 10), // value [0] <-- will be truncated
|
||||
]);
|
||||
|
||||
let table = Lookup::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
assert_eq!(table.value(GlyphId(0)).unwrap(), 9);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,134 @@
|
|||
use std::num::NonZeroU16;
|
||||
use ttf_parser::GlyphId;
|
||||
use ttf_parser::ankr::{Table, Point};
|
||||
use crate::{convert, Unit::*};
|
||||
|
||||
#[test]
|
||||
fn empty() {
|
||||
let data = convert(&[
|
||||
UInt16(0), // version
|
||||
UInt16(0), // reserved
|
||||
UInt32(0), // offset to lookup table
|
||||
UInt32(0), // offset to glyphs data
|
||||
]);
|
||||
|
||||
let _ = Table::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn single() {
|
||||
let data = convert(&[
|
||||
UInt16(0), // version
|
||||
UInt16(0), // reserved
|
||||
UInt32(12), // offset to lookup table
|
||||
UInt32(12 + 16), // offset to glyphs data
|
||||
|
||||
// Lookup Table
|
||||
UInt16(6), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(4), // segment size
|
||||
UInt16(1), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(0), // glyph
|
||||
UInt16(0), // offset
|
||||
|
||||
// Glyphs Data
|
||||
UInt32(1), // number of points
|
||||
// Point [0]
|
||||
Int16(-5), // x
|
||||
Int16(11), // y
|
||||
]);
|
||||
|
||||
let table = Table::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
let points = table.points(GlyphId(0)).unwrap();
|
||||
assert_eq!(points.get(0).unwrap(), Point { x: -5, y: 11 });
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn two_points() {
|
||||
let data = convert(&[
|
||||
UInt16(0), // version
|
||||
UInt16(0), // reserved
|
||||
UInt32(12), // offset to lookup table
|
||||
UInt32(12 + 16), // offset to glyphs data
|
||||
|
||||
// Lookup Table
|
||||
UInt16(6), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(4), // segment size
|
||||
UInt16(1), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(0), // glyph
|
||||
UInt16(0), // offset
|
||||
|
||||
// Glyphs Data
|
||||
// Glyph Data [0]
|
||||
UInt32(2), // number of points
|
||||
// Point [0]
|
||||
Int16(-5), // x
|
||||
Int16(11), // y
|
||||
// Point [1]
|
||||
Int16(10), // x
|
||||
Int16(-40), // y
|
||||
]);
|
||||
|
||||
let table = Table::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
let points = table.points(GlyphId(0)).unwrap();
|
||||
assert_eq!(points.get(0).unwrap(), Point { x: -5, y: 11 });
|
||||
assert_eq!(points.get(1).unwrap(), Point { x: 10, y: -40 });
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn two_glyphs() {
|
||||
let data = convert(&[
|
||||
UInt16(0), // version
|
||||
UInt16(0), // reserved
|
||||
UInt32(12), // offset to lookup table
|
||||
UInt32(12 + 20), // offset to glyphs data
|
||||
|
||||
// Lookup Table
|
||||
UInt16(6), // format
|
||||
|
||||
// Binary Search Table
|
||||
UInt16(4), // segment size
|
||||
UInt16(2), // number of segments
|
||||
UInt16(0), // search range: we don't use it
|
||||
UInt16(0), // entry selector: we don't use it
|
||||
UInt16(0), // range shift: we don't use it
|
||||
|
||||
// Segment [0]
|
||||
UInt16(0), // glyph
|
||||
UInt16(0), // offset
|
||||
// Segment [1]
|
||||
UInt16(1), // glyph
|
||||
UInt16(8), // offset
|
||||
|
||||
// Glyphs Data
|
||||
// Glyph Data [0]
|
||||
UInt32(1), // number of points
|
||||
// Point [0]
|
||||
Int16(-5), // x
|
||||
Int16(11), // y
|
||||
// Glyph Data [1]
|
||||
UInt32(1), // number of points
|
||||
// Point [0]
|
||||
Int16(40), // x
|
||||
Int16(10), // y
|
||||
]);
|
||||
|
||||
let table = Table::parse(NonZeroU16::new(1).unwrap(), &data).unwrap();
|
||||
let points = table.points(GlyphId(0)).unwrap();
|
||||
assert_eq!(points.get(0).unwrap(), Point { x: -5, y: 11 });
|
||||
let points = table.points(GlyphId(1)).unwrap();
|
||||
assert_eq!(points.get(0).unwrap(), Point { x: 40, y: 10 });
|
||||
}
|
|
@ -0,0 +1,998 @@
|
|||
// TODO: simplify/rewrite
|
||||
|
||||
use std::fmt::Write;
|
||||
|
||||
use ttf_parser::{cff, GlyphId, CFFError, Rect};
|
||||
|
||||
struct Builder(String);
|
||||
impl ttf_parser::OutlineBuilder for Builder {
|
||||
fn move_to(&mut self, x: f32, y: f32) {
|
||||
write!(&mut self.0, "M {} {} ", x, y).unwrap();
|
||||
}
|
||||
|
||||
fn line_to(&mut self, x: f32, y: f32) {
|
||||
write!(&mut self.0, "L {} {} ", x, y).unwrap();
|
||||
}
|
||||
|
||||
fn quad_to(&mut self, x1: f32, y1: f32, x: f32, y: f32) {
|
||||
write!(&mut self.0, "Q {} {} {} {} ", x1, y1, x, y).unwrap();
|
||||
}
|
||||
|
||||
fn curve_to(&mut self, x1: f32, y1: f32, x2: f32, y2: f32, x: f32, y: f32) {
|
||||
write!(&mut self.0, "C {} {} {} {} {} {} ", x1, y1, x2, y2, x, y).unwrap();
|
||||
}
|
||||
|
||||
fn close(&mut self) {
|
||||
write!(&mut self.0, "Z ").unwrap();
|
||||
}
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
mod operator {
|
||||
pub const HORIZONTAL_STEM: u8 = 1;
|
||||
pub const VERTICAL_STEM: u8 = 3;
|
||||
pub const VERTICAL_MOVE_TO: u8 = 4;
|
||||
pub const LINE_TO: u8 = 5;
|
||||
pub const HORIZONTAL_LINE_TO: u8 = 6;
|
||||
pub const VERTICAL_LINE_TO: u8 = 7;
|
||||
pub const CURVE_TO: u8 = 8;
|
||||
pub const CALL_LOCAL_SUBROUTINE: u8 = 10;
|
||||
pub const RETURN: u8 = 11;
|
||||
pub const ENDCHAR: u8 = 14;
|
||||
pub const HORIZONTAL_STEM_HINT_MASK: u8 = 18;
|
||||
pub const HINT_MASK: u8 = 19;
|
||||
pub const COUNTER_MASK: u8 = 20;
|
||||
pub const MOVE_TO: u8 = 21;
|
||||
pub const HORIZONTAL_MOVE_TO: u8 = 22;
|
||||
pub const VERTICAL_STEM_HINT_MASK: u8 = 23;
|
||||
pub const CURVE_LINE: u8 = 24;
|
||||
pub const LINE_CURVE: u8 = 25;
|
||||
pub const VV_CURVE_TO: u8 = 26;
|
||||
pub const HH_CURVE_TO: u8 = 27;
|
||||
pub const SHORT_INT: u8 = 28;
|
||||
pub const CALL_GLOBAL_SUBROUTINE: u8 = 29;
|
||||
pub const VH_CURVE_TO: u8 = 30;
|
||||
pub const HV_CURVE_TO: u8 = 31;
|
||||
pub const HFLEX: u8 = 34;
|
||||
pub const FLEX: u8 = 35;
|
||||
pub const HFLEX1: u8 = 36;
|
||||
pub const FLEX1: u8 = 37;
|
||||
pub const FIXED_16_16: u8 = 255;
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
mod top_dict_operator {
|
||||
pub const CHARSET_OFFSET: u16 = 15;
|
||||
pub const CHAR_STRINGS_OFFSET: u16 = 17;
|
||||
pub const PRIVATE_DICT_SIZE_AND_OFFSET: u16 = 18;
|
||||
pub const ROS: u16 = 1230;
|
||||
pub const FD_ARRAY: u16 = 1236;
|
||||
pub const FD_SELECT: u16 = 1237;
|
||||
}
|
||||
|
||||
mod private_dict_operator {
|
||||
pub const LOCAL_SUBROUTINES_OFFSET: u16 = 19;
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
#[derive(Clone, Copy)]
|
||||
enum TtfType {
|
||||
Raw(&'static [u8]),
|
||||
TrueTypeMagic,
|
||||
OpenTypeMagic,
|
||||
FontCollectionMagic,
|
||||
Int8(i8),
|
||||
UInt8(u8),
|
||||
Int16(i16),
|
||||
UInt16(u16),
|
||||
Int32(i32),
|
||||
UInt32(u32),
|
||||
CFFInt(i32),
|
||||
}
|
||||
|
||||
use TtfType::*;
|
||||
|
||||
fn convert(values: &[TtfType]) -> Vec<u8> {
|
||||
let mut data = Vec::with_capacity(256);
|
||||
for v in values {
|
||||
convert_type(*v, &mut data);
|
||||
}
|
||||
|
||||
data
|
||||
}
|
||||
|
||||
fn convert_type(value: TtfType, data: &mut Vec<u8>) {
|
||||
match value {
|
||||
TtfType::Raw(bytes) => {
|
||||
data.extend_from_slice(bytes);
|
||||
}
|
||||
TtfType::TrueTypeMagic => {
|
||||
data.extend_from_slice(&[0x00, 0x01, 0x00, 0x00]);
|
||||
}
|
||||
TtfType::OpenTypeMagic => {
|
||||
data.extend_from_slice(&[0x4F, 0x54, 0x54, 0x4F]);
|
||||
}
|
||||
TtfType::FontCollectionMagic => {
|
||||
data.extend_from_slice(&[0x74, 0x74, 0x63, 0x66]);
|
||||
}
|
||||
TtfType::Int8(n) => {
|
||||
data.extend_from_slice(&i8::to_be_bytes(n));
|
||||
}
|
||||
TtfType::UInt8(n) => {
|
||||
data.extend_from_slice(&u8::to_be_bytes(n));
|
||||
}
|
||||
TtfType::Int16(n) => {
|
||||
data.extend_from_slice(&i16::to_be_bytes(n));
|
||||
}
|
||||
TtfType::UInt16(n) => {
|
||||
data.extend_from_slice(&u16::to_be_bytes(n));
|
||||
}
|
||||
TtfType::Int32(n) => {
|
||||
data.extend_from_slice(&i32::to_be_bytes(n));
|
||||
}
|
||||
TtfType::UInt32(n) => {
|
||||
data.extend_from_slice(&u32::to_be_bytes(n));
|
||||
}
|
||||
TtfType::CFFInt(n) => {
|
||||
match n {
|
||||
-107..=107 => {
|
||||
data.push((n as i16 + 139) as u8);
|
||||
}
|
||||
108..=1131 => {
|
||||
let n = n - 108;
|
||||
data.push(((n >> 8) + 247) as u8);
|
||||
data.push((n & 0xFF) as u8);
|
||||
}
|
||||
-1131..=-108 => {
|
||||
let n = -n - 108;
|
||||
data.push(((n >> 8) + 251) as u8);
|
||||
data.push((n & 0xFF) as u8);
|
||||
}
|
||||
-32768..=32767 => {
|
||||
data.push(28);
|
||||
data.extend_from_slice(&i16::to_be_bytes(n as i16));
|
||||
}
|
||||
_ => {
|
||||
data.push(29);
|
||||
data.extend_from_slice(&i32::to_be_bytes(n));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct Writer {
|
||||
data: Vec<u8>,
|
||||
}
|
||||
|
||||
impl Writer {
|
||||
fn new() -> Self {
|
||||
Writer { data: Vec::with_capacity(256) }
|
||||
}
|
||||
|
||||
fn offset(&self) -> usize {
|
||||
self.data.len()
|
||||
}
|
||||
|
||||
fn write(&mut self, value: TtfType) {
|
||||
convert_type(value, &mut self.data);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
fn gen_cff(
|
||||
global_subrs: &[&[TtfType]],
|
||||
local_subrs: &[&[TtfType]],
|
||||
chars: &[TtfType],
|
||||
) -> Vec<u8> {
|
||||
fn gen_global_subrs(subrs: &[&[TtfType]]) -> Vec<u8> {
|
||||
let mut w = Writer::new();
|
||||
for v1 in subrs {
|
||||
for v2 in v1.iter() {
|
||||
w.write(*v2);
|
||||
}
|
||||
}
|
||||
w.data
|
||||
}
|
||||
|
||||
fn gen_local_subrs(subrs: &[&[TtfType]]) -> Vec<u8> {
|
||||
let mut w = Writer::new();
|
||||
for v1 in subrs {
|
||||
for v2 in v1.iter() {
|
||||
w.write(*v2);
|
||||
}
|
||||
}
|
||||
w.data
|
||||
}
|
||||
|
||||
const EMPTY_INDEX_SIZE: usize = 2;
|
||||
const INDEX_HEADER_SIZE: usize = 5;
|
||||
|
||||
// TODO: support multiple subrs
|
||||
assert!(global_subrs.len() <= 1);
|
||||
assert!(local_subrs.len() <= 1);
|
||||
|
||||
let global_subrs_data = gen_global_subrs(global_subrs);
|
||||
let local_subrs_data = gen_local_subrs(local_subrs);
|
||||
let chars_data = convert(chars);
|
||||
|
||||
assert!(global_subrs_data.len() < 255);
|
||||
assert!(local_subrs_data.len() < 255);
|
||||
assert!(chars_data.len() < 255);
|
||||
|
||||
let mut w = Writer::new();
|
||||
// Header
|
||||
w.write(UInt8(1)); // major version
|
||||
w.write(UInt8(0)); // minor version
|
||||
w.write(UInt8(4)); // header size
|
||||
w.write(UInt8(0)); // absolute offset
|
||||
|
||||
// Name INDEX
|
||||
w.write(UInt16(0)); // count
|
||||
|
||||
// Top DICT
|
||||
// INDEX
|
||||
w.write(UInt16(1)); // count
|
||||
w.write(UInt8(1)); // offset size
|
||||
w.write(UInt8(1)); // index[0]
|
||||
|
||||
let top_dict_idx2 = if local_subrs.is_empty() { 3 } else { 6 };
|
||||
w.write(UInt8(top_dict_idx2)); // index[1]
|
||||
// Item 0
|
||||
let mut charstr_offset = w.offset() + 2;
|
||||
charstr_offset += EMPTY_INDEX_SIZE; // String INDEX
|
||||
|
||||
// Global Subroutines INDEX
|
||||
if !global_subrs_data.is_empty() {
|
||||
charstr_offset += INDEX_HEADER_SIZE + global_subrs_data.len();
|
||||
} else {
|
||||
charstr_offset += EMPTY_INDEX_SIZE;
|
||||
}
|
||||
|
||||
if !local_subrs_data.is_empty() {
|
||||
charstr_offset += 3;
|
||||
}
|
||||
|
||||
w.write(CFFInt(charstr_offset as i32));
|
||||
w.write(UInt8(top_dict_operator::CHAR_STRINGS_OFFSET as u8));
|
||||
|
||||
if !local_subrs_data.is_empty() {
|
||||
// Item 1
|
||||
w.write(CFFInt(2)); // length
|
||||
w.write(CFFInt((charstr_offset + INDEX_HEADER_SIZE + chars_data.len()) as i32)); // offset
|
||||
w.write(UInt8(top_dict_operator::PRIVATE_DICT_SIZE_AND_OFFSET as u8));
|
||||
}
|
||||
|
||||
// String INDEX
|
||||
w.write(UInt16(0)); // count
|
||||
|
||||
// Global Subroutines INDEX
|
||||
if global_subrs_data.is_empty() {
|
||||
w.write(UInt16(0)); // count
|
||||
} else {
|
||||
w.write(UInt16(1)); // count
|
||||
w.write(UInt8(1)); // offset size
|
||||
w.write(UInt8(1)); // index[0]
|
||||
w.write(UInt8(global_subrs_data.len() as u8 + 1)); // index[1]
|
||||
w.data.extend_from_slice(&global_subrs_data);
|
||||
}
|
||||
|
||||
// CharString INDEX
|
||||
w.write(UInt16(1)); // count
|
||||
w.write(UInt8(1)); // offset size
|
||||
w.write(UInt8(1)); // index[0]
|
||||
w.write(UInt8(chars_data.len() as u8 + 1)); // index[1]
|
||||
w.data.extend_from_slice(&chars_data);
|
||||
|
||||
if !local_subrs_data.is_empty() {
|
||||
// The local subroutines offset is relative to the beginning of the Private DICT data.
|
||||
|
||||
// Private DICT
|
||||
w.write(CFFInt(2));
|
||||
w.write(UInt8(private_dict_operator::LOCAL_SUBROUTINES_OFFSET as u8));
|
||||
|
||||
// Local Subroutines INDEX
|
||||
w.write(UInt16(1)); // count
|
||||
w.write(UInt8(1)); // offset size
|
||||
w.write(UInt8(1)); // index[0]
|
||||
w.write(UInt8(local_subrs_data.len() as u8 + 1)); // index[1]
|
||||
w.data.extend_from_slice(&local_subrs_data);
|
||||
}
|
||||
|
||||
w.data
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn unsupported_version() {
|
||||
let data = convert(&[
|
||||
UInt8(10), // major version, only 1 is supported
|
||||
UInt8(0), // minor version
|
||||
UInt8(4), // header size
|
||||
UInt8(0), // absolute offset
|
||||
]);
|
||||
|
||||
assert!(cff::Table::parse(&data).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn non_default_header_size() {
|
||||
let data = convert(&[
|
||||
// Header
|
||||
UInt8(1), // major version
|
||||
UInt8(0), // minor version
|
||||
UInt8(8), // header size
|
||||
UInt8(0), // absolute offset
|
||||
|
||||
// no-op, should be skipped
|
||||
UInt8(0),
|
||||
UInt8(0),
|
||||
UInt8(0),
|
||||
UInt8(0),
|
||||
|
||||
// Name INDEX
|
||||
UInt16(0), // count
|
||||
|
||||
// Top DICT
|
||||
// INDEX
|
||||
UInt16(1), // count
|
||||
UInt8(1), // offset size
|
||||
UInt8(1), // index[0]
|
||||
UInt8(3), // index[1]
|
||||
// Data
|
||||
CFFInt(21),
|
||||
UInt8(top_dict_operator::CHAR_STRINGS_OFFSET as u8),
|
||||
|
||||
// String INDEX
|
||||
UInt16(0), // count
|
||||
|
||||
// Global Subroutines INDEX
|
||||
UInt16(0), // count
|
||||
|
||||
// CharString INDEX
|
||||
UInt16(1), // count
|
||||
UInt8(1), // offset size
|
||||
UInt8(1), // index[0]
|
||||
UInt8(4), // index[1]
|
||||
// Data
|
||||
CFFInt(10),
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
]);
|
||||
|
||||
let table = cff::Table::parse(&data).unwrap();
|
||||
let mut builder = Builder(String::new());
|
||||
let rect = table.outline(GlyphId(0), &mut builder).unwrap();
|
||||
|
||||
assert_eq!(builder.0, "M 10 0 Z ");
|
||||
assert_eq!(rect, Rect { x_min: 10, y_min: 0, x_max: 10, y_max: 0 });
|
||||
}
|
||||
|
||||
fn rect(x_min: i16, y_min: i16, x_max: i16, y_max: i16) -> Rect {
|
||||
Rect { x_min, y_min, x_max, y_max }
|
||||
}
|
||||
|
||||
macro_rules! test_cs_with_subrs {
|
||||
($name:ident, $glob:expr, $loc:expr, $values:expr, $path:expr, $rect_res:expr) => {
|
||||
#[test]
|
||||
fn $name() {
|
||||
let data = gen_cff($glob, $loc, $values);
|
||||
let table = cff::Table::parse(&data).unwrap();
|
||||
let mut builder = Builder(String::new());
|
||||
let rect = table.outline(GlyphId(0), &mut builder).unwrap();
|
||||
|
||||
assert_eq!(builder.0, $path);
|
||||
assert_eq!(rect, $rect_res);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
macro_rules! test_cs {
|
||||
($name:ident, $values:expr, $path:expr, $rect_res:expr) => {
|
||||
test_cs_with_subrs!($name, &[], &[], $values, $path, $rect_res);
|
||||
};
|
||||
}
|
||||
|
||||
macro_rules! test_cs_err {
|
||||
($name:ident, $values:expr, $err:expr) => {
|
||||
#[test]
|
||||
fn $name() {
|
||||
let data = gen_cff(&[], &[], $values);
|
||||
let table = cff::Table::parse(&data).unwrap();
|
||||
let mut builder = Builder(String::new());
|
||||
let res = table.outline(GlyphId(0), &mut builder);
|
||||
assert_eq!(res.unwrap_err(), $err);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
test_cs!(move_to, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 Z ",
|
||||
rect(10, 20, 10, 20)
|
||||
);
|
||||
|
||||
test_cs!(move_to_with_width, &[
|
||||
CFFInt(5), CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 Z ",
|
||||
rect(10, 20, 10, 20)
|
||||
);
|
||||
|
||||
test_cs!(hmove_to, &[
|
||||
CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 0 Z ",
|
||||
rect(10, 0, 10, 0)
|
||||
);
|
||||
|
||||
test_cs!(hmove_to_with_width, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 20 0 Z ",
|
||||
rect(20, 0, 20, 0)
|
||||
);
|
||||
|
||||
test_cs!(vmove_to, &[
|
||||
CFFInt(10), UInt8(operator::VERTICAL_MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 0 10 Z ",
|
||||
rect(0, 10, 0, 10)
|
||||
);
|
||||
|
||||
test_cs!(vmove_to_with_width, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::VERTICAL_MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 0 20 Z ",
|
||||
rect(0, 20, 0, 20)
|
||||
);
|
||||
|
||||
test_cs!(line_to, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), UInt8(operator::LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 L 40 60 Z ",
|
||||
rect(10, 20, 40, 60)
|
||||
);
|
||||
|
||||
test_cs!(line_to_with_multiple_pairs, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), UInt8(operator::LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 L 40 60 L 90 120 Z ",
|
||||
rect(10, 20, 90, 120)
|
||||
);
|
||||
|
||||
test_cs!(hline_to, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), UInt8(operator::HORIZONTAL_LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 L 40 20 Z ",
|
||||
rect(10, 20, 40, 20)
|
||||
);
|
||||
|
||||
test_cs!(hline_to_with_two_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), UInt8(operator::HORIZONTAL_LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 L 40 20 L 40 60 Z ",
|
||||
rect(10, 20, 40, 60)
|
||||
);
|
||||
|
||||
test_cs!(hline_to_with_three_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::HORIZONTAL_LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 L 40 20 L 40 60 L 90 60 Z ",
|
||||
rect(10, 20, 90, 60)
|
||||
);
|
||||
|
||||
test_cs!(vline_to, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), UInt8(operator::VERTICAL_LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 L 10 50 Z ",
|
||||
rect(10, 20, 10, 50)
|
||||
);
|
||||
|
||||
test_cs!(vline_to_with_two_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), UInt8(operator::VERTICAL_LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 L 10 50 L 50 50 Z ",
|
||||
rect(10, 20, 50, 50)
|
||||
);
|
||||
|
||||
test_cs!(vline_to_with_three_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::VERTICAL_LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 L 10 50 L 50 50 L 50 100 Z ",
|
||||
rect(10, 20, 50, 100)
|
||||
);
|
||||
|
||||
test_cs!(curve_to, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), CFFInt(70), CFFInt(80),
|
||||
UInt8(operator::CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 C 40 60 90 120 160 200 Z ",
|
||||
rect(10, 20, 160, 200)
|
||||
);
|
||||
|
||||
test_cs!(curve_to_with_two_sets_of_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), CFFInt(70), CFFInt(80),
|
||||
CFFInt(90), CFFInt(100), CFFInt(110), CFFInt(120), CFFInt(130), CFFInt(140),
|
||||
UInt8(operator::CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 C 40 60 90 120 160 200 C 250 300 360 420 490 560 Z ",
|
||||
rect(10, 20, 490, 560)
|
||||
);
|
||||
|
||||
test_cs!(hh_curve_to, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), UInt8(operator::HH_CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 C 40 20 80 70 140 70 Z ",
|
||||
rect(10, 20, 140, 70)
|
||||
);
|
||||
|
||||
test_cs!(hh_curve_to_with_y, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), CFFInt(70), UInt8(operator::HH_CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 C 50 50 100 110 170 110 Z ",
|
||||
rect(10, 20, 170, 110)
|
||||
);
|
||||
|
||||
test_cs!(vv_curve_to, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), UInt8(operator::VV_CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 C 10 50 50 100 50 160 Z ",
|
||||
rect(10, 20, 50, 160)
|
||||
);
|
||||
|
||||
test_cs!(vv_curve_to_with_x, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), CFFInt(70), UInt8(operator::VV_CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], "M 10 20 C 40 60 90 120 90 190 Z ",
|
||||
rect(10, 20, 90, 190)
|
||||
);
|
||||
|
||||
#[test]
|
||||
fn only_endchar() {
|
||||
let data = gen_cff(&[], &[], &[UInt8(operator::ENDCHAR)]);
|
||||
let table = cff::Table::parse(&data).unwrap();
|
||||
let mut builder = Builder(String::new());
|
||||
assert!(table.outline(GlyphId(0), &mut builder).is_err());
|
||||
}
|
||||
|
||||
test_cs_with_subrs!(local_subr,
|
||||
&[],
|
||||
&[&[
|
||||
CFFInt(30),
|
||||
CFFInt(40),
|
||||
UInt8(operator::LINE_TO),
|
||||
UInt8(operator::RETURN),
|
||||
]],
|
||||
&[
|
||||
CFFInt(10),
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_LOCAL_SUBROUTINE),
|
||||
UInt8(operator::ENDCHAR),
|
||||
],
|
||||
"M 10 0 L 40 40 Z ",
|
||||
rect(10, 0, 40, 40)
|
||||
);
|
||||
|
||||
test_cs_with_subrs!(endchar_in_subr,
|
||||
&[],
|
||||
&[&[
|
||||
CFFInt(30),
|
||||
CFFInt(40),
|
||||
UInt8(operator::LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
]],
|
||||
&[
|
||||
CFFInt(10),
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_LOCAL_SUBROUTINE),
|
||||
],
|
||||
"M 10 0 L 40 40 Z ",
|
||||
rect(10, 0, 40, 40)
|
||||
);
|
||||
|
||||
test_cs_with_subrs!(global_subr,
|
||||
&[&[
|
||||
CFFInt(30),
|
||||
CFFInt(40),
|
||||
UInt8(operator::LINE_TO),
|
||||
UInt8(operator::RETURN),
|
||||
]],
|
||||
&[],
|
||||
&[
|
||||
CFFInt(10),
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_GLOBAL_SUBROUTINE),
|
||||
UInt8(operator::ENDCHAR),
|
||||
],
|
||||
"M 10 0 L 40 40 Z ",
|
||||
rect(10, 0, 40, 40)
|
||||
);
|
||||
|
||||
test_cs_err!(reserved_operator, &[
|
||||
CFFInt(10), UInt8(2),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidOperator);
|
||||
|
||||
test_cs_err!(line_to_without_move_to, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::MissingMoveTo);
|
||||
|
||||
// Width must be set only once.
|
||||
test_cs_err!(two_vmove_to_with_width, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::VERTICAL_MOVE_TO),
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::VERTICAL_MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(move_to_with_too_many_coords, &[
|
||||
CFFInt(10), CFFInt(10), CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(move_to_with_not_enought_coords, &[
|
||||
CFFInt(10), UInt8(operator::MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(hmove_to_with_too_many_coords, &[
|
||||
CFFInt(10), CFFInt(10), CFFInt(10), UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(hmove_to_with_not_enought_coords, &[
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(vmove_to_with_too_many_coords, &[
|
||||
CFFInt(10), CFFInt(10), CFFInt(10), UInt8(operator::VERTICAL_MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(vmove_to_with_not_enought_coords, &[
|
||||
UInt8(operator::VERTICAL_MOVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(line_to_with_single_coord, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), UInt8(operator::LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(line_to_with_odd_number_of_coord, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(hline_to_without_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
UInt8(operator::HORIZONTAL_LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(vline_to_without_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
UInt8(operator::VERTICAL_LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(curve_to_with_invalid_num_of_coords_1, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), UInt8(operator::CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(curve_to_with_invalid_num_of_coords_2, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(60), CFFInt(70), CFFInt(80), CFFInt(90),
|
||||
UInt8(operator::CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(hh_curve_to_with_not_enought_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::HH_CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(hh_curve_to_with_too_many_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(30), CFFInt(40), CFFInt(50),
|
||||
UInt8(operator::HH_CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(vv_curve_to_with_not_enought_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), UInt8(operator::VV_CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(vv_curve_to_with_too_many_coords, &[
|
||||
CFFInt(10), CFFInt(20), UInt8(operator::MOVE_TO),
|
||||
CFFInt(30), CFFInt(40), CFFInt(50), CFFInt(30), CFFInt(40), CFFInt(50),
|
||||
UInt8(operator::VV_CURVE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::InvalidArgumentsStackLength);
|
||||
|
||||
test_cs_err!(multiple_endchar, &[
|
||||
UInt8(operator::ENDCHAR),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::DataAfterEndChar);
|
||||
|
||||
test_cs_err!(seac_with_not_enough_data, &[
|
||||
CFFInt(0),
|
||||
CFFInt(0),
|
||||
CFFInt(0),
|
||||
CFFInt(0),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::NestingLimitReached);
|
||||
|
||||
test_cs_err!(operands_overflow, &[
|
||||
CFFInt(0), CFFInt(1), CFFInt(2), CFFInt(3), CFFInt(4), CFFInt(5), CFFInt(6), CFFInt(7), CFFInt(8), CFFInt(9),
|
||||
CFFInt(0), CFFInt(1), CFFInt(2), CFFInt(3), CFFInt(4), CFFInt(5), CFFInt(6), CFFInt(7), CFFInt(8), CFFInt(9),
|
||||
CFFInt(0), CFFInt(1), CFFInt(2), CFFInt(3), CFFInt(4), CFFInt(5), CFFInt(6), CFFInt(7), CFFInt(8), CFFInt(9),
|
||||
CFFInt(0), CFFInt(1), CFFInt(2), CFFInt(3), CFFInt(4), CFFInt(5), CFFInt(6), CFFInt(7), CFFInt(8), CFFInt(9),
|
||||
CFFInt(0), CFFInt(1), CFFInt(2), CFFInt(3), CFFInt(4), CFFInt(5), CFFInt(6), CFFInt(7), CFFInt(8), CFFInt(9),
|
||||
], CFFError::ArgumentsStackLimitReached);
|
||||
|
||||
test_cs_err!(operands_overflow_with_4_byte_ints, &[
|
||||
CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000),
|
||||
CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000),
|
||||
CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000),
|
||||
CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000),
|
||||
CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000),
|
||||
CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000),
|
||||
CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000),
|
||||
CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000),
|
||||
CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000),
|
||||
CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000), CFFInt(30000),
|
||||
], CFFError::ArgumentsStackLimitReached);
|
||||
|
||||
test_cs_err!(bbox_overflow, &[
|
||||
CFFInt(32767), UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
CFFInt(32767), UInt8(operator::HORIZONTAL_LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
], CFFError::BboxOverflow);
|
||||
|
||||
#[test]
|
||||
fn endchar_in_subr_with_extra_data_1() {
|
||||
let data = gen_cff(
|
||||
&[],
|
||||
&[&[
|
||||
CFFInt(30),
|
||||
CFFInt(40),
|
||||
UInt8(operator::LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
]],
|
||||
&[
|
||||
CFFInt(10),
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_LOCAL_SUBROUTINE),
|
||||
CFFInt(30),
|
||||
CFFInt(40),
|
||||
UInt8(operator::LINE_TO),
|
||||
]
|
||||
);
|
||||
|
||||
let table = cff::Table::parse(&data).unwrap();
|
||||
let mut builder = Builder(String::new());
|
||||
let res = table.outline(GlyphId(0), &mut builder);
|
||||
assert_eq!(res.unwrap_err(), CFFError::DataAfterEndChar);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn endchar_in_subr_with_extra_data_2() {
|
||||
let data = gen_cff(
|
||||
&[],
|
||||
&[&[
|
||||
CFFInt(30),
|
||||
CFFInt(40),
|
||||
UInt8(operator::LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
CFFInt(30),
|
||||
CFFInt(40),
|
||||
UInt8(operator::LINE_TO),
|
||||
]],
|
||||
&[
|
||||
CFFInt(10),
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_LOCAL_SUBROUTINE),
|
||||
]
|
||||
);
|
||||
|
||||
let table = cff::Table::parse(&data).unwrap();
|
||||
let mut builder = Builder(String::new());
|
||||
let res = table.outline(GlyphId(0), &mut builder);
|
||||
assert_eq!(res.unwrap_err(), CFFError::DataAfterEndChar);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn subr_without_return() {
|
||||
let data = gen_cff(
|
||||
&[],
|
||||
&[&[
|
||||
CFFInt(30),
|
||||
CFFInt(40),
|
||||
UInt8(operator::LINE_TO),
|
||||
UInt8(operator::ENDCHAR),
|
||||
CFFInt(30),
|
||||
CFFInt(40),
|
||||
UInt8(operator::LINE_TO),
|
||||
]],
|
||||
&[
|
||||
CFFInt(10),
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_LOCAL_SUBROUTINE),
|
||||
]
|
||||
);
|
||||
|
||||
let table = cff::Table::parse(&data).unwrap();
|
||||
let mut builder = Builder(String::new());
|
||||
let res = table.outline(GlyphId(0), &mut builder);
|
||||
assert_eq!(res.unwrap_err(), CFFError::DataAfterEndChar);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn recursive_local_subr() {
|
||||
let data = gen_cff(
|
||||
&[],
|
||||
&[&[
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_LOCAL_SUBROUTINE),
|
||||
]],
|
||||
&[
|
||||
CFFInt(10),
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_LOCAL_SUBROUTINE),
|
||||
]
|
||||
);
|
||||
|
||||
let table = cff::Table::parse(&data).unwrap();
|
||||
let mut builder = Builder(String::new());
|
||||
let res = table.outline(GlyphId(0), &mut builder);
|
||||
assert_eq!(res.unwrap_err(), CFFError::NestingLimitReached);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn recursive_global_subr() {
|
||||
let data = gen_cff(
|
||||
&[&[
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_GLOBAL_SUBROUTINE),
|
||||
]],
|
||||
&[],
|
||||
&[
|
||||
CFFInt(10),
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_GLOBAL_SUBROUTINE),
|
||||
]
|
||||
);
|
||||
|
||||
let table = cff::Table::parse(&data).unwrap();
|
||||
let mut builder = Builder(String::new());
|
||||
let res = table.outline(GlyphId(0), &mut builder);
|
||||
assert_eq!(res.unwrap_err(), CFFError::NestingLimitReached);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn recursive_mixed_subr() {
|
||||
let data = gen_cff(
|
||||
&[&[
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_LOCAL_SUBROUTINE),
|
||||
]],
|
||||
&[&[
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_GLOBAL_SUBROUTINE),
|
||||
]],
|
||||
&[
|
||||
CFFInt(10),
|
||||
UInt8(operator::HORIZONTAL_MOVE_TO),
|
||||
CFFInt(0 - 107), // subr index - subr bias
|
||||
UInt8(operator::CALL_GLOBAL_SUBROUTINE),
|
||||
]
|
||||
);
|
||||
|
||||
let table = cff::Table::parse(&data).unwrap();
|
||||
let mut builder = Builder(String::new());
|
||||
let res = table.outline(GlyphId(0), &mut builder);
|
||||
assert_eq!(res.unwrap_err(), CFFError::NestingLimitReached);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero_char_string_offset() {
|
||||
let data = convert(&[
|
||||
// Header
|
||||
UInt8(1), // major version
|
||||
UInt8(0), // minor version
|
||||
UInt8(4), // header size
|
||||
UInt8(0), // absolute offset
|
||||
|
||||
// Name INDEX
|
||||
UInt16(0), // count
|
||||
|
||||
// Top DICT
|
||||
// INDEX
|
||||
UInt16(1), // count
|
||||
UInt8(1), // offset size
|
||||
UInt8(1), // index[0]
|
||||
UInt8(3), // index[1]
|
||||
// Data
|
||||
CFFInt(0), // zero offset!
|
||||
UInt8(top_dict_operator::CHAR_STRINGS_OFFSET as u8),
|
||||
]);
|
||||
|
||||
assert!(cff::Table::parse(&data).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn invalid_char_string_offset() {
|
||||
let data = convert(&[
|
||||
// Header
|
||||
UInt8(1), // major version
|
||||
UInt8(0), // minor version
|
||||
UInt8(4), // header size
|
||||
UInt8(0), // absolute offset
|
||||
|
||||
// Name INDEX
|
||||
UInt16(0), // count
|
||||
|
||||
// Top DICT
|
||||
// INDEX
|
||||
UInt16(1), // count
|
||||
UInt8(1), // offset size
|
||||
UInt8(1), // index[0]
|
||||
UInt8(3), // index[1]
|
||||
// Data
|
||||
CFFInt(2), // invalid offset!
|
||||
UInt8(top_dict_operator::CHAR_STRINGS_OFFSET as u8),
|
||||
]);
|
||||
|
||||
assert!(cff::Table::parse(&data).is_none());
|
||||
}
|
||||
|
||||
// TODO: return from main
|
||||
// TODO: return without endchar
|
||||
// TODO: data after return
|
||||
// TODO: recursive subr
|
||||
// TODO: HORIZONTAL_STEM
|
||||
// TODO: VERTICAL_STEM
|
||||
// TODO: HORIZONTAL_STEM_HINT_MASK
|
||||
// TODO: HINT_MASK
|
||||
// TODO: COUNTER_MASK
|
||||
// TODO: VERTICAL_STEM_HINT_MASK
|
||||
// TODO: CURVE_LINE
|
||||
// TODO: LINE_CURVE
|
||||
// TODO: VH_CURVE_TO
|
||||
// TODO: HFLEX
|
||||
// TODO: FLEX
|
||||
// TODO: HFLEX1
|
||||
// TODO: FLEX1
|
|
@ -0,0 +1,555 @@
|
|||
mod format0 {
|
||||
use ttf_parser::{cmap, GlyphId};
|
||||
use crate::{convert, Unit::*};
|
||||
|
||||
#[test]
|
||||
fn maps_not_all_256_codepoints() {
|
||||
let mut data = convert(&[
|
||||
UInt16(0), // format
|
||||
UInt16(262), // subtable size
|
||||
UInt16(0), // language ID
|
||||
]);
|
||||
|
||||
// Map (only) codepoint 0x40 to 100.
|
||||
data.extend(std::iter::repeat(0).take(256));
|
||||
data[6 + 0x40] = 100;
|
||||
|
||||
let subtable = cmap::Subtable0::parse(&data).unwrap();
|
||||
|
||||
assert_eq!(subtable.glyph_index(0), None);
|
||||
assert_eq!(subtable.glyph_index(0x40), Some(GlyphId(100)));
|
||||
assert_eq!(subtable.glyph_index(100), None);
|
||||
|
||||
let mut vec = vec![];
|
||||
subtable.codepoints(|c| vec.push(c));
|
||||
assert_eq!(vec, [0x40]);
|
||||
}
|
||||
}
|
||||
|
||||
mod format2 {
|
||||
use ttf_parser::{cmap, GlyphId};
|
||||
use crate::{convert, Unit::*};
|
||||
|
||||
const U16_SIZE: usize = std::mem::size_of::<u16>();
|
||||
|
||||
#[test]
|
||||
fn collect_codepoints() {
|
||||
let mut data = convert(&[
|
||||
UInt16(2), // format
|
||||
UInt16(534), // subtable size
|
||||
UInt16(0), // language ID
|
||||
]);
|
||||
|
||||
// Make only high byte 0x28 multi-byte.
|
||||
data.extend(std::iter::repeat(0x00).take(256 * U16_SIZE));
|
||||
data[6 + 0x28 * U16_SIZE + 1] = 0x08;
|
||||
|
||||
data.extend(convert(&[
|
||||
// First sub header (for single byte mapping)
|
||||
UInt16(254), // first code
|
||||
UInt16(2), // entry count
|
||||
UInt16(0), // id delta: uninteresting
|
||||
UInt16(0), // id range offset: uninteresting
|
||||
// Second sub header (for high byte 0x28)
|
||||
UInt16(16), // first code: (0x28 << 8) + 0x10 = 10256
|
||||
UInt16(3), // entry count
|
||||
UInt16(0), // id delta: uninteresting
|
||||
UInt16(0), // id range offset: uninteresting
|
||||
]));
|
||||
|
||||
// Now only glyph ID's would follow. Not interesting for codepoints.
|
||||
|
||||
let subtable = cmap::Subtable2::parse(&data).unwrap();
|
||||
|
||||
let mut vec = vec![];
|
||||
subtable.codepoints(|c| vec.push(c));
|
||||
assert_eq!(vec, [10256, 10257, 10258, 254, 255]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn codepoint_at_range_end() {
|
||||
let mut data = convert(&[
|
||||
UInt16(2), // format
|
||||
UInt16(532), // subtable size
|
||||
UInt16(0), // language ID
|
||||
]);
|
||||
|
||||
// Only single bytes.
|
||||
data.extend(std::iter::repeat(0x00).take(256 * U16_SIZE));
|
||||
data.extend(convert(&[
|
||||
// First sub header (for single byte mapping)
|
||||
UInt16(40), // first code
|
||||
UInt16(2), // entry count
|
||||
UInt16(0), // id delta
|
||||
UInt16(2), // id range offset
|
||||
// Glyph index
|
||||
UInt16(100), // glyph ID [0]
|
||||
UInt16(1000), // glyph ID [1]
|
||||
UInt16(10000), // glyph ID [2] (unused)
|
||||
]));
|
||||
|
||||
let subtable = cmap::Subtable2::parse(&data).unwrap();
|
||||
assert_eq!(subtable.glyph_index(39), None);
|
||||
assert_eq!(subtable.glyph_index(40), Some(GlyphId(100)));
|
||||
assert_eq!(subtable.glyph_index(41), Some(GlyphId(1000)));
|
||||
assert_eq!(subtable.glyph_index(42), None);
|
||||
}
|
||||
}
|
||||
|
||||
mod format4 {
|
||||
use ttf_parser::{cmap, GlyphId};
|
||||
use crate::{convert, Unit::*};
|
||||
|
||||
#[test]
|
||||
fn single_glyph() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(32), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(4), // 2 x segCount
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
// Deltas
|
||||
Int16(-64), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(0), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
assert_eq!(subtable.glyph_index(0x41), Some(GlyphId(1)));
|
||||
assert_eq!(subtable.glyph_index(0x42), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn continuous_range() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(32), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(4), // 2 x segCount
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(73), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
// Deltas
|
||||
Int16(-64), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(0), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
assert_eq!(subtable.glyph_index(0x40), None);
|
||||
assert_eq!(subtable.glyph_index(0x41), Some(GlyphId(1)));
|
||||
assert_eq!(subtable.glyph_index(0x42), Some(GlyphId(2)));
|
||||
assert_eq!(subtable.glyph_index(0x43), Some(GlyphId(3)));
|
||||
assert_eq!(subtable.glyph_index(0x44), Some(GlyphId(4)));
|
||||
assert_eq!(subtable.glyph_index(0x45), Some(GlyphId(5)));
|
||||
assert_eq!(subtable.glyph_index(0x46), Some(GlyphId(6)));
|
||||
assert_eq!(subtable.glyph_index(0x47), Some(GlyphId(7)));
|
||||
assert_eq!(subtable.glyph_index(0x48), Some(GlyphId(8)));
|
||||
assert_eq!(subtable.glyph_index(0x49), Some(GlyphId(9)));
|
||||
assert_eq!(subtable.glyph_index(0x4A), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn multiple_ranges() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(48), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(8), // 2 x segCount
|
||||
UInt16(4), // search range
|
||||
UInt16(1), // entry selector
|
||||
UInt16(4), // range shift
|
||||
// End character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(69), // char code [1]
|
||||
UInt16(73), // char code [2]
|
||||
UInt16(65535), // char code [3]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(67), // char code [1]
|
||||
UInt16(71), // char code [2]
|
||||
UInt16(65535), // char code [3]
|
||||
// Deltas
|
||||
Int16(-64), // delta [0]
|
||||
Int16(-65), // delta [1]
|
||||
Int16(-66), // delta [2]
|
||||
Int16(1), // delta [3]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(0), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
UInt16(0), // offset [2]
|
||||
UInt16(0), // offset [3]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
assert_eq!(subtable.glyph_index(0x40), None);
|
||||
assert_eq!(subtable.glyph_index(0x41), Some(GlyphId(1)));
|
||||
assert_eq!(subtable.glyph_index(0x42), None);
|
||||
assert_eq!(subtable.glyph_index(0x43), Some(GlyphId(2)));
|
||||
assert_eq!(subtable.glyph_index(0x44), Some(GlyphId(3)));
|
||||
assert_eq!(subtable.glyph_index(0x45), Some(GlyphId(4)));
|
||||
assert_eq!(subtable.glyph_index(0x46), None);
|
||||
assert_eq!(subtable.glyph_index(0x47), Some(GlyphId(5)));
|
||||
assert_eq!(subtable.glyph_index(0x48), Some(GlyphId(6)));
|
||||
assert_eq!(subtable.glyph_index(0x49), Some(GlyphId(7)));
|
||||
assert_eq!(subtable.glyph_index(0x4A), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn unordered_ids() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(42), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(4), // 2 x segCount
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(69), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
// Deltas
|
||||
Int16(0), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(4), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
// Glyph index array
|
||||
UInt16(1), // glyph ID [0]
|
||||
UInt16(10), // glyph ID [1]
|
||||
UInt16(100), // glyph ID [2]
|
||||
UInt16(1000), // glyph ID [3]
|
||||
UInt16(10000), // glyph ID [4]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
assert_eq!(subtable.glyph_index(0x40), None);
|
||||
assert_eq!(subtable.glyph_index(0x41), Some(GlyphId(1)));
|
||||
assert_eq!(subtable.glyph_index(0x42), Some(GlyphId(10)));
|
||||
assert_eq!(subtable.glyph_index(0x43), Some(GlyphId(100)));
|
||||
assert_eq!(subtable.glyph_index(0x44), Some(GlyphId(1000)));
|
||||
assert_eq!(subtable.glyph_index(0x45), Some(GlyphId(10000)));
|
||||
assert_eq!(subtable.glyph_index(0x46), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn unordered_chars_and_ids() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(64), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(12), // 2 x segCount
|
||||
UInt16(8), // search range
|
||||
UInt16(2), // entry selector
|
||||
UInt16(4), // range shift
|
||||
// End character codes
|
||||
UInt16(80), // char code [0]
|
||||
UInt16(256), // char code [1]
|
||||
UInt16(336), // char code [2]
|
||||
UInt16(512), // char code [3]
|
||||
UInt16(592), // char code [4]
|
||||
UInt16(65535), // char code [5]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(80), // char code [0]
|
||||
UInt16(256), // char code [1]
|
||||
UInt16(336), // char code [2]
|
||||
UInt16(512), // char code [3]
|
||||
UInt16(592), // char code [4]
|
||||
UInt16(65535), // char code [5]
|
||||
// Deltas
|
||||
Int16(-79), // delta [0]
|
||||
Int16(-246), // delta [1]
|
||||
Int16(-236), // delta [2]
|
||||
Int16(488), // delta [3]
|
||||
Int16(9408), // delta [4]
|
||||
Int16(1), // delta [5]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(0), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
UInt16(0), // offset [2]
|
||||
UInt16(0), // offset [3]
|
||||
UInt16(0), // offset [4]
|
||||
UInt16(0), // offset [5]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
assert_eq!(subtable.glyph_index(0x40), None);
|
||||
assert_eq!(subtable.glyph_index(0x50), Some(GlyphId(1)));
|
||||
assert_eq!(subtable.glyph_index(0x100), Some(GlyphId(10)));
|
||||
assert_eq!(subtable.glyph_index(0x150), Some(GlyphId(100)));
|
||||
assert_eq!(subtable.glyph_index(0x200), Some(GlyphId(1000)));
|
||||
assert_eq!(subtable.glyph_index(0x250), Some(GlyphId(10000)));
|
||||
assert_eq!(subtable.glyph_index(0x300), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn no_end_codes() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(28), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(4), // 2 x segCount
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(73), // char code [0]
|
||||
// 0xFF, 0xFF, // char code [1] <-- removed
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65), // char code [0]
|
||||
// 0xFF, 0xFF, // char code [1] <-- removed
|
||||
// Deltas
|
||||
Int16(-64), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(0), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
]);
|
||||
|
||||
assert!(cmap::Subtable4::parse(&data).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn invalid_segment_count() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(32), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(1), // 2 x segCount <-- must be more than 1
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
// Deltas
|
||||
Int16(-64), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(0), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
]);
|
||||
|
||||
assert!(cmap::Subtable4::parse(&data).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn only_end_segments() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(32), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(2), // 2 x segCount
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(65535), // char code [1]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65535), // char code [1]
|
||||
// Deltas
|
||||
Int16(-64), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(0), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
// Should not loop forever.
|
||||
assert_eq!(subtable.glyph_index(0x41), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn invalid_length() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(16), // subtable size <-- the size should be 32, but we don't check it anyway
|
||||
UInt16(0), // language ID
|
||||
UInt16(4), // 2 x segCount
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
// Deltas
|
||||
Int16(-64), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(0), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
assert_eq!(subtable.glyph_index(0x41), Some(GlyphId(1)));
|
||||
assert_eq!(subtable.glyph_index(0x42), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn codepoint_out_of_range() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(32), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(4), // 2 x segCount
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
// Deltas
|
||||
Int16(-64), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(0), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
// Format 4 support only u16 codepoints, so we have to bail immediately otherwise.
|
||||
assert_eq!(subtable.glyph_index(0x1FFFF), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(42), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(4), // 2 x segCount
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(69), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
// Deltas
|
||||
Int16(0), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(4), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
// Glyph index array
|
||||
UInt16(0), // glyph ID [0] <-- indicates missing glyph
|
||||
UInt16(10), // glyph ID [1]
|
||||
UInt16(100), // glyph ID [2]
|
||||
UInt16(1000), // glyph ID [3]
|
||||
UInt16(10000), // glyph ID [4]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
assert_eq!(subtable.glyph_index(0x41), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn invalid_offset() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(42), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(4), // 2 x segCount
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(69), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(65), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
// Deltas
|
||||
Int16(0), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(4), // offset [0]
|
||||
UInt16(65535), // offset [1]
|
||||
// Glyph index array
|
||||
UInt16(1), // glyph ID [0]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
assert_eq!(subtable.glyph_index(65535), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn collect_codepoints() {
|
||||
let data = convert(&[
|
||||
UInt16(4), // format
|
||||
UInt16(24), // subtable size
|
||||
UInt16(0), // language ID
|
||||
UInt16(4), // 2 x segCount
|
||||
UInt16(2), // search range
|
||||
UInt16(0), // entry selector
|
||||
UInt16(2), // range shift
|
||||
// End character codes
|
||||
UInt16(34), // char code [0]
|
||||
UInt16(65535), // char code [1]
|
||||
UInt16(0), // reserved
|
||||
// Start character codes
|
||||
UInt16(27), // char code [0]
|
||||
UInt16(65533), // char code [1]
|
||||
// Deltas
|
||||
Int16(0), // delta [0]
|
||||
Int16(1), // delta [1]
|
||||
// Offsets into Glyph index array
|
||||
UInt16(4), // offset [0]
|
||||
UInt16(0), // offset [1]
|
||||
// Glyph index array
|
||||
UInt16(0), // glyph ID [0]
|
||||
UInt16(10), // glyph ID [1]
|
||||
]);
|
||||
|
||||
let subtable = cmap::Subtable4::parse(&data).unwrap();
|
||||
|
||||
let mut vec = vec![];
|
||||
subtable.codepoints(|c| vec.push(c));
|
||||
assert_eq!(vec, [27, 28, 29, 30, 31, 32, 33, 34, 65533, 65534, 65535]);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,83 @@
|
|||
use ttf_parser::feat::Table;
|
||||
use crate::{convert, Unit::*};
|
||||
|
||||
#[test]
|
||||
fn basic() {
|
||||
let data = convert(&[
|
||||
Fixed(1.0), // version
|
||||
UInt16(4), // number of features
|
||||
UInt16(0), // reserved
|
||||
UInt32(0), // reserved
|
||||
|
||||
// Feature Name [0]
|
||||
UInt16(0), // feature
|
||||
UInt16(1), // number of settings
|
||||
UInt32(60), // offset to settings table
|
||||
UInt16(0), // flags: none
|
||||
UInt16(260), // name index
|
||||
|
||||
// Feature Name [1]
|
||||
UInt16(1), // feature
|
||||
UInt16(1), // number of settings
|
||||
UInt32(64), // offset to settings table
|
||||
UInt16(0), // flags: none
|
||||
UInt16(256), // name index
|
||||
|
||||
// Feature Name [2]
|
||||
UInt16(3), // feature
|
||||
UInt16(3), // number of settings
|
||||
UInt32(68), // offset to settings table
|
||||
Raw(&[0x80, 0x00]), // flags: exclusive
|
||||
UInt16(262), // name index
|
||||
|
||||
// Feature Name [3]
|
||||
UInt16(6), // feature
|
||||
UInt16(2), // number of settings
|
||||
UInt32(80), // offset to settings table
|
||||
Raw(&[0xC0, 0x01]), // flags: exclusive and other
|
||||
UInt16(258), // name index
|
||||
|
||||
// Setting Name [0]
|
||||
UInt16(0), // setting
|
||||
UInt16(261), // name index
|
||||
|
||||
// Setting Name [1]
|
||||
UInt16(2), // setting
|
||||
UInt16(257), // name index
|
||||
|
||||
// Setting Name [2]
|
||||
UInt16(0), // setting
|
||||
UInt16(268), // name index
|
||||
UInt16(3), // setting
|
||||
UInt16(264), // name index
|
||||
UInt16(4), // setting
|
||||
UInt16(265), // name index
|
||||
|
||||
// Setting Name [3]
|
||||
UInt16(0), // setting
|
||||
UInt16(259), // name index
|
||||
UInt16(1), // setting
|
||||
UInt16(260), // name index
|
||||
]);
|
||||
|
||||
let table = Table::parse(&data).unwrap();
|
||||
assert_eq!(table.names.len(), 4);
|
||||
|
||||
let feature0 = table.names.get(0).unwrap();
|
||||
assert_eq!(feature0.feature, 0);
|
||||
assert_eq!(feature0.setting_names.len(), 1);
|
||||
assert_eq!(feature0.exclusive, false);
|
||||
assert_eq!(feature0.name_index, 260);
|
||||
|
||||
let feature2 = table.names.get(2).unwrap();
|
||||
assert_eq!(feature2.feature, 3);
|
||||
assert_eq!(feature2.setting_names.len(), 3);
|
||||
assert_eq!(feature2.exclusive, true);
|
||||
|
||||
assert_eq!(feature2.setting_names.get(1).unwrap().setting, 3);
|
||||
assert_eq!(feature2.setting_names.get(1).unwrap().name_index, 264);
|
||||
|
||||
let feature3 = table.names.get(3).unwrap();
|
||||
assert_eq!(feature3.default_setting_index, 1);
|
||||
assert_eq!(feature3.exclusive, true);
|
||||
}
|
|
@ -0,0 +1,47 @@
|
|||
use std::fmt::Write;
|
||||
|
||||
struct Builder(String);
|
||||
|
||||
impl ttf_parser::OutlineBuilder for Builder {
|
||||
fn move_to(&mut self, x: f32, y: f32) {
|
||||
write!(&mut self.0, "M {} {} ", x, y).unwrap();
|
||||
}
|
||||
|
||||
fn line_to(&mut self, x: f32, y: f32) {
|
||||
write!(&mut self.0, "L {} {} ", x, y).unwrap();
|
||||
}
|
||||
|
||||
fn quad_to(&mut self, x1: f32, y1: f32, x: f32, y: f32) {
|
||||
write!(&mut self.0, "Q {} {} {} {} ", x1, y1, x, y).unwrap();
|
||||
}
|
||||
|
||||
fn curve_to(&mut self, x1: f32, y1: f32, x2: f32, y2: f32, x: f32, y: f32) {
|
||||
write!(&mut self.0, "C {} {} {} {} {} {} ", x1, y1, x2, y2, x, y).unwrap();
|
||||
}
|
||||
|
||||
fn close(&mut self) {
|
||||
write!(&mut self.0, "Z ").unwrap();
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn endless_loop() {
|
||||
let data = b"\x00\x01\x00\x00\x00\x0f\x00\x10\x00PTT-W\x002h\xd7\x81x\x00\
|
||||
\x00\x00?L\xbaN\x00c\x9a\x9e\x8f\x96\xe3\xfeu\xff\x00\xb2\x00@\x03\x00\xb8\
|
||||
cvt 5:\x00\x00\x00\xb5\xf8\x01\x00\x03\x9ckEr\x92\xd7\xe6\x98M\xdc\x00\x00\
|
||||
\x03\xe0\x00\x00\x00dglyf\"\t\x15`\x00\x00\x03\xe0\x00\x00\x00dglyf\"\t\x15\
|
||||
`\x00\x00\x00 \x00\x00\x00\xfc\x97\x9fmx\x87\xc9\xc8\xfe\x00\x00\xbad\xff\
|
||||
\xff\xf1\xc8head\xc7\x17\xce[\x00\x00\x00\xfc\x00\x00\x006hhea\x03\xc6\x05\
|
||||
\xe4\x00\x00\x014\x00\x00\x00$hmtx\xc9\xfdq\xed\x00\x00\xb5\xf8\x01\x00\x03\
|
||||
\x9ckEr\x92\xd7\xe6\xdch\x00\x00\xc9d\x00\x00\x04 loca\x00M\x82\x11\x00\x00\
|
||||
\x00\x06\x00\x00\x00\xa0maxp\x17\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00 name\
|
||||
\xf4\xd6\xfe\xad\x00OTTO\x00\x02gpost5;5\xe1\x00\x00\xb0P\x00\x00\x01\xf0perp%\
|
||||
\xb0{\x04\x93D\x00\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x01\x00\x00\xe1!yf%1\
|
||||
\x08\x95\x00\x00\x00\x00\x00\xaa\x06\x80fmtx\x02\x00\x00\x00\x00\x00\x00\x00\
|
||||
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\
|
||||
\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00a\xcc\xff\
|
||||
\xce\x03CCCCCCCCC\x00\x00\x00\x00\x00C\x00\x00\x00\x00\xb5\xf8\x01\x00\x00\x9c";
|
||||
|
||||
let face = ttf_parser::Face::from_slice(data, 0).unwrap();
|
||||
let _ = face.outline_glyph(ttf_parser::GlyphId(0), &mut Builder(String::new()));
|
||||
}
|
|
@ -0,0 +1,99 @@
|
|||
use std::num::NonZeroU16;
|
||||
use ttf_parser::GlyphId;
|
||||
use ttf_parser::hmtx::Table;
|
||||
use crate::{convert, Unit::*};
|
||||
|
||||
macro_rules! nzu16 {
|
||||
($n:expr) => { NonZeroU16::new($n).unwrap() };
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn simple_case() {
|
||||
let data = convert(&[
|
||||
UInt16(1), // advance width [0]
|
||||
Int16(2), // side bearing [0]
|
||||
]);
|
||||
|
||||
let table = Table::parse(1, nzu16!(1), &data).unwrap();
|
||||
assert_eq!(table.advance(GlyphId(0)), Some(1));
|
||||
assert_eq!(table.side_bearing(GlyphId(0)), Some(2));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn empty() {
|
||||
assert!(Table::parse(1, nzu16!(1), &[]).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero_metrics() {
|
||||
let data = convert(&[
|
||||
UInt16(1), // advance width [0]
|
||||
Int16(2), // side bearing [0]
|
||||
]);
|
||||
|
||||
assert!(Table::parse(0, nzu16!(1), &data).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn smaller_than_glyphs_count() {
|
||||
let data = convert(&[
|
||||
UInt16(1), // advance width [0]
|
||||
Int16(2), // side bearing [0]
|
||||
|
||||
Int16(3), // side bearing [1]
|
||||
]);
|
||||
|
||||
let table = Table::parse(1, nzu16!(2), &data).unwrap();
|
||||
assert_eq!(table.advance(GlyphId(0)), Some(1));
|
||||
assert_eq!(table.side_bearing(GlyphId(0)), Some(2));
|
||||
assert_eq!(table.advance(GlyphId(1)), Some(1));
|
||||
assert_eq!(table.side_bearing(GlyphId(1)), Some(3));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn less_metrics_than_glyphs() {
|
||||
let data = convert(&[
|
||||
UInt16(1), // advance width [0]
|
||||
Int16(2), // side bearing [0]
|
||||
|
||||
UInt16(3), // advance width [1]
|
||||
Int16(4), // side bearing [1]
|
||||
|
||||
Int16(5), // side bearing [2]
|
||||
]);
|
||||
|
||||
let table = Table::parse(2, nzu16!(1), &data).unwrap();
|
||||
assert_eq!(table.side_bearing(GlyphId(0)), Some(2));
|
||||
assert_eq!(table.side_bearing(GlyphId(1)), Some(4));
|
||||
assert_eq!(table.side_bearing(GlyphId(2)), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn glyph_out_of_bounds_0() {
|
||||
let data = convert(&[
|
||||
UInt16(1), // advance width [0]
|
||||
Int16(2), // side bearing [0]
|
||||
]);
|
||||
|
||||
let table = Table::parse(1, nzu16!(1), &data).unwrap();
|
||||
assert_eq!(table.advance(GlyphId(0)), Some(1));
|
||||
assert_eq!(table.side_bearing(GlyphId(0)), Some(2));
|
||||
assert_eq!(table.advance(GlyphId(1)), None);
|
||||
assert_eq!(table.side_bearing(GlyphId(1)), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn glyph_out_of_bounds_1() {
|
||||
let data = convert(&[
|
||||
UInt16(1), // advance width [0]
|
||||
Int16(2), // side bearing [0]
|
||||
|
||||
Int16(3), // side bearing [1]
|
||||
]);
|
||||
|
||||
let table = Table::parse(1, nzu16!(2), &data).unwrap();
|
||||
assert_eq!(table.advance(GlyphId(1)), Some(1));
|
||||
assert_eq!(table.side_bearing(GlyphId(1)), Some(3));
|
||||
assert_eq!(table.advance(GlyphId(2)), None);
|
||||
assert_eq!(table.side_bearing(GlyphId(2)), None);
|
||||
}
|
|
@ -0,0 +1,146 @@
|
|||
mod aat;
|
||||
mod ankr;
|
||||
mod cff1;
|
||||
mod cmap;
|
||||
mod feat;
|
||||
mod glyf;
|
||||
mod hmtx;
|
||||
mod maxp;
|
||||
mod sbix;
|
||||
mod trak;
|
||||
|
||||
use ttf_parser::{Face, FaceParsingError, fonts_in_collection};
|
||||
|
||||
#[allow(dead_code)]
|
||||
#[derive(Clone, Copy)]
|
||||
pub enum Unit {
|
||||
Raw(&'static [u8]),
|
||||
Int8(i8),
|
||||
UInt8(u8),
|
||||
Int16(i16),
|
||||
UInt16(u16),
|
||||
Int32(i32),
|
||||
UInt32(u32),
|
||||
Fixed(f32),
|
||||
}
|
||||
|
||||
pub fn convert(units: &[Unit]) -> Vec<u8> {
|
||||
let mut data = Vec::with_capacity(256);
|
||||
for v in units {
|
||||
convert_unit(*v, &mut data);
|
||||
}
|
||||
|
||||
data
|
||||
}
|
||||
|
||||
fn convert_unit(unit: Unit, data: &mut Vec<u8>) {
|
||||
match unit {
|
||||
Unit::Raw(bytes) => {
|
||||
data.extend_from_slice(bytes);
|
||||
}
|
||||
Unit::Int8(n) => {
|
||||
data.extend_from_slice(&i8::to_be_bytes(n));
|
||||
}
|
||||
Unit::UInt8(n) => {
|
||||
data.extend_from_slice(&u8::to_be_bytes(n));
|
||||
}
|
||||
Unit::Int16(n) => {
|
||||
data.extend_from_slice(&i16::to_be_bytes(n));
|
||||
}
|
||||
Unit::UInt16(n) => {
|
||||
data.extend_from_slice(&u16::to_be_bytes(n));
|
||||
}
|
||||
Unit::Int32(n) => {
|
||||
data.extend_from_slice(&i32::to_be_bytes(n));
|
||||
}
|
||||
Unit::UInt32(n) => {
|
||||
data.extend_from_slice(&u32::to_be_bytes(n));
|
||||
}
|
||||
Unit::Fixed(n) => {
|
||||
data.extend_from_slice(&i32::to_be_bytes((n * 65536.0) as i32));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
#[test]
|
||||
fn empty_font() {
|
||||
assert_eq!(Face::from_slice(&[], 0).unwrap_err(),
|
||||
FaceParsingError::UnknownMagic);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero_tables() {
|
||||
use Unit::*;
|
||||
let data = convert(&[
|
||||
Raw(&[0x00, 0x01, 0x00, 0x00]), // magic
|
||||
UInt16(0), // numTables
|
||||
UInt16(0), // searchRange
|
||||
UInt16(0), // entrySelector
|
||||
UInt16(0), // rangeShift
|
||||
]);
|
||||
|
||||
assert_eq!(Face::from_slice(&data, 0).unwrap_err(),
|
||||
FaceParsingError::NoHeadTable);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn tables_count_overflow() {
|
||||
use Unit::*;
|
||||
let data = convert(&[
|
||||
Raw(&[0x00, 0x01, 0x00, 0x00]), // magic
|
||||
UInt16(std::u16::MAX), // numTables
|
||||
UInt16(0), // searchRange
|
||||
UInt16(0), // entrySelector
|
||||
UInt16(0), // rangeShift
|
||||
]);
|
||||
|
||||
assert_eq!(Face::from_slice(&data, 0).unwrap_err(),
|
||||
FaceParsingError::MalformedFont);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn empty_font_collection() {
|
||||
use Unit::*;
|
||||
let data = convert(&[
|
||||
Raw(&[0x74, 0x74, 0x63, 0x66]), // magic
|
||||
UInt16(0), // majorVersion
|
||||
UInt16(0), // minorVersion
|
||||
UInt32(0), // numFonts
|
||||
]);
|
||||
|
||||
assert_eq!(fonts_in_collection(&data), Some(0));
|
||||
assert_eq!(Face::from_slice(&data, 0).unwrap_err(),
|
||||
FaceParsingError::FaceIndexOutOfBounds);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn font_collection_num_fonts_overflow() {
|
||||
use Unit::*;
|
||||
let data = convert(&[
|
||||
Raw(&[0x74, 0x74, 0x63, 0x66]), // magic
|
||||
UInt16(0), // majorVersion
|
||||
UInt16(0), // minorVersion
|
||||
UInt32(std::u32::MAX), // numFonts
|
||||
]);
|
||||
|
||||
assert_eq!(fonts_in_collection(&data), Some(std::u32::MAX));
|
||||
assert_eq!(Face::from_slice(&data, 0).unwrap_err(),
|
||||
FaceParsingError::MalformedFont);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn font_index_overflow() {
|
||||
use Unit::*;
|
||||
let data = convert(&[
|
||||
Raw(&[0x74, 0x74, 0x63, 0x66]), // magic
|
||||
UInt16(0), // majorVersion
|
||||
UInt16(0), // minorVersion
|
||||
UInt32(1), // numFonts
|
||||
UInt32(12), // offset [0]
|
||||
]);
|
||||
|
||||
assert_eq!(fonts_in_collection(&data), Some(1));
|
||||
assert_eq!(Face::from_slice(&data, std::u32::MAX).unwrap_err(),
|
||||
FaceParsingError::FaceIndexOutOfBounds);
|
||||
}
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue