Remove incompatible Unicode compatibility types.
As well intentioned as these were, uint16_t and C++11's char16_t are _not_ actually compatible. They are not implicitly convertible, and they mangle differently, so they are not even ABI compatible. In our now wonderous world of C++11, no one should be using these, so just kill them. Bug: 18300613 Change-Id: I06d92d7f1d937dd94a620874323d4c50eb6a31bd
This commit is contained in:
parent
606bb5f2e5
commit
c59932f937
|
@ -22,12 +22,6 @@
|
|||
|
||||
extern "C" {
|
||||
|
||||
// Definitions exist in C++11
|
||||
#if defined __cplusplus && __cplusplus < 201103L
|
||||
typedef unsigned int char32_t;
|
||||
typedef unsigned short char16_t;
|
||||
#endif
|
||||
|
||||
// Standard string functions on char16_t strings.
|
||||
int strcmp16(const char16_t *, const char16_t *);
|
||||
int strncmp16(const char16_t *s1, const char16_t *s2, size_t n);
|
||||
|
|
Loading…
Reference in New Issue