Is there a way to tell Jackson to use UTF-8 encoding when using ObjectMapper
to serialize and deserialize Objects?
Jackson automatically detects encoding used in source: as per JSON specification, only valid encodings are UTF-8, UTF-16 and UTF-32. No other encodings (like Latin-1) can be used. Because of this, auto-detection is easy and done by parser -- no encoding detection is accepted for this reason. So, if input is UTF-8, it will be detected as such.
For output, UTF-8 is the default; but if you explicitly want to use another encoding, you can create JsonGenerator
explicitly (with a method that takes JsonEncoding
), and pass this to ObjectMapper
.
Alternatively in both cases you can of course manually construct java.io.Reader
/ java.io.Writer
, and make it use whatever encoding you want.
java.io.OutputStream
. But there are other defaults: JDK has its default encoding if you choose to construct Writer
instance yourself, or some other lib/framework does it. These are outside of Jackson - StaxMan 2014-03-17 19:56
JsonGenerator jsonGeneratorWithUtf16 = new JsonFactory().createGenerator(new File("C:/outputJson.json"), JsonEncoding.UTF16_BE);
new ObjectMapper().writeValue(jsonGeneratorWithUtf16, objectToBeSerialized);
but when I run file -i C:/outputJson.json
it shows charset=binary
- Max 2017-05-19 19:38
file
is something Jackson can do nothing about; perhaps it only detects ASCII/UTF-7/Latin-1, and not UTF-16 encodings. Your usage looks fine. So I am not quite sure what your ask here is - StaxMan 2017-05-20 00:12