LeetCode 271. Encode and Decode String
Question
Design an algorithm to encode a list of strings to a string. The encoded string is then sent over the network and is decoded back to the original list of strings.
Machine 1 (sender) has the function:
1 | string encode(vector<string> strs) { |
Machine 2 (receiver) has the function:
1 | vector<string> decode(string s) { |
So Machine 1 does:
1 | string encoded_string = encode(strs); |
and Machine 2 does:
1 | vector<string> strs2 = decode(encoded_string); |
strs2
in Machine 2 should be the same as strs
in Machine 1.
Implement the encode
and decode
methods.
You are not allowed to solve the problem using any serialize methods (such as eval
).
Example 1:
1 | Input: dummy_input = ["Hello","World"] |
Example 2:
1 | Input: dummy_input = [""] |
Constraints:
1 <= strs.length <= 200
0 <= strs[i].length <= 200
strs[i]
contains any possible characters out of256
valid ASCII characters.
Source: https://leetcode.com/problems/encode-and-decode-strings/
Solution
The best solution is to add a header that represents the length of each string instead of using delimiter. Strings will be concatenated as header-payload-header-...
.
Note that we call toCharArray
method rather than getBytes()
. We do not care about the content of payload so that we do not need to encode/decode strings. Also, on some OA platforms, we may need to import desired character set manually.
1 | // encode a 32-bit integer to a char array, according to little endian |
LeetCode 271. Encode and Decode String
http://yenotes.org/2021/12/30/LeetCode-271-Encode-and-Decode-String/