Answer
The computer code would have to use 2 bytes per character.
Work Step by Step
A bit can either be a 1 or a 0, which is 2 choices. Each added bit doubles the number of characters that we can represent. (See Exercise 31.) Therefore an added byte multiplies the number of characters that we can represent by $2^8=256$
(One byte can represent $2^8=256$ different characters.)
Two bytes can represent $2^8*2^8=2^{16}=65536$ different characters. Since $50,000\lt 65,536$ two bytes are "enough" for representing 50,000 characters.