Finite Math and Applied Calculus (6th Edition)

Published by Brooks Cole
ISBN 10: 1133607705
ISBN 13: 978-1-13360-770-0

Chapter 6 - Section 6.3 - Decision Algorithms: The Addition and Multiplication Principles - Exercises - Page 424: 32

Answer

The computer code would have to use 2 bytes per character.

Work Step by Step

A bit can either be a 1 or a 0, which is 2 choices. Each added bit doubles the number of characters that we can represent. (See Exercise 31.) Therefore an added byte multiplies the number of characters that we can represent by $2^8=256$ (One byte can represent $2^8=256$ different characters.) Two bytes can represent $2^8*2^8=2^{16}=65536$ different characters. Since $50,000\lt 65,536$ two bytes are "enough" for representing 50,000 characters.
Update this answer!

You can help us out by revising, improving and updating this answer.

Update this answer

After you claim an answer you’ll have 24 hours to send in a draft. An editor will review the submission and either publish your submission or provide feedback.