Menu

Question 1 1 Point Information Measured Units Bits Question 1 Options True False Question Q43879087

Question 1 (1 point)

Information can be measured in units of bits.

Question 1 options:

TrueFalse

Question 2 (1 point)

Which of the following equations is used to compute theinformation (N) per event (aka outcome or trial).

Question 2 options:

N=log2[1P(s)]{“version”:”1.1″,”math”:”N =log_{2}bigg[frac{1}{P(s)}bigg]”}

N=log2(P(s)){“version”:”1.1″,”math”:”N = log_{2}({P(s)})”}

N=1log2P(s){“version”:”1.1″,”math”:”N =frac{1}{log_{2}P(s)}”}

N=log2[sP(s)]{“version”:”1.1″,”math”:”N =log_{2}bigg[frac{s}{P(s)}bigg]”}

Question 3 (1 point)

For M equally likely messages, the averageamount of information N per message is computed as_______.

Question 3 options:

N=log2M{“version”:”1.1″,”math”:”N=log_{2} M”}

N=log2M2{“version”:”1.1″,”math”:”N=log_{2} M^{2}”}

N=logeM{“version”:”1.1″,”math”:”N=log_{e} M”}

N=log10M{“version”:”1.1″,”math”:”N=log_{10} M”}

Question 4 (1 point)

“GO BEARS!” is encoded using Fixed length codes and then moreefficiently using Huffman Codes (based on probability ofoccurrence). The Fixed length code requires 36 Bits while theHuffman only uses 30 Bits. What is the compression ratio of thiscode?

Question 4 options:

0.833

36 Bits

30 Bits

1.2

Question 5 (1 point)

Rolling a 7 is the most common combination when rolling two fairdice. When compared to rolling any other number, rolling a 7 has____________________ information or surprise associated withit.

Question 5 options:

Less

More

The same

Zero

Question 6 (1 point)

Suppose Alice performs a statistical analysis on a bag ofM&Ms. She shares the results with Bob who pulls out a greenM&M and declares there is zero information obtained from thisevent. What can be said about the bag?

Question 6 options:

It is mostly green M&Ms

The bag is now empty

The bag has no more green M&Ms in it

It is all green M&Ms

Question 7 (1 point)

When a message is compressed below Entropy,________________________________________.

Question 7 options:

it doesn’t matter

It is impossible to compress below Entropy

redundant data is removed

information is lost

Question 8 (1 point)

Codes such as the Huffman scheme presented in class must have auniquely identifiable code for each symbol such that the shortcodes cannot be confused as the prefix for one of the longer codes.Which of the following code sets does not comply with thisrequirement? (Hint: there are more than one)

Question 8 options:

0, 10, 110, 1110, 1111

0, 11, 111, 1111

10, 01, 101

00, 01, 10, 110, 111

Question 9 (1 point)

Decode the Huffman encoded message 001110111010101000 given thefollowing dictionary:

A 00
B 01
C 11
D 100
E 1010
F 1011

Note: your answer can be in all caps or all lowercase

Question 9 options:

Question 10 (1 point)

Huffman codes use the following characteristic of the message tocreate compressed coding schemes:

Question 10 options:

The compression style desired

The probability of occurrence of symbols within the message

The length of the message

The Entropy of individual symbols in the message

Question 11 (1 point)

When using Huffman codes, shorter codes are assigned to thesymbols with the ______________________ probability of occurrencein the data set.

Question 11 options:

Probability does not factor into the coding scheme

Lowest

Highest

Question 12 (1 point)

Please refer to Page 7 of your lab handout which provides a dataset for the individual portion of this lab. What is the Entropy ofthe data set in units of BITS/SYMBOL?  

Question 12 options:

16.89

2.56

2.81

2.54

Question 13 (1 point)

Your average Code Length is a measure of how well your codeperformed. When using a Huffman code scheme, your Average CodeLength should NEVER be lower than Entropy. Verify that yourcalculations comply with this rule and then select the reasonWHY:

Question 13 options:

Entropy was defined by Mr. Huffman and his equation requirescompliance with this rule

Huffman codes are lossless and going below Entropy would implydata loss

My average code length was less, so I think I may have developeda more efficient code and should immediately validate thisclass!!

Huffman codes are lossy as reflected by the lower value inAverage Code Length

Question 14 (1 point)

Given the data set provided on Page 7 of the lab (individualwork), what two M&M colors yielded the shortest Huffmancodes?

Question 14 options:

Question 15 (1 point)

Given the data set provided on Page 7 of the lab (individualwork), and the M&M sequence of

Blue | Blue | Red | Yellow |Orange    

What is the compression ratio between a 3 bit fixed width codeand your Huffman code?

Question 15 options:

0.8

1.25

Cannot be determined with the info provided

12

Expert Answer


Answer to Question 1 (1 point) Information can be measured in units of bits. Question 1 options: True False Question 2 (1 point) W…

OR