When the computer truncates a long to a short, what result would you expect?
That is, do you have any formal definition of what truncation does? And is
the computer violating this definition?
Here is how I view your assignment: "Given two signed integers A and B,
stored in binary 2's-complement, where A is A' bits wide and B is B' bits
wide, with A' > B', assigning A to B loads the least significant B' bits of
A into B. The result is (of course) interpreted as 2's complement binary."
There are some important things here. One is the phrase "2's-complement",
and another is "least significant", and another is "binary".
You seriously need to understand those three concepts.
It would be useful to understand 1's complement as well, although I don't
know offhand of any modern computers that actually do 1's complement.
My e-mail address doesn't have a 2 in it.