Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
[cdt-debug-dev] Memory Window Hex/ASCII Renderer

Hi All,

 

I am seeing something inconsistencies in the memory window, the behavior between hex/ascii and integer renderer is somewhat different. The first thing that I notice is that in hex/ascii renderer, it is showing the value in big endian format even thought the target is little endian. But for the unsinged integer renderer, it does consider target endianess and set the default display format from the target. The second thing that I notice is that when you write to the target in hex/ascii renderer, it is writing it in big endian format, which doesn’t work for little endian target; the byte order is reversed.

 

Here is an example, I have a little endian target and have the hex renderer displaying memory at address 0, and an unsigned integer renderer displaying memory at address 0 as well. And the value at this address is 1. The unsigned renderer is displaying 1, but the hex renderer is displaying 01000000. When I enter the value 1 in the hex renderer, it is displaying 16777216 in the unsigned renderer. I think this is an incorrect behavior, I would expect to see 1 in the hex renderer (if it consider endianness) and 1 in the unsigned renderer, as long as both of these display the same endianess.

 

If target that is in big endian, the hex renderer is perfectly fine. So, I am wondering if this is a general case for everyone, or it is just a specific case for me.

 

Thanks you,

Patrick


Back to the top