I'm following the guide given at micromouseonline . com/2010/07/14/bit-banding-in-the-stm32 . I'm using IAR EWARM
and Cortex M3
. Everything works fine but I'm not able to set the bits in a given address. Im using STM32L151xD
and IAR EWARM
compiler.
This is how they define the functions
#define RAM_BASE 0x20000000
#define RAM_BB_BASE 0x22000000
#define Var_ResetBit_BB(VarAddr, BitNumber) (*(vu32 *) (RAM_BB_BASE | ((VarAddr - RAM_BASE) << 5) | ((BitNumber) << 2)) = 0)
#define Var_SetBit_BB(VarAddr, BitNumber) (*(vu32 *) (RAM_BB_BASE | ((VarAddr - RAM_BASE) << 5) | ((BitNumber) << 2)) = 1)
#define Var_GetBit_BB(VarAddr, BitNumber) (*(vu32 *) (RAM_BB_BASE | ((VarAddr - RAM_BASE) << 5) | ((BitNumber) << 2)))
#define varSetBit(var,bit) (Var_SetBit_BB((u32)&var,bit))
#define varGetBit(var,bit) (Var_GetBit_BB((u32)&var,bit))
the call is :
uint32_t flags;
varSetBit(flags,1);
however, the bit 1 in flags is always 0 if I see using a debugger. The flags is assumed to be 0 at first. So, all the bits in flags will be 0. However, when i use varSetBit(flags,1), the answer at bit 1 is 0 again. I dont think im doing anything wrong. Is it a compiler problem? am i missing some settings? Any help will be appreciated.