For my project I need to reduce a noise of an ADC output and implemented a simple moving average filter in VHDL.
Although it works in simulation (see the picture):
it has some strange behavior if I display it on the chipscope when the system is running in FPGA (see the picture):
The VHDL code I use for the moving average is as follows:
library ieee;
use ieee.std_logic_1164.all;
use ieee.math_real.all;
use ieee.numeric_std.all;
entity moving_avg is
generic(
SAMPLES_COUNT : integer := 32
);
port (
clk_i : in std_logic;
rst_n_i : in std_logic;
sample_i : in std_logic_vector(11 downto 0);
avg_o : out std_logic_vector(11 downto 0)
);
end;
architecture rtl of moving_avg is
type sample_buff_t is array (1 to SAMPLES_COUNT) of std_logic_vector(11 downto 0);
signal sample_buffer : sample_buff_t;
signal sum : std_logic_vector(31 downto 0);
constant wid_shift : integer := integer(ceil(log2(real(SAMPLES_COUNT))));
signal avg_interm_s : std_logic_vector(31 downto 0);
begin
process (clk_i, rst_n_i) begin
if rst_n_i='1' then
sample_buffer <= (others => sample_i);
sum <= std_logic_vector(unsigned(resize(unsigned(sample_i), sum'length)) sll wid_shift) ;
elsif rising_edge(clk_i) then
sample_buffer <= sample_i & sample_buffer(1 to SAMPLES_COUNT-1);
sum <= std_logic_vector(unsigned(sum) + unsigned(sample_i) - unsigned(sample_buffer(SAMPLES_COUNT)));
end if;
end process;
avg_interm_s <= std_logic_vector((unsigned(sum) srl wid_shift));
avg_o <= avg_interm_s(11 downto 0);
end;
I use Xilinx Vivado tool 2015.2 running on Ubuntu 14.04 x64.
Could you please help me to identify the problem, such that results in simulation correspond to results after synthesis?