0

I am trying to write a Block decoder, which is part of ADC. The Input and output are digital vector, which consist of logic '1' or '0'. The input vector is 1023bits (1023=2^10-1) lang and the output vector is 10 bits lang, when the ADC 10 bits has.

The ideal of the decoder is: First, i get the number '1' in input vector. Than I use the function div() and mod() to change the number in decimal system to binary system. For the Rest of Digitaloutput i give all bits '0'.

But the compiler said that

" The given system is mixed-determined. [index > 3] Please checkout the option "--maxMixedDeterminedIndex". "

Than I deleted the initial algorithm and put it in algorithm. But the Output don't have beginvalue and

'Chattering detected around time 0..9.22009327831e-011 (100 state events in a row with a total time delta less than the step size 0.002). This can be a performance bottleneck. Use -lv LOG_EVENTS for more information. The zero-crossing was: div(decoder1.a, 2, 2)'

block decoder
  parameter Integer Res(min = 1, start = 10, fixed = true);
  parameter Integer Step(min = 1, start = 1023, fixed = true);
  //Resolution
  Modelica.Electrical.Digital.Interfaces.DigitalInput result[Step] annotation(
    Placement(visible = true, transformation(origin = {-100, 0}, extent = {{-10, -10}, {10, 10}}, rotation = 0), iconTransformation(origin = {-100, 0}, extent = {{-10, -10}, {10, 10}}, rotation = 0)));
  Modelica.Electrical.Digital.Interfaces.DigitalOutput Binary[Res] annotation(
    Placement(visible = true, transformation(origin = {106, 0}, extent = {{-10, -10}, {10, 10}}, rotation = 0), iconTransformation(origin = {106, 0}, extent = {{-10, -10}, {10, 10}}, rotation = 0)));
  Integer a;
  Integer b;
  import L = Modelica.Electrical.Digital.Interfaces.Logic;
  Modelica.Electrical.Digital.Interfaces.DigitalInput De_clk annotation(
    Placement(visible = true, transformation(origin = {-6, 98}, extent = {{-10, -10}, {10, 10}}, rotation = 0), iconTransformation(origin = {-6, 98}, extent = {{-10, -10}, {10, 10}}, rotation = 0)));
initial algorithm
  for i in 1:Res loop
    Binary[i] := L.'0';
  end for;
algorithm
  a := 0;
  b := 0;
  if De_clk == L.'1' or De_clk == L.'H' then
    for i in 1:Step loop
      if result[i] == L.'1' then
        a := a + 1;
      end if;
    end for;
    while div(a, 2) <> 0 loop
      when mod(a, 2) == 1 then
        Binary[Res - b] := L.'1';
      end when;
      when mod(a, 2) == 0 then
        Binary[Res - b] := L.'0';
      end when;
      a := div(a, 2);
      b := b + 1;
    end while;
    Binary[Res - b] := L.'1';
  end if;
  if Res - b - 1 > 0 then
    for i in 1:Res - b - 1 loop
      Binary[i] := L.'0';
    end for;
  end if;
equation

end decoder;

i don't have a ideal to fixed. Should i write the initial algorithm or not?

mjuarez
  • 16,372
  • 11
  • 56
  • 73
  • There are some fundamental troubles in your code: `when` cannot be used in while loops. Dymola for example refuses to translate your model. Why did you use the `when` statements? Have you fully understood the difference of `when` and `if`? You should rewrite your algorithm section. Maybe your chattering troubles will vanish then also. – marco Jan 20 '20 at 06:37
  • that you very much. I change the when-statement to if statement and it does work. But can you tell my the different between 'if' and 'when' ? – Zhou Tianjing Jan 21 '20 at 10:29
  • See e.g. the answers to the questions https://stackoverflow.com/q/26149006/8725275 and https://stackoverflow.com/questions/20016232/difference-between-when-and-if-in-openmodelica – marco Jan 21 '20 at 12:06

0 Answers0