This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DSP/BIOS: Pros and Cons?

Anonymous
Anonymous

Hi All,

 

I would like to ask a beginner's question on DSP/BIOS (or SYS/BIOS):

 

Are they "real-time operating system"s? For anything to be called an OS, its defining functionality is scheduling, and probably also inter-process communication (IPC). Is this all DSP/BIOS about?

 

The use of additional part to an original simple program will inevitably introduce extra code, and I don't think DSP/BIOS can be the exception. If I am able to handle interrupt satisfactorily by using the simple vectors.asm and linker.cmd, is there any incentive for me to switch to DSP/BIOS? When I am using these simple, although initially awkward files, I have full knowledge of the running of the program to assembly code level, and their behavior are deterministic to me. Will DSP/BIOS change this? If I am using merely several lines to assembly code to trigger an ISR:

part of vectors.asm said:

INT4:
  b _my_isr
  NOP
  NOP
  NOP
  NOP
  NOP
  NOP
  NOP

 

If I switch to DSP/BIOS, will it introduce redundant code? Will it make the program's behavior non-deterministic to me because of my lacking of knowledge on this mini operating system? If without DSP/BIOS, my mathematical calculation assures me that the CPU's computation power is enough to handle my computational complexity, will DSP/BIOS change this? To how large an extent (percentage)?

And I would also like to get a statistics from TI experts.

  1. Among DSP users, how large a percentage are using DSP/BIOS, and how may are using "naked" code (c/assembly source + vectors.asm + linker.cmd)? Is there a precise or rough figure?
  2. For ARM users, what is the figure ("naked" code VS OS such as embedded linux)?

 

The last question sounds like a dilemma:

  1. For those choosing DSP/BIOS, is it because that it simplified their work?
  2. For those not choosing DSP/BIOS, is it because many of them find it difficult to get started, despite its purpose of design is to make things simpler?

 

And how many of choosing not to use OS are due to concerns on "deterministic / controllable behavior" and "deterministic computational complexity" ? Is there a precise figure?

 

 

Sincerely,

Zheng

 

 

  • Zheng Zhao,

    The reason that we have developed DSP/BIOS is to allow you to get a system running as quickly as possible. Most of my customers use DSP/BIOS on their DSP cores, even when they are using another OS on another core (such as Linux on a co-existing ARM core).

    There is overhead associated with using DSP/BIOS, but it can be reduced by controlling the portions that are used and that have memory allocated.

    If you have gone through the process of getting your program running without DSP/BIOS, there is no need to go back to insert DSP/BIOS. But for a new project, it is usually easier and quicker to use DSP/BIOS. I am not yet familiar with SYS/BIOS, but the impact should be the same - easier to get started and to get your program running.

    Go through the training material at TMS320C64x+ DSP System Integration Workshop using DSP/BIOS to understand the features and benefits of using DSP/BIOS.

    Determinism is a vague concept for a cache-based system with interrupts. DSP/BIOS does not insert additional non-deterministic behavior, and can in fact improve the determinism of your debug code by the use of LOG_printf rather than printf.

    Regards,
    RandyP

     

    If this answers your question, please click the  Verify Answer  button below. If not, please do not, but reply back with more information.

  • Anonymous
    0 Anonymous in reply to RandyP

    Dear Randy,

     

    RandyP said:

    Determinism is a vague concept for a cache-based system with interrupts. DSP/BIOS does not insert additional non-deterministic behavior, and can in fact improve the determinism of your debug code by the use of LOG_printf rather than printf.

     

    I understand that purely determinism is hard to define. I need to study more on this.

     

     

    Sincerely,

    Zheng

     

     

     

     

     

     

     

  • Zheng,

    An RTOS like SYS/BIOS  provides an appropriate  abstraction to manage low level system resources. SYS/BIOS is extremely modular and this allows you to choose only the facilities  required by your application (like threading services, memory management etc.) which in turn keeps your memory footprint down. The implementation has been done to meet real time deadlines for applications. From a development perspective you may find  it useful to  reuse the SYS/BIOS services rather than reinvent the wheel on issues like dynamic memory allocation. This will certainly reduce your development time and also allow you to focus more on your application.

    SYS/BIOS is a free RTOS and the code is released under open source. This is great if you are interested in thumbing through the kernel sources.

    Another great benefit that I see from using an RTOS like SYS/BIOS is the availability of quality runtime visualization tools like ROV and RTA. These tools are invaluable in finding bugs and tuning the performance of your application. 

    To get started I would suggest reading the SYS/BIOS users guide and then use CCSv4 to create some simple SYS/BIOS applications.  

    Regards

    Amit

     

  • Anonymous
    0 Anonymous in reply to AmitMookerjee

    Dear Amit,

    I am very sorry for failing to notice the answer, there is probably something wrong with the Outlook or I have mistakenly marked the email notification as "read" before opening it.

    Thanks very much for the detailed answer and I might ask some other questions in new threads.

     

    Regards,

    Zheng

     

  • Anonymous
    0 Anonymous in reply to RandyP

    Dear Randy,

    RandyP said:

    Determinism is a vague concept for a cache-based system with interrupts.

    What happens if cache are disabled? Can they be disabled?  I have found that in creating a custom RTSC platform the default values of L1D, L1P and L2 cache are all zero (please see this post), can this work? If the code/EVM could work when caches are all set to zero, plus disabling all interrupts, do we then have the strict determinism here?

     

     

    Zheng

  • Zheng Zhao said:
    What happens if cache are disabled? Can they be disabled?  I have found that in creating a custom RTSC platform the default values of L1D, L1P and L2 cache are all zero (please see this post), can this work?

    Caches can be disabled as you have shown above. By setting the cache size to 0KB, the cache will be turned off. But this will slow down your performance a lot, unless you put all program and data in L1P and L1D.

    Zheng Zhao said:
    If the code/EVM could work when caches are all set to zero, plus disabling all interrupts, do we then have the strict determinism here?

    Almost. Your CPU clock is generated by a PLL that is based on a crystal source. The CPU clock has no fixed phase relationship to the input clock source, and this would also be true if you supplied a direct full-speed clock at the input (because of internal delays that vary over voltage and temperature and from one device to another).

    If you mean only that the number of cycles to execute a linear list of instructions, then yes, you will have determinism by disabling cache and interrupts. Maybe peripherals need to be avoided, too, but that may depend on what you want your system to do.

    Determinism is a very high price to pay for giving up the benefits of a cache-based, interrupt-driven complex system. The timers can be used to synchronize interaction with the peripherals, if needed, and the timer pulses can be at a very fixed rate. Unfortunately, timer interrupts and even timer-based EDMA events will not give you single-cycle accuracy.

    Why do you seek "strict determinism"?

  • Anonymous
    0 Anonymous in reply to RandyP

    Dear Randy,

    However sophisticated the logic is, essentially all processors, as well as their constituent components, are silicon implementation of discrete state machines. For any particular given sequence of input, all intermediate state transitions, consequently the processing time, and eventually the output result, are all determined as a causal chain of consequences which is an a priori existence before the real execution. It is therefore like a "time-invariant" system: no matter the input is given today or a year later, as long as the environment (chip, memory, etc.) is initialized to the same state, every step will be the same like the playback of a recorded frame sequence.

    Cache is processor's immediate surroundings and the cache management unit is like her handmaiden, which adds an intermediate logistic layer for improved performance. However, cache management unit itself is still a state machine and its introduction only results in a more efficient one, but still a state machine. It is therefore will still have the same a priori deterministic and "time-invariant" property the same as a system without cache. This is why I couldn't agree fully with your "vague concept" description.

    In real application, of course we will not always have the same repeated input. However, if the input is already known beforehand (for example, the HD Avatar film), and assuming no external interrupt (button press, etc.), then the output can still be predicted before the execution.

     

    Sincerely,
    Zheng

  • Zheng Zhao,

    Three comments I have on the excellence of your statements above:

    1. This is a very true and clarifying statement to the nature of determinism in digital logic. It shows your detailed knowledge of digital logic and its behavior. And it is only a small stretch to say that it follows the logic in the quote that "insanity is doing the same thing over and over and expecting different results" (attributed to Ben Franklin, Albert Einstein, and Rita Mae Brown).

    2. Under this understanding of determinism, the simple and correct answer to your original question on whether DSP/BIOS hurts the determinism of your system is, "no, it does not".

    3. You also have a simple logical step from the above to the willingness to use a fully cache-enabled system, which will improve performance substantially and will not affect determinism per these statements.

    Regards,
    RandyP

  • Anonymous
    0 Anonymous in reply to RandyP

    Dear Randy,

     

    It is simply great to get your confirmation on 2 :)

     

    A bit further, I think this reproducibility is actually an advantage of digital (binary) system over analog system. In the physical world we will always have errors, but as long as it is within the error margin (in voltage, etc.) of logic gates, we would still get the correct 0 or 1 result.

     

    To set the status of a discrete state machine in this sense is hence much easier than for a continuous causal system, since for them we have thermal noise, EMI and others factors that are hard to manage. Errors in a continuous system can propagate, be amplified and affect a much larger area, but for binary system we can almost always eliminate small errors (within the margin) at the first logic gate.

     

     

    Sincerely,

    Zheng

     

  • Anonymous
    0 Anonymous in reply to RandyP

    Dear Randy,

    You are still correct with the "vague determinism" characterization.

    In different context the word "determinism" has different meanings. What I meant in my initial post belongs more properly to the analysis of complexity on a hypothetical machine with a uniform memory (equal access speed to all parts) rather than a hierarchical structure, so the analysis is relatively simple.

    The introduction of memory hierarchy and caching mechanism add another intermediate layer between CPU and DDR2 memory. The cache management unit has its own algorithm to make decisions based on the situation (percentage of cache that has been filled, etc.), and this will make it more complicated for the analysis than the previous simplistic model. Some instructions and memory reference might be done completely by mapping to the cache's copy of memory content, but many times not, so additional load and wait time could be incurred. This would definitely complicate the analysis and whether "absolute determinism" is guaranteed would be harder to judge than for uniform memory case (I had a look at the first volume of TAOCP by Dr. Knuth, page 126, section 1.3.1, and it seems that his hypothetical MIX machine contains three types of memory: 1) register 2) memory cells 3)magnetic tape units. There doesn't appear to be a cache layer.).

    Thanks very much for all the replies in this thread which helped me to truly understand the use of cache.


    Zheng

  • Anonymous
    0 Anonymous in reply to RandyP

    Dear Randy,

    I have found that there could be a number of complications when cache is enabled and they all made program's behavior difficult to predict. Could you have a look at this post Memory hierarchy related questions?

     

    Zheng

  • Why do you care about prediction?

  • Anonymous
    0 Anonymous in reply to RandyP

    I just want to know these. They are important.

     

    Zheng