- In Simulink, global signals such as clock, clock enable and reset are not explicitly modeled. Instead, they are created during code generation. You represent clock cycles in a Simulink model using sample time.
- For a single-rate model, 1 time step in Simulink maps to 1 clock cycle in HDL. You can use a relative mapping (e.g. a sample time of 1 = 1 HDL clock cycle) or an absolute mapping (e.g. a sample time of 10e-9 = one 10 ns clock cycle in HDL), depending on your preference and design requirement.
- For a multi-rate model, the fastest sample time maps to 1 clock cycle in HDL.
- Some optional optimization settings (e.g. sharing factor) and alternative block architecture (e.g. NewtonRaphson square root) introduce additional sample rates not present in the original model. In those cases, the fastest generated sample time is mapped to 1 HDL clock.
- Tip: Define sample time and ratio using MATLAB variables (e.g. Ts = 10e-9, upsamp = 4). This makes it easy to change all sample time settings quickly.
- If you model your sample time to be equal to the actual clock speed, that does not mean it will achieve this clock speed during RTL synthesis and implementation. Delays from logic gates and wires get introduced in synthesis and implementation, and the slowest path will dictate your fastest achievable clock frequency.
- HDL Coder can estimate some of these delays and provide direct feedback from RTL synthesis
Slow simulation time ins simulink.
1 次查看(过去 30 天)
显示 更早的评论
Hi All,
I have been working on simulating HDL logic on simulink, I am trying to simulate a 32 bit counter @ 250Mhz clock rate. To achieve this I am using HDL optimized counter which allows me to set bits and sample time. I am setting world length to 32 bits and sample time as 1/250e6, I am using step size as 5000 and simulation stop time as 1 second. I am experiencing extremely slow simulation time and need assistance with that.
I understand that using such a big counter with extremely small sampling time would obviously make the simulation slower, but I am trying to find a work around to this problem.
How can I simulate this without significant performance loss?
Is there a way to simulate HDL in any other way?
Any suggetions would be extremely valuable.
Thanks in advance.
Aditya
0 个评论
采纳的回答
Kiran Kintali
2020-11-16
编辑:Kiran Kintali
2020-11-16
“How do I model the clock signal?” – is a question frequently asked by hardware engineers who are new to using Simulink and HDL Coder. Here’s how it works:
更多回答(0 个)
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!