Re: RT Testing
Daniel Wagner <wagi@...>
Hi Chris,
On Tue, Jan 14, 2020 at 05:01:54PM +0000, Chris Paterson wrote: Hello Pavel, Hayashi-san, Jan, Daniel,Welcome on board :) I've created an RT configuration for the RZ/G1 boards:I am using merge_config.sh to build the configuration. There are some catches but generally it works good. I've hacked a small tool for automization around it. With this the configuration is allways generated from scratch using kconfig. Built it with linux-4.4.y-cip-rt and run cyclic test:Without load, it's not that interesting. Currently there is an issue with the way that the cyclic test caseMy current test suite for LAVA contains these here: rt_suites = ['0_jd-hackbench', '0_jd-compile', '0_jd-stress_ptrace', '0_cyclicdeadline', '0_cyclictest', '0_pi-stress', '0_pmqtest', '0_ptsematest', '0_rt-migrate-test', '0_signaltest', '0_sigwaittest', '0_svsematest'] That means that the test parsing now depends on Python, which isn't included in the cip-core RFS [1] that is currently being used.Sorry about that. I am changed this so that the test are marked failed if a value is seen higher as the maximum. Again, I am using my hack script to read out the results from the test and use coloring for it: $ srt-build c2d jobs result 0_jd-hackbench t1-max-latency : pass 11.00 0_jd-hackbench t1-avg-latency : pass 2.37 0_jd-hackbench t1-min-latency : pass 1.00 0_jd-hackbench t0-max-latency : pass 10.00 0_jd-hackbench t0-avg-latency : pass 2.31 0_jd-hackbench t0-min-latency : pass 1.00 0_jd-compile t1-max-latency : pass 14.00 0_jd-compile t1-avg-latency : pass 3.37 0_jd-compile t1-min-latency : pass 1.00 0_jd-compile t0-max-latency : pass 14.00 0_jd-compile t0-avg-latency : pass 3.37 0_jd-compile t0-min-latency : pass 1.00 0_jd-stress_ptrace t1-max-latency : pass 7.00 0_jd-stress_ptrace t1-avg-latency : pass 2.19 0_jd-stress_ptrace t1-min-latency : pass 2.00 0_jd-stress_ptrace t0-max-latency : pass 9.00 0_jd-stress_ptrace t0-avg-latency : pass 2.22 0_jd-stress_ptrace t0-min-latency : pass 2.00 0_cyclicdeadline t1-max-latency : fail 3462.00 0_cyclicdeadline t1-avg-latency : pass 1594.00 0_cyclicdeadline t1-min-latency : pass 1.00 0_cyclicdeadline t0-max-latency : fail 3470.00 0_cyclicdeadline t0-avg-latency : pass 1602.00 0_cyclicdeadline t0-min-latency : pass 8.00 0_cyclictest t1-max-latency : pass 11.00 0_cyclictest t1-avg-latency : pass 2.00 0_cyclictest t1-min-latency : pass 1.00 0_cyclictest t0-max-latency : pass 13.00 0_cyclictest t0-avg-latency : pass 3.00 0_cyclictest t0-min-latency : pass 1.00 0_pi-stress pi-stress : pass 0.00 0_pmqtest t3-2-max-latency : pass 13.00 0_pmqtest t3-2-avg-latency : pass 4.00 0_pmqtest t3-2-min-latency : pass 2.00 0_pmqtest t1-0-max-latency : pass 23.00 0_pmqtest t1-0-avg-latency : pass 4.00 0_pmqtest t1-0-min-latency : pass 2.00 0_ptsematest t3-2-max-latency : pass 11.00 0_ptsematest t3-2-avg-latency : pass 3.00 0_ptsematest t3-2-min-latency : pass 2.00 0_ptsematest t1-0-max-latency : pass 14.00 0_ptsematest t1-0-avg-latency : pass 3.00 0_ptsematest t1-0-min-latency : pass 2.00 0_rt-migrate-test t2-p98-avg : pass 11.00 0_rt-migrate-test t2-p98-tot : pass 581.00 0_rt-migrate-test t2-p98-min : pass 9.00 0_rt-migrate-test t2-p98-max : pass 28.00 0_rt-migrate-test t1-p97-avg : pass 13.00 0_rt-migrate-test t1-p97-tot : pass 654.00 0_rt-migrate-test t1-p97-min : pass 8.00 0_rt-migrate-test t1-p97-max : pass 34.00 0_rt-migrate-test t0-p96-avg : pass 19213.00 0_rt-migrate-test t0-p96-tot : pass 960652.00 0_rt-migrate-test t0-p96-min : pass 13.00 0_rt-migrate-test t0-p96-max : pass 20031.00 0_signaltest t0-max-latency : pass 29.00 0_signaltest t0-avg-latency : pass 8.00 0_signaltest t0-min-latency : pass 4.00 0_sigwaittest t3-2-max-latency : pass 16.00 0_sigwaittest t3-2-avg-latency : pass 4.00 0_sigwaittest t3-2-min-latency : pass 2.00 0_sigwaittest t1-0-max-latency : pass 24.00 0_sigwaittest t1-0-avg-latency : pass 4.00 0_sigwaittest t1-0-min-latency : pass 2.00 0_svsematest t3-2-max-latency : pass 13.00 0_svsematest t3-2-avg-latency : pass 4.00 0_svsematest t3-2-min-latency : pass 2.00 0_svsematest t1-0-max-latency : pass 13.00 0_svsematest t1-0-avg-latency : pass 4.00 0_svsematest t1-0-min-latency : pass 2.00 0_smoke-tests linux-posix-lsb_release: pass 0_smoke-tests linux-posix-lscpu : pass 0_smoke-tests linux-posix-ifconfig: pass 0_smoke-tests linux-posix-vmstat : pass 0_smoke-tests linux-posix-uname : pass 0_smoke-tests linux-posix-pwd : pass 0_smoke-tests linux-posix-lsb_release: pass 0_smoke-tests linux-posix-lscpu : pass 0_smoke-tests linux-posix-ifconfig: pass 0_smoke-tests linux-posix-vmstat : pass 0_smoke-tests linux-posix-uname : pass 0_smoke-tests linux-posix-pwd : pass 0_smoke-tests linux-posix-lsb_release: pass 0_smoke-tests linux-posix-lscpu : pass 0_smoke-tests linux-posix-ifconfig: pass 0_smoke-tests linux-posix-vmstat : pass 0_smoke-tests linux-posix-uname : pass 0_smoke-tests linux-posix-pwd : pass 0_smoke-tests linux-posix-lsb_release: pass 0_smoke-tests linux-posix-lscpu : pass 0_smoke-tests linux-posix-ifconfig: pass 0_smoke-tests linux-posix-vmstat : pass 0_smoke-tests linux-posix-uname : pass 0_smoke-tests linux-posix-pwd : pass Trying to figure out from the web interface of LAVA is a bit combersome. Do either of the CIP Core profiles include Python support?See above. Which of the above would be valuable to run on CIP RT Kernels?Basically you want to run all of the tests in rt-tests. A while back Daniel Wagner also did some work on a JitterdebuggerYeah, I was holding a bit back until I am happy with my setup and workflow. One of the major limitations with the current test-definitions is the difficulties to setup background workload. To make it short, I am not too happy with my current version but it works. Is anyone able to provide RT config/defconfigs for the x86 and armIf you are interested in my configs I have configs for ARMv7 (bbb), ARMv8 (RPi3) and x86_64 via my hacktool. It also builds the kernel because I didn't setup kernelci for it, so in short I get a complete config, build, test setup via 'srt-build bbb lava' Thanks, Daniel
|
|