Hi
I'm experimenting with the TMDSPLCKIT-V3 PLC kits by connecting through 100m of RG58 coaxial cable as a model power line and injecting white gaussian noise with a signal generator. I was hoping to see the response of the Packet error rate and Bit error rate at different modulation schemes to increasing the noise levels but the results from the GUI is never more than 0.00 PER when connected or 100% when disconnected (which is the spike at 200 in fig 3).
I've tried isolating the data packet on an oscilloscope (fig 1) and flooding the cable with noise (fig 2) which I'm sure should be causing some errors but is not shown in the GUI. In fact injecting large amounts of noise seems to have very little effect on SNR which should definitely have noticeable variances.
Could anyone tell me why this might be the case? Or under what conditions would the zero configuration GUI will read an actual PER and BER other than 0 or 100%?
Many Thanks!


