I'm designing a custom radar PCB heavily based on the IWR6843ISK - I was planning on pretty much copying the antenna from that board since this is more of a test design for us rather than something being designed with specific requirements in mind.
I'm aware of the different kinds of topics to discuss with the PCB manufacturer, having looked at both the Hardware Checklist as well as the 'TI mmWave Radar sensor RF PCB Design, Manufacturing and Validation Guide' (SPRACG5). That being said, I was wondering what kind of testing is to be expected with the test coupon to ensure that the board is ok?
It seems a standard offering by the PCB fab houses is to put a coupon on the panel for us and use a test device to measure the characteristic impedance and determine how close to 50 ohm it is. However, that measurement isn't being done at the antenna operating frequency of 60-64GHz. One of the manufacturers I talked to said their standard measurement was at 1GHz.
After reading a little more on the subject, I've seen that a 50-ohm microstrip can be expected to show decrease in impedance from around 1 MHz to 10 MHz and stop decreasing at about 100 MHz. If that's the case, then I suppose the 1GHz measurement would be sufficient?
Is there any reason to look for any further validation of the PCB beyond this type of testing? I understand that other manufacturing tolerances should ideally be taken into account to determine absolute limits, as discussed in section 2.3.1 of the abovementioned fabrication guide, but as I mentioned that we don't have specific antenna requirements for this design, I'm wondering if this is good enough.