IRIG standards (Inter-Range Instrumentation Group) define a set of timestamp broadcast parameters for synchronising devices over a limited network. Each standard defines a frame format and a transmission method (carrier frequency, modulation, etc). Not all standards have the same resolution and are able to obtain the same accuracy.
Accuracy and resolution?
Here is the definition of these two concepts:
- The resolution of a time synchronisation protocol is the smallest unit of time it can measure.
- The accuracy of a time synchronisation protocol is the maximum difference between a network clock and the reference clock.
These two concepts enable to define the quality of time synchronisation on a network. These are independent concepts: it is indeed possible to have high resolution but low accuracy and vice versa. For example, a protocol could send timestamps expressed in whole seconds while having an accuracy of nanoseconds or send timestamps containing microseconds but with an accuracy of seconds. Ideally, it is better to optimise both parameters simultaneously in order to achieve the best possible time synchronisation.
For a given network, these two factors depend not only on the protocol selected but also on the hardware used and its configuration. These metrics can be improved by better protocol settings or by changing or better configuring the hardware.
Resolution of IRIG standards
Since IRIG is a set of standards, there is no IRIG resolution. Each standard is defined by a letter or format (A, B, D, E, G or H) followed by three numbers. The second number determines the resolution of the standard. This resolution can range from 1 microsecond to 1 minute.
Value of the second number | Resolution |
---|---|
0 | Variable (see below) |
1 | 10 ms |
2 | 1 ms |
3 | 0.1 ms |
4 | 10 s |
5 | 1s |
If the second number is 0, then the resolution is equal to the interval between two index counts. This interval ranges from 0.1 milliseconds for the IRIG E standard to 1 minute for the IRIG D standard.
In addition to the protocol specifications, the reference clock will be very important in defining the resolution. If the reference clock has a lower resolution than that of the IRIG standard, then the final resolution will be that of the clock.
Accuracy of IRIG standards
The accuracy of IRIG standards depend on several factors. Indeed, the quality of the reference clock at the start of the chain is crucial for this matter. If synchronisation with a time source that is not very accurate is carried out, then it will be impossible to improve the accuracy afterwards.
To guarantee the highest possible accuracy, it is important to avoid anything that could introduce latency in communications. It is then required to minimise electromagnetic interference to which the network is exposed. It is also recommended to check both the hardware and the cabling regularly to make sure that these are not damaged.
To measure accuracy, it is necessary to compare the time of the reference clock to that of the clock on the machines in the IRIG network. This enables to make sure that these are correctly synchronised. With IRIG standards, timestamps are broadcast continuously over the network, but not necessarily at high speed, which can cause a clock to drift.
It is then very important to measure the network accuracy regularly to make sure that the accuracy meets expectations.
To conclude
To obtain the best accuracy and resolution for a network, it is important to choose the right IRIG standard and implement it correctly. To do so, network equipment must be configured correctly in order to introduce as little delay as possible. It is also required to monitor a network very regularly to make sure there are not unexpected delays.
With over 150 years of expertise in time management and present in more than 140 countries, Bodet Time is a major French leader in time synchronisation and time frequency. The range of Netsilon time servers offers generators and receivers in IRIG coded time.