Where do the requirements for CPU clock accuracy come from?

General discussions about V-USB, our firmware-only implementation of a low speed USB device on Atmel's AVR microcontrollers
Post Reply
cpldcpu
Rank 2
Rank 2
Posts: 44
Joined: Sun Nov 10, 2013 11:26 am

Where do the requirements for CPU clock accuracy come from?

Post by cpldcpu » Wed Jan 22, 2014 11:35 pm

I am wondering about the reasoning behind the clock accuracy requirements of V-USB. After all, the USB specification itself is much more relaxed:

  • High speed data is clocked at 480.00Mb/s with a data signalling tolerance of ± 500ppm.
  • Full speed data is clocked at 12.000Mb/s with a data signalling tolerance of ±0.25% or 2,500ppm.
  • Low speed data is clocked at 1.50Mb/s with a data signalling tolerance of ±1.5% or 15,000ppm.
from http://www.beyondlogic.org/usbnutshell/usb2.shtml

The specified tolerance to clock deviation in the USB spec is +-1.5%, while even the "PLL" implementations of V-USB at 12.8Mhz and 16.5MHz require +-1%?

As I understand, there are two sources of timing errors:
a) Jitter caused by discrete sampling of the input during SYNC
b) A clock-rate divergence between host and client.

For the 16.5 Mhz implementation, each USB bit-time equals 11 clock cycles.

To sample the center of a bit, we would have to sample 6 cycles after the edge of the sync-pulse, which equals the total allowable clock-error margin.

a)
"waitForK" samples the input every two cycles. That means that up to one cycle jitter is introduced.
Furthermore, another cycle can be added due to the phase difference between cpu clock and host.

Therefore, jitter reduces the clock-error margin by two cycles to 4.

b)
Since the clocks are resynched at the beginning of every packet, only the clock deviation between the end of SYNC and the beginning of EOP (SE0) is relevant. EOP is two bits of SE0 and is therefore immune to single bit clock errors.
The maximum data packet payload for low-speed USB is 8 bytes. The total relevant packet length is therefore 1+8+2=11 bytes, or even only 9 bytes when the CRC16 is ignored.
This equals 88 bits. In the hypothical worst case all data is FF, which is practictally impossible. In that case 14 bits are "stuffed", resulting in a total maximum critical packet lengths of 102 bits or 84 without CRC.

102 bit-times equal 1122 CPU clock cycles. If a maximum of 4 clock cycles deviation is allowed, then the allowed CPU clock deviation is 4/1122=0.35%. When ignoring the CRC16, the maximum allowed deviation is 4/924=0.43%. Note that almost all V-USB projects ignore the CRC16.

So, my conclusion is that the allowed clock deviation is 0.35% or 0.43% at 16.5 Mhz for receiving data, and 1.5% for sending.

Did I miss anything?

From my experience, it is possible to adjust the internal RC oscillator of the newer ATtinies within these specs. I have tried serveral corner cases (ATtiny84 at 12Mhz, ATtiny10 at 12 Mhz, ATiny 85 at 12/16 MHz, ATtiny 841 at 16 MHz) and never found an obvious timing issue.

Of course, there are some potential pitfalls:
- It is also possible that the host does not use an accurate clock. In that case, both timing errors would add up in the worst case. However, this is not relevant when the RC osc. is calibrated from the keepalive pulses.
- The RC oscillator is not immune to long term drift. This is a serious issue and would either require an application that is only active for a short time (e.g. a bootloader) or continuous recalibration. The newest generation oscillators in the ATtiny 841 have a temperature compensated RC-oscillator which should be more stable.

cpldcpu
Rank 2
Rank 2
Posts: 44
Joined: Sun Nov 10, 2013 11:26 am

Re: Where do the requirements for CPU clock accuracy come fr

Post by cpldcpu » Thu Jan 30, 2014 8:36 am

No answer? Where did all the USB experts go? :(

dnhkng

Re: Where do the requirements for CPU clock accuracy come fr

Post by dnhkng » Tue Jul 22, 2014 7:13 pm

Could you post a blog on getting the ATTINY841 working with V-USB?

Thanks in advance!

blargg
Rank 3
Rank 3
Posts: 102
Joined: Thu Nov 14, 2013 10:01 pm

Re: Where do the requirements for CPU clock accuracy come fr

Post by blargg » Tue Jul 22, 2014 9:39 pm

So, my conclusion is that the allowed clock deviation is 0.35% or 0.43% at 16.5 Mhz for receiving data, and 1.5% for sending.


I recently did some experiments with a high-precision clock generator (AD9850) to see how much deviation was tolerated. I tested the 16MHz version of V-USB, and it worked reliably (receiving and sending data) from 15.94MHz to 16.04MHz, which is -0.375% to +0.25% error tolerance. I didn't test the 16.5MHz version with its PLL. I was going to eliminate the one-cycle jitter introduced by waitForK and wanted to measure the improvement it made.

cpldcpu
Rank 2
Rank 2
Posts: 44
Joined: Sun Nov 10, 2013 11:26 am

Re: Where do the requirements for CPU clock accuracy come fr

Post by cpldcpu » Wed Jul 23, 2014 5:45 am

dnhkng wrote:Could you post a blog on getting the ATTINY841 working with V-USB?


Actually that is not more complicated than getting it to work with an ATtiny85. You just have to include the osccal code (you can use my optimized osccalASM.s*) and set F_CPU to 12MHz or 12.8MHz. The latter results in larger code size, but may be a bit more reliable.

*Unfortunately obdev does not seem to accept submissions to V-USB anymore, but you can find the latest version of osccalasm.s with some fixes by blargg here: https://github.com/micronucleus/micronu ... sccalASM.S
Last edited by cpldcpu on Wed Jul 23, 2014 5:52 am, edited 4 times in total.

cpldcpu
Rank 2
Rank 2
Posts: 44
Joined: Sun Nov 10, 2013 11:26 am

Re: Where do the requirements for CPU clock accuracy come fr

Post by cpldcpu » Wed Jul 23, 2014 5:47 am

blargg wrote:
So, my conclusion is that the allowed clock deviation is 0.35% or 0.43% at 16.5 Mhz for receiving data, and 1.5% for sending.


I recently did some experiments with a high-precision clock generator (AD9850) to see how much deviation was tolerated. I tested the 16MHz version of V-USB, and it worked reliably (receiving and sending data) from 15.94MHz to 16.04MHz, which is -0.375% to +0.25% error tolerance. I didn't test the 16.5MHz version with its PLL. I was going to eliminate the one-cycle jitter introduced by waitForK and wanted to measure the improvement it made.


That's very interesting! Did you include CRC checking? If yes, then your numbers seem to be very close to the predicted deviation.

blargg
Rank 3
Rank 3
Posts: 102
Joined: Thu Nov 14, 2013 10:01 pm

Re: Where do the requirements for CPU clock accuracy come fr

Post by blargg » Wed Jul 23, 2014 8:49 am

Yeah, the VUSB code checked CRC of every packet, and I had the host send 1000 config packets and verify the one-byte reply they gave. As far as I could tell, I implemented single-cycle synchronization, but it only improved the margin by 0.01MHz in either direction. I put the project aside until I could practice writing single-cycle synchronization code in a test environment and become more confident that I am really writing code that does what I think it does. I think that the PLL versions (16.5MHz, 12.8MHz) are probably superior, since from what I gather they resynchronize to edges throughout the packet.

cpldcpu
Rank 2
Rank 2
Posts: 44
Joined: Sun Nov 10, 2013 11:26 am

Re: Where do the requirements for CPU clock accuracy come fr

Post by cpldcpu » Wed Jul 23, 2014 8:00 pm

Very interesting, so the theoretical limit of +-0.35% with CRC is spot on. It should be +-0.43% when the CRC of incoming packets is ignored.

An additional cycle of margin should increase the allowable clock deviation to 0.44% with CRC (5/1122). At 16MHz this is +-71kHz, while 0.35% equals +-57kHz. The expected improvement is therefore only ~14kHz more tolerance. An observed improvement of 10kHz seems to be well within accuracy limits.

Andersan
Posts: 1
Joined: Thu Oct 30, 2014 8:42 am

Re: Where do the requirements for CPU clock accuracy come fr

Post by Andersan » Thu Oct 30, 2014 8:57 am

This equals 88 bits. In the hypothical worst case all data is FF, which is practictally impossible. In that case 14 bits are "stuffed", resulting in a total maximum critical packet lengths of 102 bits or 84 without CRC.





__________________
You can easily check out our best quality 1z0-027 latest resource MICROSOFT We offer up-to-dated University of Alabama We provide updated Princeton Universitypaced test engine to help you pass University of Hawaii

Post Reply