26 Aug 2007

Continue investigating in the issue of counting the time for t1 and t2. Previously (15th july report) when the delay is set 1ms, the count is correct but in integer, so we cannot get the decimal places and hence cannot see the change in temperature. While after we change to delay in 1us, the count of t1 and t2 is only roughly 2ms and 5ms compared to the normal values ~9ms and 17ms. Which we guessed it might because of the delay time in statement cnt = cnt +1 is significant when the delay is 1us, and when I tried to output the count value to the screen to check whether the count is correct, the t1 and t2 time I got is even smaller, this shows, that all the statement in the count loop takes up a amount of time up to us, and the smaller the delay we put, the less accurate of t1 and t2 we would get.

To minimize this problem, I referred back to the datasheet of the temperature sensor and used the following figure:
As t1 does not change that much within my temp control range (22-40 degree), its changing might be 0.001ms/°c, and based on the computation capability of the program, we can not garantee for the accuracy when the delay is 1us, so we can make the delay for t1 slightly higher that it would be much closer to the accurate value. So for t2, the changing is 0.075ms/°c, we could set the delay to about 100 also to see how does that make the count significantly change, this is also because the change of value in t2 is much more significant than t1 and it would mainly decide the value of temperature.

From some quantitive tests and trying out different delays for counting t1 and t2, there are not a accurate answer. The dilemma is if to capture the change of 0.5 degree, the delay has to be as small as 36ms for t2; while when delay is small, the time took for count statement become significant and hence cause error in the total value we counted. And if we want the count statement's time is insignificant to the total time t1 or t2, we have to set the delay time big, and hence will not be able to capture the change in 0.5degree.

The only method that could solve this problem I think so far is either change to use to timer to count, this is because the timer module in pic counts in the background and is possible for us to get the actual time; Another method is to stick to the counting method I used in the program, but just to choose the delay time that could make t2 as close to the to real value possible which is 36us so that it could catch 0.5 degree change of temperature.

The following is some reference I got in order to calculate the time elapsed:
16F77 every instruction cycle consists of 4 osillator periods, for an oscillator frequency of 4 MHz, this gives a normal instruction execution time of 1 μs. During setup_timer1(), if using the prescaler. Referencing from CCS manual, with an internal clock at 20mhz(seems referring to the oscillator frequency here) and with the T1_DIV_BY_8 mode, the timer will increment every 1.6us. It will overflow every 104.8576ms. => internal clock frequency is same as osillator frequency.

Fosc is the oscillator (crystal) clock rate = 20mhz in my case
instruction cycle time = 4/Fosc
Internal clock frequency = Fosc/4

oscillator crystal (external clock input)

prescaler is to make the oscillator clock cycle multiply by that divided-down value

For my case, 20mhz clock will have a instruction time of 1/5mhz, and choose the T1_div_by_2, I will have the instruction cycle (1/5m)*2 = 0.4us, 65536 count will have the elapse time roughly = 26ms

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-Share Alike 2.5 License.