Now that we have the
equipment list on mind we will start programming the PIC. The
softwares that I will be using for this project include 'Proteous'
and 'MicroC pro for PIC'. To understand the coding a prior knowledge
in C will be needed. I will try to elaborate the best I can.
Time switch is a
product which deals with time. So we must have a mechanism to measure
or count time. If we have that, then the rest of the project is just
some coding. How can we count a second in PIC16F628A? Or what are the
methods available?
- We can use the command 'delay_ms(1000);' in MicroC to generate a delay of 1000 milliseconds.
- We can use Timer modules in the PIC and use its overflow interrupts to count time.
Method 1:
This
is straight forward and easy. What actually happens inside the PIC
when we execute this command will give us a better insight of its
capabilities.
You
should be able to understand the PIC architecture in order to
interpret any command.
Figure
1: How a delay is generated in a PIC.
Above flowchart explains a method to generate a delay in
a micro-controller. By using appropriate values for A, B and C
variables we can generate a delay. But in this method the processor
will always be busy updating registers so that the user will not be
able to change any configuration without interrupting the delay
process. That means the processor will not be able to multi-task
without interrupting the delay. Therefore I prefer not using this
method.
Method 2: [Please go through PIC16F628A
datasheet]
To use this method you must know about the 'Timer'
modules available in the PIC. There are 3 modules 'Timer0, Timer1 and
Timer2'.
'Timer0' is a 8bit register and it can be connected to
the device clock with or without a pre scalar. When ever there is an
overflow in the Timer0, it will generate an interrupt. By handling
interrupts we can easily do the job. The user will also able to
interact with the system through multi-tasking.
That is Timer0 will overflow every 51us.
(system clock = 20MHz, instruction clock = 5MHz,
255/5MHz = 51us)
So it will overflow approximately 19600 times per
second.
(1 / 51us = 19600)
Let us assume that the interrupt handling will take
10us. So all together 61us ;(51+10), will be consumed per overflow
which gives us around 16400 overflows per second. This will give user
a window of 0.84seconds to do interactions with the device and it
will not interrupt with the time counting processes.
(51us * 16400 = 0.84seconds)
If this sounds Greek. I am truly sorry, but going
through the datasheet and understanding the micro-controller will
solve the problem. In the TS device I will use the Method 2 to count
seconds.
Next step will be coding. :)


