'PC timing industry standards'
Industry standard question for every professional out there!
I would like to keep my design theories current with industry trends.
We use high-speed instruments interfaced to PC's and as I redisign the
system, I am shooting for moving all timing-critical functions to an array
of PIC's and simply communicating with the PC to retreive data from the
Does this method of releaseing the PC from timing tasks jive with the rest
of you engineer's sense of design or not?
|In message <merc.rx.uga.edu> 3F609DC6DCBMITVMA.MIT.EDU writes: PICLIST
> Industry standard question for every professional out there!
> I would like to keep my design theories current with industry trends.
> We use high-speed instruments interfaced to PC's and as I redisign the
> system, I am shooting for moving all timing-critical functions to an array
> of PIC's and simply communicating with the PC to retreive data from the
> Does this method of releaseing the PC from timing tasks jive with the rest
> of you engineer's sense of design or not?
In our system, the PC is used as the user interface and data
logger. The time critical stuff is done in a seperate controller
with a Harris RTX Forth chip and an ADSP2111 DSP chip. There are
several PICs in the system including one whose only function is
to generate a 5kHz sine wave to excite LVDTs.
Our systems control machines that pull apart metals and other
materials in a controlled manner sometimes with several hundred
kN of force.
Some of the tests go on for several months with extensions of
the material being measured at sub-micron levels.
There was no way we were going to let dos or windose anywhere
near the contol loop. The controller will carry on with the test
should the pc, dos or windose decide to not play ball anymore.
Prior to this we had a custom made user interface with an LCD
graphic screen and a small keypad all run from the RTX. The move
to a pc and windose, whilst being customer driven, had the added
benefit of releasing the RTX from the display and keyboard
activity allowing more time for it to do what it was intended to
do - control the machine.
In developing my programmer, I went through a lot of the problems everybody
is talking about here (ie different delays for different hardware in
different PCs) and finally gave up.
Instead, I did just as you suggest and pass the data to the PIC and let the
PIC sort it out (actually, I use a Terminal Emulator download to provide the
delay for programming the PIC).
It works really well (never had a problem), is completely device independant
(we were screwing around at work and ended up running it off an SGI Origin
2000), and I didn't have to write any PC software.
Just my two cents,
"I don't do anything that anybody else in good physical condition and
unlimited funds couldn't do" - Bruce Wayne
More... (looser matching)
- Last day of these posts
- In 1997
, 1998 only
- New search...