Searching \ for 'Introduction, Questions...' in subject line. ()
Make payments with PayPal - it's fast, free and secure! Help us get a faster server
FAQ page: techref.massmind.org/techref/index.htm?key=introduction+questions
Search entire site for: 'Introduction, Questions...'.

Truncated match.
PICList Thread
'Introduction, Questions...'
1998\02\12@185757 by Jon Hylands

flavicon
face
Hello all,

I have recently subscribed to this list, and thought I would introduce
myself and ask a couple of questions.

Anyway, here are my questions:

- Has anyone used the 17C756? Comments? I plan on using one as the
main CPU in my project (MicroSeeker - see my sig for details).

- Microchip seems to claim in the data sheet for it that the 17C756
*does* handle I2C master mode in hardware. Is that the case, or is it
like the 16-series chips and only does slave mode in HW?

- For my project, I am planning on using a handful of 12C509's to
handle interfacing with non-I2C hardware (like the compass and the
FLASH memory). I will need to do a software implementation of an I2C
slave on this processor. I have heard this is a problem, but I'm not
sure why. Given that these chips will be dedicated to doing this, is
there some other problem I am not aware of?

Thanks in advance...

Later,
Jon

--------------------------------------------------------------
  Jon Hylands      spam_OUTJonTakeThisOuTspamhuv.com      http://www.huv.com/jon

 Project: Micro Seeker (Micro Autonomous Underwater Vehicle)
          http://www.huv.com

1998\02\13@084109 by Keith Howell

flavicon
face
Dear Jon,

I have not used the 17C756
Project MicroSeeker sounds neat.

Regarding I2C, arbitration is done on a bit by bit basis so
the CPU needs to supervise each bit's progress. So you don't gain
much by having a byte-to-bit hardware interface because you'd only
have to write software to manage that.

An asynchronous UART is worth its silicon because it handles
time-critical stuff like baud rate generation, and the data
is sent out over relatively long time periods.
An I2C data transfer is in the order of microseconds, and is
far less time critical, so masters would not gain much.

Anyone could legitimately claim to have an I2C interface
if they have only got a pair of open-collector lines.

If Microchip _have_ got I2C master duty done in byte-oriented
hardware (which I doubt), this is no big bonus.

Having an I2C _slave_ interface done in hardware _is_ worth
having in silicon, because it will sit around and transfer
whole bytes before interrupting the CPU, instead of the CPU
having to watch and respond to the I2C signals within
5 microseconds.

> I will need to do a software implementation of an I2C slave
> I have heard this is a problem, but I'm not sure why.

Because the Microchip documentation and example code is crap.
Which is pretty shameful for a product conceived around
25 years ago. They don't go into it in anywhere like
adequate detail.

I have actually got my PIC16C65 working as both I2C slave
and master, after much effort. I can send you the e-mails
on the subject rather than repeat them here.
This will save you a lot of time consuming headaches.

Changing subject, tell us about your cool app?
Is it going to be braving the North Sea (been there done that)
or just retrieving your soap from the bottom of your bath? :-)

1998\02\13@114303 by John Payson

picon face
> Regarding I2C, arbitration is done on a bit by bit basis so
> the CPU needs to supervise each bit's progress. So you don't gain
> much by having a byte-to-bit hardware interface because you'd only
> have to write software to manage that.

It would be possible for master-mode I2C hardware to handle the arbitration.
In certain situations this would allow faster throughput than would otherwise
be available since the CPU wouldn't have to spend many cycles processing each
bit.

On the other hand, consider as well that 98+% of I2C projects don't need any
arbitration nor handshaking since there's only a single master and all the
slaves can clock at full I2C speeds.

> An asynchronous UART is worth its silicon because it handles
> time-critical stuff like baud rate generation, and the data
> is sent out over relatively long time periods.
> An I2C data transfer is in the order of microseconds, and is
> far less time critical, so masters would not gain much.

For something like a PIC-basic-style interpreter, an I2C hardware interface
could allow for an enormous performance gain since the interpreter could
start fetching each byte while the previous one was being read in.  It would
thus be possible to save more than 30 cycles for each fetched byte.

> Anyone could legitimately claim to have an I2C interface
> if they have only got a pair of open-collector lines.

Not for slave mode; that requires circuitry to detect and latch start/stop
transitions, and to latch the clock low under certain conditions (i.e. data
which might be for that slave) without CPU intervention.  Note that I2C
hardware need not be byte-oriented; Philips (creator of I2C) has some micros
with bit-oriented I2C stuff to latch appropriate states, etc.

> If Microchip _have_ got I2C master duty done in byte-oriented
> hardware (which I doubt), this is no big bonus.

Depends upon the application.  The gain from incorporating the full I2C spec
as opposed to the commonly-used subset might be fairly small, though.

> Having an I2C _slave_ interface done in hardware _is_ worth
> having in silicon, because it will sit around and transfer
> whole bytes before interrupting the CPU, instead of the CPU
> having to watch and respond to the I2C signals within
> 5 microseconds.

Having something in silicon to latch state transitions is essential; having
a byte buffer is convenient, but not as essential.

> > I will need to do a software implementation of an I2C slave
> > I have heard this is a problem, but I'm not sure why.
>
> Because the Microchip documentation and example code is crap.
> Which is pretty shameful for a product conceived around
> 25 years ago. They don't go into it in anywhere like
> adequate detail.

Good code is tricky; the problem IMHO is probably in part that nobody looks
at Microchip application examples because they're buggy, and nobody at Micro-
chip feels like fixing them since nobody looks at them anyway.

1998\02\13@132524 by Jon Hylands

flavicon
face
On Fri, 13 Feb 1998 13:19:26 +0000, Keith Howell <.....keithhKILLspamspam@spam@ARCAM.CO.UK>
wrote:

> Regarding I2C, arbitration is done on a bit by bit basis so
> the CPU needs to supervise each bit's progress. So you don't gain
> much by having a byte-to-bit hardware interface because you'd only
> have to write software to manage that.

According to the 17C75X data sheet:

> A typical transmit sequence would go as follows:
> 1. The user generates a Start Condition by setting the START enable bit (SEN) in SSPCON2.
> 2. SSPIF is set. The module will wait the required start time before any other operation takes place.
> 3. The user loads the SSPBUF with address to transmit.
> 4. Address is shifted out the SDA pin until all 8 bits are transmitted.
> 5. The SSP Module shifts in the ACK bit from the slave device, and writes its value into the SSPCON2 register ( SSPCON2<6>).
> 6. The module generates an interrupt at the end of the ninth clock cycle by setting SSPIF.
> 7. The user loads the SSPBUF with eight bits of data.
> 8. DATA is shifted out the SDA pin until all 8 bits are transmitted.

So I assume this is all doing something for you. It appears like the
hardware is taking care of all the bit-level management, and you only
have to give it bytes when it tells you to. Of course, I may be
completely out to lunch here :-) It looks like the HW also does full
multi-master arbitration for you.

> An asynchronous UART is worth its silicon because it handles
> time-critical stuff like baud rate generation, and the data
> is sent out over relatively long time periods.

Well, the 17C756 has two of them, with independent baud rate
generators. Unfortunately, right now it looks like I won't be using
them. I may end up changing how I do things, though...

> Changing subject, tell us about your cool app?

Well, I'm basically building a very small autonomous underwater
submarine. I'm using the big PIC (17C756) as the main CPU, and it will
talk to a bunch of other small PICs over an I2C bus. Each of the other
PICs will have the responsibility of talking to one hardware device,
and providing data to the master when requested on the I2C bus.

> Is it going to be braving the North Sea (been there done that)
> or just retrieving your soap from the bottom of your bath? :-)

I doubt this one will be braving the North Sea, but I will be testing
it in swimming pools at first, and then hopefully (if it works) lakes.

The next one will be larger, and it *will* be braving the North Sea
:-)

Later,
Jon

--------------------------------------------------------------
  Jon Hylands      JonspamKILLspamhuv.com      http://www.huv.com/jon

 Project: Micro Seeker (Micro Autonomous Underwater Vehicle)
          http://www.huv.com

More... (looser matching)
- Last day of these posts
- In 1998 , 1999 only
- Today
- New search...