'No clue [LONG]'
>Can anyone help me or get me started in the right direction?
Maybe I can lend some insight. To me, programming is like making a "little
black box" do the things you want it to do, only you have to talk to it in
its own language (if you're writing in assembly language) or a third
language (e.g. C). And from the get-go you must understand the capabilities
of the chip and the language that you can use to control it. For example,
you can't just tell a PIC to do a Fast Fourier Transform (FFT) with a
single instruction. But if you are willing to "explain" to the PIC in
exacting detail every last little piece of information that's required to
do an FFT, then you have effectively written a program to do just that.
There are chips designed to do FFTs with a single instruction, mind you,
but you "talk" to them at an even higher level -- e.g. "Analyze this signal
and give me some meaningful results."
Microprocessors are exceedingly stupid little black boxes, but to their
credit they ALWAYS follows your instructions to the letter. The challenge
to you as a programmer is to make sure there is absolutely no
misunderstanding between you and the box -- all your instructions (your
programs) must be explicit in order for your program to work properly. The
benefit to this explicitness is that the resultant discipline will quickly
lead you to a better understanding of what exactly the chip is doing, and
hence what you can do with it.
As a beginner it's a bit overwhelming because there are (usually) so many
things the chip can do, and many parts fo the chip are tied together in
non-obvious ways. Plus, you have to learn your tools (e.g. MPLAB, MPASM,
etc.), and you can get really stuck without some experienced help to point
out what's obvious to others but eludes you.
I would suggest that to get started you try to write a program that does
something REALLY simple -- like increment an 8-bit value somewhere in
memory. In a PIC, it takes one line of assembly (and hence one program
word) to do this, but your program will be a bit longer because you have to
setup for the target processor (e.g. an '84), etc. Nonetheless, this simple
program will give you confidence to try something more advanced.
Starting with PICs is an excellent way to go because they are relatively
simple and you have this resource (the list) to post your questions. If you
can master assembly-language programming you may find yourself suddenly
appreciating how this $1, 1 MIPS piece computing power can enable you to do
some very interesting things.
Since you seem to have a handle on analog design, the I/O of the PIC
shouldn't frighten you -- it's pretty simple.
Stay well away from interrupts until you're quite experienced -- it's the
hardest concept to do correctly once your code is more than a few pages
Don't feel bad if you encounter long periods of being really and truly
stuck because your code doesn't work. Work on something else, and READ READ
READ. Re-Read what the app notes and data sheets say until you either fully
understand the author's intent, or you know they must be wrong ;-) In over
15 years of programming I can safely say that each and every bug in my code
(that wasn't an accidental typo) has been due to an INCORRECT ASSUMPTION.
It can be very difficult to actually realize what all your assumptions are
-- for example, if your car is running badly it's unlikely you'd think of
the fuel, because after all we all assume that fuel is fuel and shouldn't
make a difference. But hey, you might actually have bad fuel in your tank,
and until you've removed that possibility, you're really just flailing
around looking for something to explain your car's problems. When tracking
down a bug, by examining all your assumptions I believe you will eventually
find the exact reason why what you think the chip should be doing is in
fact not what it is really doing.
Lastly, I urge "my" beginning embedded programmers to do two things: 1)
explain to me exactly what it is they want the chip to do (I call this
"defining the task(s)"), and 2) explain to me how they're going to make the
chip do it (I call this "thinking like the chip"). Knowing 1) shows that
they know what needs to be done, and knowing 2) shows that they know how to
get it done.
I don't really subscribe to "top-down" or "bottom-up" design as much as I
subscribe to "see the big picture" design. Code is unforgiving -- without
sweating the low-level details it won't work reliably, and without a good
overall design it's a big mess that no one else will want to touch with a
You don't need to go to night school, but you do need to put the hours in.
It just takes time, an open mind and a strong desire to learn.
| Andrew E. Kalman, Ph.D. netcom.com | aek
| standard disclaimers apply |
Andrew Kalman wrote:
> >Can anyone help me or get me started in the right direction?
> Maybe I can lend some insight. To me, programming is like making a "little
> black box" do the things you want it to do,
- - - Nick - - -
At 16:58 28/04/99 -0700, you wrote:
I must say a long and very interesting response, but I do see that the
second last paragraph seems to be a bit of a contradiction. I always
thought that "Top down" and "Bottom up" design was what gave you the "Big
picture", and the "low level details" that went to making the product
reliable, by forcing the programmer to cover all bases. Perhaps yo would
like to elaborate on this point a bit more so that the cloudiness is removed.
I do think that you last comment for embeeded programmers can and does
apply to nearly all levels of software , perhaps with some minor variations.
>>I don't really subscribe to "top-down" or "bottom-up" design as much as I
>>subscribe to "see the big picture" design. Code is unforgiving -- without
>>sweating the low-level details it won't work reliably, and without a good
>>overall design it's a big mess that no one else will want to touch with a
>>You don't need to go to night school, but you do need to put the hours in.
>>It just takes time, an open mind and a strong desire to learn.
And Dennis wrote:
>I must say a long and very interesting response, but I do see that the
>second last paragraph seems to be a bit of a contradiction. I always
>thought that "Top down" and "Bottom up" design was what gave you the "Big
>picture", and the "low level details" that went to making the product
>reliable, by forcing the programmer to cover all bases. Perhaps yo would
>like to elaborate on this point a bit more so that the cloudiness is removed.
Perhaps the best way I can explain my intent is just to say that promoting
any portion of a software design over the others is, in my opinion, myopic.
I think that when you have a prescribed "directionality" to your design
process, you may tend to favor one end (e.g. "top" in "top-down") over the
other. If a programmer or a programming team has the discipline to truly
cover all the bases, then the approach they take is immaterial -- it's the
correctness, reliability, functionality, cost and documentation that
ultimately manner. Then it's up to Sales and Marketing ;-o
My style is to iterate many times over the whole design before delivering a
product that I'm happy with. Some of this happens in the planning stages,
and some during coding. I freely admit that's partly because I cannot
foresee every eventuality when I'm in the planning stages. But even moreso
I've found that my best aggregate designs arose when the code at all levels
I've found that code I've written sequentially (i.e. write the top first,
then the middle, etc.) can sometimes be improved upon in incremental
amounts. But if it's all being developed at the same time, occasionally a
quantum leap in performance is possible. It does slow the process down,
because some effects ripple through many levels. In my code, no level is
sacrosant, they're always candidates for improvement. So I try to keep a
"big picture" in my mind regardless of the level I'm coding at.
As you can see, I seem to rather enjoy coding. :-)
| Andrew E. Kalman, Ph.D. netcom.com | aek
| standard disclaimers apply |
More... (looser matching)
- Last day of these posts
- In 1999
, 2000 only
- New search...