I started learning C at the start of this year by going through K&R and Understanding and Using C Pointers. I will likely pick this is up, it looks good and I really like No Starch Press books.
Does anyone have pointers on where to start with actual embedded programming? I have a couple Arduinos and RPis laying around, but I'm wondering if there are more 'real' ways to do it.
If you are looking for a full course this one isn't bad: Embedded Systems Shape the World[0] it does hew pretty close to platform specifics (Windows/Keil/TI Launchpad) which can be good for specifics and not good for holistic understanding.
I have had a lot of fun following Ben Eater's[1] projects, which aren't always embedded-specific (sometimes they're TTL, sometimes Arduino) but are excellent for understanding concepts deeply.
I tend to learn best with a specific project that can grow or morph as my interest or experience dictates. You might find something to build with an Arduino, using the toolchain/IDE/libraries, get it working and then start stripping out libraries for your own implementations, or getting a toolchain of your own to cross-compile and flash.
From a C-veteran perspective, what do you think about type safe (e.g. dependent types such in ATS, or F*/KreMLin, or other DSLs) languages that compile to C, or a subset of C?
How could one get the safety promises that are observed in Rust in C?
I'm not the OP you're asking, I'll provide my own answer to the latter question though.
C is a language that doesn't come with many guarantees. I personally like to think of C as a 'higher-level assembler', targeting a virtual machine. I've been led to believe that this figurative description of the language was more common in the past than it is today. I find it a helpful description since it offers an explanation for many of C's design choices. Such as it's weak types and use of pointers. If I'm correct it's also an accurate description of the language's original aims in system development.
Also, Rust isn't the only systems programming language with a focus on safety. Ada has been around for some time now and is a much more mature language and arguably more suited for the job. It has a demonstrable track record of successful use in safety-critical software. Rust is definitely more 'C-like' than Ada, which might make it preferable to many.
I always thought that after so many years, there must be a testing framework, development tools and methodology to give a C developer the safety that his problem requires. What do people use when they are programming critical systems e.g. defense,health,flight control, etc. Problems like Heartbleed et al are not something that can be ignored in the industry.
That is why I wondered about advanced tools I heard about e.g. ATS, Compcert, and so on. As I understand, the model that is used in Rust comes with limitations in regard to program design.
Thanks for the recommendations! That's how I like to learn as well, and kind of why I started learning C in the first place. Tinkering with electronics is super fun (I think).
I actually have built the clock module from Ben Eater with the intent of building the 6502 computer project at some point in the future. I really like his stuff.
I can recommend the following for Embedded Programming;
* Michael Pont's Embedded C and Patterns for Time-Triggered Embedded Systems (PTTES). They are chock full of invaluable C code (for 8051); in particular, beg, borrow or steal PTTES (free pdf available). Also checkout his other books and his company SafeTTy Systems.
* Make: AVR programming by Elliot Williams teaches you to directly program the ATmega328P on a Arduino Uno.
* Introduction to Embedded Systems: Using Microcontrollers and the MSP430 by Jimenez, Palomera et al. is an excellent textbook explaining each hardware aspect of an embedded system and how to program them.
Note: All the above are for bare-metal embedded programming. For Embedded Linux on RPi, i suggest Exploring Raspberry Pi: Interfacing to the Real World with Embedded Linux by Derek Molloy.
A good and motivational project imo can be to buy a simple sensor board for the Arduino, and write a driver yourself for it (even if drivers exist). Pick a simple sensor - some will require you to do all kinds of hoops and loops, but others are more like "startup, trigger sensor read, read out".
For this part, it's also fun to have a logic analyzer (starts at about 10 bucks) to see the change in code manifest on physical pins. It's also helpful to see that what you think you are doing, is actually happening. Eg, the SPI Chip select pin may be inverted (high when should be low and vice versa).
Then start off with a simple program that does init, and periodically read the sensor. Perhaps adding thresholds that trigger eg a LED. Then you can extend this to pipe over serial port to the rpi and push it to some server of your choice (eq mqtt), or display on a local webserver dashboard.
Go with sensors that are ordinary SPI or I2C, not some one-wire-protocol. Suggestions, BMP180 (temp, RHum), TSL2561 (light).
Have fun!
edit: if you are doing it on the Arduino, you can start off with the arduino spi/i2c libs, and later on if you wish, fire up the AVR datasheet (or whatever cpu is on your arduino) and implement i2c/spi yourself by changing registers etc on the cpu.
Much of my job is embedded engineering, to establish a bit of credulity for the following recommendations. My employer is partial to ST Micro and Renesas, so keep that in mind. Others may suggest Silicon Labs, Microchip, or Texas Instruments. This will also be biased by my own education and work experience (e.g. no robotics, more sensors/lights stuff).
I'd recommend getting a dev kit like the STM32F4DISCOVERY (https://www.st.com/en/evaluation-tools/stm32f4discovery.html). ST Micro's boards are often used for courses (https://www.udemy.com/course/cortex-m/) so you may like to take some of those courses. You'll often hear about the TI MSP430 as another microcontroller but AFAIK it's beginning to be a bit dated. Although come to think of it, there's probably more educational material out there for it, if you're willing to search.
Grab a kit like the Sparkfun Beginner's Kit (https://www.sparkfun.com/products/13973) and read some of the tutorials on their website about creating circuits. Tutorials or courses for your dev kit should get you to a point where you can light an LED controlled by the micro.
From there, you may like to do more advanced stuff like communicating with sensors over specific protocols (Sparkfun's Tinker Kit and associated guides may be of use https://www.sparkfun.com/products/14556 though you will have to translate from Arduino to C code, which can be good practice for knowing how Arduino works under-the-hood).
At this point, you'll probably know whether you want to keep learning more about sensors/lights/IoT type stuff, or want to branch out to other embedded-related topics. More advanced IoT material will be things like taking sensor measurements, storing measurements to memory, interfacing with displays, sending data via WiFi or Bluetooth.
Edit: I skimmed over a lot to keep it short. There's a lot hiding behind how casually these recommendations are made, so feel free to reach out with any questions (email in profile).
Obviously, there's the iconic K&R, which some people don't like for its terseness. I loved it - pure and to the point. However, the second book that made a huge impact on me was "Reusable Data Structures For C" by Roger Sessions.
If you internalize these two books you'll be a highly competent C programmer.
https://www.amazon.com/Reusable-Data-Structures-Prentice-hal...
Can any more experienced C developers give an opinion on the author/contents of this book?
I'm a Python/Julia developer starting to learn C. I have K&R already, and Test Driven Development for Embedded C (Grenning).
I did order 'Modern C' by Gustedt but the publisher never delivered to Waterstones so they had to cancel the order (about 6 months ago, book still unavailable from Waterstones as of today).
Looking over the contents, it seems expensive for what you get and what not might be the most practical day to day C issues. If you want to use C more effectively, I'd look at a systems programming book for what ever system you prefer. When hiring engineers knowing C isn't the issue, knowing how to code daemons in Unix, how to use sockets, pipes, io, handle signals, etc etc is the issue.
Contents are:
* How to identify and handle undefined behavior in a C program
* The range and representations of integers and floating-point values
* How integer promotions are performed and how they may affect portability
I think it's incredibly important to understand how numbers on computers work, what are the limits of 32bit and 64bit values and how double/floats play into it.
* The order of evaluation of expressions
Most coding styles avoid ambiguity by just using (enough parentheses) around (important statements).
* Dynamic memory allocation including the use of non-standard functions
Non-standard worries me here. Memory allocation isn't particularly
* The philosophy underlying the use of character types in C
Character encodings and types
I'm curious as to what this is covering? Is it system specific?
* How to perform input/output (I/O) with terminals and file systems using C Standard Streams and POSIX file descriptors
* The translation phases implemented by the C compiler and the role of the preprocessor
* How to test, debug, and analyze C programs
IO is an interesting topic, but I suspect best covered by a systems programming manual. There's several unix books, Stevens being the goto guide historically, and I would go straight to the source instead.
These seem like really odd criticisms to level against a book which is clearly not about systems programming. The title itself makes it clear it's an introductory text on the subject of C. And from the overview:
>"You'll learn how to develop correct, portable, professional-quality code and build a foundation for developing security-critical and safety-critical systems."
So the stated aim of the book is to build a "foundation" from which you could then go on to digest and effectively use advanced things like the Stevens book.
my reply was for montalbano directly. He has K&R which lays a foundation of the language. Maybe you're right and this builds off of that as a #2 type of book, so you could be right there.
Safety-critical and security-critical are interesting for an intro book. I'd like people to be aware, and conscious of of them. Anyways, we could debate this one. It really depends on how paranoid a beginner programmer becomes.
Gustedt’s book is distributes by the author for free as a pdf if you are having trouble getting your print copy the digital version could hold you over.
I took Seacord's virtual class (CMU SEI? Can't remember) on Secure C coding a few years back and own, love, and regularly use the The CERT C Secure Coding Standard.
I learned from K&R, but highly recommend Seacord's books if you're looking for how to write secure C and a more modern take on some of the trickier parts of C.
Thanks! NCC Group is reselling the online Secure Coding Training and we also deliver instructor-led courses, although these might need to be delivered by webinar until the current crisis abates https://www.nccgroup.trust/us/our-services/cyber-security/se...
Looks quite reasonable. I'm hopeful for a better introductory C text[1]. The author has plenty of street cred, the material covered looks good.
K&R is a good reference, but not good for learning the language IMO.
1: I love Plum's books (starting with learning to program in C), but can't recommend them since the language has changed so much in the 37 years since it came out.
It's nice to see a good quality effort on a modern C book.
Looking at the current language/job markets outside the center, I feel like we are hitting the same problems as in open source. People add C++ to every C job to have something with the same level of innovations going on as new languages, even if it is about Linux embedded and you wouldn't let a C++ construct near the system.
Yes, and given that an Intel PC is a valid build target for almost every language and we pay were paying by the word in job classifieds, we almost went broke before we realized we had to throw them out too.
But after initial turbulence, life has gotten much simpler (and dare I say quiet) for our HR since we moved to using the stock job profiles shipped with the platforms we buy.
Could someone who purchased the early release say how any chapters are currently available? Usually Nostarch bolds the the currently available chapters in the table of contents and this looks like all of them.
This book is just about to go to the printer. The book business has been turned upside down by COVID-19. We still offer free ebooks with print books purchased directly from our site but we don't know when we'll be able to ship to our customers directly again.
We'll also be pushing out a promotional offer on this book, likely this coming week so you may want to wait for that. Just trying to keep things on track now that our company is completely remote and physical book stores are closing left and right.
The book is in its final stages. We're just making final corrections.
I'm chatting with Robert about whether there's a chapter that he's comfortable releasing now. If not we should have something Monday or Tuesday and the entire book within a week or so. I need to confirm on Monday.
The 30% off coupon listed on the books page is good on the print purchase, which also includes all the ebooks.
What is going on with the sudden profusion of books on C?
21st century C, Modern C, Understanding and Using C Pointers?
Okay, I realize "profusion" might seem a bit overblown, but honestly, in C world (and in comparison to other languages), this is practically a publishing boom.
Not that I'm complaining; C was my first programming language (back some time in the mid-90s), and it's still my favorite. But I wonder why we're suddenly getting new books on it? The language itself hasn't undergone any substantial changes recently, and if anything, "memory safety" is all the rage -- a thing that C most assuredly is not.
If this can get the non-C programmers to learn and understand the usefulness of a simple, minimal and direct language ("modern" languages are just too bloated), it is well worth it.
I have said it before and say it again; C will allow you to program everything from itty-bitty MCUs to honking server machines. It is also the de-facto universal "glue" language and its real-world benefits far outweigh any perceived difficulties.
Bottomline - Every programmer should know C whether you use/like it or not.
Glue language aside (which arguably is a lot more true of unices than other platforms), I'd argue Forth might make a better language / minimal environment for learning the low-level parts of a computing system.
It's got interactivity (bundled compiler/interpreter in one system), a large number of implementations (possibly more supported CPUs than C++, but that might be more due to its age), and a Forth system is easier to "pick apart" and learn how its compiler's implemented than a C compiler.
May be true. However, my point was that C is everywhere and you just cannot escape it. Knowledge of this one language allows you to program real-world high-level applications to low-level systems (eg. OS/Compilers etc.) and everything in-between.
Given the proliferation of processors everywhere, it is of immeasurable value to a programmer to become familiar with one "universal" language/runtime/toolchain so that he can program literally anything. It simplifies "incidental complexity" (avoids "tower of babel") which is key to getting things done.
Mastering C and systems programming on Unix-like systems might be the best long-term career effort, since the field is not moving as fast as other technology stacks and old knowledge is still valuable knowledge. C jobs might not be the most widespread, but once you land one, it is highly-likely that it would be a stable one and you wouldn't be easily replaced by koolaid-drinking youngsters.
If you know C++, you know C (ignore dark corners and changes in later C standards). C++ is first and foremost a "better C" and only later everything else.
Personally (As in: This explicitly reflects my experiences and I have no idea is anyone else feels this way) I am worried that we are getting far too abstracted and I am unhappy with the fact that despite the power of our hardware increasing exponentially everything is running slower and slower and taking up massive chunks of memory for simple functions.
E: Also, C can be run anywhere. ANYWHERE. It can run on home computers and consoles from the 80's! I like that fact.
the only thing C missed was a stdnet lib, with things like starting a network daemon and http parsers. oh and a decent list stdlib that everyone could agree on and build the rest on top.
had that happened a while ago, we would never have seen php et al.
Actually, you should research about the story why php was created... it was primarily envisioned as a templating language and the creator wanted people to use c/c++ for business logic etc, however people didn't do that because it was easier with php.
There is absolutely no reason why a website should be build with c, sry...
Encouraging C use is at this point, bordering on malpractice. I understand that the author is a leading authority in "secure C coding". While the ABI doesn't have a lot of affordances, we don't have to keep using the language because of the ABI.
For 272 pages written by probably the best person to be writing modern C books, from a small publisher who is known for quality books, sixty dollars doesn't sound unreasonable at all
When our authors ask how long their book should be I always say: Long enough to cover the subject, short enough to keep it interesting. My company is called No Starch Press for a reason. Think of the word starch as a nicer way to say "BS", as in No BS Press.
There's a lot of work behind these pages as with all of our books. Unlike any publisher in this field we have several people who read and craft every line of every book as necessary, together with each author, before a book goes off to a copy editor. That's where most books start but not hours.
The real cost in creating a good book is not in the paper. It's in the time it takes to actually craft the words.
Would you be interested to launch translations of some of your books? I would love to make these technical books available in French and Esperanto.
I know the Rust book is already translated in some languages. I even started the Esperanto translation myself[1], but had other priorities in the mid time. Doing that in spare time doesn't help to accelerate the process, but I already achieved the translation of the Lua reference manual this way.
If you are interested, just send me an email through mathieu at culture-libre dot org, or reply to this message with some instruction on which canal you would prefer to use.
So it's hard to say that I'm the best person to write a modern C book. A slightly lower bar might be, am I the best person who has actually written a modern C Book?
Resume-wise, I've been writing C code since 1985, and I've been an expert on the C Standards Committee (WG14) since 2004. I've written two prior books on C Programming including "Secure Coding in C and C++" and "The CERT C Coding Standard". I also teach Secure Coding in C for NCC Group https://www.nccgroup.trust/us/our-services/cyber-security/se... and I also taught these topics to Computer Science undergraduates and graduate students at CMU for over a decade. So I think I have a good balance of technical skills and communications skills, but you know, pick up a copy and judge for yourself.
This is an interesting coincidence, because I just started drafting a 'defer' paper. In general, this is not an abuse of my position. I would be happy to collaborate with you (or anyone else in the community) to develop papers and champion them to the committee. They would have to be sensible ideas, which both of yours are too.
That's awesome!!! I'm actually really interested in the semantics of how a defer{} statement might work in C. If you have any kind of a draft or anything, I'd get a kick out of reading over it.
Re: developing a paper for constexpr, that's an interesting idea. I hadn't really thought about writing one up myself but it'd be an interesting thing to try out. Can you recommend a good example proposal that I could read? Shoot me an email at nicholas dot clark on the gmail if you ever see this.
I get your point. C is not C++, and it shouldn't get every feature under the sun.
But it's also not exactly -complete-. The standards committee makes changes every few years, including language additions.
I'm a fulltime C programmer, and I would _love_ to have both of the features I suggested - for the following reasons.
- defer: I could defer a free() statement after every malloc, guaranteeing that I won't miss one or forget about it. Lots of memory leaks and file-descriptor leaks could be easily avoided if C provided 'defer' as a language feature. GCC already offers something kind of similar with its __cleanup__ attribute, and a lot of programs rely on it. How much better would it be for the language to support it natively?
- constexpr: I am _so tired_ of writing really complex macros. Like say I want to populate a lookup table for a sine-wave (for fast use in a realtime system). Wouldn't it be nice if I could just populate my lookup table with a constexpr function? Then I wouldn't need really nasty macros, and I'd also be able to ensure that the calculations all happen at compile-time.
I probably could have chose a different wording, but everything you've just said is exactly why I suggested it; C isn't a language just anyone should be writing about, especially when it comes to security
Oh, a standards committee member! We have requests...
Now that C allows mixing variable declarations in with statements, it becomes annoying that variable declarations can not have labels. This hits particularly hard with the switch/case, but can also apply with ordinary named labels. The syntax to work around this defect is ugly.
The gcc extension for case ranges is really valuable.
Setting the sign bit via a cast should not cause undefined behavior. (for example, going from uint32_t to int32_t) It should just work in the obvious way. Avoiding the problem requires extremely strange code.
I'd like a way to prevent arrays from being replaced by pointers. Assignment could work. Sometimes I really want to pass an array to a function, and I don't mind if that means a megabyte is copied into the function args on the stack. Sometimes I really want to force the huge copy, and other times I'd rather have the compiler keep it in the caller's frame (but callee can mangle it unless it is const) and just pretend that the callee got more than a pointer. Array dimensions need to survive. The callee's prototype should be able to demand specific dimensions or receive them as variables, and the caller should be able to pass a portion of a larger array
The default function parameters of C++ would be useful. The UNIX API for open() would be best done this way, allowing a prototype without the need for stdarg. There doesn't seem to be any reason why default parameters would have to be at the end; a pair of adjacent commas in the middle is a fine way to indicate that the default is to be used for that missing parameter.
It's time to standardize bitfield layout so that bitfields can be used in portable code for purposes like assemblers and disassemblers. (in other words, not for purposes like access to MMIO registers) Microsoft and GNU compilers are already quite compatible on x86_64, so that would be the basis of standardization.
When a bitfield happens to have the size and alignment of a normal integer type, it should be possible to take the address. The resulting type would be a pointer to the integer type of lowest rank having the correct size.
Anonymous unions and structs would be valuable in all scopes, including at file level. This would allow careful data organization to save space, improve cache locality, prevent undesired cache line aliasing, or allow the intentional aliasing of types. Current technology typically involves abuse of the linker, which is well outside the C language.
Being able to do something like an #include, but with a blob of binary data, would be helpful for initializing big arrays. Current practice is to have build scripts convert binary files to C, to have the linker do it, and to rely on assemblers with the capability. None of that is nice to use.
So that we don't have to invoke m4 or do nasty things with recursive macros, the preprocessor could support loops.
There are a few gcc extensions that make macros far more reasonable, including statement expressions and typeof. Add those.
So there are many ideas here. I commented separately that I would be happy to help members of the community develop quality proposals and champion these to the committee. You can look here http://www.open-std.org/jtc1/sc22/wg14/www/wg14_document_log... to see what proposals normally look like. You would have to develop your suggestions further before they would have a reasonable chance of being adopted.
So it's hard to say that Ford is the best company to make trucks. A slightly lower bar might be, is Ford the best company that has actually made a truck?
I have a 295 page book in production. The print cost for paper bound is $5.46 for case laminate $10.02. Contrary to what you may think, print costs are not going to limit $0. Quite the other way, they have risen over the past seven years from $4.72 and $9.82 (roughly inflation for the paper bound).
This is what has annoyed me about pulp paperback fiction. They skyrocketed from $5 to $15 in 15 years. Then you can buy the eBook for $12 and feel like you're getting a "deal" on a product with near zero production costs.
It's as fair that you can refuse to pay $60 as it is for the author to ask for it. It strikes me that most people who would pay $30 would also be willing pay $60 in this instance, as it's not as if many new C books are coming out nowadays. So I'm guessing that the author will do better at $60. Medieval notions of profit and the physical costs are mostly irrelevant, but the effort and expertise of the author are not.
Maybe the others here in the forum can tell me, why I should risk buying the pre-order (30% off), when I have no chance of reading a sample chapter yet. (I'm actually thinking of doing this, because I'm looking for a good second book on C after having worked through K&R before.)
When I see that Brian W. Kernighan releases a book, it's an instant buy for me.
It is a risk to by the preorder sight unseen. If you are not familiar with author and his style you could end up feeling like you wasted that money.
So until a preview chapter or similar is available I’ll through this link to Modern C, 2nd Ed by Gustedt. It has been well received and I thought Jens writing style was solid.
I don’t know if it is the perfect second book to read on C, but it seems well paced and things are well explained.
I'm checking with No Starch if we have plans to release a sample chapter of this book. There is a list of previous articles I've written including sample chapters from my previous books on Secure Codign in C and C++ and the CERT C Coding Standard here https://www.informit.com/authors/bio/3312572e-d904-45d5-afcd...
I have your book on Secure Coding and I have the latest version of Jens’ book as well. I like both your style and his, in terms of writing style and presentation.
If you think there is going to be a large difference in applicability and focus when comparing your ‘Effect C’ and Jens’ ‘Modern C’ I am kind of excited, not just to get your book, but also that even though some people would like to see C be relegated to the trash bin there is still some life left in a language I enjoy using.
NoStarch never let me down so far, and paying USD 60 (or 40, or whatever) is such a marginal difference for a book you're going to spends dozens of hours on.
> for a book you're going to spends dozens of hours on.
Is that a feature? I'm constantly frustrated by books that use many words to say little. Wasting hours on such books tends to be negative value, and I shouldn't buy them even for two cents..
This book has a very concise writing style. I really wanted to keep it small enough to carry on an airplane (when we are allowed to fly again). I'm really proud of the final product and very happy with the editing staff at No Starch who helped me produce a very polished product.
Thanks for the insight! I really appreciate when the authors interact with the community.
Many books nowadays, also from NoStarch, are released as early access books. Your book does not run under the early access program, or at least not yet.
Is this a decision made by you as an author? Or is the kind of book not really suited for early access? I could imagine that introductory books work well as early access books; readers could work through the first couple of chapters, while later chapters are still being written or refined.
We're getting caught up on our Early Access titles. Our staff is very particular about what they release and they don't like to put chapters in until they feel like they're really final. Given the current world situation I'm making sure that we relax that policy a bit and you'll see more books moving into Early Access in the next 2 or 3 weeks. Just a little hectic around my company at the moment.
My intention was really not putting pressure on the publisher as a buyer of this book. I rather wonder why certain books have an early access option and others don't.
I probably won't be able to read it before July, anyway, because I have to finish my bachelor's degree first. But I think it's not a bad time for a financial contribution to my favorite tech book publisher, so I already bought the book "blindly".
No, I usually defer decisions concerning the marketing and availability of the book to the publisher. I try to stick to the areas where I'm actually a knowledgeable expert.
Working through a technical book of any substance, as opposed to just reading or skimming it, will take dozens of hours, no matter how terse or verbose the style is.
The book is supposed to be around 270 pages, so verbosity probably won't be an issue.
I am right now working through "Python Crash Course" as an intermediate Python programmer. I very quickly read through chapters 1-8 so far and did all the exercises (rather simple ones), just to make sure that I'm not missing out anything on the basics. In the last couple of chapters, I actually picked up two or three pieces of knowledge I wasn't aware of. I spent around five to ten hours on this so far. This approach sounds terribly inefficient.
However, working through a beginner's book as an intermediate Python programmer, and only getting half a dozen of really new information out of it, gives me the confidence I need at this stage. So I don't consider it as a waste, but rather as an exercise in patience and repetition.
Why am I telling you all this? Because I think people overrate the monetary costs of books.
I kind of agree with your point, with the caveat that I just don't want to spend much money if I'm not getting much out of it. I'd happily buy a book for 200 eur if I knew it's going to be worth my time. In the recent years, I've mostly regretted the time I spent on books. Life is short, free time is scarce, and there are so many other things I could be doing.
In this case, I'm definitely curious about the book and would like to skim through it just to satisfy that curiosity, but I have my doubts as to whether I'll get much out of it (I've been writing C for 15 years and I do it professionally). Maybe there are gems of wisdom (or things I've overlooked) in it that would make it worthwhile, but it's possible I'd just regret the time and money spent :-(
This is an interesting issue. I remember when I started studying discrete math, I actually bought two books. One was by Ross, and it's the one I prefer, but at places it was a bit terse. So when I had trouble understanding something right away, I looked at the same subject in Susanna Epp's book and usually I could find a relevant explanation or example straight away, but in the long term I couldn't stand this verbose style. (By the way, both books cost $80, but you can buy used copies at 1/10.)
I agree, that book can be replaced by reading the classic K&R2 and reading the C standard, you can get your tips/tricks by reading stack overflow posts, and man pages for your API reference.
It's a luxury item. For comparison K&R2 sells for $51.99 new on amazon.
Does anyone have pointers on where to start with actual embedded programming? I have a couple Arduinos and RPis laying around, but I'm wondering if there are more 'real' ways to do it.