As you can see, the answers are all over the map. Nothing wrong with the question, nothing wrong with a discussion, but "write a program of some kind" might not help you at all.
Btw, I've switched to "C" for a stupid reason. I was writing a File Manager for DOS in Turbo Pascal (like Norton Commander, but single pane), and I got stuck how to move a file from one directory to another. The builtin Pascal function only renamed a file, couldn't move. I was told that "C" can do anything, so I started deep into it. To be honest, while the book taught me things, at first I was putting everything in .h files, as I was so much used to the much superior (even today) Unit system in Turbo/Borland Pascal.
Then some months after, getting Ralph Brown's interrupt List, and finding that I can just call a specific interrupt in assembly to rename a file (and that's what the "C" library had implemented), Pascal was no longer limitation...
But haven't touched Pascal since 1999, and moved to C/C++ since then. I still miss the Units (.TPU files).
I really loved SWAG, which was basically a collection of 'how do I?'-questions with answers in the form of Pascal sample code. Interestingly enough, it seems that someone converted it into a website:
As a kid I went through and studied many of the SWAG snippets.
Also, the compile times and the debugger were awesome. The Turbo Pascal IDE was still better than many current environments. I stopped using Turbo Pascal when I switched to Linux more or less full-time (near the end of the nineties), though I did use FreePascal on Linux a bit.
Coming from webdev and python background, i learn many things from it. Data types, pointer, byte shifting, serial communication. The most exhilarating part is when you have to optimize your code in every corner to fit in the small memory of the microcontroller.
I would suggest looking at the classic "Programming in the UNIX environment", which is dated in detail but very good on the UNIX/C design philosophy.
But as others have said, you need a specific reason to learn C now, rather than more modern languages.
The first assignment we had to do was bruteforce passwords on a Linux machine. There was a user text file and an encrypted password textfile. This was supposed to be easy, I barely passed it.
The second assignment was to simulate an attack that Kevin Mitnick did on a supercomputer. We had to spoof TCP sequence numbers from a computer that we didn't control, but back in the day TCP sequence numbers were guessable and in this assignment they made the guessability a bit easier. I think I used libraries such as libpcap and maybe anhoter one. I did a bit better than just barely passing.
Assignment three was looking at binaries with source code in C and find the vulnerability and exploit it. I nailed it.
That's how I 'learned' -- got some inkling of workable knowledge -- on C.
I'm a bit more mild on how easy it is to be language agnostic from the get go though.
I don't recommend doing that in a real app. But it was a good way to learn.
It is a good project though, to learn about high performance arithmetic, data structures and has good potential for parallel implementations (I eventually rewrote it on top of PVM to run on a cluster of computers).
C was not an “approved language”, but I was told that if the grad student who was actually grading all the papers agreed, then I could use it. I went to talk to him, and turned out that he also wanted to learn C. So, he approved my request.
Because I was happy learning a new language, I made sure all my programs executed correctly, were written in what was then the correct style, and I still managed to turn in my homework before anyone else, for every single assignment.
I learned it from the book "Programming Linux Games" by Loki Software (remember them?).
My first real C project on x86 was a graphical user interface with Gtk+. However, I soon switched to C++ using Gtkmm, then using Qt.
Writing a GC and tagged values in particular are where pointers and memory management finally really clicked with me.
Personally my attention span got completely shot once I graduated college and I can't learn for the sake of learning like you're suggesting anymore.
For those who don't know the backstory (many here will) -- crypto is hard to implement correctly; protocols are hard to implement correctly; and C isn't the easiest language to use. [Disclaimer: I've used C "since forever", love it, and am pretty decent with it, but I've made my goofs. I've also implemented crypto & protocols myself, neither probably very well...]
Apparently in ~1995, Eric A. Young ("EAY") decided to implement his own SSL stack (called "SSLeay"), at least partially with the goal of learning the C programming language.
At some point, SSLeay became OpenSSL, EAY moved off the project, and OpenSSL went on to become a staple of network computer security (and insecurity).
I can't find a reference to back this up right now, but I know I've seen it in the past somewhere credible enough that I'm here repeating the folklore.