On the educational side, you will learn about hardware works. Some will argue that there are better languages to learn these things and I'll agree: Forth could be better for learning about raw memory addresses and systems programming. Scheme could better for learning about data abstraction. However, C has several salient things going for it: C is UNIX (for good or bad, majority of serious non-embedded -- and even great deal of embedded -- programming outside of Microsoft or of Windows Desktop Application development targets UNIX) and C is the lingua franca of programming languages.
From the practical side, you will have an advantage too. Higher level languages generally still have pointers (Quiz: is Java pass by value or pass by reference? Answer: It's pass by value, just the value of anything other than the primitive types is a pointer). Higher level languages have built abstractions around I/O and VM but they typically fall under two categories: thin wrappers around C and POSIX APIS (Perl, Python, Ruby) or leaky abstractions that occasionally force you to use FFI to C (JVM based languages). In either case, if you are able to program C you have a competitive advantage against coders brought up purely on those languages.
C is also a good language to learn about data structures: partly because learning data structures in C is really about the data structures, abstract data types and algorithms and not (as C++ and Java centric courses often make it out to be) about OO. This is the book I recommend for data structures in C: http://www.amazon.com/Data-Structures-Pseudocode-Approach-C/... (I'm biased as I learned from it myself and have had classes taught by Professor Forouzan).
Note, I never said anything about performance. Just like it's nearly impossible to hand-write assembly that beats "gcc -O3" (not to say "gcc -O4" or icc) performance wise on modern hardware (out of order execution, NUMA, specialized instruction sets), I am pretty confident that languages will emerge that will beat C even for things like inner numeric loops by being able to reason more about mutability, side effects or lack thereof, exploiting parallelism and caching available in hardware. You won't be writing in C to make computation faster, you'll be writing in C to deal with I/O, memory and the like more efficiently: many times that will mean your code will run faster, but it won't specifically due to being written in C (to be blunt, it will be due to implements of runtimes and programming languages sheltering the languages/runtimes from the hardware/OS; that may not always be the wrong thing to do). See http://pl.atyp.us/wordpress/?p=2947 for a great discussion on this.
Finally in terms of C++: don't study it before knowing C and a non-C++ object oriented language. I was lucky enough to learn C before learning C++, but it was the first object oriented language I tried to learn. Because of that, I thought of OO as a black art. Oddly enough I truly learned about OO in Perl: I was writing Perl 5 (loosely adhering to Damian Conway's "Perl Best Practices") and despite OO in Perl being an ugly hack (it's similar to Python's, but without even the syntactic sugar that hides __dict__ away) the Perl code I was writing was far more OO than any C++ I wrote up until then. I then learned Python and Java; when I came back to writing C++, I was much better at it. Josh Bloch (along with Doug Lea, the author of Java's collections) in "Coders at Work" spoke of a similar experience (in his case, he didn't do any OO programming before Java) and likewise speaks very favourably about C.
I strongly recommend you do the same: learn C and do a non-trivial project in it (writing a compiler for subset of C in C did the trick for me) to truly grok it; continue building projects in Python (or another OO language e.g., Ruby, Java, Scala, Smalltalk, ObjC, Common Lisp+CLOS; pick as many of these as you'd like) to make sure you have a firm grasp of OO before you go after C++.