- The very bottom of the 2012 PDF page  mentions the related doc generation project, MarkBind 
- The MarkBind site mentions two software engineering courses whose materials were written using it.
- One of these is TE3201 , which links to the current version of the book .
It is also frustrating that this book from 2012 is still applicable in 2018. I'm not sure what the solution is or what would drive change here.
And while big companies can train their own workers, many choose to off-load that competency to the universities as well. In Germany, many of the largest export employers have a symbiotic relationship with local (to the factory) Universities. The companies help determine the course structure and syllabus, and in return the graduates are offered good living wage jobs straight out of university with continued training and certification.
Australia is a long way off of the German model, so all I know is what I've read in the Economist and a couple of other publications, but there are certainly murmers of moving to such a model in Australia as well.
"University" just doesn't mean what it used to mean.
They are learning and research institutions. They've never just been research institutions. And the vast majority of fields don't really bother teaching any R&D to under-grads, they just teach them the subject and practical application of it in the real, job, world.
In the UK we used to have vocational schools too, that all got changed to universities. It didn't mean no vocational training went on in universities.
Sure, my local vocational school does offer a certificate and diploma of software development - a 2/3-semester program which teaches a couple of Java and C# courses and how to use a database. Which is to say, it teaches you the bare minimum you need to know to become the most junior level of applications developer in an enterprise IT department - vastly, vastly underqualified to go into many of the kinds of jobs that expect CS degrees.
The other problem is that, well, I actually liked the theoretical and conceptual side of doing a CS degree, learning from lecturers who were researchers in the field. I, and most of the good software engineers I know, would have regretted just going to a vocational school.
IMHO, there's a third option worth considering. Here in Australia, lawyers start off by obtaining a law degree (a Bachelor of Laws or a Juris Doctor) from a university. However, before they're admitted to practice, they have to obtain a Graduate Diploma of Legal Practice - a 6-month course which focuses on the practical skills of being a lawyer.
I don't see why there couldn't be a similar type of program to teach practical software development workplace skills - obviously, not as some kind of mandatory licensing program, but as an optional extra.
... most of which don't actually need CS degrees, but in the absence of the rigorous vocational programs, it's a reasonable filter to get candidates who have a clue.
The architects don't all learn abstract obscure theory, they spend a(crazy) amount of time doing drafts and models. The engineers don't all learn irrelevant and out of date techniques. The pharmacists spend half their time in labs. As do the chemists. The doctors have to do rounds on real wards.
So why in CS is your defence that a university is a research institution?
That's simply not true, and looking at any other discipline shows how wrong you are.
I don't think it's a good idea to teach students whatever is currently en vogue in the industry. If you for some reason need to have an applied subject at a university instead of a vocational school you should call it Software Engineering or whatever.
All of whom spend all of their degree on practical, applied learning. And have done for decades/centuries.
If you then want to go into research in those fields you do a PhD. The vast majority of their graduates go into industry.
And of course, the vast majority of CS students go into industry, just woefully under-prepared, unlike other disciplines.
There's no reason for CS to stay theoretical only. The field of computer science/engineering/making/whatever you want to call it doesn't need loads of researchers, it needs loads of practical, professionally trained, programmers.
The ridiculous defence that it's computer science not engineering is so over and dead and that ship sailed decades ago. It's just a name. Just like a PhD, a Doctor of Philosophy, in History doesn't make you an expert in Philosophy, it's just a name.
There are software engineering courses at university.
For all the talk of incessant change it's often surprising how many technologies from the 70s are still in use today.
I find the principles of good software engineering change a lot less than the framework du jour.
FWIW I think the “software engineering” course was the most useless “CS” course I took.
When you understand version control, you can make changes without fearing that you can't get back to where you are right now.
Without thinking about it, just by reflex, I'd create a project directory for a new project: Web pages that were almost entirely English text, or a Photoshop project, or an e-book project...and init my git to track it and set up my server to push it to.
Before long, usually while trying to come up with a good commit message, I'd wake up and ask myself, "Wait, why am I tracking this? Am I just assuming it's the responsible, proper thing to do?"
It's not enough that the project be something that iteratively improves. Git's usefulness comes mainly from text data with extremely demanding constraints (hard to get it to work, easy to break, harder to repair than to start again, working/broken is not just a matter of taste), especially when you have multiple contributors, each of whom is more likely to break something than to fix it.
If you have non-text projects (ex: photo editing) or text that can usually be "fixed" by just pushing forward rather than starting again, Git still has benefits, but they may not be worth the costs.
You don't really see the value of Git unless you're writing code that you're a little scared to write.
In fact, I think this is a common behavior; almost every PC I've seen has a poor man's version control implemented by the user by copying the file and renaming it (e.g. Report_1.doc, Report_2.doc, Report_2_valid.doc, etc), despite none of them being programmers or working with code.
It's when doing a specialized kind of work with more demanding constraints that you see the benefits that justify the cost of learning and using Git. If you never do that specialized work, the specialized VC of Git might not be worth it. "Poor man's" VC might make more sense.
edit: And you can undo when you find you need something again you deleted, which happens once in a while too.
It's hard to see the benefit of using git when it's for a single assignment over two weeks where you're the only contributor.
I couldn't get my team mates to use it, but I did on my end, and it was a godsend. Other teams were merging stuff by hand and losing hours (in total) in the process.
If you're working by yourself on the sorts of programs that are in introductory classes, you don't loose a lot without VC and it greatly simplifies things and lets students focus on the actual exercises/assignments.
Git would be horrendous overkill for a freshman programming class and it has so many footgun possibilities that the TAs would be going nuts helping students recover from various disasters.
Also, I would be curious to know if there is a 2017/18 version of it.
Just skimmed through a few pages, found a small name error:
> Java millennium (Java ME) which is made to create Embedded Systems
I've recently been reading Elecia White's Making Embedded Systems which is the closest thing to this book I can think of.
Really wanna read this!
Edit → Got it! The links there are PDFs to each chapter. I can download them individually and create a book :) Thanks all.