Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Old CS lecturer looking for advice from current and recent students
392 points by geophile on June 30, 2018 | hide | past | web | favorite | 181 comments
I am 61, with an academic background in computer science, and many years in industry, mostly startups. I taught many years ago, and have resumed teaching, a database course: data modeling, relational algebra, SQL, application programming and architecture (e.g. 2-tier vs. 3-tier, web & mobile), database internals.

Student evaluations were pretty good for the most part, but quite a few students found the presentation a bit dry: I prepared every lecture as HTML ahead of time, made it available online, and presented it in class. A couple of times, I would do interactive things, e.g. tuning queries using EXPLAIN and playing with indexes. That proved pretty popular, but of course, it's difficult to capture this material, (I recorded a log of the session, but extemperaneous discussion was not captured).

Looking for advice on how to balance prepared material and more spontaneous things. Also, any other advice on how to make material of this sort (theory + practice) easier to absorb.

Observation 1: You can’t please everyone.

Observation 2: If you can prepare something detailed in writing in advance, it’s more efficient to just give it to the students in advance and let them read or view it on their own time.

Observation 3: Interactive discussions are the one thing that classrooms are optimized for.

Don’t stop the prep work. But if you can lead classroom discussions with safe cold calling, the students with get more out of it. Don’t worry about capturing the results. You could record it, but it’s the process of struggling and discussing that creates learning. Written artifacts are supplementary.

I commend you for both teaching, and for caring enough about your craft to ask for help. The Lifetime value your students can get is enormous.

One of my favorite classes in college was a physics course taught in this way:

1. Before class, we were assigned to read a chapter from the textbook, understand the material, and complete two or three homework problems from the material we had just self-learned.

2. After submitting homework, lectures focused on discussing the concepts more in-depth. Everybody already had a baseline knowledge, so the professor would highlight the important takeaways, applications, live demonstrations of concepts, etc. I found these lectures engaging because I had already learned the material - and the lectures focused on mastering it.

3. Sometimes there would be follow-up homework problems focusing on advanced applications or derivations. These advanced problems were closest to exam questions.

Some takeaways for me:

- If we didn't have homework to do before class, I doubt I would have consistently learned the material before lecture.

- Lectures taught us more than the "what" - it taught us the "why" and how these concepts related to other areas.

- Lectures focused on answering questions, exploring curiosities (like "what if" questions), demonstrations/experiments, and mastery. The professor added value beyond the written material!

I hope this helps OP because it sounds like they have material prepared beforehand, which means that the lectures could go beyond the material.

This sounds fantastic .... but in my experience many students do not read the material as assigned. How did the lecturer manage to overcome that?

My favorite course ever had the same structure. It was an elective, with only about 15 3d and 4th year students. Those not up for the reading dropped out. We covered about 20 high quality economic papers in 6 months, quite impressive for a teacher to accomplish that with students. I can pretty much still remember the contents of the course.

There was homework due before class from the material!

> Before class, we were assigned to read a chapter from the textbook, understand the material, and complete two or three homework problems from the material we had just self-learned. [Emphasis mine.]

Grade the students on this?

I had classes with the same structure. The homework was one of those online setups, and our professor had a policy that we should be ready with questions. If the questions stopped during class, the homework grade would be replaced by the grade on a quiz. The quizzes were fair game every day,were limited in scope,and only happened if the professor decided there was too little engagement.

This method sounds appealing and reasonable. My question would be- did you have a good sense of how many of your peers did in fact do the homework ahead of class- my instinctual fear is that due to this ‘different approach’ many students may not be bothered to do this?

> My question would be- did you have a good sense of how many of your peers did in fact do the homework ahead of class- my instinctual fear is that due to this ‘different approach’ many students may not be bothered to do this?

I took part in an online course where you had to answer 3/5 quick questions correctly before you were admitted to the lecture room.

I think it resulted in an engaged and knowledgeable class.

I've also taken a course where you can't proceed to the next lecture unless you get more than 70% grade on homework from the previous lecture (which you could submit multiple times but it was not multiple choice). This also worked really well.

Homework was required and graded, so I assume everybody did it! It also forced you to attend every lecture because you had to hand in homework on the way in. The professor claimed he would throw away homework from those who dropped it off and walked out. (In reality, with prior communication he was always flexible. )

I like this model, but it puts a huge workload on the TA to help with questions about the homework, since the students don't get to hear the professor explain the material until after.

if a student can't learn from a book, the have to first learn to learn from a book.

I took a physics class taught like that last year, and my experience was the opposite. Because the teaching team expected that students had read and understood the material before class (hint: very few people read it, even fewer understood it), the TAs and professors went waaay faster than they should have, and the class was one of the most stressful experiences I've had at college. Nobody liked it.

I had mixed results like that years ago, it was 'repaired' later on by repeating the stuff you should already know, mostly as a verification, but the people that might not have read what they were supposed to read would have a chance to catch up.

Ultimately, it is of course the responsibility of the student to decide what they want to do. If you don't want to learn, see learning as a side-job, or live in an environment that does not leave room for learning outside of college hours, then you might be in more generic trouble anyway ;-)

Hey, mathattack and philip1209, those are really useful points of view, thank you.

+1 for cold-calling. What is "safe" cold-calling?

I find it better to develop strategies for 'safe' handling of student input than to resort to cold calling. If a student feels embarrassed because of a response, then not only will they never raise their hand again, but neither will anyone else.

The best way to do this is to try to identify the good part or motivation in a student's statement, regardless of anything wrong that might be in there too. Provide positive reinforcement on that part, and then turn the question back to the group again, with that extra foothold that the first student provided.

Example: Me: "If we wanted to look at every relationship between these objects pairs of these objects, how many relationships would we have?

Student A: "N squared!"

Me: "Sure that's a good start; there's N squared possible two-ples you can form. But I think you may be overcounting a bit; can anyone suggest a way to cut it down?"

Student B: "You don't need to pair objects with themselves."

Me: "Great; so how many are we at, then?

Student B: "N^2 - N".

Me: "Awesome; I feel like we can still cut it down more. Any more ideas? Anyone want to try a small example?"

etc etc...


Other bits of advice that come back to me with some writing:

* Provide seeds for ideas, and let the students fill out the flesh.

* Keep the conversation moving around the room. Try to engage a variety of voices.

* Provide at least some positive feedback for everyone who participates. Modulate for effect: eg, bigger praise for more complete, well communicated answers, or for people contributing for the first time.

* If the Super Student blurts out the final answer, take it back to the class and ask if anyone can explain it in more detail.

* Choose good questions. It's easy to ask a question that only you know the answer to... Mix up easy and hard questions, and use the easy questions as a way to 'onboard' quieter students.

Do it well, and there's always going to be enough hands to avoid cold calling. If I genuinely get no hands on a question, it generally means I've asked a bad question; rephrase and ask again. (And on day one, I have a dumb joke where I have everyone "raise your hand if you have a hand" and then do 'reverse' hand raising...)

> The best way to do this is to try to identify the good part or motivation in a student's statement, regardless of anything wrong that might be in there too

Your example sounds very reasonable to me, but do you think that this is ever taken too far?

Coming to the US as a student from Europe (not the UK), I noticed that some instructors (mostly TAs) took this effort to give a "positive" response to any answer in class to quite an extreme. I can think of many instances where a student's answer showed a very serious lack of understanding, and the instructor would say something like "okay, that would be one way to think about it, but how about... " without every correcting the student.

Two negative consequences I saw were 1) a lot of missed opportunities for correcting students' misunderstandings 2) students seemed overly sensitive to any kind criticism (I had a German friend who was a TA and had quite a bit of trouble with this)

Again, none of this is incompatible with your advice, I just wanted to take the opportunity to see if anyone has noticed this as a common problem.

Yes, there is a risk there. You need to make sure that you put some phrase in there indicating that the student's answer wasn't really right. Just saying "that's one way to think about it" without clarifying that it's not quite correct for reason X will definitely do more damage than good.

The trick is telling them they're wrong without making it sound like you are calling them stupid.

Yeah, I hear that.

My question in the 'serious misunderstanding' realm would be whether this is a problem that this individual student is having, or is it a common misunderstanding across the class? In the latter case, I turn the question back out to the class, eg, "Who thinks that's the right answer, who thinks it's Something Else?" And then we demonstrate the the individual student isn't alone in the Serious Misunderstanding, and provide a bit of personal buffer between the response and the correction.

I want students to come away with a self-description of "I have lots of ideas, and they need to be carefully checked because sometimes they're wrong," as opposed to the incredibly prevalent "I'm bad at math."

Polls in general are a good buffer to make a safe guess and involve everyone.

In my experience, as recent student in the UK, even with that style of questioning people are reluctant to give answers. Certainly it is better than some lecturers that will just pick on people to ask though.

Totally. There's a relationship between the teacher and the classroom, and the students come in with a lot of expectations based on prior classes, too. A big part of the job is building and maintaining that relationship, and changing the expectations in your classroom.

It means there’s no shame for a wrong answer. You aren’t grading the response. (I’ve seen classes where participation grades were based on getting facts correct rather than group discussion)

It's of course the kind of cold-calling where you will never have to worry about type-safety or memory-safety. You will never endure a dangling pointer, a use-after-free, or any other kind of Undefined Behavior.

I'm a current student taking CS classes. I find there's 2 types of CS classes for me:

1. Classes where the content is better presented on Youtube, so I don't go to class and just watch the youtube videos on it. There are some really great Youtube videos on a lot of CS content, so it might even make sense to assign that to students so you aren't wasting your time re-explaining basic content that's already well documented.

2. Classes where the content is highly interactive, and most of the class is us digging into a topic together as a class.

I think the most interesting classes are the ones where the teacher just asks questions, and then only provides answers when the class is stuck. I think it'd be a good exercise to ponder how you could structure your classes such that you only ask questions. You may find that it's easier when they present a guess or 2 at an answer, and then you help out with any logical jumps necessary, or show them how specifically it works in a particular database.

I think there's a tipping point with asking questions in class—if you ask too few, students aren't really in their "question answering" mode, and they won't want to interact. But if you ask enough, there will always be at least a few students offering ideas. It also helps to have many softball questions, especially in the beginning, so students who might not be following as closely can still hop in.

It also might help for the students to review some material before class to be ready for it.

Best of luck!

I always wanted my CS lecturers/professors to start with a business case and work backwards. For example, like, “Lets build an e-commerce site”. What are the tools we need here? What kind of database? How should we design the front end? What tools for front end? etc. etc.

I’m not sure about others but i would have found such an approach very involving. Instead i was taught chapter by chapter of preset text book syllabus followed by standardized exams.

I hope this helps:

I wouldn't want academia to focus on business. That's more technical than academical. If I see a CS (computer "science") course teaching the latest front end JS tech, I would be disappointed.

> I wouldn't want academia to focus on business.

I don't believe that OP's comment counts as focusing on business. To me, it's an effective way to engage students with a real-world use of the tech they are about to learn, thus establishing a baseline to gain perspective of how the academical part matters.

Otherwise you risk concentrating the course on a perspective taken from the top of the proverbial ivory tower that leads students to just go through the course without taking anything out of it.

Maybe what the OP was getting at is that before a topic is presented, it needs to be contextualized.

For a single presentation, you do this by presenting some motivation and the surrounding space before diving in.

For courses, the best I've seen is to present the structure at the beginning, and continuously revisit that as the semester progresses. What's the purpose of the course? To build to a solution to X or understand problem space Y.

Most of the courses I have had have followed this structure, and whether I enjoyed the topic or not, it was engaging enough. Unfortunately, I've had a few courses by people who clearly didn't care and started the class by telling me why they are smart and I should follow them, then reading off slides prepared by the book author or whomever, without understanding the purpose or direction of the course themselves.

> it needs to be contextualized

Contextualization can and should occur not just by need, as in where are we going with this, but also

    * Where did this come from
    * How hard was it to get there
    * When was it discovered
    * How was it discovered
    * When was it accepted
Often times areas of Mathematics or Physics are presented in way that is completely disjoint from their history, that these n-pages of results actually took 50+ years are more to discover and argue over. That careers diverged and were ended over who was on what side.

Too many concepts are communicated as a series of steps, do A, do B, do C ... but you have no idea that you are actually baking a cake. If it isn't a surprise, tell the other party the destination.

Most people will go to industry after graduation, though.

When I hire I prefer to Herve someone with some practical knowledge rather than the capacity to build a compiler.

So having the "let's build something" experience is useful in the real world afterwards.

> So having the "let's build something" experience is useful in the real world afterwards.

It is, but it’s not the job of academia to teach it. You have side-projects, internships, and your whole life to learn practice.

Where did this whole thought process of it’s not the job of college to help you become employable?

It comes from afar, and is an ever-ongoing discussion. I would recommend reading [1] if you are really interested. If you are but no so much, the last paragraph is revealing:

"If we seek guidance from the past, it is better to see the 'idea of the university' not as a fixed set of characteristics, but as a set of tensions, permanently present, but resolved differently according to time and place. Tensions between teaching and research, and between autonomy and accountability, most obviously. [...] between the transmission of established knowledge, and the search for original truth; between the inevitable connection of universities with the state and the centres of economic and social power, and the need to maintain critical distance; between reproducing the existing occupational structure, and renewing it from below by promoting social mobility; between serving the economy, and providing a space free from immediate utilitarian pressures; between teaching as the encouragement of open and critical attitudes, and society's expectation that universities will impart qualifications and skills. To come down too heavily on one side of these balances will usually mean that the aims of the university are being simplified and distorted."

[1] http://www.historyandpolicy.org/policy-papers/papers/the-ide...

I don’t have a problem in theory, but with the median household income in the US being $60K a year[1] how many families can afford to invest in college for any other reason besides trying to give thier children a better opportunity?

[1] https://www.thebalance.com/what-is-average-income-in-usa-fam...

College should be a right, not an investment.


What should happen and what does happen is completely different. In the US, if you are spending 10s of thousands of dollars on an education and where the only real way for most people to have upward social mobility is a college education thst gives them practical skills, you need to be able to hit the ground running.

If college doesn’t give you real world skills and companies won’t train employees (https://www.washingtonpost.com/news/on-leadership/wp/2014/09...) where does that leave a college graduate?

I don't know how to respond to this. Companies are being more aggressive in off-loading their costs onto the workers. And given the consolidation in tech, the employers have even more power to dictate the terms.

We are seeing similar behavior with unpaid internships.

I have experience with FANG companies and most employees have huge gaps in social,life,cultural and historial experiences. When it comes to a Liberal Education, they skipped leg day.

Are you saying that new grads need to understand packer, docker, 3-way git merges and react?

What I’m saying is quite simple - understand the world or at least the U.S. for what it is and act accordingly.

There is no loyalty from any company. The days where there was an understood social contract between corporate America where they would take you in a train you and where you could work your way up the ranks and stay there for years is long gone. Corporate America is only interested in “increasing shareholder value”.

Since corporate America has no loyalty to you and will not train you - it’s your responsibility to train yourself and be ready to either jump ship for a better opportunity at the first chance you get or be ready to swim to the next opportunity when corporate America throws you off the boat.

The reality is that whether you believe that a college education is “a right” and that people shouldn’t have to go into tens of thousands of debt for a chance at a better life, American society disagrees. If you are investing tens of thousands of dollars for college, your reward can’t be just “being a better member of society”, it has to be - can I get a job to provide my own income, health care, retirement etc and have enough saved to provide my own safety net.

A liberal education if you are a software developer won’t help you achieve those goals.

Why is it so crazy that a college should teach you how to use source control? No matter what type of software development you are doing, you’re going to need it. What’s wrong with teaching you whichever is the most popular marketable front end MVC framework during your senior year so you can be productive on a job that’s not going to train you?

True you have to be a life long learner but six months after you graduate from college you’re going to have to start paying that student loan back and the way the government is trying to overturn the ACA, you’re not going to be able to stay on mommy and daddy’s health insurance after you get out of college - if they even have health insurance.

It's also worth asking 'Where did this whole thought process of "it is the job of college to help you become employable" come from?' There are long-established, deeply-entrenched advocates for both viewpoints, but the 'it is the job' viewpoint is the new one. It's worth understanding the debate.

When students go to college and either thier parents spend between $40K and 120K for a four year degree or a student takes out a loan, what do you think thier expectations are?

If I have a choice between a hiring a junior developer who has a computer science degree and only knows theory and a junior whose only experience is a boot camp where they learned practical skills. I’m going to hire the boot camp graduate first. Yes I have a “computer science” degree in the early 90s from an unknown state school, didn’t learn anything and the only reason I was employable was due to my prior knowledge from years programming in AppleSoft Basic and assembly language and I fell in love with C and stayed on the comp.lang.c newsgroups.

> If I have a choice between a hiring a junior developer who has a computer science degree and only knows theory and a junior whose only experience is a boot camp where they learned practical skills. I’m going to hire the boot camp graduate first.

Google, Faceebok and many others don't seem to agree with that statement.

Also, would your answer change if your company's goal was to develop better machine learning algorithms, or network technologies, etc.?

There is life outside of Silicon Valley. There are 60,000 computer science graduates every year in the US [1] most software developers aren’t working for the FAANG companies or the cool startups trying to change the world. They are working boring corporate jobs writing bespoked internal apps and yet another software as a service offering. Yes I know there are others writing embedded apps, mobile apps, game developers, etc. But most of those companies also are interested in people who can hit the ground running.

[1] https://qz.com/929275/you-probably-should-have-majored-in-co...

My point is not that "we should do what SV wants" (I'm not even American).

My point is... where would people get the education that bootstraps them to understand the state of the art then? To me that is the University's goal. In CS that means learning lots that may not be practical for a regular job (advanced algebra/calculus, computer architecture, data structure theory, etc.). However, in the long run this should leave you better prepared for the "boring jobs" too.

Now, you argue that this kind of education may not be the most efficient training process to regular job excellence? I totally agree. College as devised today is designed as the first step into eventually being able to contribute to the world's knowledge, or at least making good use of it. It is not devised as a way to prepare you for the job world, where immediate applicability trumps everything (in general).

I don't know the US system enough, but here in Spain (and at least in France and Germany) there are alternatives to College tuned to that goal. However, most parents (in Spain) still want their children to go the College route. Why? Because College is socially reputable (there are non-college educated leaders, but they are the exception, not the norm), and data shows that college educated people do have a higher expected income throughout their lives. It is percieved as a ladder up the social chain.

When they graduate from college and don’t have the applicable skills to get a job then what? Sallie Mae (the government agency in the US that oversees the student loan program) is going to come knocking on the students door six months after they graduate looking for them to start paying back their loans. Most students are more concerned with the most practical way of putting food on their table and many parents are concerned with getting their kids out of their basements than they are with “increasing the world’s knowledge.” That’s not a luxury the middle class has.

If the issue is as clear-cut as you claim, why don't you found a company that provides this kind of education? Why aren't there more students chosing other paths? Why are bootcamps struggling lately?


College is supposed to give you strong foundations so you can build on that. This means that college grads should be able to pick up "applicable skills" in short time (they certainly should be able to become productive at any company in less than 6 months), and from there they should be able to improve faster than their non-graduate peers, eventually surpassing their skills. If that wasn't your case, sorry mate, you are surrounded by exceptional non-graduate peers, or you are an exceptionally bad graduate (which may be your university's fault or your own).

In any case, those parents and students coughing up loads of money for graduate education seem to disagree with you.

How many parents know enough about the computer industry to know whether a curriculum will give their children real world experience?

On the other hand, how many students go into CS because they look at the starting salaries and expect to get a job paying well?

they certainly should be able to become productive at any company in less than 6 months)

Most companies aren’t willing to do on the job training even for juniors who don’t at least know the language they are using or a similar language on a junior level.

i don't think you can discuss only one direction (theory > practise). i went back to academia after several years working in the industry. i would conclude practical experience allows you to gain a deeper understanding of theoretical knowledge, also if it comes to teaching. As one can present better and more real world use cases. if you consider statistics, machine learning, etc this is absolutely the case. learning just formulas etc without real use cases is mostly a waste of time for a student. only with good examples this goes to long term memory and deep understanding...

I think most people would agree here. Nearly all CS college courses include practical projects, precisely to try to provide these good and memorable examples.

It is also true that practical projects in 12-week long courses just cannot compare to the amount of problem insight you get if you spend N years working on it from the business side. There many courses that I would enjoy (and benefit from) much more now that when I did them some ~15 years ago!

The original post was arguing against a practical project.

> So having the "let's build something" experience is useful in the real world afterwards. It is, but it’s not the job of academia to teach it. You have side-projects, internships, and your whole life to learn practice.

Have you never encountered a junior Dev fresh out of college who could talk theory all day long but couldn’t write an implementation of FizzBuzz?

> The original post was arguing against a practical project.

As I understand, it was arguing against business-oriented teaching (i.e.: teaching to experts in the current fashionable web framework/language/ML system/whatever) instead of teaching the basics. It hardly said anything about practical projects.

> Have you never encountered a junior Dev fresh out of college who could talk theory all day long but couldn’t write an implementation of FizzBuzz?

Out of college? No. I've encountered a few who couldn't write FizzBuzz, but they also couldn't talk any theory either!

I also went to graduate school for an MBA (I didn’t finish). We had real world case studies to discuss. Why shouldn’t CS do the same?

I taught myself how to program in Applesof Basic in 6th grade because I wanted to write a game. Then I learned assembly and how the interpreter worked, how the extension cards mapped memory, the OS etc.

Get them interested in the high level first and then drill down.

I claim no authority on this, but my understanding was that community college and professional schools were the institutions designed to teach people directly employable skills, and that the university existed to allow people to explore academic subject material for its own inherent value.

Do you think parents or students are paying 10s of thousands of dollars and going in debt for the “inherent value” or with the hopes that they will be more employable? Relatively few people can afford to spend that kind of money to be a “better citizen of the world” without expecting a return.

The institutions GP suggested do have that role are significantly cheaper.

And companies would much rather higher someone from a four year school than a two year college and some jobs require a four year degree.

Well did you choose a vocational course at college, or did you choose a degree at a University.

The former is for learning a job/role, the latter is for learning about a subject. The academic development in a subject/subjects may make you better for some jobs but that is not their primary purpose; you often need to add on your own extras if you want to be career ready.

I chose a degree in the early 90s. My first job out of college in 1996 was as a computer operator. Shortly after I was hired, they got a new contract where they needed someone to build a fairly complicated data entry system. I was the only person who knew how to program. I got it done, put it on my resume and got another job two years later.

I had no training and my university degree didn’t prepare me for it. While I already knew how to program, had played around with assembly on both my //e and later my Mac and taught myself C, and really only went to my little unknown state college for the piece of paper and not to learn, where would I have been if I had expected to actually learn something useful in college?

No one tells any other profession that they shouldn’t go to college expecting to get s job, why is CS different? Are you really going to say that most college students and parents spend tens of thousands of dollars on a college education just so they can be “a better member of society”?

>No one tells any other profession that they shouldn’t go to college expecting to get s job //

Erm, isn't it your choice, study a subject or get a vocational qualification. I realise there's middle ground, they're the poles.

Why should we ruin universities just so businesses don't have to filter candidates, train recruits from a younger age, or have apprenticeships?

Seems better to use universities to expand human knowledge (individually and universally) than it does to use them as a hugely expensive way to do a first filter on company recruitment processes.

Why should we ruin universities just so businesses don't have to filter candidates, train recruits from a younger age, or have apprenticeships?

It’s not about should, it’s about reality. The corporations have already spoken. We can either accept reality for what it is or graduate a bunch of students with CS degrees that are saddled with debt that have a hard time finding a job. In the immortal words of Kosh, “Once the avalanche has started, the pebbles no longer have a vote”.

CS grads coming out of college are competing with foreign developers who were trained to get a job, have more experience and that will work for less.

> It is, but it’s not the job of academia to teach it. You have side-projects, internships, and your whole life to learn practice.

and then you get employers saying they dont get skilled people.

> and then you get employers saying they dont get skilled people.

That’s not the problem of academia.

It should be, try telling students and their parents that the degree which you got after going in debt wont help you in getting a job.

It helps tremendously. But is it sufficient in getting you a job? No, you have to work on the side in order to practice what you learn. Employers often care more about your GitHub than your degree.

Yes students need to work on the side as well but you cannot just wash your hands off and say "That’s not the problem of academia."

> Yes students need to work on the side as well but you cannot just wash your hands off and say "That’s not the problem of academia."

My point is that academia must priorize theory over practice: 1- knowing the theory makes learning the practice easy, and not the other way around. 2- the theory is useful for your whole career. 3- Languages and frameworks evolve quickly, quicker than university courses [1].

[1]: In the university I was in they (re)define all courses every 3 years. That mean if you want to teach practice you must find something stable enough to be relevant for at the very least 3 years. They teach Java as a first language, Python for scripting; PHP for the Web; C for system stuff; OCaml then Scala as a functional language. As far as I remember it was roughly 80% theory 20% general practice (i.e. writing vanilla PHP instead of learning a Web framework).

yes it is - the aim of academia is to prepare people for what they will do in the future, not some reality-disconnected curriculum.

> yes it is - the aim of academia is to prepare people for what they will do in the future, not some reality-disconnected curriculum.

They already do that. The fact that some people believe going in class and doing nothing on the side will suffice is the issue here, not academia itself.

This was a constant discussion when I was in school for mechanical engineering. How do you balance learning theory and learning how to apply that theory, given the limited amount of time available? It came down to a common gripe that we didn’t get enough time in the shop actually fabricating, and the refrained response was that we learn that stuff on the job.

My take was always that a strong theoretical understanding is what differentiates and engineering degree from a technical or trade degree that focuses more (or sometimes solely) on application and practical knowledge. In my mind, the ideal engineering degree focuses on the theory and provides just enough practical instruction to give you a starting point to begin applying what you’ve learned. I’m open to discussion on that though.

For me this is the other way round: an engineer must know how to do things, the theory is a nice to have.

I am an engineer, with a PhD in physics. The theory had exactly zero use as soon as I quit academia.

> I am an engineer, with a PhD in physics. The theory had exactly zero use as soon as I quit academia.

Counter-example: I have a M.Sc in CS, with a specialization in programming languages. I know how to build a compiler, and it helps me almost every day. I’m not building a compiler every day, but the things I’ve learnt from compilers influence every piece of code I write. I can better optimize my code knowing how it’ll be compiled/interpreted by the machine.

In fact, I spent a lot of time at the university complaining I was learning useless stuff. Nowadays I’m super-grateful to my teachers for providing me all that knowledge I use every day, even if it’s not a direct application.

Someone with the capacity to build a compiler will be smart enough to pick up whatever toy frontend/backend languages/frameworks are currently in vogue. The real question is do they _really_ have the capacity to build a compiler, or are they just saying they do.

I do not think so. I can talk two semesters about quantum mechanics but would not be able ti build a nuclear plant.

Your analogy is backwards imo but I totally get where you are coming from. I would still say though that it is a lot harder to write a compiler than it is to write a web app, at least in general. Building your own compiler is like building a nuclear power plant. Building a web app, from an engineering perspective, is like drunk texting your friend about quantum mechanics, and I say this as someone who works on web apps every day. Every once in a while a real problem does crop up, but the vast majority of problems faced by web apps are just about finding the right existing tool for the job, doing frontend stuff, tweaking SQL queries, and making CRUD endpoints.

You are right in one sense though. Being good at engineering doesn't mean you have a good eye for design and good common sense for UX, but programmers are expected to more and more these days. This is why it's still important to get a well-rounded liberal arts education instead of going all-in on engineering and coming out with no creative/artistic ability.

Precisely! I'd rather want to see the materials being presented with both theoretical and applied motivations.

Right, so I have a mix of theoretical background and practice, including both toy examples, and not case studies, but brief discussions of real situations from experience.

Now my question is how to combine prepared notes (the HTML lectures), live sessions, and perhaps other techniques in class.

For something beyond mere toy examples, I would look into online language-specific playgrounds.

Those allow you to run code directly from browser - and if you include basic instrumentation (timing, step counters, item/memory use counters, ...) then it becomes really easy to show how a simple change makes things either better or worse.

To keep things language agnostic, you can cherrypick a different language for different purposes. That way you get to ensure that the lessons are not tied a particular language and the concepts have to apply universally.[1] For an interesting twist, maybe pick some historical examples too, to show how things have evolved in languages over time.

Then, when the basic concept of the lesson is [hopefully] understood, expand to a short case study to show why the dry theory matters in practice. For example, there's an old DoS attack against DNS servers, where a 56k modem could take down a root server due to unbounded O(n^2) worst-case complexity in the hash table implementations.[0]

0: https://www.usenix.org/legacy/publications/library/proceedin...

1: For extra points, you can pick examples where a particular language has chosen an implementation with amusing problems.

You've also surely got loads of anecdotes on how not to do it and tales from the coalface from your years in industry.

I'm sure you can sprinkle some of those into the classes to give a little light relief if the students are really finding things too dry. You don't have to go as far as recreating TheDailyWTF. Perhaps use some to provide worked examples of designing yourself into blind alleys and how to get out again, or just to provide awareness of a few real world issues that might come up with whatever topic is the day's lecture.

Examples aplenty of companies that have serious problems because they did not understand the impact of their early decisions due to insufficient cs knowledge.

Those are great cases to review, and then to work backwards to the root cause and to discuss possible fixes. That's both CS and practical knowledge in a way that it sticks, the best school I've found to be other people's mistakes.

Do you just give lectures, or can you also do laboratories? I've found having one session a week dedicated to practical exercises - using pre-prepared virtual machines where appropriate has been a very good teaching method - and popular with the students who can see themselves achieving the tasks.

What exactly do you do with a computer science degree that doesn’t make you hireable?

Same as what you do with the degree, say, in physics. You become a software engineer.

We had a class (mid-2000s, UMBC) that was I think part of the required major courses that taught design processes. The semester long part of the course involved getting paired as a small group with a professor who acted as a business user, and you had to acquire business use cases and build an mvp product for them. It was definitely a great course and provided this kind of real-world applicable knowledge without every course needing to be coached into the same setup, which would have become redundant and kinda not the point of university.

I wouldn't worry too much about student feedback. I have got feedback like "This class was terribly organised." and "This class was extremely clear and well organised." from the exact same class, just look for the general trends.

Interactive activities are great! Do be aware that even with a very interactive activity there may be lots of students who are not actively participating. Two or three students who loudly share their opinions can easily make you forget about the silent majority.

Don't worry too much about capturing the interactive parts and demonstrations. The key thing is they are unique enough to stick in people's memory. I still remember demos from my college days and the basic principles they were espousing, even if the technical jargon and equations are now long gone.

I agree not to put too much weight on the end-of-class course evaluations, although some universities use them formally so you might not be able to avoid caring about them. I do think student feedback is useful, but end of course feedback is often too myopic, primarily influenced by student perceptions of grading: easy classes without much work tend to get high ratings, even if nobody learns much.

I personally had very different opinions about my university courses even 1-2 years after graduation vs. right at the time of taking them. Wish there was some more systematic way of collecting that kind of student feedback with a little more hindsight.

If you are truly excited about the material, you shouldn't shy away from sharing why you find it exciting and why relational algebra and data modeling is the best thing since sliced bread. Dryness is often just another word for "why do I need this" syndrome. Relational databases are the bedrock foundation for some incredibly cool technology platforms, and have been for decades. Just figure out how to translate the technical merit of your material -- potential benefit to society, complexity, interesting theoretical properties -- into genuine excitement and a contagious feeling of "this is why this thing is so cool and you should devote your time to it". I had some professors like that in college who got me very excited about data structures and compilers, and I could see someone doing it effectively with a databases course. It's the difference between going through the motions of teaching, and using this one hour period as your chance to impress upon a room of people how databases matter and are fascinating and worth studying and developing.

Hey, this is a great thread! I am a current CS student and have noticed a similar issue with many of my classes. I think one of the biggest differences is just the generational expectation of the level of engagement from the content. Many people in my generation (millennial+) grew up with fast paced interactive content. We have come to expect more from the content we consume including a back and forth dialogue and a narrative. Don't get me wrong, I love a good old fashioned lecture, but for technical topics with no built in narrative its not the most effective way to convey a topic.

Including all these elements sounds like a lot of work. Realistically, I think you could improve your content by looking at some of the higher quality YouTube CS tutorials. Most of the instructors there pull out the most important aspects of the material they are teaching and make it more engaging.

A simple way to do this is to break the material down into the parts that are absolutely essential to understand in order to have a framework for the remainder of the material. If what you are teaching is available before and after the lectures online, you do not need to cover it all in class.

As long as you make sure the issues you know students will have questions on are addressed in class that is enough for the lecture portion of the course.

SUMMARY- Don't read lectures in class or review everything in the lesson notes. Hit the wave tops in class weaving in a narrative whenever possible. You want students to come away from the lecture remembering enough to have a framework for the topic in their head so they know where to look or have context for their questions when they come up.

Hope that helps.

I don't really understand your concept of narrative here. Without turning it into a history of CS lesson how do you cover something (which I assume it's basic in CS) like the halting problem or Church-Turing hypothesis. Don't you need a formal, abstract, mathematically rigorous approach; isn't that rather at odds to the concept of narration?

Surely in CS you're covering stuff that doesn't have ready implementations - like say a programming class would cover various common sorting algos??

I'm sure you can tell I'm not a Comp.Sci person, I've dive done maths, computing, programming, logic, etc. on the periphery though. It's the general concept I'm interested in, I've the feeling I'd have found fractal geometry or QFT maddening with (extraneous) narrative.

Assuming you're lecturing at a university, many institutions now have a "centre for teaching and learning" or similar office ("teaching innovation", etc.) where lecturers can receive support and feedback on lecturing skills and course design (among other things). Not all CTLs are well promoted on their campuses, so you may need to ask around.

Re: some of the other commenters' suggestions, you might want to research the "flipped classroom", which is the popular term for the model they're discussing (students read materials pre-lecture, and lecture time is re-purposed for discussion and engagement activities).


I would add that they can also come in and observe and confidentially give you input.

Not just flipped classroom...which some students find annoying / 'cheating' ...look at active learning as well.

The advice i got from an observation for a big presentation was invaluable. I talk too fast and, somewhat weirdly, hardly move my jaw

I just recently graduated and here are my insights for material absorption:

-I don't like it when lecturers used slides. Most don't do it right.

-I learned more from programming projects than exams (it's a hand's on thing).

-Open exams are a true test. Classes that offer these challenge students that learning is about understanding.

- Multiple Choice is BS. Free response rocks. Partial credit is commendable. Human knowledge isn't binary. You can know bits and pieces of information and derive the truth.

-The problem with college and grades is that we're wired to think that doing well on exams is a good measure of how much we know; in reality, that's not the case.

-The professors I gave the highest reviews were, in fact, the easy ones, but that's because in those classes I was more focused on learning about the stories, the concepts and the presenter instead of my learning "strategy" (gaming the class for the highest outcome). I learn more in "easy" classes. Easy doesn't mean no work. More like do the work and get the grade

- I've found incentivized answering to be more effective than cold calling. Discussions are powerful tools. To get people discussing, incentivize with points on an exam or something. I also appreciated when lecturers offered extra points for attending outside class events, that helped encourage interest in the subject. We often lose sight of real world applications when we're absorbed in theory.

It sounds like a well done class! I've seen that many lectures follow a pattern of starting in the low-level details of a topic and slowly working up to something useful by the end. I find I don't appreciate or particularly remember the details because I don't understand the motivation. So, I prefer an approach which starts out with the big picture of what cool thing is possible, and then the details take on more value and meaning for me.

This is huge. It's difficult to learn the process of solving a problem without having a deep understanding of the problem itself first.

First of all, it's awesome you're putting in the effort your students deserve!

Your question and many of the answers exactly fit a framework taught to me in university pedagogy classes (which were a part of my PhD program). According to this framework, there are three levels of teaching:

Level 1: teachers focus on how the students are.

Level 2: teachers focus on what the teacher does.

Level 3: teacher focus on what the student _is doing_.

I think you are mostly now operating on level 2. You should try and switch to level 3! Review all the concrete activities the students are doing and ask yourself: how and what the student is learning when they are doing this?

See this reference: https://www.tandfonline.com/doi/pdf/10.1080/0729436990180105

Also think about this concept: https://en.wikipedia.org/wiki/Constructive_alignment

p.d. If you decide to apply these methods, be prepared for some initial resistance from the students - effective learning is not easy and many students have rarely experienced it!

i have an anecdote that you might find interesting..

many moons ago when i was an undergrad, there was this one class that i hated. i got practically nothing out of it. and the reason was that the professor had prepared all the course material in powerpoint. every damn proof, every single block of code.. everything. he would come in and read the deck, line by line in the class. i can trace my hatred of powerpoint to that single class. i remember thinking "i can read damnit. if you're going to read the powerpoint back to me, why don't you email that to me and i can read it on my own. i am not a toddler that needs to be read to by an adult".

prepare ahead of time - by all means. plan the coursework, the lecture. but whatever you do, don't read your prepared material to the class.

incidentally, i was also taking a class on public speaking at the same time. and near the end of the semester we each had to give a speech as our final "project". and one gent stood out - Sean (don't remember his last name) - he put up one poster and gave a 20 minute speech that i remember to this day after some 20 years or so. he didn't "read down" to us with a wall of prepared material.

since then i have sat through countless presentations/meetings. i can recall any presentation as vividly as i remember Seans'.

P.S.: Sean, if you find this, you convinced me that day that BMWs are the ultimate driving machines.

Various advice.

Prepare an evaluation form at the end of the class to have a better assessment of your class from the students. Write the questions such that you can get constructive feedback from them.

Try to balance the boring stuff you can avoid with more engaging material. Break the pace with questions or exercises.

If you can, do a lot of labs to "gamify" your class. Programming/hacking can be fun and addictive so it's a great opportunity to make students happy and learning something at the same time. However, it takes time to write good labs.

I noticed that no matter the content, students are satisfied if the difficulty is adequate.

One suggestion is to also do your own survey somewhere in the middle of the course. Each group of students is going to be somewhat different and getting feedback early so you can adapt to that particular group is invaluable. Importantly, if you do this, the survey should be incredibly short. One suggestion I got from my father which I tried the last time I was teaching was basically these three questions:

1) What should I keep doing? 2) What should I stop doing? 3) What should I start doing?

The goal here is not detailed feedback, but a high response rate that will give you enough data to course correct.

The university solicited feedback, and the questions and answers were quite detailed. That was the basis for my question posted here.

I do a lot of teaching in my spare time on topics that use the command line.

I'd highly recommend you use: http://asciinema.org .

Asciinema is like YouTube for the terminal. It lets you record the terminal and embed it in a web page (such as your prepared HTML notes).

Once of the nice things is that the students can copy and paste the text from the embed. The students can then copy and paste the output.

> Looking for advice on how to balance prepared material and more spontaneous things.


I love watching some lectures... especially when they are out of my depth, but well presented with good spontaneous interaction. But having it on video makes it easy to go back and repeat for those of us who weren't quite up to speed to be able to appreciate it all on the spot. I'm all for old skool lectures, but do record them!!

This is probably too late, but my favourite CS lecturer is Richard Buckland and he is somewhat famous locally for being an excellent teacher. He has some lectures on Youtube that could serve as some inspiration: https://www.youtube.com/playlist?list=PLA4A262D5911C2721

+100 to this. I had him for H Comp 1A, literally changed my life .

I just finished taking the DB course here, and am waiting for exam results. Our lecturer was of Greek origin, and as such had an accent. Further his delivery could be a bit monotone. Those two things, which aren't really big deciders of whether a course is good or bad, drowned out any legitimate issues with the course.

Because he was monotone students started calling him "slow" whenever he tried to collect his thoughts and present things us students obviously hadn't quite grasped yet. He would take 5 seconds about 7 times per lecture to look at the slides to do this.

The point is, you would think CS students were a more logically inclined bunch. Here, at least, that is not the case. They are horrible and vindictive if they get the chance to be so. Even people writing longer, well-structured response to evaluations might be completely off-base on what is happening, or be swept up in the meme (used in the academic sense here).

Student here.

One of the biggest things for me is not necessarily the content of the lectures, but what happens outside the lectures. Some lecturers give small non-graded "challenge" assignments that really get me engaged with the content. A couple of my favourite examples of this:

- "Factor this small RSA key" - I ended up reading a paper on, and implementing Pollard's rho algorithm. It was on the extended reading list anyway, but I'm typically not motivated to read things just for the sake of it.

- "Implement OOP concepts in a "non-OO" language, such as C" - This one was just generally interesting, and helped to dispel some of the "magic" of OOP. It was also a chance to look under the hood of existing OO implementations.

I think one cool thing teachers are doing is making the homework assignments available as jupyter notebooks. You can have the coding exercises there and a little write up about what they need to make happen and they can try things in real time.

I think CS can be a dry topic and not everyone will enjoy it. I think adding some visualizations like the videos from 3Blue1Brown can be cool. It's sometimes hard to visualize how an algorithm works by reading it. https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw

Best of luck

I actually dislike this trend toward Jupyter notebooks. I think they are only useful in the same contexts as a GitHub gist — in particular a notebook is only good if you can assume it can be entirely thrown away and never needs to persist as a long-term reference.

For pedagogy, if it needs to be interactive, it should just be source code that will be executed in whatever is the idiomatic manner for the underlying language or tool. (But very often, things don’t need to be interactive anyway, and only are shoehorned into it because it’s trendy.)

For example, the Scala extension to Jupyter is painful. Absolute hindrance to learning or teaching or working. Compared with just writing source in a powerful editor and then using sbt, it’s night and day.

I say this as someone who has spent many years working with IPython and Jupyter, including writing 0MQ kernels to use IPython with a company’s in-house language, and working on large data science and machine learning teams.

Thinking of notebooks as discrete units of sharable work, I think, is absolutely eroding many more sincere collaboration, pedagogy and reproducibility skills.

It reminds me in some ways of the early 00’s MATLAB fad, especially how The Mathworks targeted students to get them hooked on this one way of working (with MATLAB as IDE, interpreter, presentation tool, etc.) to create a pressure on employers to offer MATLAB as a standard working environment.

Some institutions still haven't been weened off of MATLAB. In this case, introducing notebooks feels like one way of moving to something that "sucks less".

Could you explain in a bit more detail what's wrong about notebook-style interfaces? I'm interested in them, but haven't used them extensively.

It depends on which intended domain of usage you are thinking about.

For "reproducibility" or "experimentation" in a production-like R&D environment or university lab, there is one set of downsides. For pedagogy there is another.

From a reproducibility point of view, notebooks are bad because you'll always need the underlying software you're writing to have good modularity anyway. One-off helper functions or any pieces of important business logic, etc., need to be factored out into separate packages or libraries that can be imported anywhere they are needed. "Hard coding" implementations into a notebook is very bad for this, because if you don't exercise an unrealistic degree of care, then certain cells or definitions will rely implicitly on values or imports from other cells, and the interdependencies are a mess to untangle if you are trying port the code from the notebook into a more modular form. Experience has shown me that you don't gain any boost in productivity or speed for experimentation or tinkering if you ignore this and try to make things more modular later on. You just end up writing really sloppy notebooks that have to function as standalone scripts, and you waste some other engineer's time who has to go back and undo the mess.

If you just start out developing in source files in the natural and idiomatic package / library approach of your underlying language, it gets you 90% of the way there for zero extra effort.

The other big thing is that regardless of whether you are writing experimental code to test an idea or a prototype, or you are writing production code, you should be using code review in both cases. Particularly in the experimental case, as in machine learning or statistics, many of the mistakes that make you go back and waste a lot of time are because of methodological errors or diagnostic errors that should have been reviewed by other researchers before you execute the experiment or prototype -- which means you want your code in a good format for simple code review tools, like PRs in GitHub or analyzing diffs. Notebook formats are notoriously bad for this, and since you can accomplish any of the same dev tasks without a notebook anyway, it's an argument against trying to shoehorn a code review process to use some extra type of tooling for analyzing code review artifacts directly from a notebook file.

Along the same lines, notebooks end up encouraging you to have a large block of imports that implicitly define your dependencies, including resources, settings, environment variables, local files, URL settings, etc. etc. This stuff should always be factored out into a maintainable settings artifact of some type that can also be version-controlled and reviewed.

So once you refactor any kind of reusable logical components, refactor any hard-coded settings, use proper dependency management, and put anything of any importance into source files that facilitate easier packaging / sharing / code review ... all you're left with in the notebook is shallow plotting code, which can just as easily be in a source file as well.

Once at that point, I don't think it matters if it's a notebook or not, and it would be fine to use notebooks as extremely shallow presentation tools that do nothing but import other source packages and produce plots / tables / etc. But again, if you've gone that far, there are better ways to produce the same plotting artifacts, etc., and to get away from reliance on the browser as the display medium, and to improve code review even of the plotting artifacts.

For pedagogy I think the problems are different. First you have to think about what is the goal? If the goal is to teach programming concepts, then you are better off teaching them in the idiomatic execution paradigm of whatever language you're using. So distribute source files and instructions to students, rather than isolated notebook cells. Consuming this stuff by looking at it through a web browser is not a relevant property in this case, because the lesson has nothing to do with how the source code is visualized. You're essentially just teaching the students to be impractically lazy by relying on notebook cell execution as the means to advance in the lesson, when they should be learning the appropriate idiomatic execution mechanism of the language they are learning.

A notebook could be useful for distributing a browser-based visualization of a concept if your goal is just to make a web app or something. Like, maybe you want to show an animation of the heat equation to teach a differential equation lesson, or plot a path of a projectile for physics. What's unique about these cases is that the lesson you're trying to teach is specifically unrelated to any source code, or even any particular language. In that case, it would be better to just make an actual web application using front-end tools. I admit substituting Jupyter is probably fine for this, because it also passes my earlier claimed test: students can treat these types of notebooks as throwaway gists... the source code is wholly irrelevant, only the take away concepts they are visualizing or poking around with using some type of visualization widget.

I think the big lie about notebooks is that these two cases (teaching a software concept and teaching a non-software concept that needs interactive visualization) should for some reason go together.

But 99% of the time they would never make sense together, and it makes the notebook suffer from a problem like PowerPoint being all things to all people. The 1% of the time when it might make sense is if you are actually visualizing a computer science concept itself, like visualizing the steps of quicksort and you'd like the actual source code to be implemented right next to the visualization. But even for this, the visualization of the concept should rarely be tied to a single language implementation, and studying the language-specific implementation should rarely involve visualization.

One advantage of this approach is that the instructor can provide scaffolding to get the students started and later provide solutions where they can see where they went wrong.

A second advantage is that the students have a written record to study. This should help motivated students with the drive to learn the material. Without such a drive, nothing you do will make a difference.

A third benefit to this approach is the ability to add links to good explanations of difficult concepts in the notebook.

Honestly, the things it sounds like you're doing are great. I'd rather have my lecturer be dry and provide all the resources it sounds like you do than the opposite.

Really, the biggest thing for me is making sure the lecture is properly contextualized. I can get the big picture of the lecture but I never can really pay attention for the full thing, so I tend to learn best from reading the material from my own. My favorite courses have been where the professor gives a good intuition as to why the content matters and how it relates to the greater body of work for that reason. Details can happen on my own time.

I learned data modeling at an internship before my CS class. I learned by watching a guy do it, working in ERwin, while he talked out loud, discussing when to break things up, when he was using a previous pattern, etc. he would then critique my models and others’, where they made mistakes, the cost of a de-normalization to clarity, the choices for performance, cases the model didn’t handle, etc.

When I later took my database class, it was a cake walk.

He was building a time tracking + project management tool. Easy enough business case for anyone to understand. Enough complexity to drive my learning.

If you can use 'clickers' aka "classroom response systems", then you can make the lecture lively by having it focused around questions, which importantly provide you with proper data on misconceptions in the students' minds. If the correction of such errors is key, making them visible will allow to you attack the most salient points: those errors widely shared. As a bonus, it provides the student with something to do, increasing their attention. It's great that you are also giving the material in advance.

Please no--clickers classes generally suck and distract from subtlety and overarching insights. Lectures without them in my experience have been vastly better designed.

I think you can do it right.

For example my calculus professor used it during the last 5-10 minutes of the 2 hours class to go over what we have seen during this class and the previous one.

'recent' is 5 years ago. But as a wild guess i think this has to do with your usage of html as the presentation format. It being a text based, syntaxy format. although it does not need to be dry at all, as it is very powerful if you combine it with the right CSS and javascript, it requires highly specialised knowledge to do well, costs you a lot of your time and does not provide many benefits for presentation purposes. for the same reasons i would not suggest LaTeX either. i would imagine a HTML based presentation to end up looking like a 90's web site. plain blocks of text with some headers and an image here and there. no animations. that does tend to look dry and unpolished. not really at the level the average student is used to.

if you want to get creative, MS powerpoint is much easier to work with, is WYSIWYG and allows you to easily insert things as you think of them. it has a lot of good looking formats, transitions between slides et cetera. it allows you to easily insert video content, allowing reuse of your past interactive sessions.

i know i'm suggesting a university professor to use a laymans tool, but i am no academic and do not know of a more academic alternative. maybe it exists, you could take a look. Yet a lot of professors use powerpoint. most of them are not very good with it, so it might pay to invest a bit of time into learning it so you can produce a sleek looking professional-level presentation. Of course it is MS. In case you like open source, you could use libreoffice as well.

I've always enjoyed teachers who prepare their material well(like you possibly do), but one thing that made teachers for me stand out is doing a great material and also presenting well.

Being able to be sometimes funny or interesting, throwing some anecdotes in the mix, paying attention the the feelings of the students: sometimes things in computer science is hard and tedious.

Recognizing that and that this can make students have a hard time paying attention or enjoying it, might make your class better as for those topics, you would try to maximize the students attention with what I've mentioned above: throwing a joke, giving it some pause etc.

Other than that, I don't believe much further can be achieved. The student must also be willing to learn, and unfortunately not all of them will be willing, at all times. It's just how it is.

I don't teach because it cant pay as good as doing software dev for me(and I don't have a masters), but I've done a lot of stuff at university and teached a lot(was TA, helped people before exams etc) and have really enjoyed that time. But I saw that no matter how hard as I try, if the other part is not willing to listen, there is nothing much I can do. But that doesn't let me down, because for every 1 student that wasn't willing, there were always 20... I hope that later in my life I can teach some at college again, I like it a lot, but the financial obligations people have early in life makes it hard to go through that route, at least in most locations in the world.

Good luck, you are doing a favor to humanity!

There are books that are full of bells and whistles. And there are books where the writing is not necessarily most polished, and does not follow the most recent hypey style, and they still easily thrive for centuries on the shear power of the ideas inside. Dostoevsky and Phillip K. Dick are probably the standard examples. They give the reader something of value, something that allows the reader to process the world in a new, or additional light.

I do not know much about the databases, but it I'd imagine its one of the backbones of the information age. In particular probably the efficiency aspects. It is a powerful material. And so as long as you come into the classroom and make it clear that today you are going to deal with something that makes the student more powerful, the students are going to listen, no matter which manner of delivery you choose. It may not always seem like this, but they are young adults, not little kids whom you need to trick into eating by stories or something.

Being an industry veteran, you likely have a great perspective on what works and why. The usual things, examples, critical distinctions of what could be a good idea and what could be bad, are useful. But whether its eventually provided as HTML or not, I don't think it matters.

Some MIT professors use a method termed "practice-theory-practice".



I see this as a "practical" lecure, also from MIT: the professor tells relatable stories to motivate the new concepts. Though I imagine good storytelling is harder than it looks.


(I've actually been so inspired by this approach that I sometimes incorporate it into conversation whenever I really want someone to understand something.)


This is perhaps worth reading: Knuth - Theory and Practice. https://arxiv.org/pdf/cs/9301114.pdf

Disclaimer: I have not read it yet.

This sounds quite reasonable. The biggest criticism I would have of some of my lecturers would be that the only time some material would be available would be in lectures, this doesn't work for everyone and certainly didn't for me. I found it much better when there were good materials that could be accessed outside of lectures, before or after.

I always found lectures more engaging when they provided something more than the essential course content. Demos and live examples can work well but they can end up being rather lengthy which can be a bad thing. Reading through slides or course notes that are available anyway doesn't really help.

As I mentioned elsewhere on this thread, discussion was in my experience a mixed bag. Some lecturers tried to get discussion or class participation by asking specific people at random, I think this caused people to just avoid these lectures for fear of being picked. On the other hand, just asking questions can result in a long silence before anyone decides to answer, perhaps this is a thing more in the UK, I'm not honestly sure. It certainly can work and it's definitely better with smaller groups.

I found practical assignments were better when they felt less synthetic, it's much easier to motivate to learn for something when the task seems more authentic. For theory, it's always good to have the opportunity to ask questions if given homework, my lecturers were varied in this regard. Questions in lectures can work but some will be reluctant to ask in case it's a "silly" question. My university ran labs with TAs for most units with a practical element, I think these were a better environment as it meant that you didn't have to announce your question in front of everyone else. We had tutorials for some more theory heavy units which provided the same facility.

> I found practical assignments were better when they felt less synthetic, it's much easier to motivate to learn for something when the task seems more authentic.

So much this.

When I was doing organic chem back in the day, I found a lab textbook that pitched each lab specifically in the context of "You are an [petroleum engineer] hired to analyze whether this sample of [whatever] contains [something]," and built up a whole - surprisingly detailed - business context for what you were doing, and why you were doing it that way. It was like OChem: the RPG. It made the actual assignment feel engaging and useful - like leveling up - rather than being a random hoop to jump through because someone somewhere decided this was something that needed learning.

Feeling useless is one of the worst things in the world. Grounding things in reality, building things in the context of what you'd actually utilize it for, is amazing. I wish every class did.

Have you gone through SIGCSE related materials? Checked out the CS Ed Researcher's Facebook group? Maybe looked into the CS Ed Stack Exchange? There's a whole, deep world of CS Ed pedagogy and practice that can probably dramatically improve your teaching.

Not to insult Hacker News, but I think you can get A LOT more from the expert teacher community than the expert software engineering community.

I once attended a CS education seminar. Part of the required reading was 'How Learning Works: Seven Research-Based Principles for Smart Teaching', Ambrose, et al. Out of the flood of materials acbart refers to, I'd suggest it as a starting point and a useful addition to the toolkit.

I figured there were people here for whom the pain was fresh :)

Thanks for the pointers, I will check out those resources.

Are you sure that you should be focusing on your lecture style/content?

I recently graduated from a top tier CS school that emphasizes systems programming (C/C++, OS concepts, embedded systems, etc) and found that the overwhelming majority of learning happens when working through implementations and actually writing code (10+ hrs/week). Lectures (2-3 hrs/week) are really just a supplemental overview of concepts, the quality of a course is largely a function of the quality/rigor of the projects – the exception may be courses for students new to CS, who need lectures to understand fundamentals.

Well designed projects with detailed writeups, built-in tests, and live scoreboards created an effective curriculum. As did factoring program performance (runtime, memory utilization, cycle count, etc) into the grade (in addition to correctness).

Really appreciate your willingness to take feedback and your desire to improve, especially given that you already have a wealth of experience. Your students are lucky to have you! Best of luck.

There were five lengthy assignments, three of which involved programming. I think that could be stepped up a little, but what I've already concluded I need more of next time is more rapid feedback through smaller, quicker assignments.

I want to second this idea - usually not getting feedback quickly is one way students don't realize how far behind they are. In this regard, I think it's worth it to take a look at your assignments and see how you can make them easier on you (e.g. an automated test suite / well defined inputs and outputs or goals that runs on shared infrastructure and unambiguously tests "works or not" goes a long way)

Even just splitting the longer assignments into distinct parts so that feedback could be attained part way through would help. That said, 5 for I assume one semester is still pretty good, at least compared to my experience where it would often be 1 or 2.

I just graduated. My favorite form of course, and the one that was seemed to be preferred in discussions with friends and peers, was something like the following: - Homework is between 10/20% of the grade. Collaboration must be allowed. Even allow copying, and allow one or two drops instead of accepting late assignments. - Weekly quizzes account for ~20% of the grade in total. At my university, we have a computer based testing facility where students can schedule quizzes at times best for them. Plus, the grading is automatic :). Quizzes should be a little more difficult than the homework, and maybe more expansive/general. - The rest is exams. Exams should test conceptual knowledge over practical problem solving or particular algorithms/definitions, in my opinion. Also, as I'm sure you're aware, try to avoid subjective or ambiguous questions. Unfortunately, I had my fair share of both. - Having students write out code on an exam is probably the only way to make sure they understand how to program what is taught. Many people will cheat on homeworks or programming assignments. If you don't have a secure testing facility, they will do anything in their power to cheat on quizzes and tests, too. I worked as a TA and saw some of their techniques first hand.

With that out of the way, try to make homework online & automatically graded. PLEASE, record your lectures and allow students to watch. A auxiliary youtube channel with videos explaining the concepts is a huge plus. Please, try to recommend texts instead of requiring them UNLESS the text is actually used in the course. It has to be used substantially to warrant requiring it. However, please have a few texts listed regardless. It will help supplement what you teach. Maybe link to some good sites or references if you know of any.

Our generation isn't all that different; perhaps we prefer more things online!

Competitive programming assignments were very enjoyable for me, by the way.

Thank you for reading my rambling post. Good luck.

I taught for around four years at a coding school that gave me free reign to try whatever crazy experiments I wanted. With the compressed timeline, I also got the chance to try out different things with a new batch of students every three weeks! It was a tight feedback loop with weekly feedback forms from the students.

As an example, here’s one experiment that proved really useful, a ruby object graph visualizer to explain objects and references: https://github.com/mattbaker/ruby-heap-viz

Your experience teaching is far greater than mine, but I think I might actually have a lot to share in relation to the questions you’re asking, and I love talking with educators.

If you want to chat sometime drop me a line at mbaker.pdx at gmail.

I think we all love stories about how a certain technique was applied in practice to reduce memory or speed complexity. You can even play with their intuitions, asking how much memory or time they can save by switching an algorithm. And then revealing the truth!

Some students learn most by being teached/shown by someone else, others prefer written material/transcripts, and others to try out things themselves. There is so much documentation/guides/... online that I think especially for a database course it would make sense to provide students with pointers on how to play around with a database/dataset at home. Tasks like "optimize xy as best as you can, for pointers see here, here, here" can really spur a student's ambition. You could probably also arrange your "live sessions" around those (extra? -- you didn't mention if students have to do mandatory homework for your course) assignments.

Current student here!

I really the prepare ahead type of course layout. It gives me the chance to read through, maybe code up a couple of examples and come to class with questions. It also seems like that format of class leads to more interesting discussions on how concepts can be used.

Another thing that I've come to appreciate in University courses is a professor who was in the industry for a while. One professor in particular had a lot of stories about his time at IBM, talked about perspectives on building large systems in teams, thoughts on how to deal with management as well as the actual concepts. It's very neat to hear about what practicing computer science looks like off campus.

> I really the prepare

It seems like you're missing a worried here.

To get my attention in a prepared lecture:

- Tell stories that touch on the facts in passing. It doesn't matter whether they're stories about the founders in a field, about your personal experiences, about a fictional startup building a database, etc... As long as it's wrapped in a story I will probably find it interesting.

- Start with why. Before explaining solutions, explain the problem those solutions solve.

- Gradual build-up of a system, instead of a serial description of its parts. The chapter in DDIA about LSM databases is one of the most engaging technical chapters I've ever read because it starts with a 2 line shell script and evolves it until it is Google's Bigtable.

Flyby comment: As someone who straddled academia and industry (but mostly research; never lectured), you have a unique advantage you should utilise: You've done both.

First, most of your students are not going to be academics. They're going into industry. Second, most academics don't have industry experience. They can't link, with the passion of personal stories, the knowledge in the lectures to the work their students will actually be doing.

You can. Use this; liven up the drier parts of the course with war stories. Give concrete context to the abstract. Give them the experience that only a teacher who's been in the trenches can offer.

So, wow, this is a topic in which I am extremely interested as well. So much in fact that I wrote a book(woohoo!! Only took my a decade) and it has been on the market for a few months. The experienced CS teacher looking for new ways to engage with students is one of the primary target audiences. Anyway, if you think it might be good for some summertime beach reading, check it https://www.amazon.com/Computer-Science-K-12-possibilities-i....

I just finished my Associates in CS, and my favourite classes were the ones that were really, really hard. Case in point, my digital systems design final project had only 6 students complete it, and of those projects only 4 actually worked perfectly.

The opportunity to actually struggle in a "safe" environment is extremely valuable as a learning aid.

Nothing demands a thorough understanding of the material like a fairly complex term project.

My databases class had a similar project, and my information security class had weekly long-form research essays. Doesn't matter, as long as it's hard enough to matter.

I had a professor in my undergraduate degree who had a very dry delivery -- just part of his personality, nothing he could do. He started shaking things up in my junior year by working in very funny cs jokes, but with his dry delivery, and the result was everyone began listening to every word he said in case there was some hidden punchline. This could backfire badly though if the jokes aren't truly hilarious, but maybe something to try if you have the knack for it. I say this because some people are just so monotone that this is really their only shot.

For assignments: I would recommend using Continuous integration in the classroom. You could have Grad students help you setup the system. Here is my related blog post: https://blog.github.com/2017-03-01-real-time-feedback-for-st... For the Lecture: This could be subjective, but I would look at online MOOCS to see what presentation style lecturers use these days.

My recollection is that I usually found the lectures to be as exciting as the lecturer did. So if he was excited, so was I.

Are you perhaps not sufficiently conveying your interest in the topics to the students?

Right now, I am taking the equivalent course to yours in an online CS masters program.

The best thing they've done is structure the entire course around building a web database app. They provide a real-world description of what's needed, and break it down into big chunks that are synchronized to the content of the course's three exams. This gives a clear over-arching goal to the class. For people who have never build web db apps, they will really feel like they've accomplished something by the end.

I took this course in 2012:


Probably the best course I've ever taken, on any subject -- though this one happens to be on relational databases! The model of: 15 minutes lecture, 15 minutes in-class exercises, and 15-minutes review of student (and, if needed, "correct" solutions) worked really well.

Might be able to incorporate some ideas into what you do.

I had a Machine Learning class last semester and one thing that I found really useful which isn't really done in any of my other classes was that the lectures were recorded and uploaded to YouTube. This allowed me to look at the given slides and essentially "relive" the entire lecture at my own pace, since there were parts that I'd understood quite well the first time I heard it, but other complex parts needed a few pausing and rewind to better understand what was being taught.

One of my lecturers dies a short recap with questions related to the previous lecture which are examples of what may be asked doing the Oral exam.(if you do that.) and there is the small detail that they aren't posted anywhere or handed out. So you have to attend the lecture and follow to get use. If you want to see some of his material Google "imada jamik" and his page should be the first. He recently did a DBMS course and I have the questions if you'd like to see.

I'm a student of a Computer Engineering undergrad program

I agree with some of the answers already here, but here's what I think was missing (at least from my experience)

for the CS classes, I have mostly enjoyed when we actually see things being used (or use them ourselves), and see how they apply to practical use

to give you an example, when learning about SQlite, we made an application that could 'cache' some content so the application could be used offline, and also made the back-end, to see how to use two different DBMSes, their strenghts, and weaknesses, and why some producs/services would use DBMS X or Y

the CS class I liked most was given by a professor that did something similar to what you describe, he had pre-prepared material, and in class he mostly focused on answering questions (both questions he had when he was learning, and questions we asked him), and on showing us the things we were learning about in a public website, or a product (or sometimes, things other professors had made to handle their classes' paperwork)

that specific course wasn't about databases, but you could show the different ways some DBMSes implement some feature, and the trade-offs made for performance, correctness, etc. (think MySQL vs PostgreSQL for easy to use but buggy vs standard-compliant but kind of difficult to set up)

generally, when a professor shows that he not only understands the material, but can show us how it's useful, or how to think about how it intuitively solves some problem, the class is more enlightening and fun

ultimately, I think you can benefit students more if you show them stuff that gets used a lot on businesses where they might end up working, though there might be exceptions, like academic/subject fundamentals, for example, in my DB course I didn't really found relational algebra useful, because it was covered in class after using SQL, so I didn't see any benefit to learning it, maybe if I had learned it before SQL, I would have understood SQL faster or more easily, but I ended up seeing it as useless because I thought the actual useful thing was SQL, until we learned SQL optimization, so maybe the order was so people wouldn't have forgotten about relational algebra, but it seemed completely tedious, boring and useless before actually using it for something

hope this helps

If you can explain how different textbook scenarios are from real life that would be great. E.g. why linked list are not often used due to cache misses, what's the advantage of one DB brand over the other etc. Even though manufacturer-dependent knowledge is not as useful general purpose theoretical CS, it's still pretty important as ultimately that's what you will be working with in the real world.

Ask a student to come up and explain a small portion of the material. It will expose holes in your explanation and make that individual feel important.

I used to like active exercises - moderately difficult task, exercise or activity that is asked by the end of lecture and you can ponder about it in free time (so that I have chance to solve it before someone yells the answer).

Or some kind of questions answers quiz in that html, simple exercises to train etc. Having simple problems to he solved by yourself works great for learning.

One thing that could spice it up is maybe exploring cloud databases and scalability. Like, what happens when you need to serve thousands of requests a second. How do you need to change your configuration to adapt to that kind of load. That might make the use of EXPLAIN more apparent as the the need to shave milliseconds off your response becomes more necessary.

Not a teacher or student (for a long time), but you might benefit from general training on giving presentations. There are plenty of YouTube videos discussing this, mostly for business, but I’m sure many concepts could apply in the classroom as well. General concepts on what to put on slides (not too much), how to get discussion going, etc.

one thing for me that made a huge difference is I always did well And really liked my classes where the professor wrote their own textbook. I know this approach isn’t always feasible but if you want to really control the course work and the students learning flow I really highly recommend this approach.

Even if your first few times using the textbook are “choppy” it has some big benefits such as:

You are controlling the flow of the course

You know what exactly is in the textbook and should have a good estimation for quality on practice homework answers (this is important to identify problem points you may want to focus on)

You will also from my I xperienfe have much more latitude in how you can approach things. The professors I have had that wrote their materials for the course themselves always seem to really have a good ability for judging what they should or shouldn’t go deep on depending on how the class has been responding

You can save students a ton of money potentially (always a bonus!)

Since you have industry experience, consider adding some of your own war stories about how you learned certain things the hard way at the appropriate points in each lecture. You wouldn’t include these stories in the published material, but people will enjoy them, and the stories may help cement some of the lessons for some people.

I think you have already done a huge amount in the course. You should keep your mind open to the possibility that the students just found the topic of database to be dry inspite of your presentation. It is quite possible that those people would have really hated the subject if the presentation was not as good as it looks.

Maybe you could hook up a projector to your laptop and pair program? You could create databases that are suffering from some issue, or need some modification, and just work your way through it with a student. I think it would be really intimidating, but it might be an awesome way to make the class more interactive.

This is such a fantastic question, and I applaud you for improving your craft like this.

Hands-on material is the most unique thing that you can show them. Show them things that a textbook won't. Show them what it's like to tinker and experiment, poke and prod your way towards a solution.

I do engineering education research (and some CS education research) along with friends who teach and are active in the open source community. You should be able to email me via me profile. Go for it, I love talking education with folks and helping them improve their classes.

I’m in a somewhat unique position of being a current undergraduate student, but also a TA, so I can somewhat comment from both the student and the teacher side of things.

* Making the lectures available in an accessible format (I have a slight preference for PDF, but HTML is just fine too) is a huge benefit. Before exams I like to aggregate all of the lecture materials to date into a single monolithic document so I can ctrl+f the whole thing while studying. If you have a proper hierarchy / table of contents this is even easier.

* Learning from a book / lecture along is really hard. It’s important to not just show examples, but show me how I can run the examples on my computer myself. Something I can interact with live, tweak, play with, add code to is hugely useful for building understanding.

* Don’t assume that I understand the boilerplate, tools, and so on. I’ve had a lot of professors who explain the core material well, but not how to actually open up a text editor / IDE, write code, compile it, and run it. I had already been using UNIX for years before starting college so this didn’t affect me that much, but tooling is one of the #1 issues I see my peers (and the students in courses I TA) struggle with.

* Use lecture to explain concepts, not code as much as possible. If you show me code in lecture I’m probably not going to remember it well enough or write it down in my notes well enough to replicate your example if it’s at all non trivial. Instead, make video/HTML/PDF tutorials that walk the student through the code example. If you want to show a code example in lecture, walk through one of those tutorials in lecture! Make sure these tutorials explain how to go from sitting at my desktop with nothing open to writing code and having it run, especially early in the course. See [1] and [2] as concrete examples. If you spontaneously come up with a cool demo or something, go for it, but try to record your screen / terminal, and if it you can’t get it working, move on quickly. When I TAed my institution’s intro to UNIX systems class, I kept a terminal open on the projector at all times with a `script` session running. I would upload the transcript after each lab sessions so the students could reference it.

* If it is possible, set aside scheduled time for the students to be in a computer lab working on assignments that you or a TA will be there to help them if they have questions. It can be hard to articulate code problems during lecture or in office hours without being in front of a computer with an IDE/editor open. If you have large class sections this may not be a viable option though.

* For assignments and homework, include a clear list of deliverables which the student should turn in. For example “I want a zip file where /myprog.c implements the API described on page X of the homework 3 assignment sheet” and so on.

* If you want students to do something, attach a grade to it. In my experience, ungraded exercises usually result in the exercises remaining undone by the majority of the class.

* Provide a reference library on your course site of functioning code examples, each with a README explaining how to run it, what it does, and so on. Ideally try to demonstrate one concept per sample. This will both provide students with working examples to learn from, as well as be a valuable resource when you get asked questions in lecture and need to demonstrate a particular function call or technique off the cuff.

* Something that one of my past professors did which I found very valuable was to have an "A" and a "B" version of each assignment. Essentially the "A" version would be "get it to compile and implement some trivial facet of the assignment", and the "B" version would be "implement everything in the assignment sheet". The A version would always be due a few days after the assignment was posted, and the B version a week or two later. The A version would be worth like 10% of the assignment score and the B version the other 90%. The value here is that it forces people to start the assignment early on (no more students waiting until the last minute to start working on a homework), and gives student's an idea if they are on the right track; it also gives you a way to ding the student for making legitimate mistakes in a way that won't tank their overall course grade.

* The biggest thing in my opinion is that you are asking these sorts of questions. In my experience, the best learning outcomes always happen with teachers who care and want to teach and share their expertise and students who want to learn and put forth a genuine effort to do so. When the teacher does not care, even the most motivated students will struggle and have to go learn on their own. When the student's don't care they aren't going to get much out of the class no matter how good the professor is. In that vein however, try to engage with the students and get them interested - don't just focus on the students who are already motivated and interested in the subject; make sure to emphasize how the course material is practically useful and understanding it will benefit the students in their careers.

* As some other people have pointed out, take student evaluation forms with a grain of salt. People who get bad grades are likely going to give you a bad rating, and people who get good grades are likely to give you a good rating. At most institutions, the eval forms will have a comments section - that's what you want to look at the most.

If you want to discuss more, feel free to reach out to me. My contact information can be found on my website[3] (I try to avoid posting it on forums and message boards to avoid bots).

1 - https://cse.sc.edu/~jbakos/613/tutorials/setup_tools.shtml

2 - https://cse.sc.edu/~jbakos/613/tutorials/scells_schematic.sh...

3 - http://cdaniels.net/about.html

Don't read your page. Expect your students to read the page. Then go deep and discuss concepts in-class. Keep homework interesting, but don't ask for more than 1/wk or expect hours to be willingly ground out on make-work problems.

One thing you may consider is, if you aren't already, interspersing a real-world motivation.

So take the current material and make it slightly more relatable via anecdote about why one may need to use the current technique or motivations behind it's creation.

I'm a recent grad of a good CS program. I know that I prefer written material to lectures, and my favorite professors were those who had written material (course reader format, not slides) covering lecture content.

Why not just include interactive exercices/puzzles into the html documents? I took a db course not to long ago and would have loved it to be more "hands on".

I was thinking about online exercises, although not integrated into the lectures. I'd rather have students stay with the material than wander off to questions.

See David Malan @ CS50 https://youtu.be/y62zj9ozPOM

Watch Prof Hilfingers lectures on YouTube.. brilliant lecturer, was able to keep a room of about 500 students almost fully engaged..

The topics are inherently dry and boring. Let's face it, we aren't spitting the atom or sending someone to the moon here.

Here's a small thing that annoys me: Presentations in HTML. It feels almost like a form of DRM! Please release slides in PDF.

How so?

First of all, it's trivial to convert HTML to a PDF, just hit Ctrl+P in your browser.

HTML works much better when viewed across multiple screen sizes, and has better accessibility features. If the HTML is well written, it will also be perfectly readable in CLI viewers.

PDF is a much more opaque format.

HTML is fine. The problem is when JS is necessary to even navigate. Printing doesn't work well on those. They also tend to not work well on flaky internet connections (4G hotspot) because they load slides lazily or something. So, if it's not too much trouble, please provide PDF as an option for slides.

From an employer view, I’d really like if fresh students actually knew how to deploy the stuff they write.

Look, the material is dry, A lot of is boring boring work. How many times we (HN) collectively started with sign up/log in functionality on a project? The first time it is fun but after that your just re-writing code you have written before.

Honestly - if a student thinks your class is dry they are not going to make it to 10 years of experience to get to the actual work.

find some interesting data - aggregate.

A couple of comments from another industry adjunct

1. the terms 'lecturer', 'lecturing' and 'presenting' all lead down the wrong path: 'lectures' which is one person standing at a board talking. I prefer to call it 'classroom time' which gives me complete discretion on the format.

People learn better through interactive experiences. Lectures are not interactive. Showing code is a demo/'applied' but it is not interactive. Interactive means the students are highly involved and are co-driving the class.

To give an example - here is how I taught agile: - I went to a party store: bought one gold crown, one gold birthday hat, 5 blue hats and two different colors of stickies - I opened out the lecture with "there are many agile development processes. I think the best use of time is to rather than talk about it is to do it. So, if you choose, we can either role play it which requires your involvement or I can kill you with slides. What do you want to do?" (their choice was obvious) - I picked a few volunteers - the 'user' got the gold crown, the product manager got the gold hat and the 5 engineers got the blue hat. The stickies were for features or tasks. - We then role played two sprints for an airbnb clone from scratch which was possible in 75 mins. (yes I covered scrum != agile) - It was a lot of fun and the issues that came out in the role play were perfect and gave me the opportunity to stop the role play and talk about the theory, issues and how to resolve. (engineers misunderstanding product manager, developing common language, engineers asking to extend sprint, etc, user being demanding on timelines, value vs # features, etc) - the class left super excited and after that day I tried to incorporate a lot more fun and interactive games

(In a 25 lecture course over half had interactive games. It took me a few years of learning / experimentation to understand this and to throw away the pressure to 'lecture'. My feedback is positive and despite >50% of lectures being highly interactive students want more interactive lectures. Exam scores / grasp of material strongly correlated with interactive sessions)

2. 'Lecturing' means you need to know your field but also read up on education as a field in itself. How do people learn? How do you structure a lecture? How many points can people handle per hour? How do you control a class / classroom management? (this last one is a lot of fun and easier with a few techniques but that is for another day)

A lecture may then look like a variant of: - capture attention with clickbait - why this issue really matters and how it is hard, how it really applies in industry etc - open out on the theory - apply it to a problem / example - interactive class exercise - post mortem of the class exercise (what did we learn) - take a 5 minutes break with a joke / discuss the time I failed / saw this fail (this is the value of an industry adjunct ...) - some more theory - wrap up / re-summarize (sometimes the class exercise was #2 to make the point then we did the theory)

Set expectations in the first lecture. My class is over-subscribed so I set expectations and if they don't want to meet them then let someone on the waitlist take the class.

Care about your students. Learn their names.

Make content relevant to them. "If you want to do X you need to know this etc", "This is the thing first year software engineers sometimes struggle with", "this is a super common interview questions"

Be open with your class about your failures.

Ask your class if you have convinced them yet! Make it a challenge. If one of them says 'no' then gladly accept the challenge! (if one person says no then >30% of the class probably feel the same)

Some students were nervous about contributing for fear they may be 'wrong'. Always thank them for their input, acknowledge any merit their answer has / it would be correct if X. However, Y causes Z to be more important.

3. Oh my word does it take time. The first year I found it took me ~8 hours to produce each lecture from structuring course, research, crafting learning outcomes, devising lecture, creating materials, practice, tweak, deliver, etc.

4. Lecturing is like coding. You should look back on the lectures you delivered a year ago and cringe. That means you are developing. I still cringe hard =)

I've started teaching part time - web dev for grad students in IS, half way through my 5th semester.

It has been fascinating, learning what does and doesn't work in teaching. I'm still a fumbling amateur, but I'm getting rave review scores so far and I know what I'm trying to emulate.

Given your experience you likely know everything I've gotten so far, but on the off chance there is something helpful here:

My best reactions come from the fact that I have a class Slack workspace and am pretty reachable there. I've found Blackboard to be monstrous and Outlook is terrible when it comes to sharing code. I have a github repo for each student that I send class notes, examples, and assignments to (also how they submit their assignments). Because I use tools they will likely use on the job, I can talk about those tools from a future professional stance (such as how code reviews are NOT judgements, but opportunities, or posting code snippets, or pull requests and merging). My availability to answer simple questions at random hours is highly praised, even if 50% of the questions just require me to restate something that was explicitly raised in class.

Next big praise from students comes from discussing coding on the job. Things like "most classes teach you to write a new program to solve an issue, but the vast majority of your work will be changing existing code" and then giving them practice at that, showing them how solutions that work in the short term can cost you when you need to make changes.

I make a point of trying to build from fundamentals, so for example while my class is both JS and React specific for the tools, I first have them write Html to interact with a backend of any language, then write dynamic front end with vanilla JS, the write with React while stressing that the lessons they should learn should be conceptual and not React specific because React will get replaced like every library before it. I have to walk a fine line between giving them enough to be suggestive while not reducing them to just copying verbatim. Cant say I've nailed that down, but I feel my students end up better prepared to learn something else than just churning out an app using a framework without having a clue WHY it works or how to use any other tools. (On the whole, this experience has not left me impressed with the normal curriculum/habits of schools even as I have gained sympathy for the difficulty)

Where I'm weak is having prepared material...I end up doing most classes on the fly, with only a vague list of what to cover each day. I reuse some material from semester to semester, but I'm experimenting so much that often it is just that the material is more familiar to me, not that I'm actually reusing slides or written code or even assignments.

I'm also weak on TAs and office hours. I'm available before class for an hour each week and have tas for the larger semesters, but these are the wants that students complain are the least fulfilled.

I have a separate slack channels for good topical articles that aren't required for class, as well as for job hunting/interview tips. These get positive comments but not rave reviews, yet former students that weigh in (I have a separate slack workspace for former students) often benefitted from these.

Being web dev this involves minimal algorithm work, or even much in terms of performance concerns, and though I make a point to cover SQL injection, salting, hashing, and the basics of public key encryption these are easier to tie to practical demonstrations than what you list. (Easy to google for companies with hugely embarrassing exposes due to poor web security than to poorly written outer joins, because the latter is more private)

Hopefully something there is helpful, and if not I'm very open to hear where I can improve myself.

Not sure if it'll help, since I don't have any teaching experience, but coming at this from the perspective of a student, thinking of what your students might be thinking given what they're saying:

First, "not dry" doesn't necessarily mean "not prepared" or "spontaneous". Consider things like the SR-71 story [1] or the unprotected VNC endpoints presentation [2].

Second, keep in mind what you can give students that they can't get from a textbook or another lecture that they find on youtube. The best professors I had weren't the ones that could quote the textbook; the best professors I had were the ones that could provide context for the textbook, or tell stories that illustrated important edge cases or institutional knowledge that isn't written in any textbook. There's a huge amount of wisdom that could never be "professional" enough to make it into a textbook but is nevertheless fantastic for helping a student understand or remember the material. For example, consider the "Things I Won't Work With" series; would never get past a textbook review committee, but absolutely riveting on top of the information transfer.

Third, and finally, how these relate to helping students absorb knowledge. Consider spaced repetition and the method of loci [4] and what they mean about how the human brain forms knowledge: People remember networks of knowledge, not single facts. That's why I cite war stories above - giving students a story to wrap around an idea lets them link to it from multiple angles, each one reinforcing the memory or giving the brain a second or third chance to successfully record it. Interactive discussions are the same way; remembering a monologue is much harder than remembering the same information packaged as a socratic dialog with a few good jokes thrown in. That's why everyone else is recommending question-and-answer sessions and cold-calling; human memory is better at handling conversations than it is at speeches.

[1] https://oppositelock.kinja.com/favorite-sr-71-story-10791270..., https://www.youtube.com/watch?v=Lg73GKm7GgI [2] https://www.youtube.com/watch?v=hMtu7vV_HmY [3] http://blogs.sciencemag.org/pipeline/archives/2010/02/23/thi... [4] https://en.wikipedia.org/wiki/Method_of_loci

Depending on what school you're working in, a lot of them have an academic technology office or whatever, that will come and record classes for you: both your screen and you talking, and if you're really lucky, you'll get a camera-person focusing on the person talking when in-class discussion is happening. Absent that, you can get very far by just doing your class on youtube live or google hangouts with recording enabled - and just repeating questions for the mic.

Whatever the difficulty in recording though, I really would encourage you to keep that format, as it's much more engaging and also educational than a lecture. When someone in class poses a question and others answer, that causes a natural segue into a thinking and questioning process in every student that otherwise requires a laser-focus and also reasoning while also listening to material.

I'll second philip1209's comment elsewhere in this thread though, during my undergrad and my master's, my favourite classes were always very discussion heavy. Some of them required reading up ahead and a menial task to make sure you actually did it e.g. read a paper or a chapter, then summarize it in half a page to a page and submit online. Then the rest of the class is mostly q and a where the professor steers the agenda. In graduate level ones sometimes the students had to present each class and then everyone discusses during.

One thing that discourages teachers often is that it's difficult to get students to say anything. Believe me that this changes quickly if it's nurtured. One prof had a goofy question mark face symbol and explicitly said he expected a discussion whenever you saw that logo when he was teaching. The first few times he had to sit and awkwardly wait for a whole minute or two before someone said a word - like a game of awkwardness chicken, where someone eventually can't take it anymore and blurts out something, which is exactly what you want, so you can build the discussion on that. Once someone says something the ball gets rolling. He also did a thing where he'd gradually simplify the question every time no one answered ("ok well how would this work if you didn't have to worry about ..."), to the point where often the question would be something self-evident that was hard to not answer. Another tactic was to never shame a "silly" answer but to pull out a good bit about it and use it to direct the next question.

One thing that probably helped was that these were often graduate or junior/senior level classes, so people had mostly chosen to be there. Another thing was that the premise of the class was that it was mostly discussion based with some presenting in the middle, which works a lot better than mostly presenting with a tiny bit of discussion, because of the overhead of effort of getting the ball rolling in a discussion environment. Finally, there was a no laptops or phones rule made very clear during class selection period, which really did do wonders. The justification was "we tried it a million times and it just doesn't work" - apparently it's really hard to maintain attention for an hour and a half when distraction is a glance away, and you don't want to miss anything or it's easy to get left behind.

To add to this, even in non-discussion classes, what helps a lot is structuring things like:

- what is our problem (queries are slow) - why do we have this problem (scanning the whole table takes a lot of I/O which is expensive) - how can we solve this problem (indexes help know where content is so you can do less I/O) - demo of it - when does this not work (when your indexes aren't selective, sometimes they can be slower than just a scan, or when it's hurting insertion speed)

etc etc.

Even if you mentioned what we were trying to do once in the beginning of the class (or worse, beginning of the semester), it helps a lot to go back and keep referring to it every time you introduce some new thing just so we know how this new little tidbit helps with the original issue.

You're probably a wonderful person and I mean absolutely no mean spiritedness whatsoever. But why do we actually need new lectures anymore? Nowadays any time limited and only locally accessible lecture will have to compete with the globally maximal presentation style, namely whatever stanford or harvard or whoever creates. If you don't think you can do better than that, why not just borrow what they've already done and use your time more appropriately, e.g. by spending time with one-on-one meetings with students; something that online resources rarely if ever provide.

If lectures are only treated as talking heads, you might have a valid point. However, I think lectures ideally involve a lot of interaction between students and the lecturer which you can't get with pre-recorded material.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact