Hacker News new | comments | show | ask | jobs | submit login
Advanced Data Structures (mit.edu)
1276 points by ingve on Nov 4, 2016 | hide | past | web | favorite | 141 comments



I never hear anybody mentioning him but Jeff Erickson's 'Algorithms' textbook [1] has some of the most lucid explanations I've come across. CLRS is often times impenetrable and for the times I didn't like its explanation of something I turned to Jeff Erickson's book and it hasn't failed me yet. I'd urge anybody trying to solidify algorithms and data structures to take a look at it.

[1] http://jeffe.cs.illinois.edu/teaching/algorithms/


Thanks for the shout-out! If you can't remember the long URL, http://algorithms.wtf also works. Please send me bug reports!


Looks great, thanks a lot!

While causally browsing I noted the following in the 8-queens backtracking example:

"The following recursive algorithm, essentially due to Gauss (who called it “methodical groping”), ..."

Hilarious, especially given the current situation. I'd like to know what his original German term was.


Oh this is a fun rabbit hole to go down:

Gauss indeed wrote "methodisches tatonniren"[1], the latter of which is not a common German word, nor does it suggest grouping. I believe it to be an expression used at the time which was borrowed from French "tâtonner", which means "fumble", to describe a step-by-step process, "approaching" a solution. The closest I get in modern German is "herantasten", which in turn does not have an elegant English translation; you will have to take its elements "heran" (onto) and "tasten" (touch/feel/grope) individually to judge how close it is to fumble.

So, methodical groping is not as far off as you might have thought!

[1] https://books.google.com/books?id=3jEDAAAAQAAJ&pg=RA1-PA106&... This may be a typo—the regular verb form would be "tatonnieren"—then again a lot of German spelling has changed since the time of those letters.


The word "tâtonner" should be the same as Italian "procedere (proceed) a tastoni", since usually the circumflex accent in French correspond to an "s" in Italian.

The meaning is "feel your way"---proceed carefully and at every step feel what is around you, as if you were walking on complete darkness.


>proceed carefully and at every step feel what is around you, as if you were walking on complete darkness.

this would be a fitting translation of the german word "herantasten", which has nothing to do with groping btw


Yup, "groping around" is not at all the same as "groping"!!!


I just skimmed over skiplist in point 10 of your book. The concept and definition of skiplist is easy to grok and it is well explained but in point 10 there is not any information as to when should I use a skiplist, what kind of problem a skiplist is a good tool to use. I have bookmarked your page, it seems to be a useful resource if it is complemented with a comparison and use case for all these data structures.


A skiplist is a good candidate for problems where you are considering some kind of self-balancing tree. I find a skiplist to be the simpler structure in those cases.

I personally used it in an observable collection library, where I needed efficient indexed insert and the ability to compute the index of a node.


Thanks.

Given a directed graph with on each arc a maximum flow and a cost per unit of flow, how to find the least cost flows for a given total flow? That is linear programming. The simplex algorithm takes on a special form, and a simplex basic solution corresponds to a spanning tree. A simplex pivot is adding an arc to the tree to yield a circuit, and getting the cost of sending a unit of flow around the circuit evaluates that new arc. Etc.

W. Cunningham defined a strongly feasible basic solution which avoids cycling.

IIRC, D. Bertsekas has a good (polynomial) algorithm for that problem.

I didn't see such mentioned in your table of contents.

On my 14" screen, your PDF files would be much easier to read if the maximum number of characters per line was about 50.


Thanks. Easy to remember domain :)


TIL there is a .wtf TLD.


There's also a .study TLD. And algorithms.study is available!


Hi, I looked at some of your lecture videos which also look great. You have a wonderful teaching style. Is there any possibility of these lecture videos getting on youtube? Watching is a little choppy. Cheers.


This is awesome. I'm loving the model of computation Notes


Thanks for making the resources free.


What resource bro...


For beginners, I've always thought Aho, Hopcroft and Ullman's Data Structures & Algorithms is a bit under-rated [1]. It's short, mathematically inclined and contains very clear Pascal pseudo-code.

[1] https://www.amazon.com/dp/0201000237


(edit) Their book, The Design and Analysis of Computer Algorithms, impressed me greatly for the clarity of the thinking and conciseness of explanation, not just compared to other algorithm books, but against anything I've read in any subject.


Jeff is also a great professor. If you ever get the chance to take a course with him, do it. You won't regret it. I had him for both 173 and 373 back in the day, and credit his teaching style for the immense amount of material I picked up.

Also, this: http://jeffe.cs.illinois.edu/teaching/pikachu.html


I'm currently taking 374 (the new 373). Unfortunately Jeff isn't leading my lectures, but the couple times he's subbed in were excellent. Really looking forward to taking 473 with Jeff next semester!


I personally favor TAOCP much more than CLRS. TAOCP reads so much fun. The historical accounting is so appealing.

"Algorithm design manual" is another my favorite.

I never find CLRS appealing to read other than being used as a textbook in a formal class.


It is strange that neither Google nor Amazon can find this book:

https://www.google.com/search?q=Algorithms+Jeff+Erickson+sit...


According to the author, it isn't a textbook: "Am I writing a textbook? No. All textbooks suck."


So why can't Google find it? If it is an article, an essay or a series of lectures , Google should find it.


It's only available as an ebook.

I found it trivially at

https://www.google.com/search?q=algorithms+jeff+erickson


I had no idea about this. This looks fantastic! Theres a whole chapter on "advanced dynamic programming."!

I know people like to debate the merits of "thee algorithm book" be it CLRS or Desgupta etc. but I find that having many such books is extremely beneficial - occasionally one author will articulate a particular insight that will resonate for me in such a way that makes me believe I maybe hadn't completely internalized that thing before, despite having read similar content from other sources on many occasions and believing I had.

Anyway this looks like a great such addition to my personal canon. Thanks for sharing!


is there a print version?


No but I printed this PDF (http://jeffe.cs.illinois.edu/teaching/algorithms/everything....). 1250 pages. Not sure about the legality of it.


Totally legal! (Just don't sell it for profit.)


Why don't you sell it for profit? ;-) Lulu or Kindle if you don't want to deal with editors.


+1

As a kindle user I'd love to see a kindle edition.


I would happily pay for both a kindle and print version.


Fantastic resource. Do you have nay more like it? I'm always hungry for stuff like this. :)


> Not sure about the legality of it.

Says on the title page: Creative Commons!

"Attribution-NonCommercial-ShareAlike 4.0 International"

https://creativecommons.org/licenses/by-nc-sa/4.0/


Suppose we want to prove that every string is perfectly cromulent. I love it already.


Great link, thanks for sharing.



ooh, I had forgotten that Erik Demaine taught that course. He's great (MacArthur Fellowship, MIT professor at 20, etc.)


Ah; thanks!

I've been trying to find them on the site to no avail.


this looks really excellent!


This is very nice.

But right now as a programmer, I am using data-structures more on an as-needed basis. Sometimes it is difficult to find the right data-structures for the job. So then it would be nice to have a resource that provides the functionality of data-structures as a black box. Learning these would make more sense than learning also all of the gory technical details upfront.


Part II of Skiena's wonderful "The Algorithm Design Manual" is basically a 300 page, high-level overview of a wide range of advanced data structures/algorithms, practical insight into when to use them, and advice on which implementations to use.

Even if you already have and are happy with CLRS, Skiena's book is a great addition to your library.


I started reading it this week -- it's a lot more conversational than CLRS and the explanation of big-O, omega, theta was much clearer.


As you get into advanced data structures, you often need the gory technical details to make correct decisions about tradeoffs and usage. Succinct data structures, for instance, which are a specific thing and not just a generic adjective, are almost miraculously powerful, but you'd better understand exactly what the preconditions for using them are (which is a fairly mathematical discussion) and what the tradeoffs are before you use them because you will not be able to just "tweak" them and fix them up later if you're wrong.


I read some of the scribe notes on succinct data structures from the link, it jumped out at me as something interesting. But going from `O(n)` to `n + o(n)` doesn't seem especially game-changing -- what lead you to describe it as "miraculously powerful"?


The constant reduction is amazing, for those precise use cases you can afford to use them. Not all linear is created equal. O(10n) = O(n/10), but in practice the performance difference of the inner expressions can be a huge difference.

(Note that is a correct usage of the O notation; because of the fact that those two things are equal we generally just use O(n) because for clarity you might as well use the simplest member of the equivalence class defined by the O notation, but it is still valid to write O(n/10). It's just an error to treat it as anything other than equal to O(n).)


With that being said, would you agree that in general one does not need to be intimately familiar with these to be a productive, useful software engineer?


I'm often amazed at how far a key/value store and an array type can get you.

But when you need more, it's good to know your options.

In fact I'd say nobody can afford to be intimately familiar with all the various advanced data structures. But most people will, at most, just need to know the things they might need, then go learn about them when they need it, and many won't even need that much.


Depends on what counts as productive. If you job is to get tens of billions of things done in a second, then your productivity depends a lot on what the cache locality properties of your data structure are and whether you can allocate your space up front.


I generally would.


Or you just use strings (and regexps to process them)!


Now you have two problems.


Now you have a number of problems unbounded in the worst-case.


Often shied away from as too complicated, this book deserves the obligatory mentioning: Knuth's The Art of Computer Programming. Although not using Python and perhaps being too analytical and detailed for an average programmer's taste, the book is the single biggest classic treatise on algorithms and data structures.

At the other end of the spectrum - accessible and brief, I find Dasgupta et al.'s Algorithms a refreshingly engaging read.


This book is often recommended, but I would dearly love to know how many of its recommenders have actually read it.


I have read the first 50 pages about a 100 times.

Does that count?

P.S. I have never recommended it to anyone though, I don't recommend books I haven't read all the way through. I will get through TAOCP one day, (it's now on my retirement bucket list, along with learning Sanskrit) but I first need to read through a bunch of Math books forst, so I can get past the 50 page hump.


I wonder, what might be a suitable set of books to read before TAOCP?

Might this one suffice: Concrete Mathematics: A Foundation for Computer Science (Knuth et.al.)


I think yes, but then I have not read TAOCP beyond the first chapter.

What I can say is that "Concrete Mathematics" is a terrific book. It is the best general "math for algorithms" book I have seen. Its exercises are wonderfully picked, with problems at all levels of difficulty and classified neatly into categories. It also discusses topics that for strange reasons are not as widely known as one would expect, even though they are extremely elegant and quite useful. A concrete example of this is the Euler-Maclaurin formula relating sums and integrals.


off the topic, but what made you curious about Sanskrit?


I'm not the OP, but can comment about Sanskrit a bit. (I had studied Sanskrit in school.) [Edit:] As a third language, apart from English and Hindi.

This is mostly all IMO, except where specified otherwise:

- it's an interesting language, because it is the ancestor of many Indian languages (except the languages of the southern Indian states - Tamil, Telugu, Malayalam and Kannada - and their dialects - and except the languages of the North-Eastern states beyond Assam, maybe - not sure on this one). In fact, even those languages (southern Indian at least) have some Sanskrit-derived words; I think the term is loanwords.

- it's grammar rules are very regular (compared to many other languages, I've heard), which makes it less of a chore and easier to remember and learn;

- (probably many) parts of the language are sort of mathematical / logical in the sense that some of the rules build upon other rules, so you can use your knowledge of the latter to understand the former; e.g. a part of the language called sandhi, which gives the rules for how to combine smaller words/sounds into larger ones. Once you learn those rules (and there are only a handful), a) you can use them both to create new words according to those sandhi rules, by combining existing words (and those new words would be legal Sanskrit, BTW - poets and prose writers did it all the time), and b) can use them to figure out what compound words mean (that you have newly come across), i.e. the reverse process of a). So in that and maybe some other aspects its a bit like a programming language [1], where you build up higher abstractions from lower-level ones. [Edit:] It's my guess that people who like languages like Lisp, Haskell, Clojure, etc. - that are built upon mathematical principles and so on, might find Sanskrit interesting to learn.

- [1] I've also read a few times that due to the mentioned regular qualities of the grammar (and also, I think, due to the unambiguous nature of the sounds in it - each letter or small letter grouping can be (legally) pronounced in only one way), CS and Sanskrit-knowing people have done research on using Sanskrit in computer research in various ways (programming languages (research), AI? No idea about the success or not of those efforts, though.


Re: programming languages, it's worth mentioning that a Sanskrit grammarian invented something like context-free grammars around 2500 years ago: https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_form#Histo...


Cool. I knew about Panini as a grammarian, but don't think I knew the part you mention about context-free grammars.


An interesting thing from that Wikipedia article:

"The name Pāṇini Backus form has also been suggested in view of the fact that the expansion Backus normal form may not be accurate, and that Pāṇini had independently developed a similar notation earlier."

So that may be one of the (few or common?) cases where something was named (or considered being named) after a person thousands of years after the person existed :)


Cool context! Thanks. I've often been tempted to learn Esperanto bc I've heard arguments made as to its general logical consistency. There's also a constructed language called Toki Pona [1] that is fascinating for similar reasons.

[1] https://en.wikipedia.org/wiki/Toki_Pona


Interesting, will take a look.


Indeed, these are all good qualities of a fine language. It's just that I am not sure if it is those qualities what attracts people and makes them want to learn it. There also must be something else.


Shame I saw this comment so late, but I do have a reason for this.

I'm an enthusiastic student of Buddhism. I long time ago, during university (late 80s) I attended a guest seminar called "What the Buddha Said" It was fascinating for a number of reasons, but the one that stuck with me over the years was about the translation.

As we know, language changes over time (my favorite example is from rap: "Not bad meaning bad but bad meaning good) So what had happened in the translation of a key Buddhist text was the understanding of one word: year. It was translated literally to mean a year, but actually, at that moment in time, the word actually meant a season. So 20 years passed, really meant 20 seasons, or 5 years had passed. And this led to a knock on in the meaning of words used.

So after that, I decided that one day, I will learn to read the texts in their original language.


Who has mentioned anything about sanskrit....


It's a wonderful read, and a good education.

On the other hand, it isn't a very practical solution to most people who are thinking "I'm not sure what algorithm to use here".


I don't think the point of this book is to read it cover to cover, or in this case cover to cover to cover to cover...since it's going on 4 volumes now. I think it's meant more as a reference. Pull it off the shelf and read the relevant section when you need it.


I rather recommend this book end to end by is a good to own it, when I use to compete in programming competitions from time to time there were interesting problems that have an algorithm explained in this book but not in others it is a good reference overall.


I'm in the process of reading Vol 1 atm. It's densely packed but really well presented so far. I should caveat that I'm still near the beginning (learning MIX) so I can't claim that it will remain that way, but so far I'm really enjoying it


Well, it is a bible, after all.


There's an algorithm for that ...


It ends up in an infinite loop for me :(


List of videos/notes 2014: http://courses.csail.mit.edu/6.851/spring14/lectures/

(spent 2 min trying to find them)


They're also on YouTube under the OCW channel.

https://www.youtube.com/playlist?list=PLUl4u3cNGP61hsJNdULdu...


I've been watching the lectures & recitations from 6.006 Introduction to Algorithms (Fall 2011) to brush up prior to an interview. Erik Demane, Srini Devadas & Victor Cossan (Recitations) have been an amazing resource.

I've learned so much and am really impressed with their depth of knowledge and how they are able to convey complex ideas in a very easy to understand way, I can't wait to start the next courses.


A very underrated book and one of my favorites is Udi Manber's Introuction to Algorithms, highly recommended


I used that book for my dissertation. it had best explanation of the knapsack algorithm if I recall correctly.


Thank you for sharing this.

Anyone would recommend resources for learning fundamental of data structures?

Book, video, or courses are welcome. I don't care the programming languages that are used for implementations. I am OK with C.


Skiena's book is excellent and is definitely one to get.

Another would be Sedgewick's Algorithms. He has a pair of courses on Coursera [0] that follow his 4th edition which uses Java.

You're in for such a treat! Learning data structures and algorithms is so fun and really changes the way you think about programming.

[0] https://www.coursera.org/learn/introduction-to-algorithms


Thank you for your recommendation and advice. I will go for Skiena's book since a lot of people recommend it.


The Algorithm Design Manual by Steven Skiena

Really enjoyed that book! It intersperses real world use cases which helps when your motivation starts to wane and has a warmer tone than most algorithm books. That being said, it is still pretty rigorous and will take a lot of work to get through it all.

Whichever resource you choose, make sure it has exercises and do them! Even if you can't 100% figure some of them out just trying to will help your thinking immensely.


I'll second that recommendation and op makes an important point that is rarely expressed when threads on learning algorithms show up on HN: Don't be discouraged if you find the material hard at first, "Even if you can't 100% figure some of them out just trying to will help your thinking immensely."


Thanks a lot.

I like your advice.


Open Data Structures [0] by Pat Morin is a good one, and it comes in 3 editions (pseudocode, java, and c++) so you can pick whichever you prefer.

[0]: http://opendatastructures.org/


In their book The Practice of Programming, Kernighan and Pike explain the really important stuff in about 30 pages, chapter 2 in the book. They cover four data structures: array, list, tree and hash table, and two kinds of algorithms: sorting and searching. They say that just these are enough for most common situations. (They use C.)


Thank you.


If you are interested in Video lectures, there are plenty of resources available at this link: https://github.com/Developer-Y/cs-video-courses

For data structures in specific, You can go through following courses

1. CS 61B, Prof Jonathan Shewchuk, UC Berkeley -> [1]

2. COP 3530 Data Structures and Algorithms, Prof Sahni, UFL -> [2], videos are available at [3]

3. COP 5536 Advanced Data Structures, Prof Sahni - UFL [4]

All above courses focus on ideas than on mathematical rigour.

[1] https://www.youtube.com/playlist?list=PL4BBB74C7D2A1049C

[2] http://www.cise.ufl.edu/~sahni/cop3530/

[3] http://www.cise.ufl.edu/academics/courses/preview/cop3530sah...

[4] http://www.cise.ufl.edu/~sahni/cop5536/index.html


Thank you so much.


The preliminary undergraduate data structures course at my uni was taught using the Data Structures and Algorithms book by Michael T. Goodrich, Roberto Tamassia & Michael H. Goldwasser. The book is available for Python, Java and C++, though I can only vouch for the Java one as I haven't had the opportunity to read the others.

http://ww3.datastructures.net/


Thank you.


I also recommend the Algorithm Design Manual. In my algs class in school, we used Cormen's Introduction to Algorithms (of course), which is considered to be seminal in the field, but that's a bit of a bear to get through.

Alg Design Manual is great and gives shorter shrift to complexity notation and the math behind it, while still giving you a robust introduction to it. Also, it's not as long as it appears as much of the book is devoted to going through specific algorithms so you can be familiar and add them to your mental "tool kit."


OK. I am looking forward for Algorithm Design Manual.

When I was taking analysis and design of algorithm class in my school, my lecturer used CLRS book as the main book. I think that book is so hard to learn. Some parts in the book is easy to understand, some other part is not.


Take a look at the recently released "Grokking Algorithms" [1] by Bhargava. Uses Python for code examples. I haven't read it personally, but has good reviews on Amazon [2].

  [1] http://adit.io/posts/2016-05-25-Grokking-Algorithms-Is-Out.html
  [2] https://www.amazon.com/dp/1617292230/ref=cm_sw_su_dp


Thank you.


I am a fan of Mark Allen Weiss' Data structures and algorithm analysis in C (https://www.amazon.ca/Data-Structures-Algorithm-Analysis-2nd...). It is much smaller than CLRS and the algorithms are listed as C source code rather than pseudo code. I also like the coding style presented in the book.


Thanks. It seems interesting.


Does anyone use any of these ideas day to day? That's not to knock it, I'm genuinely curious.


Yup, in particular when I want to prove/disprove the impact of a design proposal by preparing a simulation. Pay 1-2 days writing a simulation to save 6 months of wasted engineering effort.


Anyone knows which year has the best scribe notes?


why are these hand drawn diagrams easier for me to understand and remember?


Probably because they're visually more distinct. We tend to just glance over things that look very familiar because "we know this" (not a conscious act). Using different colors for different components also helps. Allows details to "pop" out at you. Versus a diagram that's blue/black/gray/white.


Stylised fonts are harder to read but make content easier to remember, it's not a huge leap to assume drawn diagrams like these are similar. Comic Sans is good for you.

https://faculty.washington.edu/chudler/font.html


These are great, Erik is a really smart guy. His intro to algorithms class is also fantastic, which should be on MIT OpenCourseWare.


> If you haven't taken 6.854, you must have a strong understanding of algorithms at the undergraduate level, such as receiving an A in 6.046, having had a relevant UROP, involvement in computer competitions, etc.

Quite the pre-reqs...


My small contribution to the field: http://zvrba.net/downloads/fusion.pdf


I take it solutions by students are mostly done in Python now?


I took this class in 2010. Most assignments consisted of proving one or two theorems from a recent research paper.


It seems as though the instructor expects a compiled pdf, so maybe it's a more theoretical course?

http://courses.csail.mit.edu/6.851/spring14/hmwk.html


I've watched the lecture series before, it focuses a decent amount on cache oblivious data structures and doing realllllllly cool integer tricks to get things fitting into compact bit maps.


I think it's more likely to be in something much more baremetal, given the topic. My university (UNSW) teaches a similar course with C.


No, it's more likely going to be in LaTeX, given the Erik Demaine's highly theoretical inclinations.


He talks about projects at the beginning of the second session, and implementation (of something that hasn't yet been implemented) is one of the possibilities.


also very interesting is Erik Demaine's work on geometric folding, at least it's fun to watch various structures he prints and plays with.


erik is the prof.


He is awesome,a s student in third world country (which universities sucks) I watched to all his online lectures and I learned more than I learned in my 4 year of college,I always wander how person as young as he can achieve so much success.

I don't care about financial success, I am curious about scientific success.

God I wish there was mind recorder. I would listen to his (and people like he) mind all day, how they think, what they think , how they manage their time, everything.

Scientists should study these guys, imagine a world filled with people like Erik (in all careers) how beautiful that world would be.


Thank you for posting this! I love the perspective that your comment brings to course material being available online. It really helps to understand the power of educational material being abundantly available!


If I want to be honest I worship teachers and instructors I have watched their course. I have been in sucky university, I know how it feels to waste your time. The first moment I find my first course (Erik all algorithm course) it was revolution in my life.

I literally worship people like Douglas Schmidt, Erik Demanine, Tim Roughgarden, Robert Sedgwick ...

They literally changed my life.

Can you imagine third world universities? I am sure not. After 4 year , you end up knowing nothing , literally nothing,not even writing hello world.

Thanks to these guys I am reading a book every month. For past two months I was reading Silberchats database books.


Honestly this is why I find the recent decision by the Dept of Education to sue universities for not captioning their lecture videos very, very selfish.


So you're saying that we should clone him ? Or clone top(10) maths/psychs/brain-surgeon/etc ?


Of course not, you are so obsessed with your perspective, you didn't see what I said. I told we should study these guys , how they think , how they improve their performance, to achieve their success formula.

I cannot understand how in earth you infer from that to cloning them.


>>imagine a world filled with people like Erik (in all careers) how beautiful that world would be

Do you think this will happen without cloning or messing with genetic code ?


Damn you got so offended! I'm sorry.

Anyway, haven't we/they studied them ? That's what's missing ?

I refer to cloning, cause you can't take `normies` and tell them what to do and they'll become a science-guy. Cause they don't want to, don't enjoy the process, don't have the mental capacity (pick any of them or all of them).

So you need good genes (be able to learn, wanting/enjoying to learn), good enough life (don't get born in war-torn country), a good program on how to do them (what you say), a little guidance/mentoring.


You can't divide people into 2 categories: `normies` (you probably meant mediocre) and top(10). Leaving philosophy aside, there are a lot of people who do want to know what makes the top(10) tick, people who enjoy learning and who have the mental capacity. And I would argue their number is significant.


I know I can't. But to do:

>>imagine a world filled with people like Erik (in all careers) how beautiful that world would be

You have to also include `normies`, right ?


Erik Demaine is such a great guy, I met him at a conference once, where he gave an invited talk on self-assembling robots. Fascinating stuff, we had a very interesting discussion with him afterwards and his intricate knowledge of even the smallest details was quite impressive. If his talk is anything to go by, this lecture should be very worthwhile.


Completely agree. Haven't met him in person. I was taking a graduate algorithms course, a few years ago, and was searching online to gain better understanding. That's when I found Erik Demaine's lecture videos online. He does a great job of explaining the material. Loved how the concepts were broken down so that they were easy to understand. Learning from someone so good and fluent with the material was also very inspiring. It motivated me to strive for solid understanding of the course. I had a good professor, but I don't think I would have understood the subject as well, had I not watched Erik Demaine's lecture videos.



They better start close captioning or they will have to take the videos down


from the two-birds-with-one-stone-dept i've been looking for a good excuse to dive into pyret, and using it to do the exercises from this course might just be it. would anyone like to join me in a slow-paced workthrough?


Will there be a 2016 version?


I'm not sure the algorithms would have changed between then and now.


Thank you for this, I'm so excited to take it.


is the site down?


That's a great teach


This is a great resource for anybody that isn't formally trained in computer science. A lot of programmers use an abstract data type like a dictionary or hash table, but many of the self-taught and even some formally trained treat it like a magical black box that stores key-value entries very efficiently. What a hash table/dictionary gives it near O(1) properties is a good hashing function for the key, and having a good distribution of buckets for all the keys when collisions occur.

I think a lot of programmers have good understanding of many data structures. But I think hashes and dictionaries are still taken for granted. What they really need to think of hashes as many magical black boxes and the hashing function directs which key to go to which magical bucket. :)


Are you serious? This class is intended for students with deep knowledge of CS theory who want to learn more about recent data structures research. 6.006 and 6.046 are much more suitable if you aren't formally trained in computer science.


Recommend some best book about... Timeand space complexity of algorithms....




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: