Tag Archives: learning

Big-O Quiz: Graph Theory

I’ve been studying more graph theory recently. To take notes that I might remember, I added a Graphs quiz to my Big-O Quiz site. It covers the graph theory mentioned in The Algorithm Design Manual and in the Stanford/Coursera “Algorithms: Design and Analysis Parts 1 & 2“  courses by Tim Roughgarden. I’m adding to it as I work through Tim Roughgarden’s follow-up “Second course in Algorithms“.

It’s often a slightly silly quiz, offering you algorithm descriptions that make no sense for the question’s problem, along with the correct one, but the aim is to just to reinforce my memory by exercising my memory. So I’ve tried to use concise descriptions of problems and algorithms, without trying to be exhaustively correct.

Coursera/Stanford course: Algorithms: Design and Analysis, Part 2

A few weeks ago I mentioned completing Part 1 of the online Coursera/Stanford “Algorithms: Design and Analysis” course. Part 2 of Algorithms: Design and Analysis isn’t due to start again until next year, but I didn’t want to wait, so I enrolled in the archived version of the course to watch the videos and do the assignments. I should be ready to just reuse my work when Part 2 starts again for real.

Part 2 was where things got really interesting. The assignments required implementing these algorithms, though the course covered others too:

  • A Greedy Algorithm for job scheduling.
  • Prim’s and Kruskal’s minimum spanning tree algorithms. (Both O(m log n) but Prims does better on dense graphs, with more edges.)
  • Modifying a minimum spanning tree to identify clusters.
  • The Knapsack problem (Dynamic Programming – both bottom-up and recursive).
  • Shortest Path with the Belmann-Ford SSSP (Single-Source Shortest Path) algorithm (O(mn) and works with negative paths, but fails with negative cycles) as an alternative to Dijkstra’s Shortest Path algorithm (O(m log n) and works only with positive paths).
  • All Pairs Shortest Path with the Floyd Warshall algorithm (dynamic programming) (O(n3) and works with negative paths, though it fails when it detects negative cycles). Best for dense graphs.
  • All Pairs Shortest path with Johnson’s algorithm, via one call to Belmann-Ford on a modified graph and repeated calls to Dijkstra on a reweighted graph. (O(mn log n) and works with negative paths, but fails with negative cycles.) Best for sparse graphs.
  • A dynamic programming algorithm for the Traveling Salesman Problem.
  • Local search with the 2Sat problem, using Papadimitriou’s Algorithm.

I particularly enjoyed exploring “dynamic programming”, which is really just avoiding unnecessary repeated work after you’ve identified the appropriate sub-problems. It’s identifying the sub problems that is really hard. I enjoyed playing with bottom-up dynamic programming (filling in an array as you go, often discarding the n-2th set of results as you go), and top-down dynamic programming, also known as memoization (usually recursing into sub problems and ideally not doing as many sub problems as you’d do working bottom up).

While implementing a dynamic programming solution for the Traveling Salesman Problem, I learned about Gosper’s Hack for iterating over subsets. It’s now a personal favorite in my toolbox.

As with part 1 of the course, I am not allowed to publish the code of my homework solutions. But I did create a public version of the knapsack problem for solving the Make Change problem without a canonical currency (not a real world set of coins), using dynamic programming, though you shouldn’t use that as a first way to understand the classic knapsack problem. I also implemented a simple greedy algorithm for the Make Change problem with a canonical currency (real world set of coins).

The Make Change problem was interesting because I’ve read that people can and should learn to recognize NP-Complete problems, such as the traveling salesman problem. However, it is not obvious which sets of coins would cause the Make Change problem to be solvable with a greedy algorithm and which would need dynamic programming, though it might at first seem like a minor detail. (I haven’t actually read that paper yet.)

I’ve also been reading through Steven Skiena’s The Algorithm Design Manual book, which I can highly recommend. It’s more practical and enjoyable than the classic Introduction to Algorithms book by Cormen et al.

Update: I did part 2 for real when it started again. Here is my certificate:

 

gtkmm 3.18 and glibmm 2.46

A couple of days ago I made the usual bunch of *mm releases for GNOME 3.18, including glibmm 2.46 and gtkmm 3.18, wrapping glib and GTK+ for C++. This adds the usual collection of new API from glib and GTK+, but the big change is the use of C++ 11.

Until C++11 is the default for g++, you’ll probably want to use the AX_CXX_COMPILE_STDCXX_11() autoconf macro. For instance, I used this to use C++11 in PrefixSuffix. Most people won’t need to make any other changes to their code or to their build. You won’t notice any change unless you care about using C++11 features.

Using C++11 in gtkmm and friends has been the most fun I’ve had with gtkmm since I had to delve deep into the GObject lifecycle back during gtkmm 1.2. I even made actual code changes to libsigc++, which I’m usually afraid to touch even though I’m the official maintainer.

Learning in general about the deep implications of  C++11’s new features reminded me how much I enjoyed learning about C++ at the beginning. It’s once again an interesting time for C++.

Coursera/Stanford course: Algorithms: Design and Analysis , Part 1

screenshot_github_algorithmsAlthough I’ve been developing software for years, I noticed recently that I lacked the basic computer science knowledge that other people got at university, though it’s never been an issue outside of job interviews. I knew the basics of Big-O notation and how to use data structures but couldn’t describe exactly how various sort algorithms worked or how to analyze an algorithm’s performance from pseudo-code.

But this is all standard stuff now, so filling in the gaps in my knowledge seemed like a solvable problem. Over the last few weeks, I’ve worked through Coursera’s “Algorithms: Design and Analsis, Part 1” online course, provided by Stanford University. I was surprised to find myself enjoying it. It’s nice to get some insight into commonly used algorithms and data structures, and I guess it does help to inform choices made at a higher level, even if that’s only occasionally useful.

Mostly I enjoyed the programming exercises, writing implementations of Mergesort, Quicksort, Karger’s Minimum Cut Algorithm, Strongly Connected Components, Dijkstra’s Shortest-Path Algorithm, a 2-Sum Algorithm, and a Median Maintenance algorithm. Likewise, each week had a theoretical test, checking knowledge of stuff such as the Master Method, sorting algorithms, graph algorithms, heaps, (balanced) binary trees, hash tables, and bloom filters.

Hopefully I’ll keep it in all my head from now on, but that is always easier when you’ve written actual code that you can look back at. I used C++, but you can use any programming language, and nobody checks your code. Of course, I’m not allowed to publish my homework solutions.

I had already been slowly reading through Introduction to Algorithms, by Cormen, Leiserson, Rivest and Stein, which is hard going. It also has exercises, but I was far more motivated to complete the Coursera exercises, whose aim was always to get the correct specific numerical answer, so you knew when you had the code working properly. After I finished the course, I found it easy to then read the relevant chapters, which offer much more in-depth analysis than the Coursera course.

Although the participants are all still waiting for their certificates (presumably some skeumorphic image file), I’m sure that I’ve passed with 80-something percent. I’d have done better but I started the course after some homework deadlines had already passed. The final test also showed that I need to be more familiar with logarithm equivalences and geometric progressions. I do have Maths and Further-Maths A-Levels, but it’s been a long time.

Unfortunately, part 2 isn’t due to start again until some time in 2016. But I think I can do the course in the meantime, just without earning an official score.

Update: Here is my certificate for part 1:

murrayc-2015-09-24_coursera_stanford_algorithms_part1

Update: I did part 2 too.