Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tangential, but in my Java class recently I was teaching teaching the students some basic Big O stuff. I wrote a few versions of a program that found the "biggest delta" of two numbers in a list. The first version was O(n^2) time, the next version was just O(n).

I showed them how long it took to run the O(n^2) on 100000 elements, about two minutes on my beefy server. Then I showed them the O(n) version running on my comparatively weaker laptop, and it took about ten seconds to run a million elements. The students were genuinely kind of impressed, and I stressed the point to say that "these are the kind of numbers that make it important to take theory seriously."



O(n) for 1 million elements taking ten seconds is pretty long. What’s the constant factor?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: