Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Related is Dijkstra explaining the pros and cons of different styles of indexing and slicing: http://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/EW...


Right. The author discusses Dijkstra's argument, and dismisses it as not rising to the level of wrong. Personally, I find Dijkstra's argument unassailable. (It is, roughly, what a person would express in Python syntax as range(0,k)+range(k,n)=range(0,n).)


And the reason why I don't want to assail Dijkstra's argument is that I've programmed fairly extensively in both types of languages. 1-indexing means I have to think about every single time I want to go and slice an array. Half-open interval means it just does what I want, and I literally can not think of a case where I wrote an off-by-one bug under that regime. And, again, let me emphasize, extensive experience in both regimes, which is to say, it's not just a matter of what I was "used to". Indeed, I believe my first 2 languages (possibly 3) were 1-based. I'll take the style that produces the fewest bugs and happily call it "right"; others are free to argue that their intuition says they should use the buggier style, I guess, but I'm not going to be finding it convincing anytime soon.

I consider the idea that we should follow human intuition to itself "not rise to the level of wrong". There is nothing intuitive about programming, it is probably the second least intuitive activity known to man behind only pure mathematics. Your intuition needs to be trained from what is basically scratch, not blinding copied from much friendly, human domains. We know what that leads to, we've been doing it for decades; some people seem to argue "Oh, if only we'd try more human-friendliness it would all be better" as if we've never tried it, but we have, repeatedly. We get Applescript and COBOL and other languages that strive to be "human friendly" and produce an impenetrable morass of "human friendliness" that become incredibly hard to use as the size of the project exceeds trivial. As counter-intuitive as it may be, the track record is pretty clear at this point that the best languages are those that do not try to cloak themselves in false human skin.

(Which is of course not to say that we should all be programming assembler. Don't strawman me there. But the task of a language should be seen as a bridge between the human world and the computer world; it is not the job of the language to actually be in the human world itself.)


The human friendly version, and alluded to by the author, is foreach.

For the love of avoiding mutation, let's start pitching indexing in the dustbin of history when it comes to programming languages for general use. The fact that the justifications are coming from pointers and arrays ought to be a warning flag that we are looking backward rather than designing the future.


Except you often need an index, so you can't just tell everyone to use a foreach and be done with it.


Foreach is used as an example of working at a higher level of abstraction than iterating over a loop like

    for i = 0, i <= myarray.length - 1, i++
      do foo(myarray[i])
    end
I'm not suggesting that foreach is singing, dancing and all-powerful, so of course it doesn't serve for accessing arrays by position. What I am suggesting is that the syntax used for arrays was not inscribed on stone tablets upon Mount Olympus, and that

   myarray.get(6) //return the sixth element
   myarray.foreach('foo 6 19)  //operate on part of the array
May be better abstractions for general purpose programming. Syntax should facilitate the programmer.


Fair enough, but let's not fool ourselves into thinking that pointers are going away. Sure, for higher level stuff I don't want to mess with manual memory management, but what about the massive mountain of software that all of these high level programs rely upon?

We are always going to need this stuff for certain tasks. For example, I'd love to see a simply 3x3 filter operation on an image without using indices. I am also a bit biased; I'm a systems programmer, I don't write UI's and web apps. I tend to work more in the trenches, where abstraction gets in your way almost as often as it helps you get things done more quickly.


That's why I said "splice an array" rather than "iterate", actually. Whenever I'm interviewing someone and they choose a language with a "foreach", I mentally deduct a point if they insist on iterating over an array with the C-style 3-element for loop even so.


I don't find Dijkstra's argument unassailable: let's claim instead that half-open intervals are ugly (they are asymmetrical, and not typically used in mathematics) and closed intervals are preferable.

Given that, his arguments support 1-indexing better.


The claim doesn't hold. Half-open intervals appear in important places. That happens mainly because the family of finite unions of half-open intervals is closed under complement (that is, they form a semi-ring; this is unrelated to the "ring" concept in abstract algebra). In measure theory, one form of the Carathéodory extension theorem [0] says that you can uniquely extend a measure on a semi-ring to an actual measure (defined on a sigma-algebra).

The equivalent statement in probability theory says that a probability is uniquely determined by its cumulative distribution function, which are sometimes nicer to take limits of. You can also get a probability from a cumulative distribution function, provided your function is right-continuous [1] (which is directly related to half-open intervals). For more examples of half-open intervals in probability, you can look at stochastic processes. See, for example, càdlàgs [2] and Skorohod spaces; they capture the notion that processes that "decide to jump" (think of Markov chains changing states if you like) are right-continuous.

IMO, half-open intervals are just nicer whenever you have to union them or intersect them, and are no worse in other aspects when compared to closed intervals. Also, I think the author makes a big cultural blunder when he dismisses "mathematical aesthetics" as a valid reason. A significant number of mathematicians think of elegance as the ultimate goal in mathematics; as Hardy famously said, "There is no permanent place in the world for ugly mathematics".

[0] http://en.wikipedia.org/wiki/Carath%C3%A9odory%27s_extension...

[1] Well, we also need the obvious conditions: Correct limits at infinity and monotonicity.

[2] http://en.wikipedia.org/wiki/C%C3%A0dl%C3%A0g


> provided your function is left-continuous

Yes, but we are talking about indexing discrete arrays. All of your examples are continuous, and are not analogous.


I was replying to your "not typically used in mathematics" quote.

As to the "and [thus] are not analogous" part, I recommend the Concrete Mathematics book. You'll certainly see at least a hundred examples of continuous examples alongside with their discrete counterparts, as you can imagine by looking at the book title. Half-open intervals feature prominently when doing discrete calculus in the Sums chapter.


What's more, half-open intervals are more naturally 1-based, setting the missing point at 0. Dijkstra's arguments are very assailable, despite being trotted out in every discussion as arguments from authority.

The crux of the issue is treating indices as offsets rather than identifiers, which is perfectly reasonable if and only if one is directly working with sequential memory.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: