Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I realize I sound like the world's largest wet blanket but imagine if this effort was spent on improving current versions of free software.


It was FUN. KDE is doing 20th years, i've been there for 16 now. No, nobody payed me, i made it because i have passion for the project and it shaped my professional career. And YES, we need keep our memories for future reference


It looks like he already does a lot of work on current versions of free software, seems like he's on the kde team.

Or do you feel life should be all work and no play? I think stuff like this gives people the energy to work on current stuff.


I could have done a better job elaborating in my original post, I was too glib. To me it's less about this particular example (it's small potatoes overall) and more about the double standard for the tech industry. Imagine if university science departments switched to 1940s curriculum just for fun. Imagine if Ford started producing Model T's again just for fun. Tech is the only industry where people goof off, get paid (in dollars or praise) and yet there are still complaints about how bad it is to work in technology.

Edit:

To add on to this, I think what frustrates me most is the lack of collective self-reflection or appreciation of privilege in the tech industry. I'm not trying to deny the existence of real problems in the tech industry but every time I (as a non-tech industry worker) hear my friends or see some article about how underpaid tech workers are (at 3 to 4 times my salary) I have to clench my bowels and try not to have an aneurysm. I would love to see articles regularly hit the top of Hacker News titled "Look at how good we have it" (including some recognition about how bad the economy is for everyone else) but I don't think I've ever seen that.


There'd be value to a museum making a reproduction of a Model T, & there's value to the history of science, to focus on what kind of world scientists then were thinking in terms of when they created various hypotheses

I've invested a lot of my personal time in creating the fastest befunge implementations I know of. I intend to spend even more time eventually when I get to having it JIT to assembly. I find it fun, I don't get paid for it, & I have a modest salary (under 40k/yr)

It _did_ end up having positive outcomes, where I was JITing befunge code to python bytecode, & it got me thinking about the bytecode's performance, so to optimize my befunge implementation I went a level deeper & optimized the python implementation to wordcode instead of bytecode, giving anyone who uses Python 3.6 a 1% improvement

But maybe working on improving the performance of Python by 1% is frivolous, since Python's just some big codebase whose only use is in silly things like my befunge interpreter..

That said, I agree with you somewhat on that tech industry rant. I mention my salary's modest, but it was less while I was working as general labor in flooring, which was way more than when I was vending icecream off a bike (which was much more than when they'd have me put together coolant packs for 25$ in a 6 hour day)

You're directing it at the wrong thing. Writing a blog about some hobby project isn't what's wrong with the world


I think I'm not making my point particularly well. It's also possible that my point itself is terrible, as evidenced by the raft of downvotes and the fact that I was banned from posting this comment until hours later. I guess HN is only a place for popular opinions.

Your Python example is undeniably progress because Python is still in wide use, a better example would be if you improved the performance of Dylan (or some other language that has long since fallen out of use). For your flooring example, imagine if you were contracted to work on someone's house and went, "Instead of using fresh floorboards, we decided to install floorboards that have been sitting out in the rain for 40 years because we enjoyed the challenge!"

I've seen enough people make comments on how contributing to free software is a fight for the future of political and social freedom, since those who control technology will have a disproportionate impact on the systems of power and I find that hard to square with the attitude that free software no big deal and we should just have fun.


It's because there is no consensus on any of those things, particularly political beliefs surrounding free software. The people who are saying free software is about scratching itches are generally not the same people who equate contributing to free software with fighting for the future of political and social freedom. You're not seeing contradictory positions because you're (usually) not seeing the same people state those positions.

If it seems that way, it's because a given online discussion isn't a random sample of people's beliefs. For example, to someone reading Reddit or Slashdot (or HN sometimes, unfortunately), it's easy to jump to the conclusion that everything sucks and everyone hates everything. Post an article about Python, and all the comments will be about how Python is terrible and everyone should use Ruby (or whatever). Post an article about Ruby, and all the comments will be about how Ruby is terrible and everyone should use Python (or whatever). Read the same site for long enough and it will start to look like a mass of contradictions- "wait, everyone here said Python sucked before and I should use Ruby, but now they're all saying Ruby sucks and I should use Python?!?" But really it's just different people showing up to disparage whatever the topic of discussion is. I think it's just the nature of the medium that makes it really hard to gauge consensus within a community.


> Imagine if university science departments switched to 1940s curriculum just for fun.

My university physics curriculum was pretty much all around in the 1940s (classical mechanics, quantum mechanics, electromagnetism, special and general relativity, etc.)

I'm currently in a computer science department where the curriculum includes a bunch of stuff like Cassandra, Mongo, Javascript, Unity, etc.

I think it would be good for those of us in computing to take more notice of our field's past (even by the 1940s we had lambda calculus, combinatory logic, etc.)


If you took more than an intro class then I would say your university did you a disservice. And of course history has it's place for all fields of knowledge, but not at the expense of progress, which is the point I'm (poorly) attempting to make.


To me, it sounds more like that's the only point you can salvage from your original post. And that's stretching it.


I appreciate your contribution.


> Imagine if university science departments switched to 1940s curriculum just for fun.

Well, here's Stephen Granades' awesome video where he explains how the luminiferous aether works, and how it transports light, and even (helps) perform an experiment to demonstrate it. It's got real science in it!

https://www.youtube.com/watch?v=CpNVG33awq4

So, while I get your point, I'm still slightly on the 'that actually sounds quite cool' camp...


But people do restore classic cars, for fun.


It's not a perfect example because free software doesn't map onto any traditional industry that well.


Was he paid to do it? It might just be a fun side project for him. If you don't like it, don't use it.


If you think that free software could be "brought forward" in some way then go and do it yourself.

But telling other people what to do with their life time is highly immoral.


In what way is it "highly immoral"? Is it more or less immoral than burglary (for example)?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: