Hacker Newsnew | past | comments | ask | show | jobs | submit | blauditore's commentslogin

That's... Normal. Technology has always been moving towards higher-level abstractions. In terms of software, many engineers nowadays know how to code in high-level languages like JS or Java, while maybe 30-40 years ago many folks probably knew C, assembly, and all the low-level stuff like e.g. explicit memory management that most modern devs never deal with.


How abstract before it becomes a problem? these days they want to outsource thinking itself to technology. At that pare, are we failing DeCarte's "I think, therefore I am".


It's the same in Europe. There are many car drivers who would never admit that, but they just don't want to leave their comfort zone and learn how to use public transport. But when asked they will say stuff like "well, we live a bit outside the city", or "now with kids you basically need a car".


I've seen what looks like 10-year-old kids taking the S-Bahn to school on their own. Apparently, that's quite common, and no excuse.


Not sure of it's a class thing, but rather the fact that software engineers often make good money, especially at places like Meta. It'st the same for me: If I lost my job tomorrow, I'd have enough savings to take some time before needing another job. Not sure if this would have been true for my parents.


Software folks love over-engineering things. If you look at the web coding craze of a few years ago, people started piling up tooling on top of tooling (frameworks, build pipelines, linting, generators etc.) for something that could also be zero-config, and just a handful of files for simple projects.

I guess this happens when you're too deep in a topic and forget that eventually the overhead of maintaining the tooling outweights the benefits. It's a curse of our profession. We build and automate things, so we naturally want to build and automate tooling for doing the things we do.


I don’t think those web tooling piles are over-engineered per se, they address huge challenges at Google and Facebook, but the profession is way too driven by hype and fashion and the result is a lot of cargo culting of stuff from Big Dogs unquestioningly. Wrong tooling for the job creates that bubble of over complicated app development.

Inventing GraphQL and React and making your own PHP compiler are absolutely insane and obviously wrong decisions — for everyone who isn’t Facebook. With Facebook revenue and Facebooks army of resume obsessed PHP monkeys they strike me as elegant technological solutions to otherwise intractable organizational issues. Insane, but highly profitable and fast moving. Outside of that context using React should be addressing clear pain points, not a dogmatic default.

We’re seeing some active pushback on it now online, but so much damage has been done. Embracing progressive complexity of web apps/sites should leave the majority as barebones with minimal if any JavaScript.

Facebook solutions for Facebook problems. Most of us can be deeply happy our 99 problems don’t include theirs, and live a simpler easier life.


Not sure why you lumped React in there. Hack is loopy, and GraphQL was overhyped but conditionally useful, but React was legitimately useful and a real improvement over other ways of doing things at the time. Compare React to contemporary stuff like jQuery, Backbone, Knockout, Angular 1.x, etc.


I agree with you very much, if what you are building actually benefits from that much client side interactivity. I think the counterpoint is that most products could be server rendered html templates with a tiny amount of plain js rather than complex frontend applications.


Translating this page to English is quite funny


Ah yes, like no-code programming in the past, or what was it called again?


It's called Excel, and there's probably more logic written in it driving the world economy than in all the rest of the programming languages combined.


I've been around for a while. The closest we ever got was probably RPA. This time it's different. In my organisation we have non-programmers writing software that brings them business value on quite a large scale. Right now it's mainly through the chat framework we provide them so that they aren't just spamming data into chatGPT or similar. A couple of them figured out how to work the API and set up their own agents though.

Most of it is rather terrible, but a lot of the times it really doesn't matter. At least most of it scales better than Excel, and for the most part they can debug/fix their issues with more prompts. The stuff that turns out to matter eventually makes it to my team, and then it usually gets rewritten from scratch.

I think you underestimate how easy it is to get something to work well enough with AI.


The SO copy-pasting is actually quite accurate. The same folks are now just blindly generating code. That's why most software in the world is shit, and will continue to be in the future. There might just be more of it.


There will most definitely be much more of it, maybe machines are doing this on purpose to increase dependency on them haha. Ultimately, wagging a finger at someone will have no outcome, allowing someone to make real mistakes while vibe coding will be a much better learning experience. Someone that drops a prod database using Claude will have a very lasting memory of that(not saying that should be the goal, critical thinking obviously matters A LOT). Cars didn't used to have seatbelts, a lot of people died, then they got seatbelts and now the world is a better place.


I can't help but keep finding it ridiculous how everyone now discovers basic best practices (linting, documentation, small incremental changes) that have been known for ages. It's not needed because of AI, you should have been doing it like this before as well.


Anyone who’s been a developer for more than 10 minutes knows that best practices are hard to always follow through on when there’s pressure to ship.

But there’s more time to do some of these other things if the actual coding time is trending toward zero.

And the importance of it can go up with AI systems because they do actually use the documentation you write as part of their context! Direct visible value can lead people to finally take more seriously things that previously felt like luxuries they didn’t have time for.

Again if you’ve been a developer for more than 10 minutes, you’ve had the discouraging experience of pain-stakingly writing very good documentation only for it to be ignored by the next guy. This isn’t how LLMs work. They read your docs.


> Anyone who’s been a developer for more than 10 minutes knows that best practices are hard to always follow through on when there’s pressure to ship. > But there’s more time to do some of these other things if the actual coding time is trending toward zero.

I think you'll find even less time - as "AI" drives the target time to ship toward zero.


I agree that this will be the end result over time, maybe even faster than we expect. And as those speed pressures increase, AI will take over more and more of the development process.


These best practice protections become essential only when you give the work to really bad programmers - such as parrots.


Completely disagree. That's like saying that user manuals and driving assistances (e.g. alerts about approaching an object) in cars are only for bad drivers.


The Kessler syndrome is mentioned, satellites colliding, causing a cascade of follow-up collisions. This gets brought up a lot, but people have a poor intuition on how large the orbit space is. Think of it this way: It's obviously larger that Earth's surface, and placing, say, a million objects on Earth still leaves a lot of space between them (there are thousands as many humans). Yes, satellites move in certain orbits, not in random places, but space is large, and humans are bad with imagining large numbers and things. The illustrations with fat dots on tiny earth images are misleading too IMO.

Apart of that, I do agree that space data centers are probably just a marketing stunt at this point, although some things could obviously be done to increase their chances, like more lightweight designs on GPUs, something that was never a big topic before.


The other way people get confused about Kessler syndrome is that they imagine it's like the movie Gravity where it happens suddenly rather than a slow process that plays out over years/decades.


What hotkey-driven and fast-paced workflows are you referring to? I used to be an Office user, now G Docs, and I hardly miss anything. Hotkeys do exist, and more complex stuff can be automated quite well with AppsScript.

Maybe I'm not enough of a power user, but these things often sound to me like the 0.1% productivity boosts that are nice to have, but often hardly relevant in the grand scheme of things.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: