Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wtf, this has a name now? I thought of this exact idea literally months ago but never had the time to do any experiments on it.

At the time I dismissed it as potentially being incredibly expensive for the improvement you do get, and runs into typical pitfalls of evolutionary algorithms (in the same way evolution doesn't let an organism grow a wheel, your LLM evolution algorithm will never come up with something that requires a far bigger leap than what you allow the LLM to perturb on a single step. Also the genetic algorithm will probably result in a vibecoded mess of short-sighted decisions just like evolution creates a spaghetti genome in real life.)

I'll definitely need to look into how people have improved the idea and whether it is practical now.

 help



This is not a new idea at all, many many have had it, no one really can claim it


Wikipedia has humor:

> The same observation had previously also been made by many others.


I genuinely laughed reading the first words. Yeah, its hard to be novel

Don’t worry, Twitter bros already coined it.

Genetic algorithms have existed since the 60s / 70s, e.g. computers learning to play a game. LLMs aren’t particularly guide at it.

I think hyperparameter tuning may actually be a kind of genetic algorithm.


Hyperparameter tuning could be done by genetic algorithm. I think it’s a bit of a category error to say that it is a genetic algorithm though.

Hyperparam tuning is usually done by Bayesian Optimization though.


Yeah that’s correct, it could use it, but there are better alternatives for this particular problem.

You know this doesn’t work most of the time…



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: