After quite some time, and actually after reading this post[0], I took another look at GNU Texmacs, this time with a little more depth and patience. And indeed, the program is an incredibly powerful tool for creating beautiful documents. I'm also currently on a roll where I'm reappreciating the philosophical advantages of WYSIWYG. Anyway, for me it's definitely an insider tip for anyone who is annoyed by LaTeX and is open enough to try WYSWYG.
To save people’s time: this thing is not LaTeX and you won’t be able to use any of the LaTeX packages that you need if you are preparing a manuscript for a journal (for example).
I am somewhat concerned about the volatility. All three languages have their merits and each has a stable foundation that has been developed and established over many years. The fact that the programming language has been “changed” within a short period of time, or rather that the direction has been altered, does not inspire confidence in the overall continuity of Ladybird's design decisions.
Not just volatility but also flip-flopping. Rust was explicitly a contender when they decided to go with Swift 18 months ago, and they've already done a 180 on it despite the language being more or less the same as it was.
they tried swift, it didn't work, and they figured rust was the best remaining option. that's not "flip-flopping" (by which I assume you mean random indecisiveness that leads to them changing their mind for no reason)
Yup, this was not flip-flopping, it was willingness to be open to options, even if it means going back on a decision branch made earlier in the process.
For the Ladybird project, now is the best time to be making a big decision like this, and it's commendable that the project lead was honest to recognize when an earlier attempt was not working, to be able to re-think and come to a better decision. I'm no fan of Rust, but for this project I think most of us would agree it's a better language than Swift for their purpose.
They made a very pragmatic and sensible decision after reviewing Swift that it wouldn't be suitable for their purposes, so they shifted to the next best alternative. I think they reasoned it very well and made a great decision.
There's been some fun volatility with the author over the years. I told him once that he might want to consider another language to which he replied slightly insultingly. Then he tried to write another language. Then he tried to switch from C++ to Swift, and now to Rust :P
Indeed, and as a school those 18 months are well worth it, but it is in many ways also 18 months wasted. There is a strong sense of NIH with the Ladybird dev(s), and I wonder if that isn't their whole reason for doing this.
I've seen another team doing something similar, they went through endless rewrite cycles of a major package but never shipped, and eventually the project was axed when they proposed to do it all over again, but this time even better.
> Indeed, and as a school those 18 months are well worth it, but it is in many ways also 18 months wasted.
the thing for me is (and maybe i've missed something?) but if after 18 months of struggle i'd really like to get a more insightful blog post* that goes into detail about what exactly failed and the process that lead to it... as a language enthusiast i think getting valuable lessons/reflections would be cool (was the cause swift c++ iterop progressing too slow? or some other technical hurdle? was there politics involved? etc etc)
* of course i'm just an internet person, i don't deserve anything from anybody ^^
The sense of NIH is from Serenity, and that was probably the reason for Jakt's existence too. Now it's spun off into its own project there is a lot more pragmatism.
Well, here's to hoping because we really need a stand-in for FF. I realize the irony here in terms of that being the ultimate 'NIH' project but that one I can get behind because the browser landscape is much too fragile. Of course they might end up taking users away from FF rather than from Chrome, Edge or Safari.
In case you didn't know they're using a lot of third-party libraries now for pretty major things: libcurl for http, Skia/Harfbuzz for rendering, libxml, OpenSSL, ffmpeg, etc:
> The role of the human engineer […] has been to reduce risk in the face of ambiguity, constraints, and change. That responsibility not only endures in a world of Write-Only Code, if anything it expands.
> The next generation of software engineering excellence will be defined not by how well we review the code we ship, but by how well we design systems that remain correct, resilient, and accountable even when no human ever reads the code that runs in production.
As a mechanical engineer, I have learned how to design systems that meet your needs. Many tools are used in this process that you cannot audit by yourself. The industry has evolved to the point that there are many checks at every level, backed by standards, governing bodies, third parties, and so on. Trust is a major ingredient, but it is institutionalized. Our entire profession relies on the laws of physics and mathematics. In other words, we have a deterministic system where every step is understood and cast into trust in one way or another. The journey began with the Industrial Revolution and is never-ending; we are always learning and improving.
Given what I have learned and read about LLM-based technology, I don't think it's fit for the purpose you describe as a future goal. Technology breakthroughs will be evaluated retrospectively, and we are in the very early stages right now. Let's evaluate again in 20 years, but I doubt that "write-only code" without human understanding is the way forward for our civilization.
If this worked, it’d have worked on low cost devs already. We’ve had the ability to produce large amounts of cheap code (more than any dev can review) for a long time.
The root issue is it’s much faster to do something yourself if you can’t trust the author to do it right. Especially since you can use an LLM to speed up your understanding.
What stands out for me is that the productivity gains for small and medium-sized enterprises are actually negative. But in Germany, for example, these companies are the backbone of the entire economy. That means it would be interesting to know how the average was calculated, what method was used, what weighting was applied, etc.
All in all, it's an interesting study, but it leaves out a lot, such as long-term effects, new dependencies, loss of skills, employee motivation, and much more.
> For the first time in my life, I’m suddenly wary of meeting other "computer programmers" in the wild. I feel like there’s a decent chance we won’t actually have much in common, let alone values or morality.
Maybe the social group the author is referring to has split up (forked)? For example, I wouldn't call anyone producing vibe coded AI stuff a "programmer". That noun will be reserved for the original group.
Prompting AI feels a bit like that feeling I had when I was copypasting random javascript and java applets for my first website. The website had MIDI, a scrolling status bar on the bottom, particles that followed the mouse, you name it. Never understood any of it, nor made me deep dive because it was too complicated. But then at some age I reached out for fundamentals and that's when I started building things.
All this to say, no, software developer probably not. Script kiddie?
[0]: https://news.ycombinator.com/item?id=47152982
reply