“The Encyclopedia Galactica defines a robot as a mechanical apparatus designed to do the work of a man. The marketing division of the Sirius Cybernetics Corporation defines a robot as “Your Plastic Pal Who’s Fun to Be With.” The Hitchhiker’s Guide to the Galaxy defines the marketing division of the Sirius Cybernetics Corporation as “a bunch of mindless jerks who’ll be the first against the wall when the revolution comes,” with a footnote to the effect that the editors would welcome applications from anyone interested in taking over the post of robotics correspondent. Curiously enough, an edition of the Encyclopedia Galactica that had the good fortune to fall through a time warp from a thousand years in the future defined the marketing division of the Sirius Cybernetics Corporation as “a bunch of mindless jerks who were the first against the wall when the revolution came.”
It's like low-background radiation steel. There's a finite amount of information out there that's not ai garbage, and the percentage of scrapeable information that isn't tainted grows smaller by the moment.
I imagine this trend will continue, and the price will go up.
This is not creating widgets or lines of code, not creating a product for consumption, this is fostering the development of inquisitive minds, hopefully encouraging them to become critical thinkers and ultimately the next generation of leaders who will push the bounds of human knowledge further than ever before.
Why would better tools be expected to do enable teachers to do that for more students at a time?
There is a lot of research out there showing worse educational outcomes as class sizes increase. This is one of the areas where wealth disparities in education manifest; rich areas tend to have smaller class sizes, and historically the very rich would pay for private tutors for their kids, whereas poor kids are stuck with bigger class sizes, less individual attention from educators, and typically average worse educational outcomes.
>This is not creating widgets or lines of code, not creating a product for consumption, this is fostering the development of inquisitive minds, hopefully encouraging them to become critical thinkers and ultimately the next generation of leaders who will push the bounds of human knowledge further than ever before.
There's plenty of drudge work teachers do that's not "fostering the development of inquisitive minds". Grading papers, preparing lesson plans, etc. I don't see why not at least some of that can be offloaded to AI.
> Why would better tools be expected to do enable teachers to do that for more students at a time?
Khan Academy showed that one great teacher distributed to millions does that pretty well. It doesn't make sense for every teacher in the country (the worst and the best) to create their own syllabus and teach the same thing over and over again.
Corroborates that zero-trust until now has been largely marketing gibberish. Security by design means incorporating concepts such as these to not assume that your upstream providers will not be utterly owned in a supply chain attack.
I'm a hot dog chef with over 20 years of experience. Credited with inventing 274 hot dog styles. International awards. World renowned and industry figure.
My entire team, very competent hot dog experts, was laid off after a hot dog cooking machine could do what took us 3 months, in just one day. I've been out of a job for 12 months. The reason? All hot dog making has been offloaded to Claudog Hotdog. "Sorry. Hot dog manual cooking is a thing of the past", one recruiter told me.
I'm working as a software engineer as we speak. I keep applying to hot dog related positions but I get no interviews. Even positions significantly below my pay grade and skillset. No one is hiring. Hot dog cooking is over. We are entering a new era.
I'd take these options from several companies (all selling hotdogs) and wrap them up in Collateral Hotdog Obligations which I'd then offer to investors.
I build bicycles. I was shocked when our internal team built a bicycle that goes to the moon
We are afraid to release it to the public! And thus we are shutting down the company. We don't want humans polluting Moon and the atmosphere and space!
Very cool that these companies can scrape basically all extant human knowledge, utterly disregard IP/copyright/etc, and they cry foul when the tables turn.
We should treat LLM somewhat like patents or drugs. After 5 years or so, the models should become open source. Or at very least the weights. To compensate for the distilling of human knowledge.
All extant human knowledge SO FAR. Remember, by the nature of the beast, the companies will always be operating in hindsight with outdated human knowledge.
reply