Cancer also stems from non-heated tobacco because the plant itself contains carcinogens that are pressed into the skin in the mouth for example, often including lesions and such
> This makes them AMD’s first desktop chips to qualify for Microsoft’s Copilot+ PC label, which enables a handful of unique Windows 11 features like Recall and Click to Do.
This is not the selling point they think it is.
The problem I see with the AM5 socket is simply the fact that DD5 RAM to support it is just too expensive. So this will not really make the big impact they were hopin for.
I have a hard time believing ANYONE thinks this is a selling point. Literally anyone, including marketing and execs at Microsoft. I think they have just sunk too much money in it to quit, so they keep doubling down.
"The problem I see with the AM5 socket is simply the fact that DD5 RAM to support it is just too expensive."
DDR4 is basically just as expensive. At least DDR5 gives on-chip error correction (not as good as full ECC).
For a geneal computer, there's not that much difference between AM4 and AM5 unless you really want the extra speed of DDR5, PCI 5, and the newest processors. You can build a very capable AM4 machine for slightly less money, but that savings is found on the CPU and motherboard, not on the RAM.
DDR4 looks to be around half the price of DDR5 on the used market to me. I wouldn't call that slightly less money unless you weren't planning to install much RAM.
On the new market, they are really close. Most consumers are buying new. If we want to get pedantic, we can compare buying used systems on Facebook marketplace to cannibalize the parts and resell the others to see net cost.
I don't think most people building AM4 systems currently are buying new, or at least not everything new, simply because depending on what you're looking for there might not even be any new parts.
The ECC in DDR5 is there just to make it work because of the errors caused by the density and the data rates. The ECC isn’t there for you, it is for the manufacturers.
I think for brand new computers/builds that's correct but where it hit me was wanting to upgrade an existing desktop. I already have more DDR4 RAM than I need and would have been willing to purchase a new CPU/motherboard and being forced to also purchase new RAM at the same time made it too big of a price tag all at once. I just found the best zen 3 cpu I could on ebay and called it a day.
I think your point still stands overall for AMD's business though, I assume a vast majority of CPUs are purchased in new desktops?
I am still running DDR2 & DDR3 machines! I was going to finally make the big upgrade this year but am now holding off until the market finds a little bit more sanity.
Agree, i just built a desktop for the first time in ages, it is a leap and change from using laptops with numerous components pplugged into them. i made the leap to desktop. everything was comparably reasonable except the RAM or anything that has memory chip on it ( RAM, NVME etc) so i did some research just to make sure. All in all i happy with the result i went with AMD 9900x no graphics card in this option, i skipped the graphics card for now.
I would like to add that, looking for bundles helps a lot. If you have micro center near you, utilize it to you full advantage they are the only ones given promotional items with bundles at the moment from what seen. The main objective is to skip the price gouge of RAM chips, they cost more than the CPU at the moment. I got CPU and motherboard plus 32g RAM fro $600 and that was a save. the RAM was $445 alone.
HN is overloaded with AI stuff, its hard to break through all the noise. I say this as someone very interested in AI. Even I skip some links because its just too much.
I see it making claims about 10x efficiency, but how is tokens / second / watt? The only machines I've seen with the memory bandwidth to effectively do local inference are Mx arm chips on mac.
because it's not faster than the Ryzen 395's GPU. power efficiency doesn't matter as much as TTFT for desktop users, especially when they're tasking their AMD box as a dedicated inference machine.
some older pre-395 AMD articles suggested it'd be possible to use the NPU for prefill and the GPU for decoding and this would be faster than using either alone, but we have yet to see that (even on Windows) for any usefully sized models, just toys like LLaMA-8B.
The main thing I've noticed with local NPU's is very few applications take advantage of it, and IIRC it's because documentation for them is lacking and there are quite a few different kinds out there, each with different API's, libraries, etc.
To really take advantage of those gpu cores you need memory bandwidth. Modern transformer based LLMs are really bandwidth hungry. I am really happy to see this first push. NVIDIA having discrete GPU/memory/etc is an option, but not great for a lot of different reasons. Unified memory architectures like what AMD and Apple have are the way to go for the future. Put 256GB of ram on the main board and be able to access it at speed for LLM use please.
I think the article is being s bit disingenuous. The real problem is triple A developers and publishers pushing for it to replace a not insignificant amount of creative work.
While still over charging everyone and scalpling every $ from everyone with micro transactions and game mechanics that need xp boosters.
I don't think the article pointing out the existence of AI use outside of that “real problem” is disingenuity. It's possible to simultaneously condone indie developers using AI to make the most of their limited budgets and condemn AAA studios for using AI to lay off their workers; conflating one with the other just makes it harder to have an honest and productive conversation about whether AI helps or hurts software development as a profession and a craft.
> I don't see the discourse of indie or single game developers being ostracized in some public shaming trend
Not specifically “game” developers, but I do see attempts at that ostracization on the OSDev subreddit; at least one participant there has posted progress updates on a vibe-coded hobby OS, and each of those updates ends up deluged with people complaining specifically about the AI use.
> and each of those updates ends up deluged with people complaining specifically about the AI use
I would genuinely like to see this thread, because if the comments are legitimate and backed up by examples, ie : "This is XSS vulnerable" etc.. then even with the prefix of "AI Slop" I'm fine with..
I think it's fair people don't get too comfortable with just trusting vibe coded agents, when in my own experience, the bugs they leave around are often harder to identify from a simple review than a simple architeture misalignment.
It would elevate the conversation significantly if people didn't use "vibe coding" and AI use interchangeably.
I don't use Reddit, but you don't have to look for a specific thread. It often feels like there's a mob of people just waiting for fresh meat to wander into their camp. Literally any thread referencing AI on this site is full of people who appear to have nothing but venom and contempt for people who use these tools.
It's not everyone, but loud minorities are still loud.
I'll try to only use the term AI as VibeCoding is more a derivative. There definitely are some people who are just doomers about AI entirely and I think it's always the case with any new technology. That said, you can't deny there is just an unholy amount of useless applications for the AI tooling that are really not providing anything useful other than generating 'slop'.
Using AI tools for protein folding or medical breakthroughs for example will impact the world in a positive way. People will champion that. Using them to automate your creativity hasn't been in demand by anyone except shareholders or people looking to milk a quick ad revenue for little to no effort. So of course, there's a negative sentiment.
No, I'm asking why a website that someone could fill in a few fields and result in the optimized llm for you would need to run in a container? It's a webform.
The use cases they give here are so bad. "Customer service automatically create a ticket. Shop automatically for you. Book a flight automatically for you"
> Maybe it’s just more visible now but it seems like these companies are really accelerating in their evil lately.
Well I mean they have no incentive to behave any other way, if anything they are rewarded via the shareholders. Number goes up when they perform mass layoffs. That tells you everything you need to know.
Not particularly 'educational' at all, but "My dad wrote a Porno". Was recommended it by a friend and have been wetting myself laughing on the work commute.
Also "Stuff you should know" is a super popular one that always gets a listen.
I'm not a doctor though so while I might sound sure it's based on what I've read on the topic over the many years.
Edit : rightly corrected its not just heating and burning, its tobacco and others in general. But nicotine itself is not cancerous.
reply