Hacker Newsnew | past | comments | ask | show | jobs | submit | serg_chernata's commentslogin

Love the reference. On a more serious note I'm really curious how this will play out. Nvidia seems to be doing it's best to prop up the prices of existing models as it prepares to launch the 4k series. The big question seems to be whether most of these miners will start mining some other token or get out of gpu mining entirely.


If the card has cuda support I would guess they're off to some sort of p2p AI / ML marketplace. Unfortunately AMD cards were actually better for mining. If anybody knows of something like vast.ai or render for AMD I'm all ears.


> they're off to some sort of p2p AI / ML marketplace

Seems to be at least slightly true, by my amateur judgement. Sometimes I use https://vast.ai, and seems there is more offers than usual, currently ~180 instances available for rent.


Morgenrot Cloud (https://morgenrot.cloud/) is the main consumer grade AMD compute provider I know of. Not quite vast.ai in that they're centralized, but they've got the hardware.


> Nvidia seems to be doing it's best to prop up the prices of existing models as it prepares to launch the 4k series.

Not really, no:

https://wccftech.com/hours-into-the-eth-merge-nvidia-geforce...

New 3000 series retail prices on the high end cards have been steadily dropping, and it seems like on ebay used prices have dropped 10% in the last month.

As for the 4000 series cards - they've stated in SEC filings that they will be trickling out stock to keep prices high.

AMD are the ones who are really fucked; their cards suck, and nobody bought them out of choice but desperation. Now that the market is glutted, people will heavily prefer nvidia cards.


> AMD are the ones who are really fucked; their cards suck, and nobody bought them out of choice but desperation.

Do they really suck? From some benchmarks i saw there are cards comparable to 3060-3070s (the 6800 IIRC) which is solid midrange competition.


It depends on the game. Some games AMD cards blow Nvidia cards (of the same "tier") away, some it's the opposite. The GP commenter who said they "suck" is incredibly wrong. I can't understand people who fan-boy over giant corporations.


Since at least the Pascal microarchitecture NVIDIA has beaten AMD in performance per watt, and AMD is infamous for how unreliable and poorly performant their drivers are for Windows. AMD manages to do decently in benchmarks because they overclock and overvolt their cards with massive heatsinks and the cards last just long enough to run the benchmark before overheating and downvolting/downclocking.

All of this is accepted industry fact, and your devolving the discussion to personal insults is proof of this; otherwise you would have come armed with actual tests and reviews.


Yet I have non of these issues with a AMD card. Apparently I'm the only exception in the whole world considering it's industry fact.


The last paragraph there about AMD seems completely baseless and overstated.

NVIDIA does have much higher market share and brand recognition from both ML and gamers for now, but AMD has been firing on all cylinders for quite a few years, and has built a terrific open source driver codebase to further refine.

Even throughout the 2010s AMD offered price-competitive models that definitely didn’t “suck” for what they cost, or require “desperation” to purchase.


> AMD are the ones who are really fucked; their cards suck, and nobody bought them out of choice but desperation. Now that the market is glutted, people will heavily prefer nvidia cards.

I bought only AMD equipment for the last few years out of despise for the market manipulation by Intel/nVidia (see Intel x86 compiler/nVidia GameWorks, both not optimizing for their equipment but deoptimizing for competitors), and I have gotten completely adequate gear for the price paid. Not so much despair here.


Technically, the video covers this at the very end. You'd need a lens between the eye and led to focus the light.


Happy birthday. Today I'm 33 and we sound eerily alike. I'll try to heed some advice from this thread and hope that I won't post the same thing in 11 years.


Agreed. We tried bolting a js front onto our elixir app and instead ended up using liveview. It saves us a ton of effort and has most of the bells and whistles our users expect anyway.


Jamie's channel is one of my all time favorites. I love watching him work exactly because of his general attitude toward problem solving and how different it is from mine.



Can you suggest one as an 'entrypoint'? I watched a couple after Matthias Wandel mentioned him, but they were a bit.. well, even the comments were saying things like 'that's it he's really fallen off the deep end now'.


Not sure how far back his YT channel goes, but I used to follow Jamie way back when he was based in a crappy flat in Southern England, developing some kind of hexapod toy, and thereafter when he was in [somewhere in North America] building a crazy place in a forest somewhere.

I've not been able to get into his Southern American adventures so much.


But is TSMC producing these ASIC chips?


Yes, but more generally they use the same raw materials.


I received my MBA in IT from WGU and couldn't be happier. It cost me a total of $8,000 and was one of the best things I have ever done for my education. This MBA cost me a fraction of what my BS did.


Happy New Year!


Same here, I cancelled immediately. I swallowed the previous price increases but this is getting close to regular cable TV. I switched to YTV to avoid cable to begin with.


Then you'll be just as disappointed as I am. This thing comes with 4 USB-C ports and an audio jack. That's it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: