this post was submitted on 28 Nov 2023
1 points (100.0% liked)

Gadgets

1 readers
1 users here now

Gadget (noun): a small mechanical / electrical device or tool, especially an ingenious or novel one.

founded 10 months ago
MODERATORS
top 11 comments
sorted by: hot top controversial new old
[–] Shoshke@alien.top 1 points 9 months ago (1 children)

And this is a part of the reason why Nvidia can price their consumer GPU's like they do.

They literally are full throttle in the B2B and anything that might've been dedicated to the retail market just frees up production for the AI business that is absolutely ridiculously profitable for them ATM.

[–] XYHopGuy@alien.top 1 points 9 months ago

data center products are bottlenecked by chip on wafer on substrate packaging as they use chiplets.

Gaming GPUs do not- they have pricing power for consumer GPUs because they are a generation ahead of competitors.

[–] skyflex@alien.top 1 points 9 months ago

I got to enable a special Instance tier for one of our engineering teams in AWS the other day. They come with 6 or 7 of these GPUs. Had to coordinate with our TAMs because they're basically bare metal hosts and cost so so much - he told me even internal AWS folks aren't allowed to play with them because of the cost and demand. Crazy.

[–] watduhdamhell@alien.top 1 points 9 months ago (1 children)

AMD and their MI300 should catch a similar traction, and propel AMD to much greater heights, market cap wise. Now might be an excellent time to buy...

I mean, who else makes this level of AI accelerator? Nobody. Nobody but AMD and Nvidia can do this right now. Seems to me they are both going to be much, much larger companies in the next 10 years than anyone thought they might be.

[–] Sirisian@alien.top 1 points 9 months ago (1 children)

I mean, who else makes this level of AI accelerator?

Google, Microsoft, and Amazon are making AI accelerators for their datacenters. Some are for training and others are for inference (running the trained models for services).

I'd be somewhat hesitant until the full benchmarks are out. AMD's higher memory I think it was sounded neat, but that could be very fleeting in terms of advantages.

[–] watduhdamhell@alien.top 1 points 9 months ago

None of those companies are real chip makers and, while they may be able to produce a nice, custom solution for their applications, will never compete with the actual chip makers, and AMD and Nvidia will continue to supply a greater and greater share of the vast majority of AI super chips, eventually eliminating most custom solutions.

[–] Lewd_Pinocchio@alien.top 1 points 9 months ago (2 children)

Average MSRP is $30K USD.

Off one customer, Nvidia made 15 billion. Jesus.

[–] wakIII@alien.top 1 points 9 months ago

Hyperscalers don’t pay close to MSRP, expect more like $5-8k per chip

[–] shalol@alien.top 1 points 9 months ago

Yeah their pricing is already quite reflective of the investors ambitions of selling the pickaxes to the gold rush. IMO, it’s never about the success but about the potential behind a company.

[–] Acceptable-Truck3803@alien.top 0 points 9 months ago (1 children)

Yeah I’m in this industry and to get a H100/A100 card you are looking around 40-60 weeks or at least 4 months. It’s not Meta/Facebook alone that use these cards…

[–] FightOnForUsc@alien.top 1 points 9 months ago

40 weeks is a lot more then 4 months… unless you are saying the low end isn’t 40 weeks