this post was submitted on 03 Nov 2024
272 points (98.9% liked)

Technology

59575 readers
3440 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Panther Lake and Nova Lake laptops will return to traditional RAM sticks

you are viewing a single comment's thread
view the rest of the comments
[–] InverseParallax@lemmy.world 12 points 2 weeks ago (1 children)

Arcs are OK, and the competition is good. Their video encode performance is absolutely unworldly though, just incredible.

Mostly, they help bring the igpu graphics stack and performance up to full, and keep games targeting them well. They're needed for that alone if nothing else.

[–] Buffalox@lemmy.world 6 points 2 weeks ago (1 children)

They were competitive for customers, but only because Intel sold them at no profit.

[–] InverseParallax@lemmy.world 4 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I mean fine, but first gen, they can fix the features and yields over time.

First gen chips are rarely blockbusters, my first gen chips were happy to make it through bringup and customer eval.

Worse because software is so much of their stack, they had huge headroom to grow.

[–] Buffalox@lemmy.world 1 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

First gen chips are rarely blockbusters

True, yet Nvidia was a nobody that arrived out of nowhere with the Riva graphics cards, and beat everybody else thoroughly. ATi, S3, 3Dfx, Matrox etc.

But you are right, these things usually take time, and for instance Microsoft was prepared to spend 10 years without making money on Xbox, because they saw it had potential in the long run.

I'm surprised Intel consider themselves so hard pressed, they are already thinking of giving up.

[–] InverseParallax@lemmy.world 11 points 2 weeks ago (1 children)

True, yet Nvidia was a nobody that arrived out of nowhere with the Riva graphics cards, and beat everybody else thoroughly. ATi, S3, 3Dfx, Matrox etc.

Actually, they didn't.

This was their first: https://en.wikipedia.org/wiki/NV1

Complete failure, overpriced, undercapable, was one of the worst cards on the market at the time, and used quadratics instead of triangles.

NV2 was supposed to power the dreamcast, and kept the quads, but was cancelled.

But the third one stayed up! https://youtu.be/w82CqjaDKmA?t=23

[–] Buffalox@lemmy.world 4 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

You are right.

and used quadratics instead of triangles.

Now that you mention it, I remember reading about that, but completely forgot.
I remembered it as the Riva coming out of nowhere. As the saying goes, first impressions last. And I only learned about NV1 much later.

But the third one stayed up!

👍 😋

But Intel also made the i815 GPU, So Arc isn't really the first.

[–] InverseParallax@lemmy.world 4 points 2 weeks ago (1 children)

Oof, yeah, they actually had another they didn't release, based off pentium cores with avx512, basically knights landing with software support for graphics.

They were canceling projects like it was going out of style, which is sad, that would have been amazing for Ai.

[–] Buffalox@lemmy.world 3 points 2 weeks ago (1 children)

Yes, there was the Xeon Phi, Knights Landing, with up to 72 cores, and 4 threads per core!
The Knights Landing was put into production though, but it was more a compute unit than a GPU.

I'm not aware they tried to sell it as a GPU too? Although If I recall correctly they made some real time ray tracing demos.

[–] InverseParallax@lemmy.world 7 points 2 weeks ago* (last edited 2 weeks ago)

So, trying not to dox myself, I worked with the architect twice.

Knights Ferry was derived directly from Larrabee (GPU), P54Cs with pre-AVX-512, .

KNC was a die shrink with more cores. Both of these were PCIe accelerators only.

KNL had full Airmont Atom cores with smt4, basically meaningful cores with proper AVX-512. Also you could boot them with linux, or as a PCIe accelerator.

KNM jadded ML instructions, basically 8/16bit float and faster SIMD.

They cancelled KNH.

I interviewed some of the actual Larrabee guys, they were wild, there was a lot of talk about dynamic translation, they were trying to do really complex things, but when people talk like that it makes me think they were floundering on the software and just looking for tech magic solutions to fundamental problems.

Intel always dies when the software gets more complex than really simple drivers, it's their achilles heel.

KNL also had the whole MCDRAM on package for basically HBM bandwidth, but that didn't actually work very well in practice, again due to software issues (you have to pick where you allocate, and using it as an l4 cache was not always effective).