Good, at least the execs didn't go with some awful AI inspired name. This way, all the old datasheets they never reformatted are still correct.
Intel missed the boat on low end stuff completely with Altera. Very recently since 2021 they have started to change that. Last year they were soliciting use cases and market research for the agilex 3. From my talks with them they have a pretty good handle on what they need to do for low end.
The presentation today was just fluff mostly about AI which has die space in their agilex 5. They have a use case where AI is run in the chip of a cell station to detect traffic patterns and manage QoS better.
I'm not sure the AI bet is worth all the marketing though. FPGAs almost always are used in hard realtime, embedded applications where the behavior must be deterministic in many ways. How is AI gonna help then?
What does it possibly to mean to have an FPGA infused with AI? Does anyone know what claim they're making? It seems LUTs and AI are at odds with one another.
I hope these new Altera parts are more reliable than the original ones. The original EP300 was sensitive to power supply rise times and one time in a 1000 would come up brain dead (counted after several field complaints). Hence the EP310 and never any more Altera designs for us.
Those are 30 year old parts, of course they are different now. Errata exists but it is always mitigated or documented, and I've not run into any show stoppers in the past 10 years. The software though? Different story
Intel missed the boat on low end stuff completely with Altera. Very recently since 2021 they have started to change that. Last year they were soliciting use cases and market research for the agilex 3. From my talks with them they have a pretty good handle on what they need to do for low end.
The presentation today was just fluff mostly about AI which has die space in their agilex 5. They have a use case where AI is run in the chip of a cell station to detect traffic patterns and manage QoS better.
I'm not sure the AI bet is worth all the marketing though. FPGAs almost always are used in hard realtime, embedded applications where the behavior must be deterministic in many ways. How is AI gonna help then?