Big Pink wrote on 2024-08-28, 00:38:
Ozzuneoj wrote on 2024-08-27, 17:28:It seems like that would have been a pretty huge oversight in the design, but they may have had other good reasons to make the connector this way. Maybe they did this to manage the force required to insert or remove cards? If half the pins are at different depths then it would require half the force spread out over two rows of pins.
Given the number of pins, if AGP card edges weren't staggered it would end up being like VLB all over again. No-one wanted the "thousand pin apocalypse".
Normal AGP only has around 16 more pins than PCI, so I don't think it was purely a connector size issue. AGP Pro is, of course, much larger and has more pins, but that came several years later and could have just as well never been created (aux power connectors on cards made AGP Pro irrelevant). Perhaps they anticipated AGP getting larger over time and were expecting PC motherboards to shrink over that same time period, necessitating a more dense slot, but AGP 2x, 4x and 8x all work in the same size slot, so I don't know how much that really affected things.
PCI-Express x16 slots have 32 more pins than AGP and are longer than AGP, but the designers seemingly didn't feel the need to stagger the pin depth to save space.
Anyway, it is surprising to me just how little information exists about the AGP standard and why it was made the way it was. Or, I should say, my Google search attempts are mostly coming up with barely-related results and nothing very technical. I would be curious to know exactly why it was made the way it was.
The only standard PC interfaces I can think of that use staggered pin-depth connectors like these are EISA, Slot 1, Slot A and AGP. I don't think any of these are known for being exceptionally reliable or stable connectors, which may be why PCI-E and other interfaces moved away from staggered pins.