Comments Locked

38 Comments

Back to Article

  • sethborg - Thursday, December 22, 2005 - link

    How about a follow up article. Where are we now? Yeah we'd be missing out on SLI but most people dont use it anyway. I'd rather have a huge heatsink on my GPU and have 100mhz more coming out of it than SLI anyway. Do you think we'd ever see 2 cores on one chip? I dunno if that makes sense since it's all parallel anyway, just use that chip area for more lanes. I dunno.
  • Rock Hydra - Sunday, October 30, 2005 - link

    While this seems like a great idea, especially for SLI, although the memory must be on the die or package or else memory latency will be a killer. Motherboards are already cramped as they are, so how big the package is is going to make a big difference. If there is a Zalman cooler like the CPU cooler, then there won't be room for PCI/PCI Express devices. Additionally, GPUs generate more heat than CPUs, and CPUs already have dedicated exhausts in the case. (rear panel/ PSU fans). That would most likely mean that there would have to be an exhaust fan near the PCI cutouts for cases. Although the idea is very interesting. I think it will be hard to implement. I think there hast to be a third party bus comitte that both companies should go to and adhere to those standards.
  • SniperWulf - Tuesday, October 25, 2005 - link

    I remember when I mentioned sockets back in the 3dfx days and people laughed at me.......
  • Rock Hydra - Sunday, October 30, 2005 - link

    Don't worry, man. Most ideas that seem revolutionary or extraordanary now were laughed at in its infancy of thought.
  • tyborg - Tuesday, October 25, 2005 - link

    I predict that NVidia will become the chipset/gfx king, and ATI will become dedicated to PPUs
  • KingofCamelot - Wednesday, October 19, 2005 - link

    Due to the fact that ATI and NVIDIA never seem to get along, I doubt there would be 1 standard socket. This would lead to four models for every mobo maker. One AMD motherboard with ATI socket, one with NVIDIA. One INTEL motherboard with ATI socket, one with NVIDIA. Also, CPU sockets and GPU sockets would likely not be sychronized time wise. What if ATI comes out with a new socket, while AMD and INTEL still have their old sockets? This would mean you would need to buy a new mobo to upgrade your graphics card. What about reverse compatibility? If GPU sockets change will you have to buy a new graphics card if you want to get a newer mobo?

    Don't tell me the sockets won't change . Just look at the history of CPUs. You can't limit next-generation architecture by a pin count. Even if NVIDIA's next GPUs are pin compatible, we don't know if they will continue to be that way down the road of GPU architecure.
  • bob661 - Thursday, October 20, 2005 - link

    quote:

    Due to the fact that ATI and NVIDIA never seem to get along
    They'll agree or the mobo manufacturers won't make the sockets. Look at BTX.
  • Regs - Wednesday, October 19, 2005 - link

    We all know what's going to happen. Just like our CPU socket we would have to upgrade our motherboards to the latest and greatest every year to get the best performing CPU. You can't run a 3 GHz Barton on a Socket 939, right? This is just another way to increase profits mid year for vendors.
  • gibhunter - Wednesday, October 19, 2005 - link

    It would work great if you had embedded RAM in the chip or on the GPU itself like they do with notebook parts. Cool both the GPU and CPU with one big heatsink with dual contact points and one huge, slow and quiet fan and you have a killer solution on your hands. I like this idea a lot.
  • tuteja1986 - Wednesday, October 19, 2005 - link

    Great idea but what is NVIDIA really trying to achieve with this. Are they making sure that people need to buy these motherboards which are only compatible with NVIDIA card. Standards need to be placed i think or else it will turn into a very ugly war. This SLI and crossfire is just looking like the being of a total domination plans. I want a standard technology that will enable the user to do either do crossfire or SLI on the same motherboard. Anyways I am not liking where the Motherboard and Graphic Card Industry is going. This is all thanks to the stupid software that i now starting to hate called "3D MARKS".
  • DigitalFreak - Wednesday, October 19, 2005 - link

    I would rather see something like this in a laptop.
  • KristopherKubicki - Wednesday, October 19, 2005 - link

    quote:

    I would rather see something like this in a laptop.


    It's called MXM. LOL :)

    Kristopher
  • A554SS1N - Wednesday, October 19, 2005 - link

    I quite like the idea of it myself - it could hopefully give people a chance to use larger CPU sized coolers - imagine also how this would help the graphics companies to increase frequency of the GPU's by lowering heat... Ok it might not work out like that.

    For a GPU socket, there would have to be video RAM sockets, where fast video RAM could be swapped in. This could spell more user choice for the consumer (or simply make things even more confusing) as a user could potentially choose individual memory and GPU upgrades - in the long run it would mean not having to upgrade everything at once, for example, if you had fast enough video memory installed but could only afford the GPU, you could choose to stick in whatever suits your needs. Basically more customisation. The biggest downsides are potential cost (not if you intended to buy a new system anyway) and also importanty mentioned here already - space on the motherboard PCB, although in one sense, space required by the GPU slot is slightly gained from removal of a standard GPU slot.

    One potential problem with having different video memory and GPU combinations could likely be in the form of more complex drivers required, and/or more potential for driver error.

    The more I think about the GPU socket, it seems like it could be a good idea in theory, but in practice it may/will just be another headache.
  • Visual - Wednesday, October 19, 2005 - link

    this goes against the whole idea of having a separate graphic card...
    but in a way, it might be the right thing to do - do we still need separate cards?

    originally the benefit of separate graphic cards was the significantly faster memory access - system memory just wasn't fast enough, and having the graphic chip with a small amount of ridiculously expencive and fast ram gave quite a boost. look at things now... we get technologies like TurboCache and whatnot to use system memory instead of dedicated memory, and it turns out actually faster than certain budget solutions used on videocards. sure, the high-end still has 1200mhz ram or 512bit membus, making it still quite faster than our system ram, but it cant last long - dualchannel (maybe even quadchannel) ddr2 800mhz, or something of rambus's quadpumped alternatives migh soon make our normal ram faster than the common videoram. lets face it - ram tech isnt growing fater quick now only because there is almost no demand for it. ddr3 prototypes are working at 1.3ghz and up already, but who needs this when the fastest FSB from intel is 1033mhz? we've offloaded the things that would benefit from fast ram off the cpu already, so we're not getting much development in this area :( for similar reason we're getting offered a physics card now, instead of better general-purpose cpus :(

    the graphic processor itself is getting more and more general-purpose functions. it only makes sense then, to stop looking at this processor as something graphic-specific at some point. it'll just be something like a co-processor, to help with intensive highly parallel computations alongside the main cpu... i.e. it may be used for graphics and physics. at some point or other we'll have to be able to upgrade this without the memory (it'll be using our superfast system memory after all) so it has to get socketed instead of on a slot with memory.

    a whole different question is if we need this co-processor to be physically separate from the main cpu (maybe for upgrade flexibility or whatever) or we'll get it integrated into the cpu - like amd's plans have indicated, something like the cell architecture on steroids.
  • Calin - Wednesday, October 19, 2005 - link

    The leading class GPU are "better" than the CPUs whether considering thermal power, area or transistor count. If you would choose to put them together, you will get some monstruous result
    Leading edge will stay with different GPU and CPU, and on low end (trailing edge :D) the GPU is in the chipset. It is better for Intel architectures like that (and it is easier to mix and match), but for AMD architectures graphic stays closer to memory if it is in the processor.
    Would there be different processors based on the GPU capabilities included? I'm not sure about that.
  • Saist - Wednesday, October 19, 2005 - link

    I think ATi's new RingBus could be the resolution for the memory. Since it's already compatible with everything from DDR to GDDR4, the RingBus should allow socket GPU's to make to market.

    Something I actually started wondering about on the hypermemory and turbocache boards, and something I recall ATi talking about during an E3 demo on Radeon Xpress, is the capability in integrated video mode to have one channel of the memory controller just dedicated to graphics and the other for the processor.

    Something else I've also wondered about is if we would ever see integrated boards that would have an open slot of memory specifically for the graphics.
  • Beh - Wednesday, October 19, 2005 - link

    although it sounds like it may be more of a headache than its worth. motherboards seem fairly packed already these days, i cant imagine integrating something like a 7800gtx in with it. plus making it upgradable with the sockets and memory slots would take more room. and it would be a heck of a thing to keep all that cooled down. it would also drastically cut down on the number of expansion slots. for one, you wouldnt need as many and besides you simply would have the room! and what about upgrade paths? i already have to worry about upcoming socket M2 and the changover to ddr2, this would just add to my hairloss if i also have to take into account nvidia G70 socket or ati's R520 socket or gddr3/4 slots. factor in SLI/crossfire and were talking mobos the size of a pizza box. i dont know if i can take it...
  • Griswold - Wednesday, October 19, 2005 - link

    Agreed. Such a move will cause mucho trouble for the customer, but nvidia doesnt care about that. It smells like they're in bed with some mobo makers, due to their chipset ties..

    "you want to use our new G5000? Sure can, but you'll have to use a board with GDDR10 RAM on it, otherwise the GPU will be memory bandwith limited... what? Your mobo only has GDDR8? Well, maybe you should buy our latest mainboard with GDDR10 for only $500!"

    I dont think we'll be swapping out video RAM like we do with normal RAM now. If you want todays vid RAM performance, you cant get that with some sort of DIMM socket.

    I for one, dont like that idea.
  • bob661 - Wednesday, October 19, 2005 - link

    No more trouble than changing memory and CPU's. I don't see the problem. You can use the space previously occupied by the x16 slot or slots. One really only needs about 3 x1/x4 slots in a machine anyways (I would only need one or two max). Workstation users can buy a workstation motherboard (larger boards) if they need more slots.
  • Frackal - Wednesday, October 19, 2005 - link

    .
  • JarredWalton - Wednesday, October 19, 2005 - link

    Unrelated to your post, but does everyone else with Firefox see a big with the rendering of the above post? The subject is moved down to the body for me and everything is rather borked for this thread. Hmm. I wond'er if there's a problem in the subject?
  • JarredWalton - Wednesday, October 19, 2005 - link

    Just a theory. Ignore this otherwise....
  • zemane - Wednesday, October 19, 2005 - link

    Could the size of your browser font be the problem? You may increase it using Ctrl++ but the blue background height seems fixed and will not follow. To see that post's subject decrease your font using Ctrl+-.
  • JarredWalton - Monday, October 24, 2005 - link

    Ah. That's it. Long subject lines with a different font get borked.
  • bersl2 - Wednesday, October 19, 2005 - link

    A big what?

    Anyway, no.
  • JarredWalton - Wednesday, October 19, 2005 - link

    big bug. LOL. Yeah, that's what I meant. (Bad fingers - stop making typos!)
  • sxr7171 - Wednesday, October 19, 2005 - link

    Yeah I do.
  • DigitalFreak - Wednesday, October 19, 2005 - link

    I do as well. Firefox 1.07
  • Sunrise089 - Wednesday, October 19, 2005 - link

    I do as well - Firefox 1.5 Beta 1
  • DeanO - Wednesday, October 19, 2005 - link

    You might be right - unless a standard for the socket can be agreed upon, which admittedly will be less likely than it has been in the past with PCI/AGP/PCI-E, but still it's possible. Otherwise it'd be no different to the current SLI/Crossfire situation...
    - you're probably starting to see from all my posts that I like the idea ;-)
  • DeanO - Wednesday, October 19, 2005 - link

    Does this mean we might see dual socket GPUs? Or would there be dual core GPUs in place of SLI configs? Hmmm...
  • squeezee - Wednesday, October 19, 2005 - link

    This is kind of a half-good idea, while being able to swap in a faster GPU will be nice it does offer a limited upgrade path as well. Some things benefit from pure GPU power but you still need a fair bit of memory bandwidth these days. With a gpu socket design the user will be stuck with the same memory performance nomatter how fast the gpu gets, and when the memory technology changes from say GDDR3 to GDDR4 or if they want to get more/faster video ram they will have to buy a whole new motherboard.
  • RaynorWolfcastle - Wednesday, October 19, 2005 - link

    People forget that one of the reasons that video cards have fast RAM is because the RAM chips are soldered directly to the PCB.

    For one thing, that makes trace length management simpler since you always know exactly where each chip will be. It also saves the manufacturer money because they don't have to put in a socket. Thirdly, it also makes it possible to add many more contacts (and thus wider pdata paths as high-precision alignment for reflow soldering is done at the factory, instead of having joe six-pack trying to force the RAM in place in his basement. This also means that you get a much cleaner signal path to the soldered RAM.

    With that said, if upcoming GPUs are no longer limited by bandwidth but by pixel and vertex shading power, this could be a viable solution.

    Then again, maybe nVidia will push to have RAM chips soldered directly to the mobo. I'm not sure how good of an idea that is with the current state of RAM and video cards, However. Mobos generally use less PCB layers than video cards (6 vs 10 last I knew) so routing could become a nightmare.

    Just my 2 cents.
  • DeanO - Wednesday, October 19, 2005 - link

    I don't see why the RAM for the graphics core couldn't be changeable like RAM for the CPU currently is... If fact, it looks like you could have more control over how much RAM you have for the GPU.
    True about the change from say GDDR3 to GDDR4 though :-(
  • Schadenfroh - Wednesday, October 19, 2005 - link

    great idea, nvidia could sell the GPUs directly to the consumer and cut out the AIB makers, i bet they are not happy at all about this idea, save the ones that make mobos.
  • bersl2 - Wednesday, October 19, 2005 - link

    I wonder, can the whole graphics card concept be split up into discrete components, as the components that support the CPU are? Can we have a GPU daughterboard, pluggable GDDR RAM, and a socketed GPU, all made by different manufacturers? Of course, that would seem to raise costs in some areas, but reduce it in others; and by how much, I wouldn't know.

    Another thing I wonder about is whether this is a prelude to finally having an open graphics ISA (which need not be standard for all or any GPUs). I certainly hope so.
  • semo - Thursday, October 20, 2005 - link

    i think i read somewhere that a 6800gpu costs $40. i really doubt that if you could upgrade your gpu now you would be paying that price for it.
  • bob661 - Wednesday, October 19, 2005 - link

    That would be fantastic to be able to swap out GPU's like CPU's. For someone like me, that would mean no cards in any of my slots.

Log in

Don't have an account? Sign up now