AMD Fusion

Discussion in 'AMD x86 CPUs and chipsets' started by Polski Radon, Oct 26, 2006.

  1. Polski Radon

    Polski Radon Member

    Joined:
    Apr 3, 2003
    Messages:
    1,014
    With the completion of AMD's acquisition of ATi, there have been new reports of new plans by AMD to intigrate the GPU into the CPU, bypassing the chipset. The current timeframe for this fusion is 2008/2009.

    This will be great for low-end and OEM machines because of the lower amount of subcomponents on the motherboard, therefore lower cost, but in my opinion, this is not a viable option for the high-end, due to the high transistor counts for the current high end as well as the relatively higher turnover of GPU's over the lifecycle of a microprocessor.

    Is AMD going to be the predominant player in the low/middle end of the computer industry?
     
  2. stmok

    stmok Member

    Joined:
    Jul 24, 2001
    Messages:
    8,878
    Location:
    Sydney
    Yeah, a shift from IGPs. I guess chipsets will be simplified and cheaper?

    OT: LOL...
    Try this on your browser: www.ati.com
    And you'll get this: ati.amd.com

    ATI has turned green! :lol:
     
  3. mercho

    mercho Member

    Joined:
    May 7, 2004
    Messages:
    1,502
    Location:
    Bathurst, NSW
    lol..once you select something like crossfire though, you get the good old ATI red back :lol:
     
  4. joyufat

    joyufat Member

    Joined:
    Jun 27, 2001
    Messages:
    1,015
    Location:
    Moral High Ground
    I doubt it, Intel is pretty dominant in the integrated graphics sector and this "fusion" concept is completely unproven. They need to demonstrate why having a GPU integrated in the CPU is better than having it integrated in the motherboard.
     
  5. fr33lanc3r

    fr33lanc3r Member

    Joined:
    Apr 23, 2006
    Messages:
    418
    will be intresting to see what effects this has on the market in a few years time
     
  6. <pRo>ToSs

    <pRo>ToSs Member

    Joined:
    Sep 22, 2003
    Messages:
    3,538
    Location:
    BKK
    well i do hope this brings about parity between AMD and intel, and boosts AMDs sales of IGPs which is where the big bucks are, thereby building AMD's competitivbe positioning.
     
  7. MR CHILLED

    MR CHILLED D'oh!

    Joined:
    Jan 2, 2002
    Messages:
    136,478
    Location:
    Omicron Persei 8
    Wow, the new ATi website just took me by surprise......heh

    Thought I'd hit the wrong bookmark!
     
  8. OP
    OP
    Polski Radon

    Polski Radon Member

    Joined:
    Apr 3, 2003
    Messages:
    1,014
    AMD's stock price rose 3.22% just yesterday, time to sell some of my stock :).

    What I only foresee is AMD leaving the high-end FX line with on-die PCI Express controller and leaving the GPU as a stand alone card (or connect the GPU directly via HT). Having the latest Dx10 graphics cards from nVidia spotting 700 million transistors at 1.2GHz, it would be mad for AMD to include such a beast with the CPU. They could shave some of those off by running the graphics component running in synch with the CPU, which should be at least 2.5GHz at 65nm.

    What I would really like to see is AMD doing something like Intel is planning, that is having 80 mini cores on the die, but making most of those cores FPGA, giving the CPU all the performance when it requires, and then changing to a mixture of GPU's anf PPU's for games etc.
     
  9. SLATYE

    SLATYE SLATYE, not SLAYTE

    Joined:
    Nov 11, 2002
    Messages:
    26,823
    Location:
    Canberra
    I agree with what Polski Radon said - AMD is very unlikely to try a high-end GPU and CPU combination. Rumours suggest that R600 will manage about 200w; add a dual-core or quad-core CPU and that turns into 300w. 300w is a lot of power to deal with through one heatsink.

    If AMD can make a cheap CPU/GPU cost less than Intel's Celeron/IGP combo then they'll probably do well in the low-end market. Whether AMD can actually do that is another question.
     
    Last edited by a moderator: Oct 29, 2006
  10. T-O-D

    T-O-D Member

    Joined:
    Oct 23, 2003
    Messages:
    1,316
    Location:
    Brisvegas!
    In their press release at the time they mentioned that this was slated for 2008.

    I can forsee cheap 3D (read: OS uses only ie. Vista) machines in the near future with both CPU and GPU on the one unit.

    If they were to scale down the 3D power just so that it was enough for common tasks and not gaming, put it on a micro-ATX board, add a small (cheap) flash based storgae system.....

    i can see uses for nodes like that :)
     
  11. <pRo>ToSs

    <pRo>ToSs Member

    Joined:
    Sep 22, 2003
    Messages:
    3,538
    Location:
    BKK
    hell even for HTPC usage, that should be fine! It'd be great if they can pull it off :cool:
     
  12. iAl3xi

    iAl3xi (Banned or Deleted)

    Joined:
    Jan 26, 2006
    Messages:
    896
    Location:
    Eltham, Melbourne
    they should have most of it onboard and have it so u change out the GPU like when u change a cpu :D

    EDIT:miss spelled CPU as "cup"
     
  13. T-O-D

    T-O-D Member

    Joined:
    Oct 23, 2003
    Messages:
    1,316
    Location:
    Brisvegas!
    Ehh??

    I dont know about you, but i tend to change GPU more often than CPU.

    I dont want to have to pay a few hundred for something thats not really much better than what i had if i dont have to!

    EDIT: unless i misread and you meant having a GPU like we currently have a CPU. like just a chip replacement rather than a card???

    It's possible... although itd be interesting when it comes to memory speed/bandwidth my DDR dual channel is 128bit memory, the G80 has 384bit memory. HUGE difference
     
    Last edited: Oct 27, 2006
  14. laleo

    laleo New Member

    Joined:
    Nov 12, 2003
    Messages:
    20
    Everyone talks about the GPU being merged with the CPU. What about a graphics card with a full CPU on it?
     
  15. SLATYE

    SLATYE SLATYE, not SLAYTE

    Joined:
    Nov 11, 2002
    Messages:
    26,823
    Location:
    Canberra
    In addition to that, GPUs use RAM at 2Ghz+ while most desktops use 800Mhz at most.

    You can't put the main CPU on the graphics card - there needs to be something to organise data going to/from the RAM/PCIe/IO (and that won't work very well if the CPU in on the video card). There'd also be rather large latencies and limited RAM bandwidth if the CPU had to access the system RAM through the PCIe bus.

    Putting a 'secondary' CPU on the GPU is already happening. GPUs are getting better at general-purpose work, with the result that Folding@Home now runs very nicely on ATI's X1900 series of GPUs. Of course, they're only good at some things - but why bother to put a pure general-purpose CPU on the GPU when you've already got a perfectly good general-purpose CPU on the mainboard?
     
  16. T-O-D

    T-O-D Member

    Joined:
    Oct 23, 2003
    Messages:
    1,316
    Location:
    Brisvegas!
    You beat me by a coupla hours Chris :p

    I was gonna reply saying that the first thing i can see happening is having the CPU talk direct to the GPU using a HT link. (like the transition of the memory controller to be 'on die')

    One would imagine that for the lower end parts theres more than enough bandwidth to do that already with HT 1.0 tech, whereas the newer GPU's would probably need an extra link or three in HT 2.0.
     
  17. iAl3xi

    iAl3xi (Banned or Deleted)

    Joined:
    Jan 26, 2006
    Messages:
    896
    Location:
    Eltham, Melbourne
    yeah thats what i was trying to say :p
     
  18. Ze.

    Ze. Member

    Joined:
    Sep 13, 2003
    Messages:
    7,871
    Location:
    Newcastle, NSW
    Modern graphics cards have already exceeded that bandwidth in the middle and high end markets.
    http://en.wikipedia.org/wiki/Comparison_of_ATI_Graphics_Processing_Units
    http://en.wikipedia.org/wiki/Comparison_of_NVIDIA_Graphics_Processing_Units#GeForce_7_series
    It's an interesting option but i don't know practical it would be :) an alternative in the short term would be a GDDR memory controller integerated on to the CPU/GPU for the higher specification parts. In the longer term we could see a consolidation of system and graphics memory towards the same standard. If that happens then we are likely to see the age of DIMMs end and memory going back to being soldered on the motherboard.

    It'll start out in the low end and media appliance sectors and gradually move upwards. I think the home computer market is moving towards an integrated appliance as the way of the future.
     
  19. nitestick

    nitestick Member

    Joined:
    Apr 17, 2006
    Messages:
    234
    Location:
    perth.wa.au
    personally i think the solution to the potential uprade issues of a CPU integrated GPU would be solved if they would stop being slack in their coding of games :p. how is it people can playe 1-2 year old games (e.g. FarCry, HL2, F.E.A.R) and have high settings that look awesome and then pick up a game of the shelves today and have to play at bare minimum which looks god awful? lazy programmers <_<

    more so what i think will come of the merger is that AMD may adapt some GPU technologies for use on their CPUs. perhaps a little more process paralelism, it's just i can see a fusion of AMD and ATI technologies as yielding a number crunching monster. they could probably then usurp Intel's position as top dog in number crunching and scientific use.
     
  20. zpnq

    zpnq Member

    Joined:
    Jan 9, 2005
    Messages:
    122
    Location:
    south australia
    it could be done in a dual socket manner, where you have a motherboard similar to todays dual cpu motherboards, but the second cpu socket is acualy a gpu socket.
    This would mean you can upgrade the graphics component or your cpu component indapendantly. It would also deal with the extra heat issues.
     

Share This Page