Discussion in 'Video Cards & Monitors' started by eva2000, Apr 27, 2006.
Yeah, you can get a 2nd pci-e card for that much, why bother with a proprietary physics card???
Because a CPU, even if its only doing physics, can only handle a few hundred, if that, interacting rigid bodies. These cards can handle many thousands... CPUs are General purpose calculators, they arent specialised at doing anything as such...
My sample was shipped today w00h00
The price is $499 for the BFG but comes with no games as far as I have been told
I am not sure if the sample is going back out, if it has to I will buy one with out a second thought.
Start downloading Ghost Recon demo
500 dollards is a rip off. It is as essential as a sound card is. I was thinking 150-300. I'd rather run a game at higher graphics settings, at a higher framerate that have more shit flying round realisticaly
i guess it is all relative and depends on the person
same goes for pc/home theatre sound systems... for me $10-69 cheap speakers is all i need while some folks spend $1000s on state of the art sound systems heh
Any word if there will be other versions of the BFG Tox? I thought i heard that they might make low, mid and high range versions.
I only know of the 1 model at the moment. The UK had an OEM and Retail one but both where the same card.
And yes Ghost Recon is coming down *wink wink*
499 $$$$........bloody leeching sob's.
Quote from everyone after release of 3DFX voodoo card...
"Who would ever spend $500 on a Graphics accelerator???"
Guys, start looking at the big picture. Give it time…
Ive seen on other sites people very upset because it doesnt increase their 3dmark score or increase the frame rate...
In 5 years running a game without some sort of Physics acceleration will be the equivalent of someone trying to run F.E.A.R in software rendering, the stupidest thing you have ever heard.....
I think the real question is who will end up buying them out....
I dont quite think it is as essential as a sound card. infact, nowhere near as essential at this point in time. if it was so essential why would you consider buying another graphics card instead? Personally id prefer to have shit flying around realisticly, then have a few pretty pictures. i play games for the enjoyment, better environments, and your relation to those environments can only improve the joy of playing games. Some people like watching pretty pictures, i think DVD's were made for them.
Meh, for a product that has no real support yet, it seems to be getting a decent following already. Im willing to fork out a few hundred bucks if it will make me happier. there is obviously a reason its being pushed out into the market.
For me, if it can do what they say it can, and developers get behind it and start punching out some quality titles for it, ill be standing in line with my fistful of cash.
I have a question about physics processor---->Does it do anything more appealing other than those debris flying around in the air(GRAW)???? Hmmm $500 seems a bit too much for the effect it seems to generate in GRAW (Well if it can improve other aspects of a game then its fine.)
I welcome the future but I just hope its worth the price to pay.
I definitely remember the 3DFx Voodoo2 days... I still have my Voodoo2 12MB in SLi with a 4MB S3 Virge which I refuse to part with but it was a worthwhile investment. Same with these PhysX cards... bide your time gents.
That's my first thought when I heard of the PhysX solution. I talked to a few Professors in Sydney Uni about this, and in theory, it would work. AEGIA has provided the SDK and such required to develop applications for it...All you need is some C/C++ programming skills. (I also talked to AEGIA, and they know its possible, but they're focusing on games first, before anything else).
You set it up such that if it detects a PhysX card, it would use that. But if there is none present, it would switch to software or emulation-mode. (which is slow). The concept is the same as DirectX APIs.
I might consider buying one of these PhysX cards just to try and write an application. (Maybe a rough FEA application to see if things work?). And then look into reverse-engineering a driver for Linux.
I suspect those benchmark apps have used the AGEIA sofware SDK, but not enabled support for PhysX hardware. This would explain why there are no changes in the benchmark numbers with "Before" and "After" results. As in, these numbers are just showing "software-emulation" of physics.
So its too early to make conclusions about this until applications have been fully implemented with PhysX chip support. At the moment, it looks like its all in software-emulation mode.
No. Both the Xbox 360 and PS3 do NOT have PhysX chips in them.
They use AEGIA's API to "assign" a core(s) OR SPE(s) for the role of physics calculations. In a way, they emulate a PhysiX processor using the multi-core solutions they have in them. So I'm guessing the PS3 or Xbox 360 versions of the PhysX-enabled games will have an additional bit of coding somewhere to take advantage of the console's hardware.
You have to understand it from the Developer's view. The physics processor allows you to essentially have a fully interactive environment with enormous detail. It opens the door up to many things that were previously not possible with existing hardware. (Even the highest spec PC, overclocked, would not be able to handle such an environment...Unless you like slide shows).
At this point, its too early to judge a product until it has matured to a certain level (where software can fully take advantage of the PhysX chip).
GRAW is an early adopter of AEGIA's solution, hence, you don't see much. But as time goes buy, developers get more experienced with the tools, and will be able to make things far more impressive than just "flying debris". (like flow of water, lava, falling rocks, rain, and so on)...Its like learning a new activity...You start off really crap and uncoordinated. But as experience builds, you become more confident with the tools you use. That's when you get really creative.
My overall point of this thread is to not quickly judge this solution yet. Give it time. Better yet, don't buy anything until its shown to really benefit YOUR needs.
Proffesso - is that Avid Liquid you've shown - as I'm thinking of making the jump from Matrox/PremPro to HD 1080i Liquid Pro for use with a Sony HVR-Z1P.
Should work nicely with the Intel 3ghz dual core/ MSI7800GTX combo. If this card adds even more processing grunt then it would suit my needs and I'd be happy to fork out $500 to speed up the job.
Edit - just noticed background of shot shows its for animation, bugger.
I guess soon we will start seeing physics as an option given via in-game menu like it does for sound card( hardware or software support) etc
I am keen to see it kick off.....
I did a bit of thinking about this, and for engineering applications, calculations need to be very accurate. The card is tailored for real-time use (ie. games), not so much precision number crunching (just how you might use a cpu to render a scene accurately rather than relying on the graphics card and API).
The calculations needs to be spot on for the results to be of any use (we could be talking 64-bit floating point!), otherwise they'd be worth next to nothing.
If it were capable of number crunching at high speeds it'll be somewhat like a generic CPU. If it operated like other hardware acclerators in the sense that it offloaded certain calculations, then memory bandwidth becomes important, especially at high precision. That could be the real bottleneck. If thats the case, then calculations might just be better off with dual processors.
Though we will wait and see with interest
go download the cellfactor HDTV clip... wow! particles are cool! if this is what gaming is coming to.. i'm in
Is this the beastie?
And games need to have software designed for it to work right? So if I play BF2 for example it wont help in any way?
499 $$$$......these leechers must be on crack