• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
Locked
0

Unsupported GPU for CS5

New Here ,
May 05, 2010 May 05, 2010

Copy link to clipboard

Copied

With all the interest of the "unsupported" GPUs, I thought it was time to start a specific thread.

Please post your questions and experiences.

Hacking is not advised and the unsupported cards are not ready for production use.

You've been warned! 

Views

92.3K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
replies 218 Replies 218
New Here ,
May 08, 2010 May 08, 2010

Copy link to clipboard

Copied

I always stay with the big name brands that are commonly found:   EVGA, BFG, MSI,Gigabyte

Pricing is all pretty much the same...probably more now since everyone is looking for them.

That is why I bought the GTX 470 since the price was the same and is a much faster card in the games I play.  Only time will tell if CS5 Pr will support it...Officially.  Adobe says there are "render errors" but I can't find them.

If you are serious about your work, get a 285 though.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 08, 2010 May 08, 2010

Copy link to clipboard

Copied

I have the BFG http://www.newegg.com/Product/Product.aspx?Item=N82E16814143190&cm_re=285_gtx-_-14-143-190-_-Product

I think Chuck does too.

--------------------------------

8800 GTS and GTX actually have the same G80 GPU for soft mod and they both do it.  In fact, it actually gives you the option to choose the 5600 as well, but I don't that would make sense since the 5600 has 1.5gb memory or something like that.  Enough of the soft modding though.  The GTX 285 is the answer for me so far.  Basically, this just confirms that Adobe is on track with the supported cards.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
May 08, 2010 May 08, 2010

Copy link to clipboard

Copied

That is the one...  I recommend using the downloadable Nvidia utility that allows you to program the GPU fan speed.  I set ours up so that the fan runs at 100% when Premiere launches and goes back to stock speed when you close out of Premiere.

http://www.nvidia.com/object/nvidia_system_tools_6.06.html

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 08, 2010 May 08, 2010

Copy link to clipboard

Copied

8800 GTS and GTX actually have the same G80 GPU for soft mod and they both do it.  In fact, it actually gives you the option to choose the 5600 as well, but I don't that would make sense since the 5600 has 1.5gb memory or something like that.  Enough of the soft modding though.  The GTX 285 is the answer for me so far.  Basically, this just confirms that Adobe is on track with the supported cards.

I just read the softmod instructions.  It refers to an NVSTRAP.sys file that does not exist in todays drivers (geforce or quadro).  There goes that possibility.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 30, 2010 May 30, 2010

Copy link to clipboard

Copied

Hey, do You think that in future with updates, Adobe block this hack to turn on on many nvidia GPU MPE ??

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 30, 2010 May 30, 2010

Copy link to clipboard

Copied

Why would they?  Think of all the "live" beta testing going on.  Think of all the useful data to be gathered.  Think of all the word-of-mouth endorsements for CS5 and MPE.  And every bit of it support-free and liability-free.

-Jeff

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 30, 2010 May 30, 2010

Copy link to clipboard

Copied

I absolutely agree with you, Jeff.

I have been feeding Adobe with material and experiences on my experiences with the GTX-480 and they happily devour it. It is another source of information they don't have access to and if properly documented with hardware, software, conditions, project and source material, it is very valuable info for them. Such info they would not cut off.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
May 30, 2010 May 30, 2010

Copy link to clipboard

Copied

Harm,

Similar happened with PrElements 8, and graphics card issues (both ATI/AMD and nVidia), and Adobe even did two surveys looking for user-input. In rather short order, PrE 8.0.1 update came out, fixing many issues. Adobe cannot have every possible hardware configuration in the lab, so having useful data fed to them, with relevant details, would be invaluable. Keep up the good work.

Hunt

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 30, 2010 May 30, 2010

Copy link to clipboard

Copied

i wrote this question because i think about change my gtx285 to gtx260.Thanks them save money  and buy extra 4gb of ram (now 8gb ). Do you think that is good change ?? gtx260 is only about 10% slower that gtx285

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 30, 2010 May 30, 2010

Copy link to clipboard

Copied

We know that the Hack works...

The question now is whether cards like the GTX 260 and any other card based on the same GPU, have any artifacts or errors versus the "official" GTX 285.

Considering that the GPUs are close cousins, with only differences in bus width, available cores and clock, I would imagine that they should perform very similar with the only difference being speed.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 30, 2010 May 30, 2010

Copy link to clipboard

Copied

KamilekPOL wrote:

Hey, do You think that in future with updates, Adobe block this hack to turn on on many nvidia GPU MPE ??

I think there would be too much outcry and lost sales. Maybe Nvidia wouldn't mind, who knows, if they locked it out, but I no longer thing Adobe would do this.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

Hi All

I have a GTX 260

I am trying to decide whether to try the 'hack'

I see some people on this thread saying this could be dangerous - In what way ?

If it is just a software hack, I assume if it didnt work, then you could go back to the

original?

Surely no hardware could be damaged ??

Thanks for your help in advance

Chris

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

Do it!

There is no danger in breaking your card. I would say the only ***very far*** danger is in screwing up your install of Premiere, and that is a very long shot...

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

Valter Vilar wrote:

Do it!

Yeah I agree, just do it.  It's awesome.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

Hi Guys,

Since I started this thread there have been no failures or any kind of damage to cards/systems.

Also, keep in mind that Adobe themselves (Wil) has acknowledged the "hack".  Field testing without the liability for Adobe.

In general, if you have a Cuda capable card with 1 gig of ram, you are able to use the hack...albeit with varying degrees of quality due to the varying cores,speed,bus width of the cards.  The hack has worked on the Fermi cards down to the Geforce 8xxx serious if I am correct.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

Hey Chris, the hack is to the software, not to your card.  All it does is give you an option of using GPU Hardware Acceleration vs Software Acceleration.  So yes, my understanding is that if it does give you problems all you need to do is select Software Acceleration and your Premiere Pro CS5 will act the same as it did before.  I can't think of any reason why reverting to Software Acceleration wouldn't work.

However, I have a GTX 260 also and have experienced 0 problems using GPU Hardware Acceleration with the GPU accelerated effects.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

Ok guys

thanks for your quick replies

Im off to 'do it'

back later .....

Chris

ps Will this hack help with playback output window ? - At times it is jiddery, and I have a i7 quad chip and 6 gig

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

GPU acceleration should give you smooth playback in the viewport window when applying GPU accelerated effects such as Ultrakey.  I still get jitters with AVCHD files when using non-accelerated effects and playback resolution is set to Full Resolution.  When set to 1/2 resolution or 1/4 resolution it's smooth on my i7 860.  To change the playback resolution, I think you just right click on the viewport window and it'll be in the menu.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

cjacs987654 wrote:

Hi All

I have a GTX 260

I am trying to decide whether to try the 'hack'

I see some people on this thread saying this could be dangerous - In what way ?

If it is just a software hack, I assume if it didnt work, then you could go back to the

original?

Surely no hardware could be damaged ??

Thanks for your help in advance

Chris

There is no way in the world it can possibly be dangerous. I mean all it does is always you select the choice within the program that tells the GPU

do some calculations instead of the CPU! If you are afraid of this you might as well not have purchased a graphics card to begin with since you should be equally afraid to let a game play on it, let a 3D modeling/rendering program run on it, etc.

It is not an overclocking tool. And it is barely even a hack. All it does it to remove the block on cards that Adobe/nvidia have not officially tested on so they don't have to worry if by some odd chance it makes a calculation error or some weird bug pops ups (from Nvidia's point of view, I'm sure they just want to push expensive quadro and hope this scares some people into getting one of those intead of a GTC 275 or GTX 420 or something).

All the hack consists of is typing the name of your graphics card into a text file! That's it! And you don't even need to delete the name to remove it, you can toggle CUDA on and off within the program!

The only way hardware can be damaged is if you have such flimsy cooling that running your graphics card overheats things, in which case you system is doomed anyway since as soon as you place your first game for a little while, or watch a blu-ray, etc. you'd overheat anyway.

Once again there is nothing to worry about. If you are worried about this then never run any games, watch any movies, carry out any GPU work for protein modeling, 3D modeling, rendering, etc. and just sell off your graphics card.

So do iiiit. YOu cannnn doooo iittttt.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

Hi,

Thanks for your reply detailed reply.

I didnt think it would do any damage, but saw some messages on this thread which seemed to suggest it

Anyway, got it working now. Will give it some welly tomorrow

Thanks

Chris

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

Hi All,

Tried the hack - cant get it working. When i say it dodnt work, in adobe project settings - general I cant see anything for CUDA

Ive modifed cuda_supported_cards.txt to:

GeForce GTX 285
Quadro CX
Quadro FX 3800
Quadro FX 4800
Quadro FX 5800
Geforce GTX 260

The output of the sniffer is:


Device: 00000000003642C8 has video RAM(MB): 896
Vendor string:    NVIDIA Corporation
Renderer string:  GeForce GTX 260/PCI/SSE2
Version string:   3.0.0

OpenGL version as determined by Extensionator...
OpenGL Version 3.0
Supports shaders!
Supports BGRA -> BGRA Shader
Supports VUYA Shader -> BGRA
Supports UYVY/YUYV ->BGRA Shader
Supports YUV 4:2:0 -> BGRA Shader
Testing for CUDA support...
   Found 1 devices supporting CUDA.
   CUDA Device # 0 properties -
   CUDA device details:
      Name: GeForce GTX 260      Compute capability: 1.3
      Total Video Memory: 896MB
   CUDA driver version: 3000
CUDA Device # 0 not choosen because it did not match the named list of cards
Completed shader test!
Internal return value: 7

I have upgraded nvidia to 197.45

After I followed the steps 1 to 7, I ran gpusniffer.exe again and it didnt change still says 'not chosen because etc...

However, i noted in step 5 when I added adobe premiere pro.exe, when I applied and came back to the list of programs

it showed as 'Adobe Premiere CS4 (adobe premniere pro.exe) and not 5 as i expected

Anyone got any ideas ?

Thanks

Chris

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

Did you try a reboot? Are you on a dual monitor setup?

The hack works fine with the GTX 260, so it is just a matter of finding out what is wrong.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

Hi Valter

Thanks for your reply

Yes i rebooted after installing 187.45 then rebooted after the hack

I dont have dual monitor

Chris

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

oops 197.45 not 187.45

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jun 14, 2010 Jun 14, 2010

Copy link to clipboard

Copied

function(){return A.apply(null,[this].concat($A(arguments)))}

cjacs987654 wrote:

oops 197.45 not 187.45

You didn't type the name of your graphics card properly. It needs to be EXACTLY the same name that GPUSniffer reads it as.

You typed "Geforce" when it spells it "GeForce" which won't match what you entered with a simple string compare.

And just in case it did a caps no-care string match, then maybe the problem is that you are looking for a CUDA option when they renamed it Mecury Engine Hardware or something like that, I forget, but I don't believe they reference CUDA in the option name anymore.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines