• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
Locked
0

Pr CS5 - List of supported CUDA Cards

Advisor ,
Apr 01, 2010 Apr 01, 2010

Copy link to clipboard

Copied

Adobe is working on a playback and rendering engine for Adobe Premiere Pro called the Mercury Playback Engine. This new engine is NVIDIA® GPU-accelerated, 64-bit native, and architected for the future. Native 64-bit support enables you to work more fluidly on HD and higher resolution projects, and GPU acceleration speeds effects processing and rendering.

The Mercury Playback Engine offers these benefits:

  • Open projects faster, refine effects-rich HD and higher resolution sequences in real time, enjoy smooth scrubbing, and play back complex projects without rendering.
  • See results instantly when applying multiple color corrections and effects across many video layers.
  • Work in real time on complex timelines and long-form projects with thousands of clips — whether your project is SD, HD, 2K, 4K, or beyond.

Ensure your system is ready to take advantage of the Mercury Playback Engine in a future version of Adobe Premiere Pro. The Mercury Playback Engine works hand-in-hand with NVIDIA® CUDA™ technology to give you amazingly fluid, real-time performance. See it in action

* PR CS5 supports the following list of CUDA cards:

285.jpgGeForce GTX 285Windows and MAC
3800.jpgQuadro FX 3800Windows
4800.jpgQuadro FX 4800Windows and MAC
5800.jpgQuadro FX 5800Windows
quadrocx.jpgQuadro CXWindows

More hardware details:

http://www.adobe.com/products/premiere/systemreqs/

Views

51.9K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Adobe Employee , Apr 15, 2010 Apr 15, 2010

Now that the launch is done and this information is all public, I'm going to summarize all the bits of information that have been floating around into one distilled post:

The Mercury playback engine comprises of 3 areas (our chief weapons are surprise, surprise and fear...  nevermind...):

- 64 bit support, and better memory management / frame cache management / sharing between the Adobe apps (ie Premiere and After Effects & the Media Encoder have a notion of shared memory now, and are aware of how

...

Votes

Translate

Translate
replies 265 Replies 265
Guest
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

Harm Millaard wrote:

Although the remark is incorrect, I assume it was based on the underlying assumption that GTX cards are limited in their support of MPE to 3 tracks versus unlimited tracks with Quadro cards, even though the Quadro's have inferior specs, which leads to the speculation that the limitation is artificial and may have been coerced by nVidia for marketing reasons.

Whatever the reasoníng behind the limitation, do not be fooled. MPE will give a significant performance gain and ask yourself, how often do you need more than 3 tracks with Cuda enabled effects and when you do, what negative effect will that have if you need to render a slight part of your timeline?

The sustained rah-rahs for MPE in its current form continue to confound me (unless, of course, the comments are coming from pro-Adobe P.R.).  And the logic of this specific comment is strange.  Any and all hardware acceleration will work to the benefit of any MPE experience, offloading chores from the CPU.  So whether the user has one track or several tracks, hardware acceleration will improve the user experience.  Obviously.

The problem is, Adobe has simply disabled hardware acceleration for specific CUDA cards, whether or not those cards exceed or almost perfectly match the capabilities of, say, the GTX 285.  It is a classic substitution of judgment against the interest of computer-savvy people who would deal with potential instabilities at their own risk.  What has happened here -- and this should be obvious from reading this thread, a controversy that I "accelerated" at the beginning -- is that Adobe has upset many if not most of its customers by taking away their choice to proceed with the supposed "risk" of instability, which is merely a guess rather than a proven fact.

Saddest of all would be a retrospective analysis that, for example, MPE hardware acceleration in its present form works perfectly well for the complete high end of nVidia's CUDA product line.  I consider that to be a sure likelihood; but of course, I cannot prove it.  And even if Adobe could, they wouldn't go into the details on our account.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

Adobe has upset many if not most of its customers by taking away their choice to proceed with the supposed "risk" of instability, which is merely a guess rather than a proven fact.

If you are right, then it is Adobe who will suffer when those angered refuse to buy the upgrade to CS5.  If your logic is correct, then it will be a self-correcting situation and you may get what you want sooner rather than later.

OTOH, if there is enough added value in CS5 separate from CUDA acceleration, then the list of supported CUDA cards will remain very short for the time being.

You have the opportunity to vote with your wallet, and I suggest that you do exactly that.

-Jeff

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

You know how Apple is saying they wont let Flash on there phones/pads because it 'could' make performance worse for the customer. Adobe slammed them saying they just wanted to control the market. Now Adobe is doing the same thing by selectively allowing just a few high end card, there reasoning being the same as Apples.

Just saying...Don't care one either way. I'm buying a 285 next month no complaints if cheaper cards or my current 260 aren't supported.

Maybe Adobe should just say...."look Nvidia spent tens of thousands of dollars helping us make MPE a reality. This is the deal we made with them. If we didn't make that deal there would be no MPE in CS5. Also the money saved on not having to give tech support to the dozens of other low end CUDA enables cards(some years old) helped pay for the R+D that made MPE possible "

If they just came out and said the truth would everyone be happy?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Advisor ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

zenviolence wrote:

-- is that Adobe has upset many if not most of its customers by taking away their choice to proceed with the supposed "risk" of instability, which is merely a guess rather than a proven fact.

...

saddest of all would be a retrospective analysis that, for example, MPE hardware acceleration in its present form works perfectly well for the complete high end of nVidia's CUDA product line.  I consider that to be a sure likelihood; but of course, I cannot prove it.  And even if Adobe could, they wouldn't go into the details on our account.

Adobe has taken away choice?   You may run any graphics card you want, and you will get significant improvements in performance over CS4.   Add an approved card and you will get exponetial performance improvement.   I dont see that making too many people mad.  

Your last statment is based on what knowledge of testing the engineering and integration of any cuda card with CS5?  I suspect none.   Adobe has been very open about what cards they choose and why even long before launch. 

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

Curt Wrigley wrote:

zenviolence wrote:

-- is that Adobe has upset many if not most of its customers by taking away their choice to proceed with the supposed "risk" of instability, which is merely a guess rather than a proven fact.

...

saddest of all would be a retrospective analysis that, for example, MPE hardware acceleration in its present form works perfectly well for the complete high end of nVidia's CUDA product line.  I consider that to be a sure likelihood; but of course, I cannot prove it.  And even if Adobe could, they wouldn't go into the details on our account.

Adobe has taken away choice?   You may run any graphics card you want, and you will get significant improvements in performance over CS4.   Add an approved card and you will get exponetial performance improvement.   I dont see that making too many people mad.  

Your last statment is based on what knowledge of testing the engineering and integration of any cuda card with CS5?  I suspect none.   Adobe has been very open about what cards they choose and why even long before launch. 

Curt, methinks you're rather late to the game in understanding the situation, though it should have been clear anyway from my response.  You're simply wrong to say that "any graphics card you want" will "get significant improvements in performance over CS4."  You could have the cheapest graphics card in the world and it wouldn't make a spit of difference in this context.  Adobe has strategically decided to disable hardware acceleration in all but one non-Quadro nVidia card among dozens of CUDA products, some better, some worse than the GTX 285.  The suspicion voiced here that Adobe and nVidia are colluding against us is well-founded, but possibly inaccurate.

Yet, again, as I wrote before, the rah-rahs for MPE are absolutely strange (or perhaps stubborn) -- yes, we get it, MPE will speed things up on the software side.  Even more dramatically, though, is the effect of video card hardware acceleration.  That latter area is where Adobe flubbed massively (so far).

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Advisor ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

I am late the game?   I think you misunderstand.  MPE is part of CS5 regardless of the graphics card you have.  The 64 bit and SW improvements in CS5 deliver improved performance period.    Add a compatible cuda card and you get exponential performance improvements.   You're entitled to your opinion, but I fail to see how this is a flub.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

Curt Wrigley wrote:

I am late the game?   I think you misunderstand.  MPE is part of CS5 regardless of the graphics card you have.  The 64 bit and SW improvements in CS5 deliver improved performance period.    Add a compatible cuda card and you get exponential performance improvements.   You're entitled to your opinion, but I fail to see how this is a flub.

Harm Millaard wrote:

You're simply wrong to say that "any graphics card you want" will "get significant improvements in performance over CS4."  You could have the cheapest graphics card in the world and it wouldn't make a spit of difference in this context.

You are wrong to say that. It is unsubstantiated and absolutely incorrect.

Without a CUDA certified card performance gains are around 40%. That is a significant improvement.

Curt and Harm:  You're wasting everyone's time.  Re-read my previous two postings, rather than skimming them, and you'll understand the issue.  No one disputes that MPE will speed up non-GPU operations.  The controversy has to do with Adobe's botched roll-out of GPU acceleration within the MPE framework.  Thus my use of the phrase "in this context."  Got it?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

Gentlemen, please don your flame retardant suits...

Zenviolence (interesting moniker, by the by), I completely understand your point about wanting choice.  As an engineer, I too like to dabble - I custom build my home computer rigs, overclock them & quietly gloat to myself how I stuck it to the 'man' (not sure who that is, but boy, I'd hate to be him) on how fast my machine runs for bargain costs, etc.

But, flipping to the standpoint of Adobe on this, the problem with handing over the keys to the goods is a support issue.  You or I may be knowledgeable enough to tune things correctly, but for a phone rep to try to diagnose & troubleshoot issues with the majority of the customer base?  Think of it like aftermarket performance car parts - they might work fine, probably work better, but from an official car dealership stance, you're on your own.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

Thanks, Wil; my proposal to Adobe (voiced elsewhere) has been that the middle-ground/compromise is to offer tiny little patches to those of us Adobe smarties who would sift deep into Adobe.com for the download; for Adobe to issue a file patch that merely flips the switch on for, say, the GTX 295 which is practically the same thing as (only better than) the GTX 285 is a ten-minute job for any competent programmer.  In the alternative, a pop-up dialogue could warn "non-power-users" of the apocalyptic dangers of proceeding to enable GPU acceleration for the GTX 295 -- which is, incidentally, a risk that may never materialize into a problem.

I really don't think that Adobe would have to worry about Joe Average CS5 customers in this context.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

I really don't think that Adobe would have to worry about Joe Average CS5 customers in this context.

Having been around these forums for a number of years now, I can safely say that I think you might be underestimating the amount of trouble that Joe Average can get into, either accidentally or on purpose.

-Jeff

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

Just wondering...does anyone know if there are any brand name PC's out there that ship with a GTX285?  I've found Dell's and HP's that have the higher end cards but not this one.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

Find a custom builder or build your own as the "name brand" stores limit your choices...or buy it with the cheapest card possible and order the GTX 285 from your favorite online store and install. Dell et al. would do well to stock it for this very reason. You may be able to get them via their workstation class of machines.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Advisor ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

If you install yourself be aware that  it takes up two slots and is a full length card.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
Apr 23, 2010 Apr 23, 2010

Copy link to clipboard

Copied

Also be sure that the power supply has the special output lines that go directly to the card (I think the card takes two direct power lines)

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Apr 23, 2010 Apr 23, 2010

Copy link to clipboard

Copied

And be sure you have at least a 650 watt power supply.

I have a friend I'm gonna build a computer for because non of the big brand computer come with 285's as an option.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Advisor ,
Apr 23, 2010 Apr 23, 2010

Copy link to clipboard

Copied

And; it generates heat.  Dont put it in a system that already runs hot without enhancing the cooling.   The card itself has its own fan, but it wont help a system that already runs hot.

Of course a properly built system wont be running hot; so it shouldnt be a problem at all.   But, we run into a lot of improperly built systems from time to time.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 04, 2010 May 04, 2010

Copy link to clipboard

Copied

Wil Renczes wrote:

Gentlemen, please don your flame retardant suits...

Zenviolence (interesting moniker, by the by), I completely understand your point about wanting choice.  As an engineer, I too like to dabble - I custom build my home computer rigs, overclock them & quietly gloat to myself how I stuck it to the 'man' (not sure who that is, but boy, I'd hate to be him) on how fast my machine runs for bargain costs, etc.

But, flipping to the standpoint of Adobe on this, the problem with handing over the keys to the goods is a support issue.  You or I may be knowledgeable enough to tune things correctly, but for a phone rep to try to diagnose & troubleshoot issues with the majority of the customer base?  Think of it like aftermarket performance car parts - they might work fine, probably work better, but from an official car dealership stance, you're on your own.

I have to wonder though why the sudden big production about this? Why does no other HW accelerated multi-media software make such a big deal out of it? Does PowerDVD go around disabling most cards in the lineup? Does Lightwave suddenly revert to software engine?  Wouldn't the general mess of someone's OS and system install/motherboard/CPU-type/etc. likely be more variable than CUDA between many of these cards? And mention is made a possible driver bugs in certain cases, well from what I have seen they come and go, so even if it is all fine with this card today it might be the opposite story the next release from nvidia for all you know.

I though the whole point of CUDA is that is supposed to be a general standard and that you are not programming individual cards entirely to the metal. I don't recall reading anything yet about having to do special coding in CUDA for certain seemingly more or less identical video cards. Why is the 275 any different from the 285 when it comes to CUDA? Nvidia is bragging rght and left on their page about CUDA in the 200 series. I'm looking at using CUDA/DirectCompute/OpenCL for speeding up a non-realtime photo-realistic renderer and I don't, as yet, see any dire warnings about CUDA from card to card, Nvidia didn't even restrict their new real-time ray-tracing demo to the Fermi series but let it work on the 200 series as well (although it is good deal slower indeed on say a 275 than the 400). It is certain way faster on Fermi cards than 200-series cards though.

Now I could wrong, but something seems odd here.

Why not at least a "this card is not officially supported, Adobe will not be responsible for providing any technical support to users using the CUDA engine on this card, agree? yes no"?

anyway thank you for participating in the forum though, not trying to be harsh, i think everyone very much appreciates that you are posting here

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

Harm Millaard wrote:

"Oh Lord, help me to keep my big mouth shut until I know what I'm talking about."

Or do better:  Remind yourself that the topic of this entire discussion is nVidia GPU hardware acceleration, not non-GPU MPE improvements.  If all you want to do is cheer on MPE's non-GPU-related speed improvements, there's a big huge forum waiting for you.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

You're simply wrong to say that "any graphics card you want" will "get significant improvements in performance over CS4."  You could have the cheapest graphics card in the world and it wouldn't make a spit of difference in this context.

You are wrong to say that. It is unsubstantiated and absolutely incorrect.

Without a CUDA certified card performance gains are around 40%. That is a significant improvement.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Advisor ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

joshtownsend wrote:

Adobe is disabling motherboards that haven't been fully tested by Adobe and older computers with a shortage of RAM that won't run Premiere stable and fast like advertised. Right now the only high-end Intel i7's are supported with more processors possibly being added before release next month. Also harddrives that are too slow to playback native 4k will not be accessible through Premiere. This is to guarantee stability to all customers.

This is not correct.   The source of your info?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

I was joking.....thought it was obvious. Sorry.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Advisor ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

Im too old for anything to be obvious; except the fact that Im old and complaining about taxes almost full time now.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Apr 22, 2010 Apr 22, 2010

Copy link to clipboard

Copied

peculiar joke.

very close to hit "ignore_this_user_button"

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 23, 2010 Apr 23, 2010

Copy link to clipboard

Copied

I want to say what I find wrong with all of this issue cause I think there are things that need to be said. The problem that I see here is that when Nvidia promoted the CUDA standard it promoted it precisely as such and as far as I know a standard is supposed to be exactly that but now some people are finding the hard way (an expensive way) that the "standard" may be really a "quasi standard" or "pseudo standard".

The problem I see is I that I think people expect something that is CUDA compatible or Open CL compatible to be exactly that and now they are realizing that that is not the case. This is like buying a Direct 3D or Open GL compatible card and then when you buy a Direct 3D or Open GL compatible software or game you find out that despite the "standard" it still doesn't work. This is exactly the same thing happening but with another standard.

Now think of the history of the PC 3D standards and precisely of some that are very closely related to this issue like 3D acceleration cause they are right there in these same graphic cards. One thing that comes to mind is the fact that in the early stages or days of both Direct X and Open GL there were growing pains and it took time to work the bugs out but there are still some issues.

Nowadays you are less likely to find problems with Direct X or Open GL but you still find them. How many times you brought a PC game home only to find out that it doesn't run on your graphic card even with the latest drivers. It has happened to me a couple of times and I'm not talking about performance issues, I'm talking here about your hardware far exceeding the minimum system requirements only to find out that your 3D program or game looks all crazy on the screen and is unusable.

Now, like I said nowadays is harder to find such an issue and I wonder if the same thing is not happening here again. The problem is that Adobe may said whatever they want to say or Nvidia may say whatever they want to say but the bottom line for those that bough expensive hardware is that it didn't work the way it was touted. When people bought those CUDA capable or CUDA enabled cards they bought them with the idea that any CUDA capable or enabled software was going to be able to take advantage of their investment and now they found that that is not the case. How can we not expect them to be very angry? And whose fault is it for this? Adobe or is it Nvidia or both? Who is taking responsibility for the problem?

These are all valid questions because it entails other questions such as if this is a problem that people that posses CUDA capable hardware can expect in the future with other software programs. Are people that have such CUDA capable cards going to have a spotty coverage or spotty compatibility from many CUDA enabled software programs out there? Is this a similar growing pain such as what happened in the early days of the Direct X and Open GL standards? Will it go away as the "standard" matures?

Has Nvidia had enough time already to mature the CUDA standard enough to the point that this is not supposed to happen? Is it a standard that is robust enough to prevent this kind of thing from happening and if it is, is it Adobes fault for not working properly with it? If there are kinks in it that need to be worked out and will Nvidia work them out soon? Will we see a CUDA version 2.0 that has less of these type of problems? Good, good questions but at the moment the bottom line is that many people that bough CUDA capable hardware are going to feel disappointed that their expensive graphic cards didn't work as touted.

I have in particular two Evga 260 GTX Graphic cards in SLI mode but at the moment I have no use for Premiere. I only have Creative Suite CS4 Web Premium so I didn't buy Premiere and I don't plan to buy CS5 anytime soon cause I'm very happy with my CS4 suite at the moment but if I had bought Premiere CS5 thinking that my CUDA capable graphic cards could give me that extra performance that Adobe was promising and then when I got home I found that it didn't work like that I would probably be as upset as those people.

What if I bough the Hypershot renderer (a very expensive thousands of dollars CUDA capable renderer) thinking "Oh with my two 260 GTX cards it is going to run much faster" and then I found something like "We are sorry but Hypershot doesn't accelerate at all with your graphic cards you will have to use it with CPU acceleration only" and worst what if instead of those two 260 GTX cards I bough two very expensive Quadro cards and found out that they were not compatible with my also very expensive Hypershot? Uffffffffff!!!!! Boy would I be angry!

So best practice for people now is of course that from now on when they are in the market for those CUDA "capable" cards and CUDA "capable" programs they have to be certain first and ask the software manufacturer to make sure that their hardware is compatible with it and for those that already made the investment sorry for them and let's hope that either Adobe or Nvidia fix their problem and soon.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Apr 24, 2010 Apr 24, 2010

Copy link to clipboard

Copied

This thread has been pretty interesting and in some ways entertaining to read and follow. I'm really starting to believe that many people just can't be satisfied. I've been a part of these forums for years now, and with every release people have complaints. And this release seems to be no different - even though it has yet to hit the shelves.

Users have complained for a long time that PPro is unstable, bug-ridden, not "Pro" enough...I don't agree with that, but that's what seems to come up time and time again. Some of this stems from (by my observation) users who don't always follow the recommended hardware requirements, sometimes it comes from using odd consumer formats, and other times it's just a bug - every piece of software has them...even FCP and Avid.

But getting back to the bit about hardware - the other two big A editors have hardware requirements. FCP has the most restrictive - you have to buy a Mac. Avid works on both, but has it's own set of requirements for it to work properly. Adobe hasn't really ever set any strict hardware requirements (compared to the other big A companies) other then processor speed, memory, and hard drive space - and for cutting DV, those are pretty low. And I've always felt that many people who try to use home built machines that were built after a weekend at Fry's Electronics and done without really consulting anyone who knows anything about video editing often seem to complain the most. Now some on these forums REALLY know their stuff and can build themselves one hell of a computer. Others (and I admit - I'm not really a hardware guy) should leave it to the pros.

So this time around, Adobe comes up with a great, powerful new feature - the catch is that you have to have one of a handful of video cards to use it. Why? So that it they could focus on those cards during development to create the most stable, most reliable, and best performing feature they could. They finally got some strict guidelines. And you know what - I'm glad. Why? Because I've seen MPE work, and it's amazing. I would much rather have a limited number of video card choices and have software that performs brilliantly, then have a large number of cards that half work, or work some of the time.

I believe that Adobe listed what cards would be supported some time ago - it's not like this is any big surprise. Will cards be added...I'm sure they will, but I'm also guessing that it will be a short list to keep the level of performance high. Adobe has limited resources, and can't be expected to test and certify every card out there. By limiting it, they have done us all a favor - given us something that works great with the right equipment.

I'm glad Adobe has finally laid down some ground rules for working with PPro. I'd welcome more of them if it meant a more stable NLE with more great features like MPE.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines