• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Does the NVIDIA GeForce GTX Titan X work with the newest AE CC version?

New Here ,
Apr 29, 2015 Apr 29, 2015

Copy link to clipboard

Copied

Hello

I would like to know if I can use the GeForce GTX Titan X for the newest AE CC version? On the system requirements Page (2014) for ray-traced 3D renderer, the GTX Titan is mentioned, but not the Titan X. Is the Titan X also working? Who can help me.

I would like to run it on a HP z640 (Windows) Computer.

THX for all the help.

Tom

Views

20.9K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Apr 29, 2015 Apr 29, 2015

Copy link to clipboard

Copied

Yes. Been running it for a few weeks.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 29, 2015 Apr 29, 2015

Copy link to clipboard

Copied

THX for your answer. And you can really use it for ray-tracing in AE?

Just why doesn't Adobe put the GeForce GTX Titan X on this list?

What Computer are you using to run the Titan X?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Apr 29, 2015 Apr 29, 2015

Copy link to clipboard

Copied

I'm running it on a custom build with a Asus X99-E WS and with an Intel 5930k and 64GB ram.

I misread the original post and can confirm that it does not work with the 3D raytracer. It works with all other aspects of CC that I have tested. My apologies. Can I recommend that you use Element 3D instead as it opens up far more possibilities than what the AE raytracer can do.

The Titan X and the Quadro M6000 have the same GM200 chip so you will see no performance increase in AE with the Quadro. Stay away from any Quadro card when purchasing GPUs for any Adobe software. Quadros are crazily overpriced and only offer minor performance gains when used with software designed specifically for Quadro drivers of which Adobe software isn't anyway.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 29, 2015 Apr 29, 2015

Copy link to clipboard

Copied

‌Thank you.

Sad to hear that it doesn't work. But you are saying, that for Element 3D it does, right? Because that would be great! I will buy an hp z640, Intel XE/E5-2630v3, with 2x8GB RAM. So I'm at 16GB RAM, but thought the Titan will handle almost every work so I don't need more. Am I wrong?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
Apr 29, 2015 Apr 29, 2015

Copy link to clipboard

Copied

‌The Titan is way overkill on this system. The 2630 is pretty slow, can not be overclocked, is not a dual CPU configuration and most importantly, 16 GB is not enough memory and wrongly configured for the quad channel architecture. You would need 4 or 8 memory sticks, so either 32 or 64 GB memory.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Participant ,
Apr 30, 2015 Apr 30, 2015

Copy link to clipboard

Copied

To be honest, Titans are way overkill on most systems - but it wasn't my money I was spending when I go mine For my own machine I've got a 980 which is perfectly good enough for Element. The titan will give you about 25% more for twice the money.

I'd have to agree with cc_merchant. If you're looking at buying a new system then I couldn't recommend a Xeon, even the 2630 which is nicely priced. Look at getting an X99 i7 with a 980. There are rumours of a 980Ti coming out this summer so you may want to hold out for that. That will essentially be the same GM200 as in the Titan X but with 6GB.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 30, 2015 Apr 30, 2015

Copy link to clipboard

Copied

‌‌When you are talking about overkill, what exactly do you mean? Does the GPU depend so much on the CPU and the RAMs that the Titan X will not work properly?

Can you recommend an HP z640 build, that would work with the card? One that is a dual CPU conf., has 32GB RAM.

I need a Computer for AE movie work in 4K.

Thanks for all your help and criticism.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Apr 30, 2015 Apr 30, 2015

Copy link to clipboard

Copied

If you want a system optimized for AE get all the memory you can afford/install in it

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
May 01, 2015 May 01, 2015

Copy link to clipboard

Copied

‌Overkill means that you can drive a Porsche 928 with it's 800+ HP engine in rush hour. The engine is mostly idle. The Porsche is no faster than the Fiat 500 in the next lane, despite the enormous power of the engine.

THe CPU and memory need to feed the video card with data, in order for the video card to do its magic. If the data feed is no more than a trickle, the video card is mostly idle, waiting for the next trickle of data.

A faster CPU and more memory help to increase the trickle to a steady stream of data.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Expert ,
May 01, 2015 May 01, 2015

Copy link to clipboard

Copied

No 928 has ever had 800bhp

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 01, 2015 May 01, 2015

Copy link to clipboard

Copied

‌To get back on the subject, what would be a good enough workstation for the titan x?

- HP z640 - 1x Xeon E5-2630v3 and 32GB RAM?

- HP z640 - 2x Xeon E5-2620v3 and 64GB RAM?

- HP z640 - 2x Xeon E5-2697v3 and 64GB RAM?

- HP z840 - 1x Xeon E5-2680v3 and 32GB RAM?

- ???

Or is it just stupid to put a Titan X in a computer system for less than 5000.- €? If so, what Card would you recommend then?

Thx for all the comments and help.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
May 01, 2015 May 01, 2015

Copy link to clipboard

Copied

‌2 x Xeon E5-2697v3, 64+ GB, transfer rate on all volumes > 800 MB/s.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
May 03, 2015 May 03, 2015

Copy link to clipboard

Copied

What about:

HP640 with 2x Xeon E5-2643V3 and 64GB RAM?

How can I see, what GPU would fit that System and not just be the "Porsche 928 in rush hour"?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
May 03, 2015 May 03, 2015

Copy link to clipboard

Copied

‌Sorry for my earlier typo. I meant the Porsche 918 Spyder.

To determine when to use the Titan X, use a rough rule of thumb like this formula:

MIN (N, 1.5) x (physical cores/CPU) x (clock speed), where N is the number of CPU's.

Your example with dual E5-2643v3 results in: 1.5 x 6 x 3.4 = 30.6

With dual E5-2697v3, the result is: 1.5 x 14 x 2.6 = 54.6

WIth a single i7-5960, overclocked to 4.5, the result is: 1 x 8 x 4.5 = 36

When the result is < 40, it is highly doubtful the Titan X can be used to full potential, when the result is between 40 - 50, it may occasionally be used to full potential, when the result is > 50, it is very likely to be a very good choice.

This statement assumes that memory is at least 64 GB and the disk setup is very fast on all volumes with transfer rates in excess of 800 MB/s.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
May 19, 2015 May 19, 2015

Copy link to clipboard

Copied

You should say, No stock 928.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Apr 29, 2015 Apr 29, 2015

Copy link to clipboard

Copied

What can you tell me about the quality and speed of this card (GeForce GTX Titan X) compared with the NVIDIA Quadro M6000. They tested this two cards, but I'm not shure if I understand the result correct. For me it looks like for work in After Effects the Titan is as good as the Quadro. Here a Link to the test (it is in german):

http://www.tomshardware.de/geforce-quadro-workstation-grafikkarte-gpu,testberichte-241759.html

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
Apr 29, 2015 Apr 29, 2015

Copy link to clipboard

Copied

The  Titan X would be better than the Quadro K6000 because it is a 900 series GPU instead of 700 series. That means less heat and power draw a the very least besides HDMi 2.0.  The other 900 series cards have not worked with Ray Tracer from CS6 since it's legacy now and not been updated with CC 2014. I suspect it wont with the Titan X cards but the latest NVidia drivers may have addressed that. However the  Titan X works fine for GPU acceleration in all other applications in the Adobe suite.


Eric
ADK

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Sep 30, 2015 Sep 30, 2015

Copy link to clipboard

Copied

Does anyone know if there any progress in this question?

My triple Titan X system realy delivers in Octane render, but AE still fails to launch ray trace

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guru ,
Sep 30, 2015 Sep 30, 2015

Copy link to clipboard

Copied

That is to be expected. The 9xx series do not support ray tracing. Any $ 30 video card is just as good as three Titans for $ 3300 in AE. Says something about a wise investment.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Oct 13, 2015 Oct 13, 2015

Copy link to clipboard

Copied

Any news on when this update will hit? http://blogs.adobe.com/aftereffects/2015/09/whats-new-and-changed-in-the-upcoming-update-to-after-ef...

It would be great to use the 900 series cards as intended..

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Feb 21, 2016 Feb 21, 2016

Copy link to clipboard

Copied

Adobe CC2015 works in mysterious ways:

Just finished this build:

Single xeon 2697v3 on x99 deluxe with 64gb 2400ddr4 ram

OS/Program Drive samsung 950 pro m.2 256GB

Cache Drive: samung 950 pro m.2 512GB

Media Drive: Raid 0 Samsung (2x1TB) 850 pro's

Titan X

This system is very fast at 3ds Max, but in After Effects it's is shockingly slower than my old setup ( i7 3770k OC'd to 4.3Ghz with a GTX 780 32gbddr3 and old ssd's).

I then bought an i7 5960x cause I thought something was wrong! OC'd it to 4.5Ghz and it's basically the same speed as my old machine in After Effects in both previews and render.

Adobe's Media Encoder was even more frustrating:

The i7 3770K is generally 20% faster than the i7 5960X With CUDA enabled

With CUDA disabled they the i7 3770k was 10% faster.

The xeon 2697v3 however was 50% SLOWER than the i7 3770k (CUDA enabled)

I can live with Media Encoder being slow, but it's really frustrating to see such stalled performance in After Effects

Now comes the interesting part:

I did a third build which wasn't meant for AE but decided to test it.

i7 6700k OC'd @ 4.8GHz 16GB ddr4 ram and on board graphics

OS on a pair of Raid0 Samsung EVO 850

All files on the C: drive

This system ended up being the fastest in AE at both previews and renders.

So faster than the i7 3770k with gtx780 and faster than i7 5960x with Titan X

In Media Encoder the i7 6700K (no CUDA) was on a tie with the i7 5960x (With CUDA)  but still 20% slower than the i7 3770k
(With CUDA)

Whats interesting about this is:

1: The Titan X doesn't do much difference and since its not officially supported it seems the 780 works way better than the Titan x even this has been enabled in the preview setting. (there was no ray-tracing going on in my renders, so After Effects must still utilize the GPU maybe as open GL on both previews renders as well as encoding inside Media Encoder and it seems and unsupported GPU just doesn't integrate well.

2: Hard Disk configuration seems to be over hyped as a reason or a potential bottle neck and the ram myth is also just a myth.

The test on the i7 6700K clearly showed that a system with only a C: Drive had no impact versus multiple lighting fast m.2 drives .

Also i7 6700K with only 16GB DDR4 ram was the only setup that was faster in AE in both preview and rendering and this was with on board graphics card.

The i7 3770k with 32GB of DDR3 ram and even the i7 5960x with 64GB of DDR4 ram didn't make a difference.

3: The multi-core dilemma also is not valid. the xeon 2697v3 has 14 cores (28 threads) and they all worked equally much in AE, only problem was AE was not able to feed them enough data, so they didn't exceed 10-15%

The i7 5960x with 8 cores (16 threads) showed the same issue. All cores where working but AE failed to utilize the full potential of this CPU as well.

The i7 6700k with 4 cores but without CUDA cores was the fastest. It didnt max out the CPU but clock'ed at around 70%

The i7 3770k with 4 cores and the 780 seemed to be the best setup giving this is the "slowest cpu of them all". The CPU was allowed to max out at 70-100% which clearly made a difference.

Conclusion:

I believe the culprit here could be the Titan X, meaning if AE can use both cpu and gpu in a nicely orchestrated conjunction as it does with the i7 3770k and the 780, then it seems to be able to take full advantage of your hardware.

The i7 6700k wins maybe because Adobe has geared it's latest CC2015 towards the skylake architecture and/or because at 4.8Ghz pr core it's the fastest clock speed of all the CPU's I testest and without the unsupported Titan X to interfere it outperformed the i7 5960x and the xeon 2697v3

Having 16 or 64 GB of ddr4 ram didn't do any difference. AE just slowly fills up the assigned RAM but with no impact on speed. (maybe you can have longer ram previews which is nice, but it does nothing for speed)

The old hard disk configuration argument is also not valid. The fastest system had 1 drive ( A relatively fast drive admittingly which does read/write simultaneously) but I bet it would have been just as fast on a normal 5400rpm sata HDD.

My Advice to the consumer:

Get the i7 6700K and a gtx 780 or other card that's supported by AE.

Don't break the bank on the latest and fastest HDD's, just make sure you back up your files every now and then. Maybe consider doing a raid1 or 10.

I would however get a decent ssd for the OS just because app's and booting is faster.

My Advice to Adobe:

Rumor has it that all these strange results are only temporally and since changing the render engine from cc2014 you have been working on a new approach which hopefully will stay as simple as the current cc2015 but intergrate more GPU's and CPU's so they can be fully utilized.

Now if this is the case then this should be your most important update ever, since these current issues has a really negative impact on professional users of your platform.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 28, 2016 Jul 28, 2016

Copy link to clipboard

Copied

hello guys!

after effects had *always* issue with CPU cores:

- is the past you have to use the regretted "gridiron nucleo pro"

GridIron Nucleo Pro - YouTube

all the begin of this video is still accurate, after all this time...

- then adobe "bought" or "copied" this idea and integrated it into the option panel of AE

AE only using 25 percent of my CPU

Also, if you enable Render Multiple Frames Simultaneously, the main After Effects application is just one instance. The other instances (depending on your number of cores) are called "AEselfLink". You may want to check CPU usage for those as well. But as Todd said, in many cases other factors become bottlenecks. And some features disable simultaneous frame rendering for that Comp.

* even later they realized that you can cache so many things from the composition... on a SSD if possible. another great idea from Nucleo pro.

- this was until 2015 CC version update, that completely removed this feature.

features not available in After Effects CC 2015 | Creative Cloud blog by Adobe

The Render Multiple Frames Simultaneously feature has been superseded by the new architecture in After Effects CC 2015 (13.5). The new architecture will allow a future version of After Effects to utilize processor threads and RAM more efficiently than the Render Multiple Frames Simultaneously functionality ever could.

-so they get rid of the multiple "aftereffects.exe" tasks in the background because it has never worked well or easily. BUT they then disabled a lot of use of mutlicore.i think that even a 6cores CPU is too much for AE now.

so the HIGHER clock (and overclock) is the key today... that why Xeon are so bad at rendering... untill they found a much better way to use all cores.

multiple ways of rendering:

in AE there are a least 4 different ways or rendering.

- it as *always* been a CPU only render; or GPU assist render ( i mean you have to feed you GPU, and this is the CPU that do it)

- since a very long time AE is using openGL into 2D mode; for drawing the display; for deforming image/videos; for some effects.

and also in so-called "2.5D"; when you manipulate 3D objects (see famous video copilot plug-in); and so effects too.

* so having a GPU with openCL acceleration is useful; and this is easy since all GPU have it. beside the more VRAM you get, the more complex textures you can use. 4GB is very good fo HD video.

- since AE CS6 Adobe introduced a true "ray-traced-3d-renderer". in fact they integrated the famous Nvidia Optix library; which obviously works only on nvidia cards with CUDA.

http://www.provideocoalition.com/the-ray-traced-3d-renderer-in-after-effects/

i think that around end 2014 they have a "fight" about nvidia optix licencing inside AE; so they stopped to develop around it.

GTX 970 not supported? After Effects error: Ray-traced 3D: Initial shader compile failed ( 5070 :: 2...

... and advised to go to the cinema4D route.

then a long time after , in 2016, they have provided some update; like "The OptiX library from Nvidia is being updated on Windows, which adds the ability for Maxwell-architecture GPUs (e.g., GTX 970, GTX 980) to be used for the ray-traced 3D renderer on Windows."

what’s new and changed in the After Effects CC 2015 (13.6) update | Creative Cloud blog by Adobe

very good trick :

*******************

since optix1.dll is the core thing of optix render in AE; you can update it manually. Optix is always releasing new update that support more GPU

Download OptiXâ„¢ Versions | NVIDIA Developer . here you see that optix 3.6 supports maxwell.

but the very latest is 3.9.1; and it does works perfectly with a GTX 980 (and i guess on TitanX)!

https://devtalk.nvidia.com/default/board/90/

=> better to update manually that to wait so long that adobe give you the last version!

* only the just-released optix4.0 does support pascal chip, not tested yet in AE...

-since AE CC they introduced cinema4D; so it's cinema 4D rendering engine that is used. and there are plugind to use your GPU; but they are costly. see Octane from Otoy

to sum-up , if you want a fast ring for AE with 3D; the best build is :

-  a very fast and overclock-able CPU (i7... k); 4 cores is good; 6 cores a little too much.

- 32GB RAM (more preview, more complex scenes)

- a SSD (for cache)

- a NVIDIA GPU with 4GB+ VRAM; GTX 980ti is the best i think. TitanX is costly, but you get 12GB VRAM...

- an Optix1.dll with version 3.9.1

[GPU with Pascal chipset are requiring Optix 4; i've never tested it...]

fred

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Jul 28, 2016 Jul 28, 2016

Copy link to clipboard

Copied

I am using the latest Ae.

I must say is the best After Effects i seen until now but, when i see my cores not used is sad, specially wen you render 9 minutes of animation with huge comps and even a preview 960*540 takes 2-3 hours to render. Three 2015 high end graphics cards in my machine, none used at full power by Ae, but i must admit that the new Lumetri Color is indeed fast and renders fast but i would be happier for real multi core support.

I think was clear many years ago when Mores law was reached that multi core is the future/present. I understand that the Ae architecture needs to change for that but is not acceptable these days when you have 20 physical cores on a Xeon, what results in machines with 80 logical cores, insane machines for anything except Ae  what uses only 1-2 cores, just because is hard to recode. Rebuild Ae from scratch, use source code from others, make it room based like smoke by integrating premier in it, make it node based ( you have a node view in Ae already build) Ae needs to change. Even nuke has great competition now and don't think Ae wont suffer the same. Each year new software appears so is just a matter of time until we get something, maybe not necessary better but faster or more efficient.

Understand that the kids doing Andrew Kramer tutorials on youtube are not your market, they are using craked software most of them. The film and advertising industry is, and here time is money and waiting for 3 hours for a half HD resolution iteration is waste of money.

Anyway i am very happy with the current Ae version, don't get me wrong, and to make it render a bit faster i am using command line rendering in file sequence but for some comps even that wont help because of the huge ram usage where even 64GB  is not enough sometimes.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Community Beginner ,
Jul 29, 2016 Jul 29, 2016

Copy link to clipboard

Copied

LATEST

latest news i got form Adobe is that they are pushing to get GPU video encoding working, this mean a HUGE speed gain in Premiere, Media Encoder and AE.

some codecs are working well in GPU encode, like mpeg2 and h264

see some professional libraries here : Video Processing Library

i guess that this is the future, and not using huge CPU cores count (that is always difficult to code and optimize). and it's worth much more to gain double speed each year by simply changing GPU that rebuild a new machine with more CPU.

also why not adding a ray-trace-only accelerator card in your machine?

https://home.otoy.com/otoy-and-imagination-unveil-breakthrough-powervr-ray-tracing-platform/

EDIT : found and benchmark that show how much AE 2014 is way faster than 2015 with many "cores" (note that a core is really a core, not a thread)

and again benchmarks show that not all render benefit from 20 cores...

https://www.pugetsystems.com/labs/articles/Adobe-After-Effects-CC-2014-Multi-Core-Performance-716

much better is to overclock a 6 or 8 cores i7 (like i7-5960k) above 4Ghz (max 4.5Ghz)

Overclocking Five Retail Intel Core i7-5960X CPUs

and especially in AE 2015 that not using at all multicores above 8 :

https://www.pugetsystems.com/labs/articles/Adobe-After-Effects-CC-2015-Multi-Core-Performance-714/

With that said, if you need to use AE 2015 (13.5), it is clear that dual CPU systems are a bad choice After Effects 2015.

BTW all thoses special/custom use of CPU and GPU show how bad a closed hardware machine, like any Apple machintosh, are bad money investissent for AE...

fred

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines