Skip to main content

⚠️ Please note that this topic or post has been archived. The information contained here may no longer be accurate or up-to-date. ⚠️

Workstation recommendation please

Comments

112 comments

  • grasjeroen
    [quote="Christian Gruner" wrote:
    [quote="NNN635397352773288520" wrote:
    The quadro is because of the 10 bit option and its has a low working temperatur and that the drivers are tested for our kind of work, but yes it cost alot compared to what it can. 😄


    For Capture One use, 10 bit support is a waste of money.
    Drivers are only verified/certified for CAD work, not for graphical work. This is basically what you pay for in a card like this.

    The low working temperature is a result of very limited computational power. And if you do go up in the cards that seem to be performing "ok" like Quadro M5000 or FirePro W8000, they basically have the same formfactor and noisy fans as larger gaming cards.


    But are other cards verified to support 10 bit for Capture One use? 10 bit would be very useful when using a professional monitor, e.g. from Eizo, which now even has support for calibration in CO. So can we utilize 10 bit on the monitor, the card, and in CO? If so, what cards do support 10 bit?
    0
  • Christian Gruner
    CO only outputs the user interface in 8 bit, so the last 2 bits are a not used for CO.
    0
  • grasjeroen
    [quote="Christian Gruner" wrote:
    CO only outputs the user interface in 8 bit, so the last 2 bits are a not used for CO.

    To be completely clear: that is for the entire user interface, so including the image(s)? So images are only shown in 8 bit?
    0
  • Robert Whetton
    [quote="grasjeroen" wrote:
    [quote="Christian Gruner" wrote:
    CO only outputs the user interface in 8 bit, so the last 2 bits are a not used for CO.

    To be completely clear: that is for the entire user interface, so including the image(s)? So images are only shown in 8 bit?

    yup so 16.7million colours only! and, the human eye is capable of detecting around 10 million unique colours.
    0
  • photoBy
    We need to buy 2 Windows machines for Capture One Pro 9 and the in-house tech support wants me to buy Dell.

    From what I have read here, we should be going for an Alienware Dell w/ a fast gaming card compared to the Precision workstations that have Quadro and Firepro cards.

    The biggest factors I should look for with the GPU are CUDA cores/processing streams, memory speed, and memory bandwidth. On top of that, we should check the CompuBench Video Composition specs for comparable CO9 performance.

    We can go with the AMD RX 480 (I have seen it said AMD is cheaper for more power in OpenCL), but due to the nature of Dell's discounts it's an extra ~$150 to get the new NVIDIA GeForce GTX 1080 Founders Edition that has better specs across the board than the AMD and is ~17% faster on the CompuBench Video Composition test.

    So . . . has anyone tested the new Pascal-based NVIDIA GPU's with Capture One Pro 9?

    Thanks!
    0
  • Robert Whetton
    I'm hoping to have a C1 number for a 1080 soon!
    0
  • Christian Gruner
    The new NVIDIA and AMD 16 nm cards (like gtx1080 and Radeon 480) work just fine in CO 9. They run the same drivers as the older cards, so CO doesn't see much difference.
    0
  • grasjeroen
    [quote="photoBy" wrote:
    The biggest factors I should look for with the GPU are CUDA cores/processing streams, memory speed, and memory bandwidth. On top of that, we should check the CompuBench Video Composition specs for comparable CO9 performance.


    Is CompuBench's Video Composition benchmark indeed a good measure for CO9 performance? If not, what other benchmark would be better? Or should I look at two or more ...
    0
  • Christian Gruner
    Usually it is, but of course with some margin to both sides.
    0
  • Robert Whetton
    OK so.
    i7 6700K
    16GB DDR4 (3000MHz)
    EVGA GTX 1080 Founders Ed

    Capture One OCL Benchmark score 0.084560

    Considering my R9 390 running on an i5 2500K gets 0.068889, looks like AMD is definitely more bang for buck!
    0
  • grasjeroen
    [quote="Bobtographer" wrote:
    OK so.
    i7 6700K
    16GB DDR4 (3000MHz)
    EVGA GTX 1080 Founders Ed

    Capture One OCL Benchmark score 0.084560

    Considering my R9 390 running on an i5 2500K gets 0.068889, looks like AMD is definitely more bang for buck!


    I'm currently configuring a new system, and am thinking about: i7 6850K, 32GB DDR4 (2400MHz), GTX 1060. My HW friend tells me that AMD GPUs are less reliable, mostly due to the drivers. What is your experience?

    Furthermore, the benchmark results are impressive for AMD. Anyone in the know regarding the why's of this difference? BTW, when I look at CompuBench results for Video, Nvidia is not doing that bad.
    0
  • Christian Gruner
    [quote="grasjeroen" wrote:
    [quote="Bobtographer" wrote:
    OK so.
    i7 6700K
    16GB DDR4 (3000MHz)
    EVGA GTX 1080 Founders Ed

    Capture One OCL Benchmark score 0.084560

    Considering my R9 390 running on an i5 2500K gets 0.068889, looks like AMD is definitely more bang for buck!


    I'm currently configuring a new system, and am thinking about: i7 6850K, 32GB DDR4 (2400MHz), GTX 1060. My HW friend tells me that AMD GPUs are less reliable, mostly due to the drivers. What is your experience?

    Furthermore, the benchmark results are impressive for AMD. Anyone in the know regarding the why's of this difference? BTW, when I look at CompuBench results for Video, Nvidia is not doing that bad.


    We don't see any big or small difference in reliability between the 2 vendors.
    I can't comment on the difference in performance, as it would be guesswork.
    0
  • Robert Whetton
    [quote="grasjeroen" wrote:
    I'm currently configuring a new system, and am thinking about: i7 6850K, 32GB DDR4 (2400MHz), GTX 1060. My HW friend tells me that AMD GPUs are less reliable, mostly due to the drivers. What is your experience?

    Last time I had Nvidia was maybe 10 years ago? Everyone has different tales to tell about their failed hardware.. I've found ATi/AMD reliable, they update their drivers often too..
    0
  • grasjeroen
    [quote="Bobtographer" wrote:
    [quote="grasjeroen" wrote:
    [quote="Christian Gruner" wrote:
    CO only outputs the user interface in 8 bit, so the last 2 bits are a not used for CO.

    To be completely clear: that is for the entire user interface, so including the image(s)? So images are only shown in 8 bit?

    yup so 16.7million colours only! and, the human eye is capable of detecting around 10 million unique colours.


    So 10-bit is only a technology hype according to you? If so, I should not spend my money on displays that can display many more colors.
    0
  • Robert Whetton
    [quote="grasjeroen" wrote:

    So 10-bit is only a technology hype according to you? If so, I should not spend my money on displays that can display many more colors.

    10bit displays apparently give better grey tones? but unless your prints are displaying something you don't see on screen, why update?
    0
  • Pavel Derka
    this is an interesting thread and I really appreciate that Christian has contributed so much good to know information.

    I normally use a mac Mini for C1 or a 2012 Macbook Pro, but have been considering using on of my PC's or servers due to the fact I like my drives inside rather than attached.

    I'm a bit put off, because my main PC (xeon 3.2 gz with 24 gig ECC ram and ATI Radeon 7750) is not accelerated in C1. I spec my systems first for Linux and FreeBSD needs, and boot into windows 7 or 8 only once in a while, and while I don't mind spending money for a better card I have two major reservation. Firstly, many of the new cards take a long time to be well supported under Linux, but hey that's their fault, but the more major problem for me is that I can't stand the noise that many of my past gaming cards. It drives me nuts.

    In that spirit do any of you have any suggestions as far as decent video cards that are accelerated, but don't necessarily have to be extremely fast, and are also low power and with a very low fan sound?

    I"m also pretty surprised, though perhaps it's my background in Virtualization that has made me a bit paranoid that more people are not talking robustness over simply speed. Raid Zero? Non-ECC memory and large hard drive? That's a problem in my book and something that I thought more people would be concerned about as a first priority. Don't people, who's data is valuable care about things like single level cel advantages, instead of a hair better performance? Heck in my book both Apple's HFS+ and NTFS are "bottlenecks" as far as reliability is concerned. Many people think that SSD's are more reliable that good "spin" drives, but that is not at all the case.

    But on that topic, does Capture One run on Windows Server running ReFS, by any chance -or is that an unworkable (or just plain dumb) idea for some reason?

    But back to my initial concern - what card is now power (no need for a better power supply) costs less than ~250 bucks, and does not impersonate a turbine when stressed a bit doing Capture one's bidding? Linux support under Slack, Debian or Ubuntu a major plus! 😊
    0
  • Alain Decamps
    [quote="Pavel" wrote:

    In that spirit do any of you have any suggestions as far as decent video cards that are accelerated, but don't necessarily have to be extremely fast, and are also low power and with a very low fan sound?

    I"m also pretty surprised, though perhaps it's my background in Virtualization that has made me a bit paranoid that more people are not talking robustness over simply speed. Raid Zero? Non-ECC memory and large hard drive? That's a problem in my book and something that I thought more people would be concerned about as a first priority. Don't people, who's data is valuable care about things like single level cel advantages, instead of a hair better performance? Heck in my book both Apple's HFS+ and NTFS are "bottlenecks" as far as reliability is concerned. Many people think that SSD's are more reliable that good "spin" drives, but that is not at all the case.



    I would look at an AMD RX 470 or RX 480. The 470 is a bit less fast and uses less energy.

    Robustness is important, but backup often (snapshots) and verify files are possible solutions. I don't know of GPU's that have ECC build in 😉
    0
  • Pavel Derka
    [quote="Alain" wrote:
    [quote="Pavel" wrote:

    In that spirit do any of you have any suggestions as far as decent video cards that are accelerated, but don't necessarily have to be extremely fast, and are also low power and with a very low fan sound?

    I"m also pretty surprised, though perhaps it's my background in Virtualization that has made me a bit paranoid that more people are not talking robustness over simply speed. Raid Zero? Non-ECC memory and large hard drive? That's a problem in my book and something that I thought more people would be concerned about as a first priority. Don't people, who's data is valuable care about things like single level cel advantages, instead of a hair better performance? Heck in my book both Apple's HFS+ and NTFS are "bottlenecks" as far as reliability is concerned. Many people think that SSD's are more reliable that good "spin" drives, but that is not at all the case.



    I would look at an AMD RX 470 or RX 480. The 470 is a bit less fast and uses less energy.

    Robustness is important, but backup often (snapshots) and verify files are possible solutions. I don't know of GPU's that have ECC build in 😉


    I will look into those two in the sense of Linux support and price/performace ratio. I guess looking through gaming reviews could be an effective way to tell about the noise floor.

    True about GPU's and ECC 😉 ... but the corruption happens at the writes, and ECC fixes memory flips etc. The GPU isn't involved. 😊
    0
  • Robert Whetton
    all the years I've been using a computer, I only ever used ECC once in a workstation - back in the dual 1.3 PIII days on a Supermicro board. Can't say I notice any difference in reliability with non ECC RAM..
    0
  • Pavel Derka
    [quote="Bobtographer" wrote:
    all the years I've been using a computer, I only ever used ECC once in a workstation - back in the dual 1.3 PIII days on a Supermicro board. Can't say I notice any difference in reliability with non ECC RAM..


    I imagine it's not a thing that happens a lot, but bit flips, in memory are not all that uncommon, or bit rot, and without checksums in RAM they get written out. Rarely does this dammage a file but once in a while it can and when you get back to it one day you may find that every several thousand jpg file openings cause on to get corrupt. Of course hard drives run cheksums, but if they get faulty data from RAM that does not help. I'm surprised if you've never had that happen Bob. I get it once in a while.

    The larger the hard drive the more likely it will bite one day and especially if you use hardware raid controllers (like the infamous raid 5 write holes - which are greatly mitigated by ECC) paradoxically the most reliable antidote to this all is to use and file system like ZFS or Btrfs but it's important to NEVER use hardware raid with those. People have cried the blues over that simple mistake. ZFS of course is meant to only be used with ECC.

    Of course this does not create stability problems with the OS, just once in a while a file which eats itself. I get that every so often, and always know it because out of the blue Photomechanic shows me a black preview. But of course a good backup strategy is the more important part of all of this. 😊
    0
  • Christopher Hauser
    I'm just finishing off a new computer and wanted to know what would give me the best performance GPU wise.

    What I'm considering:

    - 2 Radeon R9 390X
    vs
    - 2 GeForce GTX 1070
    vs
    - 1 GeForce GTX 1080

    Price wise they are all close. What I get so far is that in general for C1 AMD is better. In addition I was a more than a little underwhelmed by the Capture One OCL Benchmark score 0.084560 from the 1080... My single Radeon R9 390 (without X) Already scores around 0.07

    Now do I understand it correctly that two Radeon R9 390X would give me quite a processing boost over 1 ? (I understand that both are benched separately) I'm talking about real world performance.

    Facts about the rest of the PC:
    - Intel Core i7 6900K
    - 128GB Mem
    - Working files on a 960Pro 2TB --> Export to another 960pro (Both m.2)

    Files: Raw files are all IQ3100 and IQ180
    0
  • Robert Whetton
    [quote="ChristopherHauser" wrote:
    Now do I understand it correctly that two Radeon R9 390X would give me quite a processing boost over 1 ? (I understand that both are benched separately) I'm talking about real world performance.

    Correct! but also higher hit on CPU.. but around 40% increase, correct Christian?
    0

Please sign in to leave a comment.