Skip to main content

⚠️ Please note that this topic or post has been archived. The information contained here may no longer be accurate or up-to-date. ⚠️

iMac GPU - Vega 48 Vs. Radeon Pro 580X

Comments

14 comments

  • Okular
    I definitely can't answer this question, but we have this thread next door that is about benchmarks:
    GPU benchmarks - values only ( [The Capture One forum has migrated to a new platform, as a result all links to Capture One related postsstopped working and have been removed] )

    Have a look there for the appropriate hardware equipment and compare the benches, perhaps that gives a clue
    0
  • Brad Trent
    See...that's the thing.....I can look at 'benchmarks' and stats all day long and it means nothing if I wanna know about real World applications. I tried gleaning something from that thread earlier and it tells me nothing. Maybe if I was a gamer, it might get me pumped, but as someone who just wants to push hundreds of big gig raw files through C1, it's kind of useless.

    I'd like to say it's not about the money...and the extra $450 upgrade is hardly gonna put me in the poorhouse...but honestly, I would rather use that cash to buy a nice case of 1er Cru Burgundy if I'm not gonna see a processing edge with a GPU I don't need for the job.
    0
  • cdc
    While I can't speak to your case specifically I've seen Phase One employees respond here to laggy performance issues and iMac's. The issue was poor performance on iMac's with 5k screens, the representative stated that even the best GPU offering available for the iMac still displayed poorer than preferred performance with Capture One. However the base GPU that came with the iMac Pro was adequate. This was a couple years ago if memory serves and I'm not certain what offerings were available then vs now, but the point I'm making is that upgraded GPU's do make a difference on iMac's though I don't know where point of diminishing returns is. Perhaps the people at Phase One could help you out with some info if you sent a message to them.

    I'd personally spring for an upgraded GPU before 128GB memory since you'll be working with a 5k display.
    0
  • Brad Trent
    Well...actually, the iMac’s screen will be strictly use for palettes.....I’ll do all the real work on a second NEC monitor.

    But this is the first I’ve heard of poor performance with iMacs and C1...now it looks like I hafta do even more research!!!
    0
  • photo by FA
    From CO team’s point of view, all iMacs are underspecced and we shouldn’t expect a good performance and it’s not CO fault. However if you dig deep, it is purely CO fault as their code base is not optimized and not using current version of GPU languages ie Metal or OpenCL v2.2.
    0
  • Brad Trent
    fatihayoglu wrote:
    From CO team’s point of view, all iMacs are underspecced and we shouldn’t expect a good performance and it’s not CO fault. However if you dig deep, it is purely CO fault as their code base is not optimized and not using current version of GPU languages ie Metal or OpenCL v2.2.


    Is there something from Capture One you’ve read about this, or is this just one of those fanboy conspiracy theories that picks up steam and morphs into ‘fact’? But assuming it is true that the app isn’t optimized to take advantage of optimal GPU ‘languages’, then back to my original question...is the Vega 48 all anyone needs for Photoshop work?!!
    0
  • Grant Hodgeon
    Brad,

    You're particular about the GPU but fail to mention the type of SSD, the speed of RAM, etc.

    All are factors in performance but the key components are drive speed, cpu clock and opencl cores. You will also see benefits from batch processing and multi file previews with higher core counts.

    To answer your question, your performance 'increase' with Vega will be negligible, and a waste of $450.

    You would be better served buying two 580X's.

    Could C1 be more optimised for performance? Not sure. I believe as OpenCL continues to phase out/deprecate we'll see a shift in performance one way or another as the new technology is adapted. Whether that takes a hot while and a few versions to iron out any bottlenecks, we'll have to wait and see.

    Get your Burgundy and learn to read the benchmarks that matter to you in order to best decide what fits your use case and budget. Benchmarks are not only made for gamers. https://barefeats.com/ -- you'll get further quicker if you can infer from these types of results yourself.

    Cheers!
    0
  • photo by FA
    Brad Trent wrote:
    fatihayoglu wrote:
    From CO team’s point of view, all iMacs are underspecced and we shouldn’t expect a good performance and it’s not CO fault. However if you dig deep, it is purely CO fault as their code base is not optimized and not using current version of GPU languages ie Metal or OpenCL v2.2.


    Is there something from Capture One you’ve read about this, or is this just one of those fanboy conspiracy theories that picks up steam and morphs into ‘fact’? But assuming it is true that the app isn’t optimized to take advantage of optimal GPU ‘languages’, then back to my original question...is the Vega 48 all anyone needs for Photoshop work?!!


    When I have bought top spec iMac 5K, almost 4 years ago now, I wasn’t happy with CO performance I have contacted with TS multiple times. Although this machine was pretty good to edit videos, somehow CO performance was really bad. The feedback which I’ve received multiple times, was the GPU wasn’t enough for 5K. This was 4 years ago. Up until today, many iMac 5K users have complained about the performance and CO always have told us the GPU is the bottleneck. Meanwhile Apple has shifted its support to Metal, dropped the support OpenCL. If you see Mac forum, users are complaining still about their top spec MBPs as CO does not use discreet GPUs etc. that made me to check the OpenCL that CO uses. Apparently they still v1.2 while the current version is v2.2. As a note v1.2 language does not support new GPUs as it is nearly 10 years old. It may be well possible that due to old OpenCL, the software can not maximize the use of GPU. I’m not a tech geek or a person who knows coding etc but in my book, if a software does not support the current language then there is a problem.
    0
  • WPNL
    I also still have not heard of anyone on Lightroom that's optimistic about the performance on a 4k or 5k display... Makes you wonder, doesn't it?
    You want a fast(er) system, go back to Full HD.
    If you want a fast system on 4k or 5k then you have to invest in faster hardware, but keep this in mind:
    From FHD to 4x = 4 times the pixels, but are GPU's/CPU's 4 times faster nowadays?
    0
  • Grant Hodgeon
    WPNL wrote:
    I also still have not heard of anyone on Lightroom that's optimistic about the performance on a 4k or 5k display... Makes you wonder, doesn't it?
    You want a fast(er) system, go back to Full HD.


    And yet with RED as an example you're now able to playback 8K raw in resolve whilst live grading with a gaming GPU.

    We're demosaicing a single raw file. Even doing a grade bypass you can see the struggling multitude of rendering layers trying to switch on and off. The inefficiences are beyond obvious.

    Look at SCRATCH, FLAME, BLENDER, NUKE... There's ways to hide the hard work of displaying the raw image from the interface itself. There's ways to mask slowness in the GUI by keeping it responsive and seperate from the math. It's like some single threaded 32 bit program still. Renaming for example. Moving files for example. It's not just the displaying of the image that is slow with this software and it's not just the GPU that will speed things up. Slowness rears its head in many ways and regardless of the hardware you throw at it, you can't fix code.

    Take a look at LightSpace for Windows as a prime example. Phenomenal software, but it's a cup of tea software. There is no rushing it, and there is no throwing sports car money at it. It will chug along as fast as its little engine will allow, your go faster stripes and slick wheels begone.
    0
  • photo by FA
    WPNL wrote:
    I also still have not heard of anyone on Lightroom that's optimistic about the performance on a 4k or 5k display... Makes you wonder, doesn't it?
    You want a fast(er) system, go back to Full HD.
    If you want a fast system on 4k or 5k then you have to invest in faster hardware, but keep this in mind:
    From FHD to 4x = 4 times the pixels, but are GPU's/CPU's 4 times faster nowadays?


    Ok so we should accept because Lightroom is slow, CO should be also slow. As I’ve said, this machine or any other new iMacs are capable to edit videos without any problem, so single RAW file is not a big deal.
    On the other hand, proper optimized Affinity Photo works without any glitch in terms of performance if I wanted to apply similar adjustments to the exact same photo. AP is a single RAW file editing tool as well, without a DAM which also doesn’t use GPU.

    So, an unoptimized software works slow no matter what your hardware is.
    0
  • WPNL
    We don't have to accept, I was just pointing out the "coincidence".
    I agree we could expect a faster UI/experience. I am just a hobbyist but I can imagine the frustration when working on large files. I could tell the difference when I went from 24MP to 42MP. Anyway, we can only wait and hope I'm afraid...
    0
  • photo by FA
    WPNL wrote:
    We don't have to accept, I was just pointing out the "coincidence".
    I agree we could expect a faster UI/experience. I am just a hobbyist but I can imagine the frustration when working on large files. I could tell the difference when I went from 24MP to 42MP. Anyway, we can only wait and hope I'm afraid...

    Sorry I’ve misunderstood you, apologies for that.
    I think if we can collaboratively hope, the Gods of software world will hear us the mortals and somehow help to optimize the codes of the softwares we use 😊
    0
  • SFA
    fatihayoglu wrote:
    WPNL wrote:
    I also still have not heard of anyone on Lightroom that's optimistic about the performance on a 4k or 5k display... Makes you wonder, doesn't it?
    You want a fast(er) system, go back to Full HD.
    If you want a fast system on 4k or 5k then you have to invest in faster hardware, but keep this in mind:
    From FHD to 4x = 4 times the pixels, but are GPU's/CPU's 4 times faster nowadays?


    Ok so we should accept because Lightroom is slow, CO should be also slow. As I’ve said, this machine or any other new iMacs are capable to edit videos without any problem, so single RAW file is not a big deal.
    On the other hand, proper optimized Affinity Photo works without any glitch in terms of performance if I wanted to apply similar adjustments to the exact same photo. AP is a single RAW file editing tool as well, without a DAM which also doesn’t use GPU.

    So, an unoptimized software works slow no matter what your hardware is.


    But with Affinity you convert a RAW file to an internal file structure and then do the heavier work on the pre-converted file. With C1 you are always working with the source file if you want full functionality. (The option of using the Preview fiel as the basis for editing in an off-line catalogue is closer to what Affinity and many other applications do.)

    If, using Affinity, you wish to change the original RAW conversion parameters in some way you have to go back to the Develop mode and produce a revised conversion.

    I'm not criticising that approach. Everything seemed to work that way (and "destructively" until around the time Lightroom came along and, iirc, Capture One appeared with a live tethering option offering direct access to RAW file conversion and editing in a Studio. (There may be other applications that offered the same thing back then that I am not aware of but that is not the point. It's a question of time in the development timeline of Digital Cameras.)

    Another difference can be the way that a "stack" of adjustments are applied according to the design criteria of the application - but that's too much of a subject to be discussed in this thread.


    Grant
    0

Please sign in to leave a comment.