Feature request - GPU Acceleration for preview generation

Commentaires

33 commentaires

  • GrahamB3
    TechGage tested some aspects of GPU performance and C1 back in January.

    https://techgage.com/article/a-look-at- ... rformance/
    0
    Actions pour les commentaires Permalien
  • AGLyons
    I have two cards in my laptop, an iGPU from Intel and an Nvidia Quadro M1000. C1 insists on using the Intel iGPU rather than the more powerful Nvidia card and there is no way to tell C1 which card to use in this scenario.

    The comments about GPU's being used more during rendering well when you get to thinking about it, creating thumbnails is basically just rendering out a small jpg for every shot in the session/catalog.

    Another thing I'd like to see as an improvement in is overall responsiveness. I'm running the laptop off of a Samsung 970Pro M.2 drive. A fast NvME drive. I'd really like to see C1 improve the preview loading speed especially when the preview resolution is set to a measly 1920px. There is always the fuzzy shot loaded and you have to wait about 1-2 seconds sometimes longer, for the sharper preview to show. When you are going through shots quickly you never do get to see the sharper shot unless you slow down considerably.
    0
    Actions pour les commentaires Permalien
  • WPNL
    ... Especially because there seems to be software that CAN present fast previews for culling.
    I'm using FastRawViewer parrallel with C1 and have set XMP and trash/select folders in sync so I can do my rating and deleting in FRV and then edit in C1.
    0
    Actions pour les commentaires Permalien
  • SFA
    [quote="AGLyons" wrote:
    I have two cards in my laptop, an iGPU from Intel and an Nvidia Quadro M1000. C1 insists on using the Intel iGPU rather than the more powerful Nvidia card and there is no way to tell C1 which card to use in this scenario.

    The comments about GPU's being used more during rendering well when you get to thinking about it, creating thumbnails is basically just rendering out a small jpg for every shot in the session/catalog.

    Another thing I'd like to see as an improvement in is overall responsiveness. I'm running the laptop off of a Samsung 970Pro M.2 drive. A fast NvME drive. I'd really like to see C1 improve the preview loading speed especially when the preview resolution is set to a measly 1920px. There is always the fuzzy shot loaded and you have to wait about 1-2 seconds sometimes longer, for the sharper preview to show. When you are going through shots quickly you never do get to see the sharper shot unless you slow down considerably.


    Do you mean a Quadro 1000M ? Or maybe a K1000M?

    If so they are not very powerful. I have a K1000M in my 6 year old laptop and it just about qualifies to be used in processing for C1 for some small parts of the processing chain. The rest is allocated to the CPU (Intel i7 3820QM).

    What is set where will depend on many factors but whether or not my system is using the Quadro makes little difference unless I am batch processing. But then this is a 6 year old lap top.

    The main drive is a Samsung SSD of its era and not an NvME drive but then the hardware proably does not support NvME anyway. Progams an dcatalogs are on the C: SSD but working data (sessions for me) will usually be on an internal 1Tb M2 SSD runing in a comms slot that reduces the transafer speed to a max of SATA 2 - limited to about half the potential transfer speed of the drive at SATA 3 performance.

    HOWEVER it's difficult to notice a difference simply because what the drive specs advise as possible are very rarely going to be available in real world use for a number of reasons. Too many to go into here and I'm not the person qualified to do it anyway. Just be aware that the "speed" quoted will likely be for very specific data transfers using hardware that can fully utilise the available transaction options.

    In effect top make it worth splitting the processing between the CPU and the GPU(s) is very dependent on what calculations are being performed and whether the work of splitting the file's data and then recompiling (and the timing of the split processes to be able to recompile without wasting time waiting for some part of it to finish) is worth the effort. It may not produce any saving at all depending upon content.

    NVIDIA control software allows you to specify that certain application are you use a specified GPU. However it does not guarantee that they will be used or will enhance performance.

    You should check the C1 Log file for the startup GPU assessment if you have not already done so.

    Also the Nvidia software includes a GPU activity monitor (or at least mine does) that does not indicate performance but does identify which programs are supposedly prepared to use the GPU. In my case it's one for which I don't recognise the icon (might be the control application itself), C1 and and business application that only uses graphics for input screen rendering as far as I know.

    I am, of course, assuming that you have the preference setting for Hardware Acceleration turned on.

    HTH.

    Grant
    0
    Actions pour les commentaires Permalien
  • JT Pennington
    [quote="AGLyons" wrote:
    Going through generating previews for 1200 images. 6 core CPU is pegged and GPU(s) are sitting idle doing nothing.

    Please take advantage of modern hardware to speed these procedures up. It's done in the video world why can't we do this in the photo world too?


    This worked previously. I'm not sure, but I think there may be a regression of some kind in the 12.x series. I no longer get the message of 'Setting up hardware acceleration' when I start Capture One.

    I'm working with support on the issue. We've gone through the steps of completely disable/remove OpenCL entirely from Capture one by deleting the actual OpenCL binary that Capture One creates and uses and then going back through and enabling it again, so the binary gets rebuilt. We have confirmed that OpenCL is functioning at least in the onscreen rendering.

    I'm using an i7 Overclocked 7700K (4 cores, 8 threads @ 5ghz) along side an RTX 2070, and when I import or exporting images I'm seeing the CPU pegged and no more than 10% utilization on the GPU. To make sure its not a Disk IO issue, I'm running Capture One and rendering Images from a RamDrive.
    It's possible that I'm being throttled by my CPU, but I don't really think that's the case.

    Again, Im not 100% sure this is a Capture One issue, but hopefully we'll be able to track it down.
    0
    Actions pour les commentaires Permalien
  • C-M-B
    [quote="NNN636228568741858362" wrote:
    [quote="AGLyons" wrote:
    Going through generating previews for 1200 images. 6 core CPU is pegged and GPU(s) are sitting idle doing nothing.

    Please take advantage of modern hardware to speed these procedures up. It's done in the video world why can't we do this in the photo world too?


    This worked previously. I'm not sure, but I think there may be a regression of some kind in the 12.x series. I no longer get the message of 'Setting up hardware acceleration' when I start Capture One.

    I'm working with support on the issue. We've gone through the steps of completely disable/remove OpenCL entirely from Capture one by deleting the actual OpenCL binary that Capture One creates and uses and then going back through and enabling it again, so the binary gets rebuilt. We have confirmed that OpenCL is functioning at least in the onscreen rendering.

    I'm using an i7 Overclocked 7700K (4 cores, 8 threads @ 5ghz) along side an RTX 2070, and when I import or exporting images I'm seeing the CPU pegged and no more than 10% utilization on the GPU. To make sure its not a Disk IO issue, I'm running Capture One and rendering Images from a RamDrive.
    It's possible that I'm being throttled by my CPU, but I don't really think that's the case.

    Again, Im not 100% sure this is a Capture One issue, but hopefully we'll be able to track it down.


    Have you checked whether these imports are actually fully using the bandwith of your SSD/HDD? Because in my experience they are very slow due to the small(ish) file sizes and random read/write actions. Not only are the files copied/imported, the CaptureOne files (including the settings, previews and other stuff) are also written at the same time into the CaptreOne folder of the import folder.
    That can't be done by the GPU, that has to be coordinated by the CPU.
    So the GPU is only "allowed" to get to each file at a very slow pace (since the other data and file info has to be fetched and written to your disk at the same time as well.
    0
    Actions pour les commentaires Permalien
  • JT Pennington
    [quote="C-M-B" wrote:
    [quote="NNN636228568741858362" wrote:
    [quote="AGLyons" wrote:
    Going through generating previews for 1200 images. 6 core CPU is pegged and GPU(s) are sitting idle doing nothing.

    Please take advantage of modern hardware to speed these procedures up. It's done in the video world why can't we do this in the photo world too?


    This worked previously. I'm not sure, but I think there may be a regression of some kind in the 12.x series. I no longer get the message of 'Setting up hardware acceleration' when I start Capture One.

    I'm working with support on the issue. We've gone through the steps of completely disable/remove OpenCL entirely from Capture one by deleting the actual OpenCL binary that Capture One creates and uses and then going back through and enabling it again, so the binary gets rebuilt. We have confirmed that OpenCL is functioning at least in the onscreen rendering.

    I'm using an i7 Overclocked 7700K (4 cores, 8 threads @ 5ghz) along side an RTX 2070, and when I import or exporting images I'm seeing the CPU pegged and no more than 10% utilization on the GPU. To make sure its not a Disk IO issue, I'm running Capture One and rendering Images from a RamDrive.
    It's possible that I'm being throttled by my CPU, but I don't really think that's the case.

    Again, Im not 100% sure this is a Capture One issue, but hopefully we'll be able to track it down.


    Have you checked whether these imports are actually fully using the bandwith of your SSD/HDD? Because in my experience they are very slow due to the small(ish) file sizes and random read/write actions. Not only are the files copied/imported, the CaptureOne files (including the settings, previews and other stuff) are also written at the same time into the CaptreOne folder of the import folder.
    That can't be done by the GPU, that has to be coordinated by the CPU.
    So the GPU is only "allowed" to get to each file at a very slow pace (since the other data and file info has to be fetched and written to your disk at the same time as well.


    Read my post again. I'm using a RamDisk. Im not using an SSD or an HDD for Capture One.
    I'm literally using the fastest access memory that exists in a computer. There should be ZERO IO bandwidth issues.
    Capture One Binaries, the photos, and the Capture One Catalog all exist 100% in Ram.

    Here's real world benchmarking of my RamDisk.
    https://i.imgur.com/8IysNTo.png
    0
    Actions pour les commentaires Permalien
  • C-M-B
    My apologies.

    For me OpenCL works (1070) - "setting up hardware acceleration" should only show up when you change something (new driver, new captureone version,..) and even then you won't necessarily see the activity window that shows the hardware acceleration setup.

    Back to your issue: With no base for comparison it's hard to say what's at fault or what's to be blamed here. Perhaps you could make a video showing how slow your image rendering is.

    I happen to have the same CPU - though I'm running undervolted at stock clock speed (I like a silent system) - and my GPU is previous gen and I'm "only" using a pretty decent SSD (Samsung 960 PRO) so I can at least offer a base for comparison.

    I'll do a quick reboot and then I'll open a session with about 2.200 images (50MP, compressed RAW) and I'm going to tell you how long that takes. Then you can tell me how many images you're opening and how long that takes.
    0
    Actions pour les commentaires Permalien
  • C-M-B
    It took a little less than 20 seconds to open the session and load and render the previews for 2.200 files (50MP compressed RAW / 55-42mb each)

    So fetching the previews from 100GB of data within 20 seconds =>110 images/s

    Of course CPU useage was very high so that's definitely your bottleneck as well.

    I don't think RAM disk is able to really speed anything up for you.

    I think the best improvement PhaseOne could (and should) do would be to store the preview files in the session/catalog folder and only (re)load the images when you change something or when you "tell" C1 to empty the cache.
    0
    Actions pour les commentaires Permalien
  • SFA
    [quote="C-M-B" wrote:

    I think the best improvement PhaseOne could (and should) do would be to store the preview files in the session/catalog folder and only (re)load the images when you change something or when you "tell" C1 to empty the cache.


    Previews are stored in the session/catalog.

    I suppose, given unlimited RAM memory allocation for the entire system, all previews could be held in memory at all times - but while we are a lot closer to that sort of architecture capability than we once were in terms of potential the reality would be that the current underlying operating systems designs would not support it at scale.

    The nearest one could get would be to leave the system running and the catalogue(s) or session(s) open. Even then for hibernation purposes (for example) the main disk read/write performance would need to be taken into account on resuming. There might be a case for choosing a specific version of a disk (SSD or spinner) for such a purpose just as would be the situation in a server farm.

    That said, to extract optimum performance might be rather expensive in hardware terms and still not without limitations. Also impossible to control (as a software supplier). Maybe there should be a Phase branded "recommended specification" Mac and PC product for which could be offered some sort of performance guarantee. Everything else could then be left to the users to decide how to progress with their non-branded random configurations.

    Just a few thoughts to consider.


    Grant
    0
    Actions pour les commentaires Permalien
  • C-M-B
    Where exactly are they stored?
    And why does it still load each individual image every time?
    0
    Actions pour les commentaires Permalien
  • SFA
    [quote="C-M-B" wrote:
    Where exactly are they stored?
    And why does it still load each individual image every time?


    In a session (I don't really use catalogues) there are files stored in a sub-folder (CaptureOne) to the image's location folder. Two subfolders in fact un the CaptureOne folder.

    One is called Settingsxxx where xxx identified the C1 version number to which they relate. The Settings folder contains the edit information file(s) for each image.

    The other folder is the Cache folder where one finds the subfolders for Previews (Proxies) and Thumbnail files.

    The Preview files have extension .cop and may also have a .cof file associated with then (Focus Mask).

    In general the files in the Preview will be created at the point of import and recreated only if they are specifically regenerated at some point - either because the regenerate process has been run or because the Cache folder has been deleted by the user for some reason.

    When one opens a session C1 will populate its working memory with the thumbnails (for the browser) and the Preview files (For the viewer) and apply adjustments from the files in the SETTINGS folder in order to display the latest version of the image. Or versions if one has multiple variants of an image or images.

    Exactly what processing needs to be applied may also depend on the currently active settings in C1 - Proof Profiling for one example but there are, potentially, a number of current process settings that could require some adjustment to whatever content exists in a previously saved preview file to make it display as desired for the current user interface values. This will be done in memory as far as possible but temporary files will be saved to the local disk as usual - hence the apparent disk activity.

    I have never seriously compared how the process may compare when using a catalogue but I think it is much the same other than the cache and settings files being contained within the catalogue folder structure even when the source file folders are external to the catalogue.

    I doubt that this outline tells the full story but I think it is a reasonable guide to the way things work. If anyone knows differently I would be delighted to learn more.


    Grant
    0
    Actions pour les commentaires Permalien
  • C-M-B
    Well, when I contacted support they told me that preview files are _not_ stored and every time you start C1 it need - but I guess you are correct! My mistake!

    However in that case, the way these files are loaded when you start CaptureOne is incredibly stupid and takes way too long for what they are. The session file really only acts as a browser that only remembers where you left of last time - instead it should at least (!) keep a fully indexed and cached overview of the files in your "CAPTURE" folder instantly available.
    Right now my session-files are all about 14MB even with 2200 files in them and I'm sure you'll agree that they could easily be 200+ MB large should be an actual database.
    0
    Actions pour les commentaires Permalien
  • SFA
    [quote="C-M-B" wrote:
    Well, when I contacted support they told me that preview files are _not_ stored and every time you start C1 it need - but I guess you are correct! My mistake!

    However in that case, the way these files are loaded when you start CaptureOne is incredibly stupid and takes way too long for what they are. The session file really only acts as a browser that only remembers where you left of last time - instead it should at least (!) keep a fully indexed and cached overview of the files in your "CAPTURE" folder instantly available.
    Right now my session-files are all about 14MB even with 2200 files in them and I'm sure you'll agree that they could easily be 200+ MB large should be an actual database.


    Actually I think the concept works well as far as sessions are concerned. Especially since there is no requirement for a session to have anything in the Capture folder or indeed any associated folders (favourites) at all.

    Moreover there is no need to assume that an image is related to a single session. For practical purposes on my 6 years old notebook a relatively large session (a few thousand images) will be fully presented from start up in less than a minute and if I have time to dive in and start to work before memory has been fully populated I can.

    What you suggest makes sense for a catalog since import is a requirement but the size issue will crop up and in any case it would be unlikely to be viable to keep everything in memory. The nearest you can get to that is to never (or rarely) turn off the machine or, perhaps, save a few seconds by using Hibernate.

    Databases, generally, have to compromise between number of records and speed of interaction. Personally I don't see the point for tuning the DAM of an EDITOR, mainly intended for interactive editing of a single image for most of its operational time, to be superfast and finding and listing a lot of images the majority of which are not very relevant to "today's" editing task.

    WIth Capture One there is absolutely no need to add a few shots from today's new shoot or the most recent vacation trip - the images one is most likely to be working in in the foreseeable future - into a collection of tens of thousands of historic images that have not been touched (in terms of editing) for years and likely never will be.

    It's a bit like buying a house under the flight path of an airport because it will be convenient to catch a flight once every few years and then complaining about the noise.

    I guess Phase could always introduce a Server Grade version and gain speed that way but I doubt many here would be excited about the underlying hardware and system software costs involved.

    Just my opinions of course. YMMV.


    Grant
    0
    Actions pour les commentaires Permalien
  • C-M-B
    I didn't mean they should dump the concept - I actually like the fact that the files and adjustments are not stored in a database.

    _BUT_ they could simply do both. Have a real session file with the previews stored inside to instantly display everything - and when you edit an image store the setting just like they're doring right now inside the CaptureOne folder.

    The problem is that when you're working on sessions on a NAS or anything that has a slower read/write/access rate, it gets to a point where it's nearly impossible to use.
    When I open a session with 1-2k images on my SSD it reads about 110 images per second, so that takes 10-20 seconds to fully load and display.
    On a NAS it reads about 5 images per second, so that would take 3-6 MINUTES.

    That is ridiculous. That is abysmal. That is embarrassing.

    I'd just prefer it if the files were kind of "indexed" with the thumbnails in the session file - so when you open the sessions file it'll display everything without any kind of delay. Keep the settings and the previews inside the CaptureOne folder by all means, so nothing gets lost if the Sessions file gets deleted.

    But that efffin waiting for a session to open via NAS is just stupid.

    And what's the purpose of creating a session for each shoot or project if the session file serves no real purpose to display the actual photos?
    0
    Actions pour les commentaires Permalien
  • SFA
    [quote="C-M-B" wrote:
    I didn't mean they should dump the concept - I actually like the fact that the files and adjustments are not stored in a database.

    _BUT_ they could simply do both. Have a real session file with the previews stored inside to instantly display everything - and when you edit an image store the setting just like they're doring right now inside the CaptureOne folder.

    The problem is that when you're working on sessions on a NAS or anything that has a slower read/write/access rate, it gets to a point where it's nearly impossible to use.
    When I open a session with 1-2k images on my SSD it reads about 110 images per second, so that takes 10-20 seconds to fully load and display.
    On a NAS it reads about 5 images per second, so that would take 3-6 MINUTES.

    That is ridiculous. That is abysmal. That is embarrassing.

    I'd just prefer it if the files were kind of "indexed" with the thumbnails in the session file - so when you open the sessions file it'll display everything without any kind of delay. Keep the settings and the previews inside the CaptureOne folder by all means, so nothing gets lost if the Sessions file gets deleted.

    But that efffin waiting for a session to open via NAS is just stupid.

    And what's the purpose of creating a session for each shoot or project if the session file serves no real purpose to display the actual photos?


    Yep, a NAS is slow in my experience. But then it is slow loading data directly from a USB3 disk as well - or at least mine is.

    Now I will admit that the drives - recommended for NAS use - are only 5400rpm but it's a NAS with a single device communicating with it most of the time so that really should not be an issue. But it is.

    If I am doing some casual work on a session that I have already archived to the NAS I can live with the performance once it is loaded.

    Usually.

    But it I'm still doing heavy editing I keep the session on an internal drive (or sometimes an external USB3 drive though they are compromised to some extent I find) and then I don;t have a problem unless I have allowed free disk capacity to drop extremely low OR I am running multiple applications that have between them used most of the system memory. Or, sometimes, both.

    But I think that in part those are exactly the issues that Phase have addressed by pre-loading the previews into memory.

    An alternative is for an apparently quick load is to load only on demand and use a very small selection at first open. Then hide all subsequent on-demand loads behind some apparent on-screen activity.

    Since we see, from time to time, comments about C1 being slower than brand x and then others saying C1 is faster than brand x I can only assume that the factors involved - including the design concepts behind the architecture decisions - are either producing random apparent performance machine by machine, day by day for some reason OR each application has different strengths and weaknesses in the way it is intended to work and different workflows induce different perceptions about performance by the users. Or something along those lines.


    Grant
    0
    Actions pour les commentaires Permalien
  • C-M-B
    okay, let's just take Adobe Bridge as an example. When I go to a folder on my NAS that I have never opened before with AB it instantly displays ALL images and then takes only a few seconds to render the thumbnails.
    And if i click on one it instantly displays a preview.

    So obviously my NAS is fast enough and it is possible to work on a NAS. Heck even with Photoshop I can open multiple large files (1GB-3GB), edit them and save them via NAS rather quickly.

    I just don't think there's an excuse for a program to take that long just to display files, especially when programs that have never even accessed the same files are able to display and render them in just a fraction of the same time.

    I like Capture One but I'm not blind.
    0
    Actions pour les commentaires Permalien
  • SFA
    [quote="C-M-B" wrote:
    okay, let's just take Adobe Bridge as an example. When I go to a folder on my NAS that I have never opened before with AB it instantly displays ALL images and then takes only a few seconds to render the thumbnails.
    And if i click on one it instantly displays a preview.

    So obviously my NAS is fast enough and it is possible to work on a NAS. Heck even with Photoshop I can open multiple large files (1GB-3GB), edit them and save them via NAS rather quickly.

    I just don't think there's an excuse for a program to take that long just to display files, especially when programs that have never even accessed the same files are able to display and render them in just a fraction of the same time.

    I like Capture One but I'm not blind.


    I've never used AB so I can't comment on that but I do recall working with early LR V1 and comparing it to the editor I favoured back then wondering why LR seemed to relatively fast and the other editor so relatively slow on screen at the time.

    The answer appeared to be the way the process was presented.

    LR would apparently show instant results but if one looked carefully after the initial screen update there were still changes rippling through the screen and, on the machine I was using back then probably mostly for jpg editing at the time, about 3 seconds elapsed before the change and save process completed. That was using the local C: drive.

    Using what came to be my preferred editor and applying as close as I could get to the same adjustments the screen showed no changes until all of the adjustments were written at the end of processing in one update - which took about 3 seconds. So far as I could measure and for all practical purposes there was no difference in processing time to the point where my next instruction was accepted but LR looked faster in that iteration. I assume that AB, PS and other members of the family are likely to use the same design philosophy - possibly even the same code.

    For both of those application one had to be sure to save the file of course. With C1 almost all edits are automatically saved as part of the process, the exception being things like Keystone correction where the probable processing overhead and the nature of the change makes it logical to work with a preview file (although 4k and 5k size previews probably negate the benefit) to make the visible overview adjustment and then apply the change when happy with it rather than make a lot of micro changes automatically while achieving the desired alignment.

    The challenge, of course, is that external USB drive and even NAS boxes and whatever the chosen spec of drives in them, tend to deliver much better data transfer rate for large files than small ones but the trade off is the balance between how much data one needs to move between disk and memory and back to disk to record the changes.

    Writing to the Temp files on the local disk is likely to be equally fast (for practical purpose of human perception) in both cases. Updating the edits on an external drive in near real time rather than when instructed may give a different perception and is, perhaps, a process with broader considerations than simply perceived elapsed time.

    And, of course, there are many other factors that might or might not be in play when using an external drive - especially a NAS which is likely to have a lot of internal file management going on to compete with external date delivery requests.


    Grant
    0
    Actions pour les commentaires Permalien
  • C-M-B
    You still don't understand the issue.

    The issue is that when you open a session ist starts to populate the window very slowly with images. 🙄

    I'm pretty sure it's not outlandish to ask for those files to remain in the session window and to not always start with an empty window all over again.

    What I would like to have is a session file with a kind of index of the last opened folder with all the photos and thumbnails already included, that would make opening and browsing through a session a breeze.

    But right now the sessions file is completely useless as everything (and I do mean everything) is stored in other files anyway.
    0
    Actions pour les commentaires Permalien
  • SFA
    [quote="C-M-B" wrote:

    What I would like to have is a session file with a kind of index of the last opened folder with all the photos and thumbnails already included, that would make opening and browsing through a session a breeze.

    But right now the sessions file is completely useless as everything (and I do mean everything) is stored in other files anyway.


    It would still have to load into memory (assuming the session was not already running) which is what it is doing.

    Very similar to a hibernating computer (re-loading its last working state from a "disk image") compared to a total re-boot or standby mode - which is reliant on using power so OK for a desktop but not quite as reliably useful for the intended effect if using a notebook.

    On my system, usually but with some slight dependence on what I was doing last, what you describe as your 'would like to have' is pretty much what I get more often than not. To some ectent it is likely to depend on the last action I took before closing the session.

    FWIW I nearly always close the session using the "Close Window" option.

    I have no real tested indication about whether that makes a difference but I use it because I often have multiple session open and usually wish to close them one at a time.


    Grant
    0
    Actions pour les commentaires Permalien
  • C-M-B
    Yes. It would. But that would not take as much time (or disk access) as loading every single file/thumbnail.

    If you open a database (=Session) it would be very stupid to first open the database and then fetch every individual value (=image/thumnail) one after the other. It would make more sense to open the whole database at once.

    If the Session-File contains the thumbnails and an index of the session files, it will be bigger.
    Maybe 300MB (in reality it would be less because it doesn't need to load all the preview files at once, it just needs to display the files and their thumbnails at once).
    That would take 3 seconds to open over a 1Gb/s NAS and a 5400 rpm HDD would be even faster instead of 6 minutes.
    That would be less than 1 second on a SSD, even an external USB3 SSD instead of 20 seconds.

    Right now it's a very slow, cumbersome and ressource-intense process and absolutely unnecessary.
    0
    Actions pour les commentaires Permalien
  • SFA
    And would that include presenting the data in the file as an image?

    When you make a change to an image do you have to write the entire 300MB (or whatever) file back to disk for real time updating or just parts of it in something similar to single image updates?

    Having loaded a preview record for viewing let's say you decide to edit it according to the latest system settings. How long would be acceptable to wait for the image to be re=presented? After editing, how long would be acceptable to update the revised preview once you have submitted to to be saved?

    Let's take an example.

    I sometime use Faststone, notably for quick viewing of processed jpgs since I don't want to read them in C1.

    But lets use it, as I do on the road if I have not travelled with the notebook and C1, as a RAW presenter.

    So I open a folder containing about 1700 RAW files (that it supports). Not big files by modern standards bt good enough for this purpose. The folder is on an internal 1Tb SSD that is SATA3 capable but happens to be running in a SATA 2 slot - so half speed but in effect that makes very little to no difference in perceived performance.

    It takes about 10 to 15 seconds to present me with a screen and a thumbnail strip of the first 10 images plus the first image presented in full screen (more or less), that being an extraction of the embedded jpg from the RAW file. If I scroll to the next image the full screen takes about half to a full second to display. BUt that's just the enbedded jpg.

    I can apply some edits to the image a tool at a time making the adjustments and deciding whether I want to keep them. If I do keep the them the change seems to be written to a work file and I can move to another tool.

    Once I have finished all of my changes I am asked if I wish to save the file. If I choose to do so the application takes a few seconds to create a jpg with the applied changes.

    The settings suggest that the current values are set for intermediate performance. I can slow things down by changing some settings related, for example, to colour management os speed things up by otping to work with small or lower resolution files.

    Although the application has some edit capabilities there seems to be nothing related to saving edits for RAW files for future work - other than saving the results as a new file, defaulting to jpg.

    It is meant to be a Viewer with some one-off editing capabilities that are somewhat similar to basic photo/graphics edits so this is no surprise and not a criticism.

    It does appear to be quite fast as a rapid displayer of quite a large number of files BUT if editing is to be performed the work involved using the tools will likely result in a slower overall experience come what may. (Ignore the lack of saved edit file for the RAW and re-use. That is not what this post is attempting to illustrate.

    Now, if you have an example of a program that is both fast for loading and display AND for the entire editing experience I would be interested to experience it. So far I have not found that combination that always stands out form the crowd in every aspect nor have I read any suggestions about such a development in this forum of from any other sources.

    If such a thing exists - great! Please advise where to find it - especially if is can run on some sort of PC.

    Thanks in advance.


    Grant

    I would like to experience it.
    0
    Actions pour les commentaires Permalien
  • C-M-B
    Writing 300MB (only when closing!) would not be a big deal and in ANY case that would be very fast.

    So I can't really take that as an excuse, sorry. And as I said, 300MB was just an example, in reality it would be way less (considering that the file list/index is basically a text saying "THAT FILE EXISTED HERE THE LAST TIME YOU CLOSED THIS SESSION, SO WE'LL SHOW IT TO YOU". And thumbnails don't take any space at all.

    I just checked a session with 2.2k images. Thumbnail size: 22MB.


    So really, my estimated 300MB were really exaggerated and off by a factor of more than 10 😉

    I think writing 22 - or maybe 25 (with the index) MB to the disk after you close a session isn't too much to ask for. More space activity happens when you open a session and it's being populated.
    0
    Actions pour les commentaires Permalien
  • SFA
    [quote="C-M-B" wrote:
    Writing 300MB (only when closing!) would not be a big deal and in ANY case that would be very fast.

    So I can't really take that as an excuse, sorry. And as I said, 300MB was just an example, in reality it would be way less (considering that the file list/index is basically a text saying "THAT FILE EXISTED HERE THE LAST TIME YOU CLOSED THIS SESSION, SO WE'LL SHOW IT TO YOU". And thumbnails don't take any space at all.

    I just checked a session with 2.2k images. Thumbnail size: 22MB.


    So really, my estimated 300MB were really exaggerated and off by a factor of more than 10 😉

    I think writing 22 - or maybe 25 (with the index) MB to the disk after you close a session isn't too much to ask for. More space activity happens when you open a session and it's being populated.


    Thumbnails, on my system and for my files, are about 20 to 30 KB. So on my system the folder size for something over 1700 images is about 35MB or just over 38MB "on disk".

    The Preview files for Focus mapping in the Proxies folder are about double that and the entire size with the relevant Preview files is 1.16GB for about 3600 files. My resolution setting is for 1920 pixels are per my scree size.

    The file or files would only still be in memory for any given session if you have an enormous capacity of non-volatile memory (i.e. duplicated disk usage) which, for archived records, would seem excessive.

    However you have reminded me that, using the temporary file storage on the C drive, another application I have used some years ago, developed by a team who had found renown at the time creating high speed database methods, used a predefined and allocated section of disk (about 5GB back then as I recall - OK for jpgs but not much capacity for RAW files) to retain the fies that had been previously worked upon. This was basically a session workflow without the management aspects for the grouping of files by the application.

    As the storage filled up they used an algorithm to decide which 'old' file in the temp folder was likely to be the best one to overwrite next time space was needed.

    The result was a fully used 5GB of disk at all times and the potential for a perceived slower response when the temp folder needed to be tidied up and files deleted before the next file could be created. It was cleverly done but still have the potential to appear slow, compared to earlier in the same edit session for example, if it had decided to remove files one was just getting back to working with.

    The nature of a session is a bit variable for retaining the sort of file you suggested. It sort of makes more sense for a catalogue and indeed that is more or less what it has. But then there are still the potential differences between the current editing activity settings and the previous one and the potential load time delay of having the catalogue folder structure on an external drive or NAS. But than one is still dealing with data files note rendered display images and if Proofing or displaying watermarks the calculations and settings are not saved anyway since they are likely to be highly variable for many users.

    You best bet, as mentioned before, is to not close the edit activity unless you have to and if you really want to shut down use the hibernat mode - which will save the current state of play to disk much as you are suggesting.


    Grant
    0
    Actions pour les commentaires Permalien
  • C-M-B
    Grant, please read carefully what I want.

    I'm not asking for all the previews do be loaded at once. I just want alle the photos to show up in the browser so when you're looking for one image you won't have to wait for 6 minutes until the browser viewer is populated.

    That only requires the use of thumbnails and a list of the images that have been there the last time.
    0
    Actions pour les commentaires Permalien
  • SFA
    [quote="C-M-B" wrote:
    Grant, please read carefully what I want.

    I'm not asking for all the previews do be loaded at once. I just want alle the photos to show up in the browser so when you're looking for one image you won't have to wait for 6 minutes until the browser viewer is populated.

    That only requires the use of thumbnails and a list of the images that have been there the last time.


    I understand what you seem to want but it seems rather specific if it goes beyond what you should be seeing anyway.

    There are probably a couple of thousand alternative opinions about what should be displayed. And how it should be displayed.

    Is it worth re-writing all the code around a different concept just for that? Is it even feasible?

    How would you want to deaal with the situations when for some reason you did not, by default, want to see what you were last working on and to do so automatically would be wasting your time?


    Grant
    0
    Actions pour les commentaires Permalien
  • C-M-B
    Yes.
    0
    Actions pour les commentaires Permalien
  • C-M-B
    Look, I'm not tryin to be annoying but just picture this:

    I have selected images from the session with the 2.200 photos, given them stars and color codes to easily find them. I filter the session to only display the green ones. There are 11 of them.

    I close the session.

    I open the session again and have to wait for 20 second for it to slowly display 11 images. 🙄


    That's just ridiculous.
    0
    Actions pour les commentaires Permalien
  • SFA
    20 seconds?

    Filtering is similar to using a dynamic smart album - though I think it may be possible that a smart album might shave a few seconds from the start up time.

    As an alternative, select all of the images you want to return to just before you elect to close C1 and create a regular (indexed) album rather than a dynamic album.

    Close the session/catalogue.

    When you next open it is should start from where you last left it - with the newly created and directly linked and indexed album of the specific images you want to see on re-start.

    Once you have finessed the process you should be able to save a few seconds every time you open a C1 session or catalogue .

    You will have saved developers,testers and documenters untold hours writing code that tries to second guess what a user wishes to do when they open ... something. For some purpose.


    HTH.

    Grant
    0
    Actions pour les commentaires Permalien
  • C-M-B
    [quote="SFA" wrote:
    20 seconds?

    Filtering is similar to using a dynamic smart album - though I think it may be possible that a smart album might shave a few seconds from the start up time.

    As an alternative, select all of the images you want to return to just before you elect to close C1 and create a regular (indexed) album rather than a dynamic album.

    Close the session/catalogue.

    When you next open it is should start from where you last left it - with the newly created and directly linked and indexed album of the specific images you want to see on re-start.

    Once you have finessed the process you should be able to save a few seconds every time you open a C1 session or catalogue .

    You will have saved developers,testers and documenters untold hours writing code that tries to second guess what a user wishes to do when they open ... something. For some purpose.


    HTH.

    Grant


    Dude, I'm not going to create an album for every kind of image selection for every session.
    0
    Actions pour les commentaires Permalien

Vous devez vous connecter pour laisser un commentaire.