Skip to main content

⚠️ Please note that this topic or post has been archived. The information contained here may no longer be accurate or up-to-date. ⚠️

243,000+ RAW files in a catalog?

Comments

32 comments

  • ChrisM
    [quote="Terence2" wrote:
    I've got a Lightroom catalog with about 243,000 files, mostly RAWs from Canon 5d2/5d3 and some coming from a 5DSR now. The files and catalog all reside on a 28TB 7200rpm 8-bay RAID tower with Thunderbolt 2. There's about 20TB of free space currently. My main computer is a Macbook Pro Retina 15" with 16GB of RAM and a 756GB SSD with about 300GB free.

    My current workflow is to shoot into C1 Sessions, edit, process to JPG/TIFF for clients, then import into LR catalog after job completion. I use LR for keyword and metadata input as it is much easier than C1. I also use LR for making portfolio edits (in Collections) across my archive for syndication and promos. Can C1 catalogs replace LR for me in this scenario? Will C1 have any performance problems with so many images? Is there a limit to how many images I can manage in C1 Catalogs?

    I have a catalogue roughly that size, with referenced raw image management. It works ok. as far as browsing and editing goes, although some functions tend to slow down, such as image zoom.
    But I stopped creating smart albums and such, simply because for my purposes, it takes too long to load them, around a minute and a half or so. After loading it works ok.
    I stopped using CO1 for smart albums and such, and switched to Adobe bridge for that. For now it feels better to seperate raw editor and image manager. This both for a macbook pro and a powerful windows pc with 16gb of RAM.

    Chris
    0
  • CAPTURE NIKON D700
    What is the benefit to go around between to Pro. Programmes ❓
    way are you using C1 8 !
    way are you using Adobe !
    suffering and wasting Time for what !
    is it because for writing Keyword to Images and agin to Managing them like Aperture don before :mr green:
    i posted here bevor and keep asking Phase One that C1 8 is missing a good DAM function.
    and Media Pro is not working well to C1, but they don't give a care about it, some buyers at Phase one are still satisfying them with this issues...
    i am still wishing to get a complete solution coming from one place like C1.
    0
  • NNN635158767546269381
    Hello,

    My catalog is about 100k images (Mostly Canon RAWs). It is performing "ok". Browsing in to the folder/collection is slower as Lr, but the thunbmail viewer and the RAW developement is better/faster.

    However, importing a Lr catalog is dead slow.. It tooks me 1 full day! Fortunately, you do that only once..

    DAM is still a bit better in Lr, but RAW devlopement is really better in C1. Hoping that C1P will catch up in the upcoming release.

    Regards
    0
  • Helmut Kaufmann
    When you look at the underlying database, it becomes clear why there are performance issues. As long as the database schemas are not adapted, I would doubt that we will see significant improvements. This is really a shame as the RAW processor engine of CO is just great compared to the competitors... Got a PhD in databases but Phase One had not approached me with a support request yet 😎
    0
  • Andriy.Okhrimets
    I do believe that all database related stuff is coming from a product that Phase One bought long time ago from Microsoft.
    Phase One main priority as I always recall was image quality and DAM was and is just added feature.
    I do have around 100K Raw files, but I tend to create catalogs per client, which greatly simplifies my workflow.

    I've swiched 50K lightroom catalog on iMac with just 8GB of RAM and it was really slow process, and it takes so long to open it up. But I was able to chop it based on client named using awesome export as catalog feature. And now things run much more smooth.
    0
  • Terence Patrick
    Thank you all for validating my intuitions that C1 is simply not ready for primetime as a robust DAM solution. It's a shame, really. But it seems cataloging/DAM is either very difficult to pull off or the demand isn't that strong in the small business market. Enterprise options are way too pricey and overbuilt. Guess I'll be balancing between C1 and LR for the time being.
    0
  • Eric Nepean
    I have a 2009 iMac with Core i5 processor, magnetic hard drive and a 16GB of RAM. Open CL Hardware acceleration is set to Auto for both display and processing.

    With this setup, my folder with 17,000 images is getting a bit slow. Response of individual projects and albums is OK, but clicking on "all images" with Viewer=off results in a 60-80 second delay before C1Pro responds to the mouse and is able to scroll. If I go do something else and then come back to "All Images" I get a similarly long delay.

    Looking at the activity monitor I would say the CPU usage and the RAM usage is not the limiting factor as the usage of both is well below maximum. More likely it is disk usage as I can hear the disk drive rumbling incessantly as C1Pro works, and it is reading up to 20MB per second.

    Possibly with a newer iMac with better HW acceleration and an SSD hard drive, this might be sped up.

    One thing I do is I keep all my incoming images in a separate catalog until I'm finished the initial sorting and editing, I only put finished projects in the main catalog.
    0
  • Andriy.Okhrimets
    Hi, there is a reason for your slow down. There is some reason why Macs do not clean memory in time, that`s why having 8GB is really tough thing when comes down for sorting out and clearing images. But here as a small trick that will make you life much more easy on 8GB install Memory Monitor from a store. It is awesome and FREE, and click free memory on it from time to time when working with C1, for me it does amazing, almost magic thing speeding up my iMAc and making it a beast again. BTW I do same thing after closing C1 to speed up, and free memory used by C1.
    0
  • harald_walker
    Am not even close to 240K images in one catalog yet and am already being bothered by the slowness, esp. when trying to search or filter images.

    I have been considering to move the catalog files (which include the SQLite database file) with referenced images onto a fast USB3 flash drive. Currently they are on the internal SSD of my retina MacBookPro which is not the fastest. Has anyone done this already? I did the same with virtual machines and that works quiet well.

    I wish they'd be using something different than SQLite for the catalog database. Would be happy to run MySQL or Postgres locally but there might also be ways to get more performance out of the current solution.
    0
  • Andriy.Okhrimets
    Hi, moving files to external HDD will make you work even slower. Because internal SDD is much faster.
    0
  • harald_walker
    [quote="Andriy.Okhrimets" wrote:
    Hi, moving files to external HDD will make you work even slower. Because internal SDD is much faster.


    Depends on what you move and how fast the external storage is. There are external storage solutions that are faster than an average internal SSD (e.g. external Thunderbolt2 SSD RAID).

    My referenced raw image files are on an external Thunderbolt RAID system. The CaptureOne catalog (which includes the preview and image meta-data) is on the internal SSD. The slowness I mentioned (search, filter,...) in only using the catalog data on the internal SSD.

    Main reason to move the raw files to an external system is of course that the internal SSD is simply too small for the image files.
    0
  • Permanently deleted user
    Totally agree with Harald and using the same setup here with keeping the catalog on the internal drive and the images referenced on the 4bay direct attached Thunderbolt2-Raid.

    As Terence2 is even using a 8-bay Thunderbolt2-Raid he will probably by far outperform his internal SSD drive in the MBP Pro performance wise.

    Myself I am using a 150GB catalog on the internal SSD drive and 1TB+ of referenced images (43k images) on direct attached Thunderbolt2-Raid without any performance lack at all.

    The only thing is loading the catalog by starting C1 takes a while and could be preferably faster.
    0
  • harald_walker
    [quote="mercator" wrote:
    When you look at the underlying database, it becomes clear why there are performance issues. As long as the database schemas are not adapted, I would doubt that we will see significant improvements.

    Don't have a PhD in that field but see what you mean (just had a look at the underlaying database for the first time) and agree.

    For instance I have a catalog with only 32K images. Filtering all images of a year takes about 10 seconds before it shows any thumbs (= 2000 - 4000 images selected). Filtering on keywords is just a slow. In the database keywords are stored in one column as a comma-separated string value.
    0
  • Helmut Kaufmann
    Both, searching for years as well as keywords - even with hierarchies - would be straight forward even with the sqlite database that sits below. What it seems the software does is caching large parts of all of the result set, which uses a lot of memory. This would need distinctively different from Lightroom or Aperture. I would assume that they could speed up the most common queries with very low effort...
    0
  • Helmut Kaufmann
    Just in case anyone is interested. Setting the page size of the underlying database to the size of the individual blocks on disks speeds up the access (pragma page_size=512; vacuum;). I have also set the cache size to 2GB (pragma default_cache_size=4194304;), which seems to give an additional boost along with the creation of a number of obvious indices on attributes such as the image capture date and filename (which are used for sorting in the browser). it would be great if it were possible to set additional parameters, such as threads but it seems CO is not allowing this due to the SQLite compile time parameters.
    0
  • Andriy.Okhrimets
    mercator Thank you for posting that nice idea of tweaking paraters of SQL Lite, I will try to use next time when working on big catalog. But I rather not experiment on my production archive copy 😊
    0
  • Helmut Kaufmann
    😄
    0
  • Eric Nepean
    [quote="mercator" wrote:
    Just in case anyone is interested. Setting the page size of the underlying database to the size of the individual blocks on disks speeds up the access (pragma page_size=512; vacuum;). I have also set the cache size to 2GB (pragma default_cache_size=4194304;), which seems to give an additional boost along with the creation of a number of obvious indices on attributes such as the image capture date and filename (which are used for sorting in the browser). it would be great if it were possible to set additional parameters, such as threads but it seems CO is not allowing this due to the SQLite compile time parameters.


    What SW tool did you use for this?
    0
  • Helmut Kaufmann
    I have used sqlite http://sqlite.org/download.html. There is another post by me in this forum that illustrates how to use it. Sorry that i am brief today as i actually have only little connectivity.
    0
  • NNN635158767546269381
    Hello,

    CaptureOne's DAM is indeed not very fast..

    I delete my "Cache" folder (Preview), I noticed them the software was faster.
    Managing thousand of small files (previews) is pretty challenging for a file system/disk. It may be a good idea to pack let's say 100 previews in one file. Reading this file out would be fast. Just an idea

    Hoping to see improvement in upcoming version

    Regards
    0
  • Eric Nepean
    [quote="mercator" wrote:
    I have used sqlite http://sqlite.org/download.html. There is another post by me in this forum that illustrates how to use it. Sorry that i am brief today as i actually have only little connectivity.


    Thanks Mercator, I got it.

    It says something when the users can provide simple changes to speed up a product.
    0
  • SFA
    [quote="Eric Nepean" wrote:
    [quote="mercator" wrote:
    I have used sqlite http://sqlite.org/download.html. There is another post by me in this forum that illustrates how to use it. Sorry that i am brief today as i actually have only little connectivity.


    Thanks Mercator, I got it.

    It says something when the users can provide simple changes to speed up a product.


    It does, but what?

    Everyone I know who has addressed this sort of issue for commercial business applications is always seeking way to "speed up" or "handle larger volumes" - or both.

    Very often these twin objectives start out as seemingly in conflict when combined. Or at least in conflict if critical transaction reliability and long term supportability across many users and their requirements (often on a corporate network) are to be deliverable and then maintainable release after release.

    Over time enhancement may be identified as the development tooling improves or new ways are found to do things or some older "needs" are redesigned in the application and replaced. All of which takes time, resource and, of course, funding.

    And testing. A lot of testing, ideally across multiple machine configurations.

    What works for some people may not work as well for others - which is partly why disk drive manufacturers, for example, build different designs for different purposes and companies building mass storage systems will take that into account when creating specific versions of products for specific markets. That's just one example of the issues that have been described to me over the years. There are many more.

    I'm guessing that anyone playing with the DB parameters likely knows all about that. However there may well be many others here who don't know and perhaps have no wish to know other for expecting such matters to be simple to deal with.

    They rarely are based on what I have seen if the whole product functionality is considered.


    Grant
    0
  • Eric Nepean
    [quote="SFA" wrote:
    [quote="Eric Nepean" wrote:
    [quote="mercator" wrote:
    I have used sqlite http://sqlite.org/download.html. There is another post by me in this forum that illustrates how to use it. Sorry that i am brief today as i actually have only little connectivity.


    Thanks Mercator, I got it.

    It says something when the users can provide simple changes to speed up a product.


    It does, but what?

    Everyone I know who has addressed this sort of issue for commercial business applications is always seeking way to "speed up" or "handle larger volumes" - or both.

    Very often these twin objectives start out as seemingly in conflict when combined. Or at least in conflict if critical transaction reliability and long term supportability across many users and their requirements (often on a corporate network) are to be deliverable and then maintainable release after release.

    Over time enhancement may be identified as the development tooling improves or new ways are found to do things or some older "needs" are redesigned in the application and replaced. All of which takes time, resource and, of course, funding.

    And testing. A lot of testing, ideally across multiple machine configurations.

    What works for some people may not work as well for others - which is partly why disk drive manufacturers, for example, build different designs for different purposes and companies building mass storage systems will take that into account when creating specific versions of products for specific markets. That's just one example of the issues that have been described to me over the years. There are many more.

    I'm guessing that anyone playing with the DB parameters likely knows all about that. However there may well be many others here who don't know and perhaps have no wish to know other for expecting such matters to be simple to deal with.

    They rarely are based on what I have seen if the whole product functionality is considered.

    Grant


    For me, the speed (actually slowness) of C1Pro is such that it really interferes with the usability of the tool - 90 seconds to first respond to the first keystroke after opening a catalog and much longer yet for the second keystroke makes fast work impossible.

    About every other week I think about switching to a more responsive tool - it has taken much of the joy and creativeness out of my photography, to be replaced with slogging through the tool.

    I'm not the only one facing this, I have even received comments from PhaseOne reps that this slowness may be a factor.

    I note that there settings changes which can be made that will improve speed - and yet there is no guidance. There is range of performance and features in Macs - and yet there is no guidance.

    I note that people with database expertise have remarked that the database engineering is less sophisticated than C1ps competitors. Aperture is over 10x faster with the same library of images on the same machine.

    To me, it says there is a lack of focus and consideration for the user.
    0
  • SFA
    Well, Eric, I can't really comment on Macs or catalogues.

    What I can say is that running even large sessions, with several open at a time, on my Notebook is not slow and never has been unless the machine itself is behaving strangely. Certainly nothing like as slow as you have described. The dedicated GPUs I have are not powerfula and so are ignored. Everything is via the CPU and usually, but not exclusively, on internal SSD.

    It maybe that those factors help. I certainly don't see the point of have a very large catalog open just to edit a few new images when 99.9% of the images in the catalog will be untouched for extended periods.But that, of course, is my personal preference.

    But skipping past that .... whether or not there are further balanced database tuning activities to be designed and implemented managing the DB parameters oneself may lead to additional work down the line and, at worst, being unsupported and indeed unknown to the developers and support teams you are not guaranteed continuity for the changes you have made from release to release and support activity may be made more difficult (to say the least) should anything go wrong.

    It might depend on what you do and how well you can interpret any subsequent challenges and correct them.

    Whether or not the way the C1 DB is implemented at this time makes sense for good performance across a range of possible hardware and software configurations (feelings may not be unanimous about that!) it seems evident that some users, for what ever reasons, seem to be more affected by things than others. So things can be OK.

    In that situation it really ought to be better to seek a solution either by helping the developers identify the reasons so that they can, where possible, adapt things and carry that forward for the benefit of all - or by finding ways to eliminate all or most of the problem operationally.

    That could entail all sorts of things of course - I would rule nothing out other than trying to avoid manipulating settings not made available by the application unless you are very sure about what you are doing AND are prepared to self fix anything that may at some point go wrong.

    One of the interesting things I have discovered about the database parameter design for a specified list of tasks is that if you ask 100 people with related knowledge for their opinion about a defined and documented operational requirement specification you are likely to get 100 different approaches of which a small number may, broadly agree. Sadly general agreement alone does not, in my experience, make effective development certain.

    When it comes down to it it is your call. But given the timings you mentioned I have to wonder if tinkering with a few settings is going to make a dramatic difference. I guess all you can do is try (on a test file?) and see.


    Grant
    0
  • Eric Nepean
    Hi Grant

    If I could find any way of improving the speed of handling images without tinkering with the database, that would be preferable. (I have no problem with the speed of editing images).

    It may be that there are some bits of HW that are important (your SSD is much faster than my magnetic HD) or that there is some part of the data in the database that is hard to manage.

    It looks like I'll be doing some experiments with a test catalog one way or another.

    Could you provide a few details about your system that works so well?

    You mention that you have an SSD, is this where you store your image files?

    What kind of processor and how much RAM do you have?

    It sounds like you have OpenCL and OpenGL turned off, but could you mention the type of graphics card anyway?

    Which operating system are you using?

    If your machine is a Mac, which one?

    Approximately how many images do you have in a catalog, are they RAWs or JPEGS, and from what kind of camera? From one camera or many cameras?

    Finally (sorry for the many questions) do you store the image files inside the catalog (managed) or outside (referenced), and do you use many keywords, if so they are arranged flat or in a hierarchical (tree) scheme?
    0
  • SFA
    [quote="Eric Nepean" wrote:
    Hi Grant

    If I could find any way of improving the speed of handling images without tinkering with the database, that would be preferable. (I have no problem with the speed of editing images).

    It may be that there are some bits of HW that are important (your SSD is much faster than my magnetic HD) or that there is some part of the data in the database that is hard to manage.

    It looks like I'll be doing some experiments with a test catalog one way or another.

    Could you provide a few details about your system that works so well?

    You mention that you have an SSD, is this where you store your image files?

    What kind of processor and how much RAM do you have?

    It sounds like you have OpenCL and OpenGL turned off, but could you mention the type of graphics card anyway?

    Which operating system are you using?

    If your machine is a Mac, which one?

    Approximately how many images do you have in a catalog, are they RAWs or JPEGS, and from what kind of camera? From one camera or many cameras?

    Finally (sorry for the many questions) do you store the image files inside the catalog (managed) or outside (referenced), and do you use many keywords, if so they are arranged flat or in a hierarchical (tree) scheme?


    Hi Eric,

    I had probably better first stress as before that I use Sessions and Windows 7 so we are not talking like for like here and nor would I expect you to see the same start up performance with a large catalog as I see with sessions of several thousand images However the point is that others posting in the forum here using Macs also seem to get better performance, though obviously note everyone.

    To set the scene for my equipment - after all at least the processors, SSDs and Memory are likely to be fairly similar specs between MAC and PC based on age .....

    It's a Dell Precision M 4700. About 2.5 years in my possession so it would have been however close it got to being "state of the art" about 3 years ago. As this is the premium business end of the Dell product range I assume that the internal components are also likely to be high specification in terms of bus speeds and so on, in order to make best use of the headline components. I suspect that makes a big difference. Given the price differentials through various model ranges there had better be a difference! 😉

    Intel i7 3820QM CPU @2.70 GHz

    24Gb RAM but it came with 8Gb and performance seems fine. The extra mostly allows me to run multiple applications easily.

    Built in Intel graphics and a low end NVidea Quadro K1000M GPU card - neither of which are powerful enough to be used by C1 for processing.

    The original drive is a Samsung PM830 SSD. Nominally 512Gb but in effect about 480.

    This is an older technology SSD drive and not up to the potential speed of the current products.

    I added a Samsung 840 EVO mSATA SSD of nominally 1Tb in a spare Comms card slot in the machine. The slot is restricted to a lower SATA comms speed than the card can manage - about half speed - but in practice I doubt is really noticeable. Right now both disks are about 90% full - I need to do some further tidying across the data for all the applications I use!

    I also have a pair of 4Gb External drives (USB3) which as intended to end up as mirrors of each other (more or less) with one destined for offsite backup. Running a large Session from the externals is a little slower but not bad unless the disks have dropped into ECO mode at which point they can take a while to wake up and Windows seems to get a little confused about things for a while.

    The theory was to put all the images (and the session files in my case of course) on the larger second SSD keeping the original one free (sort of) for programs and OS stuff. In practice I still have session on both drives and both work equally well even at probably no more than half the potential performance that can be obtained from more recent SSD designs using the larger sizes of disks.

    My sessions typically range in size from about 18Gb to 40Gb. Sometimes a little larger. That would perhaps include selected output files processed to several sizes. For big shoots I may be importing anything up to 3000 images a day for 3 days - perhaps close to 50Gb. Sometimes over 60Gb. (My older "workhorse" camera creates sensibly small file sizes!).

    Maybe, if time allows and curiosity gets the better of me over the winter, I might try throwing everything into a test catalogue to see what happens.

    I would expect a MAC using a similar configuration to be at least as good for performance and anticipate that many would expect it to be better than a Windows 7 PC on a like for like basis. The trouble is it's not really something that is easy to test reliably even if one had both machines side by side.

    What I do know, based on business applications testing using large data sets, is that there is a point at which DB performance can drop off quite dramatically depending on how the DB settings have been defined compared to anticipated typical usage. A broad package to address a broad market is likely to have some compromises built in (as do the performance parameters of disk drives for example).

    If the application or the users have the luxury of being able to tune the needs to a precise requirement that rarely varies much can be gained when seeking, for example, absolute speed of processing or, alternatively, handling very large data volumes. Getting both does seem to be somewhat trickier to balance out.

    I think there are possibly some easily gained performance benefits from opening sessions rather than catalogues. There is no reason I can come up with for not using both approaches in combination when better efficiency can be obtained.

    HTH.


    Grant

    ETA. I forgot to say that I would always keep the master files as referenced images. This is, sort of, the natural way for a session anyway and allows me to work on the same file with different applications should I choose to do so.

    I tried LightRoom at version 1 launch and one of the things I was uncomfortable with was the need to to use the catalogue for everything (as it was then).

    However for best performance I would expect to have the files on the same disk or at least an internal disk.

    I can keep session sets together easily and move them about as complete entities. That way copying off an internal disk to an external storage is not a problem and is still accessible for occasional re-visits. Or I can copy back to the internal storage if I choose to do some more extensive work on the session's images.

    Importing a session to a catalogue would offer further benefits especially if the session was not an entirely stand-alone body of work.

    My use of keywords is fairly recent. Typically I shoot events and although general keywording and location metadata is easy to identify and apply, getting down to the details of the content - names, descriptions, sub-event, etc., - is a lot of work just to obtain the information in usable form. Sometimes doing it is viable and hierarchical structures can speed things up (although they can also confuse things at times).

    With different shoot types and subject matter using Keywords as an organisational tool would likely be far more useful for the effort of setting it up than I think I can get from most events.
    0
  • Eric Nepean
    Hi Grant

    Wow - what a great insight into your way of working. Thanks.

    Here is a very short answer as I have to run, Work and family will keep me busied out over the next 3 days

    At first glance the big differences that I see from you to me are:
      SSD vs Magnetic Hard drive
      Windows vs OSX
      Sessions vs Catalog


    (My set up is OX 10.9 (same vintage as Win 7) , 2.66GHz i5 iMac about 5 years old, 16GB RAM, 1TB Magnetic drive (no SSD), 1 catalog with about 18000 referenced images (about 150GB) )

    This weekend I will try changing these factors to see if I can find a one that creates a major improvement.
    0
  • SFA
    Hi Eric,

    This was the first time with an SSD for me. I was just astounded at the performance when new and an empty drive. Even now it's a joy to use though MS seems to have run some Win 7 updates that do strange things randomly - like Explorer stops working. Kill and re-start and it is fine. Strange.

    With that sort of strangeness and of course the Anti-Virus stuff going through periods of wanting to re-start every day rebooting becomes a regular event. It takes seconds usually (post Windows Update somewhat longer of course) and becomes almost a pleasure rather than a pain to be put off and put off.

    18k files in a catalog does not sound like enough to cause the sort of slowness you have described - not unless there is some other major event occurring each time you start up C1. Backup or something? 180k - well, maybe ....

    The C1 log file and maybe running the Mac performance stuff might give you some idea about what is going on and when under the hood.

    Bear in mind that hardware overall configuration may have a big role to play for you. Older hardware MAY not be able to make full use of the performance of newer SSD drives for example but should still show improvements if you can spring for one. Likewise my understanding is that the Intel processors have quite a performance jump between the i5 and i7 and also with each new release within the families. My processor is now about 3 generations behind current. There have been a couple of generations of enhanced bus technology too that make my device quite "old school" these days.

    And, of course, I have mostly avoided camera purchases or upgrades that result in huge files. At the volumes I have often shot over a day or weekend it would start to get expensive trying to keep up with newly introduced high capacity and high speed memory cards to cope with the larger files and disk capacity to store them!


    Enjoy your experiments. I would recommend trying a large session at some time to give a point of reference.


    Grant
    0
  • harald_walker
    I tried some of the mentioned SQLite changes but didn't measure any improvements.
    But last weekend I found a way to significantly increase the speed in my use case. In my case I am searching for old images by year and keywords (have to re-edit hundreds of old photos) and that was unbearably slow. The combination with the year filter was obviously the main problem. Instead I now just created albums with all images of a year. From a database point of view that is obvious much more efficient since it is a fixed relationship. With the filter it took up to 10 seconds to select all images of a year in a catalog of ca. 40K and then another 6-7 seconds to show the matches for a keyword. With the album per year that is down to ca. 2 seconds and now searching by keyword within this selection is also ready within 1-2 seconds.
    0
  • Jonathan Knights
    [quote="Terence2" wrote:
    I've got a Lightroom catalog with about 243,000 files, mostly RAWs from Canon 5d2/5d3 and some coming from a 5DSR now. The files and catalog all reside on a 28TB 7200rpm 8-bay RAID tower with Thunderbolt 2. There's about 20TB of free space currently. My main computer is a Macbook Pro Retina 15" with 16GB of RAM and a 756GB SSD with about 300GB free.

    My current workflow is to shoot into C1 Sessions, edit, process to JPG/TIFF for clients, then import into LR catalog after job completion. I use LR for keyword and metadata input as it is much easier than C1. I also use LR for making portfolio edits (in Collections) across my archive for syndication and promos. Can C1 catalogs replace LR for me in this scenario? Will C1 have any performance problems with so many images? Is there a limit to how many images I can manage in C1 Catalogs?


    I use MacPro 2x3GHz processors (8 cores) with 16GB RAM and loads of disk external and internal.
    I have tried to import all my RAW images 120K into a COP in versions7 and 8 without any success. I can do a one step import into LR in 24hours by pointing at the top directory and telling it to catalog including sub-directories!

    I have recently been testing the process again and it seems that the way that the COP Catalog Import process works it may be best to do the Import in sections by directory in a hierarchy so that the process proceeds with sets of 5000-10000 images are imported in each step. In this way I recently managed to get all my images into a single catalog on COP.
    Hope this works for you.

    That said I have MediaPro that i use for for my cataloguing but I dont find it as easy to use as LR. However COP wins on image quality!
    0

Post is closed for comments.