Mac - Copying a Filled Layer from portrait to landscape image results in a non-scaled mask being applied
ImplementedUpdate 20th February:
This bug was fixed in 16.3.6.
See the release notes here.
Update 25th November:
There are specific cases in which this behavior still occurs when copying the filled mask from different camera models.
Our team is back on the case, and working on a new fix. Apologies for any inconvenience. Follow this post for the latest updates.
_________________
Update 19th September:
So we've received reports from several users regarding an ongoing issue with copying masks that is still present in the latest version, Capture One 16.2.4. We want to assure you that we are actively working on a solution to address this matter soon.
Keep an eye out for the upcoming updates!
------------------------------------
Result:
Screenshot provided by Gauthier Mignot.
#193473
#197909
-
Wait! There’s more ;) Now radial masks sometimes invert. A nasty glitch as you often won’t see it in the thumbnails. That also stays until and including export. You need to toggle the mask on and off. But usually, it does return. Just noticed it in a few cases.
Also, in the past two releases, AI masking is completely broken. It just gives me weird pixelated jittery noise.
So there’s deeper trouble and something is very broken with masks lately. Basically since they introduced AI support.
0 -
Have you looked at release notes of newer versions?
0 -
Walter Rowe be more specific please.
0 -
Helen, why this bug is marked as "Implemented" again?
0 -
In fairness, I'm on 16.3.7.10 and I haven't encountered this bug since updating to this version. That's not to say that it's definitely gone for good, but it's a lot more usable now.
It's a pity that I had to basically give up on 16.2 to get here, especially given the new upgrade/pricing model, but on the upside, the AI masking has been fantastic!
0 -
Yes, and it's a pity for them too: I stopped upgrading Capture One. And I was with this software from the beginning. I'll upgrade when forced to do it.
0 -
I always used the latest version. In the past year, I haven't done that. It's the same problem I reported 7 months ago. The issue occurs when copying settings from different cameras. For example, now the problem only occurs when I copy from a landscape photo taken with the Z7II to a portrait photo taken with the Z6II. I haven't been as bothered by this bug because I started editing some of my jobs in Lightroom again. I find it incredible that the testing department doesn't find these bugs and that they aren't resolved in a timely manner. I am going to cancel my subscription because the upgrade is useless. I haven't even been interested in the new features introduced because at the end of the subscription year I plan to return to the latest version of my perpetual licence.
0 -
@sergiu, you have an issue when you copy a landscape mask from a photo shot with a Z7II to a portrait orientation photo shot with a Z6II - and think Capture One (or any company, really) should have tested for this? Seriously? Think for a second just how many permutations of cameras there are out there. Now double that. That’s the number of test cases they’d have to run through for a scenario like this. No update would ever be pushed. I think it’s both interesting and tragic you found the needle in the haystack. Report it and move on. The indignation is almost funny.
-1 -
I disagree. Copying masks worked for years between all permutations of camera combinations and does so in other software. It is simply not an option to have basic functionality break in a software that is catering to a professional market.
0 -
Conrad von Schubert 1, you don't know that every permutation worked. Only that the permutations you used worked. 2, it's impossible at any reasonable scale to test every permutation. I get being frustrated that a permutation I or you use doesn't work but to fall back to belittling people because they were, to paraphrase, either too stupid or too lazy or too incompetent to test just screams either a gross lack of understanding of the QA process or entitlement.
0 -
Some software testers create automated test cases to verify the correct functionality of software and identify bugs. While it's understandable not to invest time and money in writing automated tests for new bugs, this issue has persisted for a long time and is impacting cameras currently in use. The Nikon Z6ii and Z7ii models are still on sale, making it crucial to address these problems promptly.
By running multiple virtual machines, you could potentially complete all tests on the current selling models within a day (worst-case scenario).
So, Brian Jordan, "Incompetence" is an appropriate word to describe this situation. It may not be the developers at fault but rather the lead developer or the management, who prioritizes new features over fixing bugs. This is a well-known issue in the software industry.
This particular problem has been ongoing for nearly a year and should have been addressed a year ago.
0 -
Sergiu Bacioiu No hill is difficult for a man who's never climbed.
I want you to just think for a minute....how many cameras can you name? Now start a list - pair camera 1 with every other camera. Now pair camera 2 with every other camera. And so on... Even with the relatively few cameras you can name off the top of your head, the list quickly becomes huge. Now just think of writing a script or testing protocol to test Every Digital Camera with Every Other Digital Camera. It's economically impossible.
-1 -
Brian Jordan have you ever written any code or had any involvement in the software industry?
0 -
Sergiu Bacioiu 20+ years. Have you? Even written a QA script? I have. Spent maybe 5 years with Rose QA software. What's your background? If I'm mathing right, given 1000 cameras (and I'm confident there are more), there's almost 500,000 unique combination of cameras. Now add that just to test this you need to test Portrait to Landscape from A to B and from B to A then you need to test the reverse so you've quadrupled your test counts. How do you justify the staffing to write a QA sequence with a half a million nodes AND the staffing to QA that script? How long does that take? How much does that cost? Where do you find customers willing to foot the bill? Heck, the script to write the script would be awesome! Permutations are a nightmare and, at some point, you have to call it a day. Not everything can be tested.
Assuming your challenge to my credentials, I'm guessing you have written some code? Show me your testing protocol and I'll give you a buck for every item you tested against but then you give me 10 for every item I can come up with that you missed. Unless you're NASA, I'll come out ahead.
0 -
Let's not test all the cameras that have been available so far. Instead, if we consider the approximately 200 camera models currently on sale, we would have 400 files—both landscape and portrait. I believe these files already exist in the company's database.
Now, let's adjust the curves. For the image layer, set the input 0 and the output to 255. For the new filled layer, set the input 255 and the output to 0. This will result in a black image. When you copy and apply these settings to the rest of the files, you will clearly see any issues. Here is an example.
Let's take each case one by one. For the first image in the set, apply the settings mentioned above, then copy these settings to all images and export them to a folder named using camera model. Reset all images and repeat this process with the next image. In the end, we will have 400 folders containing 400 images each. With a total of 160,000 images, even if reviewed by a person, once sorted by orientation, they can be easily identified with a simple scroll. I estimate that running the script won't take more than 48 hours, and human verification will take a few additional hours.
Let's set that aside for now and consider my case. From my previous posts, it was reported that these cameras were causing issues. Not even the cameras that have already been reported are not being tested. What is your opinion on this?
0 -
Why would we do this? Seriously. 200 cameras? Where do you get the files? Do you keep 200 cameras on-hand at all times? That's a cool half million to sit around. Likely not so you somehow need to solicit getting those photos sent in somehow from somewhere. Which cameras do we pick? 200 top selling? 200 most frequently used across the user base? How do we track those stats? Whose job is that? Who chases new files when a new camera pushes into the top 200? What do we tell the guy who has a problem but his camera is number 201? Do we only test top 200 cameras against other top 200 cameras? What about firmware? Will a firmware update affect this at all? MAAAYBE???.... So we probably need to test that. Or we just ignore it? Latest firmware only. Someone made that call so let's hope it's right. But then someone has to keep track of firmware updates across 200 cameras and make sure we have test files from all the latest firmware pushes on all the top 200 cameras. Whew.......there's one full time employee! Now, where does this rank in all the testing? Surely somewhere below "does the software open on all the supported OS/versions?" but where exactly? What else will we be testing? Oh.....one of those other tests necessitates a change in code that may or may not impact this so we gotta test all this again.....so there's another full round of someone scrolling through 160,000 images. When's the last time you did that? Mind numbing, huh? How many times can you do that and be accurate? How many times before your eyes bleed? But, assuming an automated test suite, no one is actually looking at this anyway. You program in the tests to software created to automate this. Have the "bots" look at things and kick out any anomalies. Those actually get looked at by a person. But, again, there was that change that necessities some segment of the overall test suite be run again. So we start from the top and run things until we decide we have run all the things that should/could have been impacted because we're really, really pushing up against a release date and running the FULL SUITE of test protocols will cost us time we really don't have. So where does it all end?
I'll tell you where it ends. Across the countless permeations of x and y and z on this OS on that version some percentages of errors will kick up. We have tested to a point where we are within a given reliability window that none of the percentages will be show-stoppers. 99%? 99.9%. 99.999%. Every decimal adds cost and time so you have to pick a number. Beyond that, you deal with it. You stand up a support desk and you, sadly, turn your users into a bunch of little guinea pigs. You hope and pray and burn sage and whatever you do hoping that no show-stoppers crop up from that .01% or .001% and you address the issues that arise that slipped through the testing that you can do. Then you track those issues. Anywhere you see hotspots, you add into your standard test suite so those hotspot areas get tested more thoroughly but even so, you can never test to 100% certainty.
I get that you're frustrated. I never would have spoken up without what I felt was an attack on people and their professionalism. QA is hard. There are practically an infinite number of permutations to test for even the smallest of changes and QA is always seen as an "expense". Revenue departments get love and funding and expense departments get cuts. That's the way it is. I don't know these people at all but I feel confident in saying they are neither lazy nor incompetent. I think we generally look at things from the micro-focus. I'm a user and I want MY issues fixed. THEY ARE THE MOST IMPORTANT. Seriously. Just ask me. But flip the coin so that your'e running the business and the bottom line is that one user's issues when copying masks made on images from this camera to images made on that camera are not top-line priorities. "Users" having issues copying masks, sure. But, likely not you specifically. As a user, that's a tough pill to swallow. I don't know what their QA looks like. Have no idea. Could I do better given my experience? Likely no. Probably not. Maybe. But what I can do is report problems that I have and give specific detail so the issue can be researched. If I have enough issues, I may have to reevaluate whether I want to reconsider the relationship. If I have an issue and it's not getting addressed, I'll certainly be angry. Might even take to the boards and fuss about it. Might have to reconsider that relationship more urgently. But that still doesn't warrant a personal attack on people in my opinion.
0 -
Why didn’t you explain from the beginning that something that worked flawlessly before might never work again, and even if it works now, it might not work in the next iteration? This isn’t just my issue; I know other photographers with the same problem using different cameras, and you can find similar complaints from photographers online. Now that I understand the issue won’t be fixed, I can make an informed business decision. Thank you.
0 -
I downloaded RAW files for 13 camera models, both landscape and portrait. I copied the settings from the landscape files of 5 cameras to the portrait files of the 13 cameras. You can see the results below. Already, there are 65 combinations that do not work, and I didn’t want to go further because it seemed pointless.
Please note that these are random files found on the first page of Google when I searched for RAW files. The failure rate is 100%.
0 -
What are you talking about? I don't work for C1. I don't speak for them. I'm just another user who took issue with you launching a personal attack against people and decided to call you out over it. I'm not here to explain anything to you. He....ck I'm here because I'm bored right now and have nothing better to do. 10 days from now I'm off again and y'all won't see me around here for at least 3 months.
Also, to your downloaded RAW files, sounds like a you problem. Working fine for me. Have you updated?
0 -
I understand that you’re defending the profession and that it’s not always possible to test every scenario. However, when you randomly select files from the internet and encounter the same issue with all of them, I believe there’s a problem that needs to be addressed.
0 -
What are you talking about? I don't work for C1. I don't speak for them.
The use of “we” in the previous post indicates that you are working for the company.
0 -
Brian Jordan First of all, I wouldn’t add new questions in an edited post after a reply. Secondly, I’m not crazy enough to run these tests without using the latest version. I’ve even tried it on multiple computer configurations. 16.4.3.15 is not the latest version?
0 -
@Sergiu, I don’t see where I can help you. I thought you were a jerk. I called that out. Then I continued the conversation as long as I thought there was value. That time’s passed.
(And I’m not a mod. That’s a bug.)
0 -
I realize this may not add much value to our discussion and perhaps I come across as a jerk, but I’ve heard the same complaint from other photographers. This remains an unaddressed issue that continues to cause problems. I am still able to reproduce the bug that I reported to the support team seven months ago.
0 -
How is this still a problem a year later? ridiculous or incompetence?
0 -
Is this funny or not?
0 -
Hello all,
This issue is set as fixed in Capture One 16.3.6 and onwards.
Please if you can reproduce it in a later version file a bug report here:
https://support.captureone.com/
Please follow the instructions so Logs and screen recordings are included, thank you.0 -
Hello, I’m getting that issue on 16.4.something (I have automatic updates turned on), just submitted a report on 8/6.
0 -
hey Brian Jordan, i was 100% sure you were a moderator! anyway, it looks like i was able to remove the mod badge. feel free to ping me if it comes back.
as for this post — please remember that one of our community rules is to treat people with respect. you are one of our most active community members, and we appreciate your involvement, but please avoid name-calling and similar behavior in the future.
thank you!
0
Please sign in to leave a comment.
Comments
89 comments