Recently discovered that Darktable (DT) is supported for use on Windows computers. Therefore, I’m trying to learn how to use it. The manner of interfacing with the file system seems quite strange. I’m not quite to the point that I completely grasp the distinction between things like libraries, collections, film rolls etc. but my attempts to organize the raw files are encountering difficulties. A very basic problem seems to be that on Windows drive letters on secondary drives that get mounted and dismounted may change from one use to another. Since I need to store raw files, as well as the files produced by developing them, on such drives it looks like DT is NOT able to find them when the drive letter changes from one execution to another.
This is a pretty standard problem on Windows that can be dealt with fairly easily as long as the program design provides some method of specifying the location of what DT may call collections via parameters/arguments that can be specified/altered prior to invoking the program. It would seem like this is a very common situation faced by lots of, possibly most, users. Therefore, it is suspected DT does have a means to accommodate this situation. However, as a novice, I’ve now spent a good bit of time looking for such but without any luck. Might there be a simple/straightforward solution to this problem that someone with more expertise can refer me to.
There is a command line option to indicate a different library or configuration at the start, but the problem is that all file references in the library database are absolute paths.
Besides the option indicated above by @g-man, you can try forcing Windows to assign always the same letter by giving your external drive a high letter (Z:) on Disk Managment. That way, there is no chance that a random USB drive connected will take the letter assigned to your photo drive.
Yes! That is the kind of thing I was hoping to find. I’d even done a lot of right clicking looking for something like that. I’m afraid the term “Search Filmroll” did NOT come across as what I was after.
I must admit I do NOT appreciate what the library database is accomplishing. At present I’m left to conclude that this is something more sophisticated photographers (e.g., pros with large volumes of raw files for different customers maybe) must like. However, I have also noticed that DT uses variables quite extensively. This suggests to me that some kind of common root path that serves as a prefix for lots of (possibly all) individual entries could be defined and then managed separately from that portion of the path that differs within the collections.
Having quite a bit of experience with Windows and lots of different software packages, I think I can fairly say that including the drive letter in a DB entry for each and every file is a bad idea.
Yes, I make pretty extensive use of that capability. However, that only works well for a specific drive. What I’d like to be able to do is have a bunch of different thumb/flash drives that contain raw files. As it turns out I have received some other help figuring out how to use DT in portable fashion where I can have a self contained installation on each flash drive. When compared to the amount of space needed for raw files and the resulting by-products the space required for storing DT is pretty minimal.
This also helps with another problem that all software (especially on Windows) has which pertains to compatibility problems as new releases come out. By putting a new release on a new drive (or even a new folder on an existing drive) there is NO risk of messing up past work you’ve already done in order to migrate to a new release of the software. That is probably the major reason I run just about all application software on Windows in portable fashion. The ability to easily revert to a prior release when the new one presents problems is preserved.
Even another command line parameter that can be specified at startup that simply specifies the drive letter in effect for this execution would be helpful but NOT without removing the drive letter from the library database.
Portable definitely helps. I normally use the drive assignment method. I do a combination of the two. I don’t rely on the dt database, so I don’t know how much friction that would add to the user experience. Another option is to use a symbolic link. Just takes an one-liner command to do.
I think this is opposite of how most commercial products work. Lightroom library points to a specific path and it has a similar feature to help Lightroom find the new path if you move folders around.
If you don’t need the database DAM features, but use dt only for image editing, you can define your database to be : memory:. The side effect is that you need them always import your images, of you want to re-edit them.
If by “commercial products” we mean software that costs money I will have to concede that it has been a long time since I’ve done that and even back in the prior century it would have been my employer who is paying for it. This does have a good bit to do with my lack of interest in Lightroom or the various other products Adobe is selling for photo processing.
I think my point comes down to the idea that drive letters are a Windows anomaly that have nothing to do with the file system and how it is structured for the purpose of storing the files in question. In that, I’d argue that my files have NOT moved and this would be better reflected in the library database if the mount point were handled separately. There should be no reason to change the database, which by the way I have located on the same flash drive as the raw files and the DT program, just to change the mount point. I write some pretty simple scripts that can detect which drive the script is invoked from and pass that to any software that can make use of that data.
With that said I also acknowledge that I need to better learn what DT is doing, in this case with the library database, before recommending design changes. The help provided herein is making that possible. Thanks for that.
I did receive some other help that referred me to those parameters. That was what allowed me to get everything (DT, library database, and raw files) installed on the same USB drive. As previously mentioned I don’t yet know what benefit the library database is trying to offer. I expect that to come with more overall knowledge of the specific capabilities of DT.
As it happens I do understand relational database design. Is it possible that a data model has been published somewhere for this database? Having access to that could accelerate the learning process for me.
Drives and partitions and filesystem’s are only useful in relation to their operating system. This is just bad design on windows’ part. Your files have changed location when the drive letter changes, as the drive letter is part of the URI.
I’d argue that this would cause more confusion among windows users, a group that, in general, doesn’t seem to have a good grasp on how their computer works. It would increase the support load for us volunteers and increase stress on the project.
It seems a lot better of an idea to make sure that your drive is mounted at the same letter, which people have been doing a long time on windows, the procedures to do so are well understood and documented, and people can help themselves with this issue which is not specific to dark table at all.
To be sure I was NOT thinking of using DT as a viewer/organizer. However, I do expect it may take more than one session to develop some raw files. If everything needed to normally develop a raw file is retained in the sidecar (.xmp) file then it sounds like the “memory” option could be worth trying. In that, there is nothing in the library database that affects raw file development. Yes?
Memory just creates a tmp library for the files you work on in that session and then deletes it at the end so its like setting up DT to be a pure editor…that is my understanding. I have used it that way most of the time…no issues…