My goal here is to read from one directory and write out to another. The database is running version 12.0.1 build 3592. I think I have already addressed any permissions issues as I can successfully read and write into both folders. The source folder table contains 94,000+ entries of various folders, subfolders, and files I am trying to use a procedure that takes a documentID, reads the path from one of our link tables, pulls the contents of the file with that same path and inserts those contents with a new name into the destination table, writing the file to the folder. The first issue that I ran into is using "...where file_name = XYZ " to compare the filename to a string (file path) crashes the database. I have not checked to see if this is addressed in an EBF. So I am instead using the like comparison. When I run the select statement with the like comparison it returns the results instantly: select first '87test.jpg' as file_name, SourceFile.contents as contents from ClientData as SourceFile where SourceFile.File_Name like 'Long Client Name\2012\Classification\Long Address\Images\testpic.jpg' ; But when I use the same select with an insert (either with auto name or specifying the columns) it does not finish. I have let it run for 20 minutes without a result. I added the "first" to the select to try to make it faster, but it does not seem to help. insert into clientdataland (file_name, contents) select first '87test.jpg' as file_name, SourceFile.contents as contents from ClientData as SourceFile where SourceFile.File_Name like 'Long Client Name\2012\Classification\Long Address\Images\testpic.jpg' ; This statement will run quickly, but I assume it is because the file name is not as long as the previous one so the like performs more quickly: insert into ClientDataLand with auto name select first 'NewFileName.pdf' as File_Name, Contents from clientdata where ClientData.file_name like 'Scan 001.pdf' As it seems the like with a long file name is causing the issue I would rather use the = , but that crashes the database. This is how I create the directory servers: CREATE SERVER ClientDataSRV CLASS 'DIRECTORY' USING 'ROOT=D:\Data\ClientData\;SUBDIRS=100;CREATEDIRS=YES'; CREATE EXTERNLOGIN user1 TO ClientDataSRV; CREATE EXISTING TABLE ClientData AT 'ClientDataSRV;;;.'; CREATE SERVER "ClientDataLND" CLASS 'DIRECTORY' USING 'ROOT=D:\Data\DestData\;SUBDIRS=100;CREATEDIRS=YES'; CREATE EXTERNLOGIN user1 TO ClientDataLND; CREATE EXISTING TABLE ClientDataLand AT 'ClientDataLND;;;.'; With all that background, my questions are how do I make the insert (or like comparison if that is the real issue) perform faster, and is the = issue a bug. |
Just as another hint: As you have the documentID and can access the document path, wouldn't it be easier/faster to use xp_read_file()/xp_write_file() to copy the file contents between source and destination? |
If you have a repro for the crash, please send it to me and I will have a look at it.
Question: How long is your long file name?
Of course I tried to setup a repro on my test DB and it works just fine. The file name I'm using to test is 70 characters, but they could be more or less than that. For now I think what I am going to do is create the table on the fly as far down the structure as possible up to the file name, and then compare the file name.
Once I get that working I'll try to get the crash replicated.
Could missing doubled backslashes as path separator be an issue here?
(Or are they just not shown correctly here?)
Yep, Im wild guessing...