2013-11-20 01:24 PM
For the longest time I had a custom datasource created that would read entries from a CSV file on the WFA server and copy it to a dictionary object that I could query.
I used Goodrum's tutorial in order to do this. Now, for the last several weeks, pulling in this csv file has been failing.
The file still exists in the same place, yet it throws "The file named dfs_region.csv required to populate tabledfs.dfs_region was not found"
I have tried moving the file to the WFA installation directory and updating the script to point to the new location and it still doesn't work.
Any ideas would be really helpful. Thanks
Solved! SEE THE SOLUTION
2013-11-20 01:47 PM
Also, I know that it sees the csv file. Because if I rename the csv file to something else while the script is still looking for the original name, I get this instead:
"Cannot find path 'C:\wfa\dfs_region.csv' because it does not exist."
Additionally I found that I cannot delete the dictionary entry related to the custom datasource. When I attempt to I receive this cryptic error:
org.hibernate.exception.ConstraintViolationException: Cannot delete or update a parent row: a foreign key constraint fails (`wfa`.`workflow_command_mapped_object`, CONSTRAINT `fk_workflow_command_mapped_object_dictionary_entry_id` FOREIGN KEY (`dictionary_entry_id`) REFERENCES `dictionary_entry` (`id`))
2013-11-20 01:56 PM
I had a weird issue that popped a couple of times. Ultimately, I ended up deleting all of the files in the tmp location and then the import worked with no issues. It was a weird issue. I also saw that the copy was working (if I deleted the file manually, a copy would show up even though there was a failure). After I dumped everything and got the import to work... I had no more issues.
Jeremy Goodrum, NetApp
2013-11-20 02:07 PM
Did you just delete the files in the mysql\data\tmp directory?
When I stop the WFA services these files are removed automatically. I then restarted the service. Tried to run the data acquistion, and it still fails.
This all started happening after I imported a custom workflow that was provided to us by Yaron. This workflow was for testing Cluster-Mode failover and added it's own dictionary entries. Unfortunately I didn't realize there were issues until after the 2 days of backups that WFA keeps.
Not being able to delete that dictionary object makes me think something happened to the custom tables within the database. I tried to open a NetApp case on this but because it is custom, they told me to resolve it through here.
2013-11-20 02:15 PM
So the files that are imported would be found here - C:\Program Files\NetApp\WFA\jboss\standalone\tmp\wfa
The issue with the foreign key is likely because you have a command using a reference variable.
Jeremy Goodrum, NetApp
2015-10-01 11:15 AM
I know this thread is pretty old but I've been having the same problems with WFA 3.0P1 on Windows Server 2012.
If there were any of the temporary .dump files in the temp directory my custom datasource aquisition would fail. I added this code to the beginning of the data source script to remove the .dump files before trying to acquire my datasource and things have been working well since:
Get-WFALogger -message ("Removing all .dump files from temporary directory") $tempDir = "C:\Program Files\NetApp\WFA\jboss\standalone\tmp\wfa” Get-ChildItem $tempDir *.dump | remove-item
2017-01-03 10:15 AM
Kudos for the suggestion on removing .dump files from the WFA temp directory before importing the datasource file - this worked perfectly! WFA v4.0
Previously my symptoms were; I'd create the datastource and initlal import would be successfull, so I'd think I was all set. But subsequent imports would fail with exactly the error described at the top of the thread.
2017-05-21 05:01 PM
I would like to add that if you're creating a cusom data source in WFA running on a Linux server, tha path for the tmp directory is this:
I experienced the same error message as described in this thread, but the cause was different. The cause in my case was that the source filename contained upper-case letters. When I changed the source and destination filenames to be all lower-case letters, the data acquisition succeeded. Go figure!