Custom Script Based Data Source - Time Outs

I have a custom datasource script that goes out and discovers all of our DFS Namespaces, and all of their Targets. The first part that just gathers the namespaces and puts into a CSV goes and updates fine. The second table that I have which pulls all the DFS Links inside the namespaces takes about an hour and half to complete. That part is fine and is expected as well. The part that fails is when the csv file gets uploaded into the database it is running into an issue where it is taking too long to upload.  Below is the error I get.

Acquisition timed out as it took longer than 15 minutes to load acquired data into the cache database - ARJUNA016063: The transaction is not active!    

Anyone have any suggestions. Is there any settings that I can modify to to increase this timeout?

Re: Custom Script Based Data Source - Time Outs



     What is your WFA version? The possible setting change depends on the WFA version.

For WFA2.1 and above, see my reply ( towards the end) in this post:

Note: The default config setting provided by NetApp is optimal for the most common use cases. Modifying the JBoss config may have unwanted issues and not all of them have been known or verified by NetApp.

If this post resolved your issue, help others by selecting ACCEPT AS SOLUTION or adding a KUDO.

Re: Custom Script Based Data Source - Time Outs

We are running

Re: Custom Script Based Data Source - Time Outs

How big is the data the custom script is trying to acquire?

The transaction timeout happens if the data is very very large.

You need to see why the custom script is taking so long time.

Where is the most time taken when the acquisition happens?


Re: Custom Script Based Data Source - Time Outs

The file it is trying to pull in is 79000 lines long with three columns. (DFS Path, Target Path, and the state of that link (ONLINE/OFFLINE)

I'm not sure how I would check where the acquisition is being hung up at.

Re: Custom Script Based Data Source - Time Outs

Hi Cory,

               That does not seem like a very large dataset. I would attempt a manual import using mysql work bench and see how long that takes. I would look for a data formatting issue in the input file or a bad charter in the file.  If you are retrieving the via powershell you may want to consider using the mysql insert process documented in other scripts on the wfa site.