2013-11-07 07:54 AM
I have a couple of Oracle data sources configured on WFA 18.104.22.1681.2
These have been connecting with no issues, until I upgraded to 2.1 yesterday.
I then received the following error;
Listener refused the connection with the following error:
ORA-12505, TNS:listener does not currently know of SID given in connect descriptor
Now I have been through this error before when I first configured the Oracle data sources and it was down to the SID being incorrect. In 2.0.0 there is a field named 'Default database' when you configure the Data Source Type, this should be named SID as that is what the field deals with. Anyway, once this was figured out the error resolved and it connected.
Fast forward to 2.1 upgrade - the same error appears!
I looked at the listener logs for before the upgrade and after the upgrade and found the SID is no longer being passed - and the 'Default database' field no longer exists in v2.1
successful connection on v2.0.0 :
06-NOV-2013 11:11:33 * (CONNECT_DATA=(CID=(PROGRAM=JDBC Thin Client)(HOST=__jdbc__)(USER=LONINASP0011$))(SID=LNDSC1P1)) * (ADDRESS=(PROTOCOL=tcp)(HOST=10.240.152.76)(PORT=50805)) * establish * LNDSC1P1 * 0
failed attemp on v2.1:
06-NOV-2013 22:51:48 * (CONNECT_DATA=(CID=(PROGRAM=JDBC Thin Client)(HOST=__jdbc__)(USER=LONINASP0011$))(SID=null)) * (ADDRESS=(PROTOCOL=tcp)(HOST=10.240.152.76)(PORT=62347)) * establish * null * 12505
Has anyone come across this issue or how to set the SID in v2.1? Or is service_names are being used which field in 2.1 is it inserted - I have tried all sorts of permitations.
2013-11-07 06:24 PM
Unfortunately this is a bug that got introduced in 2.1 when the Add new Datasource type page was redesigned and the field 'Default database' got ommitted. In WFA2.1 there is just no way to specify a Database Name for any datasource type. So any Datasource which connect using Oracle JDBC or MS SQL drivers will always fail during acquisition.
This bug has been identified and its fix will be available in the next WFA release. Till then your inconvenience is regretted.
Till then, you can redesign your Datasource type using Method: Script and write a Powershell script to Connect to your Oracle DB using JDBC or ODBC drivers. To help you with this, I can provide you a sample script and steps to do it. Just give me some time.
2013-11-08 03:36 AM
Below are the steps:
You'll need PowerShell 3.0 on your WFA server for it. PowerShell 3.0 is quite good compared to 2.0 so having it will always be better. WFA supports PowerShell 3.0 . Windows 2012 comes already with it. Windows 2008 will need an upgrade. Windows 2003 will not work at all.
1 . Download the 64-bit Oracle Data Acces Component (ODAC) from here : http://www.oracle.com/technetwork/database/windows/downloads/index-090165.html . Take the first one named ODP.NET_Managed121010.zip. (1.9MB) Extract the zip and find the file called Oracle.ManagedDataAccess.dll. We only need the file Oracle.ManagedDataAccess.dll for connecting. So copy it to some location like C:\Oracle .
2. Import the attached Data Source Type file. Its a sample script ( crudely written but working ).
3. You can retain all your dictionary items. For every dictionary item, you'll need to add a .csv file name. The name of this file and your dictionary item should be same. In my example my dictionary item was named: student_oracle and hence myfile name was student_oracle.csv . Have your sql queries as you were having in your older Datasource queries.
You can get details about creating a script type Datasource from WFA documentations or some videos. Its very simple.
Add a new Datasource of this type and you are done.
Tell me if you need more help.
2013-11-08 04:20 AM
That is dispointing but thanks for the workaround. As I'm strapped for time I will stick with version 2.0 but will try your workaround in UAT.
Do you have idea of when the fix will be released?
While on the Data Source subject - the update interval seems to have a maximum of 1440 minutes?, could this be lifted. I have some very large LDAP data which takes sometime to process but the data is largely static, so it would be preferable to run this once a week.