Problem:
Need to connect to SQL Server Database using SQL Developer
Solution:
This is an option available under SQL Developer which will allow us to connect to Third-Party Database systems. If we configure our SQL Developer, we can connect to SQL Server database.
Below mentioned are the steps which will ensure SQL server connectivity:
1) You need to have a Jar file in place for SQL server.
2) Download jTDS SQL Server driver - an open source JDBC driver from:-'http://sourceforge.net/projects/jtds/' to your local drive.
3) Unzip the folder
4) Open SQL Developer
5) Goto Tools -> Preferences -> Database -> Third Party JDBC Drivers
6) Click 'Add Entry', navigate to the unzipped folder
7) Select jtds-1.2.5.jar and click OK
8) Now goto the connections window and create a new connection
9) Give some connection name, enter proper username and password
10)Select 'SQLServer' Tab
11)Enter proper Hostname and leave Port as 1433 (Default). Change the port number only if it is running on running on some different port.
12)Click on Retrive Database button to fetch the Database name you want to connect
13) Test the connection
14) SUCCESS, Save the connection and press the connect button
It works!!!
Technical posts containing information and solutions on Oracle Fusion Middleware 12c/11g/10g, AIA, Oracle BPEL PM, Mediator, OSB, MFT, ESB, OWSM, Business Rules, JDeveloper, Oracle Weblogic, Oracle Application Server etc..etc..
Friday, January 28, 2011
Sunday, January 16, 2011
Handling Large Payload Files
Problem:
Want to create a BPEL process to read, transform and translate a large payload file (e.g. size 1GB)?
Thoughts:
Reading and traversing large payload file was always a problem in SOA 10g. Supposedly Maximum limit for such operations are for less than 7MB files in 10g.
Solution to such problems are answered in 11g where you can read and transform huge payload files.
Solution:
Suppose you have a CSV file incoming from source directory which you want to transform to a Fixed length file. Keep the source and the destination schema (XSD) ready. Also prepare the Transformation file (XSL) and keep it handy.
For accomplishing this, you will only be using one File Adapter which will take care of your read, transform and write - so all the three operations takes place under single I/O interaction.
Follow the steps below for making it possible:
1) Drag and drop a file adapter in the external references swim lane. (For e.g. Give name as FileMove.)
2) Select Synchronous File Read
3) Give any dummy value for File physical path and File Name. It will be changed manually later.
4) Select Native format translation is not required (Opaque Schema)
5) Outbound File Adapter is now configured, click on Finish
6) Open the relevant .jca file (in this case FileMove_file.jca)
7) Ensure that className should be
"oracle.tip.adapter.file.outbound.FileIoInteractionSpec"
8) Add extra parameters as shown in below file:
<adapter-config name="FileMove" adapter="File Adapter"
xmlns="http://platform.integration.oracle/blocks/adapter/fw/metadata">
<connection-factory location="eis/FileAdapter" adapterRef=""/>
<endpoint-interaction portType="FileMove_ptt" operation="FileMove">
<interaction-spec
className="oracle.tip.adapter.file.outbound.FileIoInteractionSpec">
<property name="SourcePhysicalDirectory" value="testDir1"/>
<property name="SourceFileName" value="test1"/>
<property name="SourceSchema" value="xsd/source-csv.xsd"/>
<property name="SourceSchemaRoot value="Root-Element"/>
<property name="SourceType" value="native"/>
<property name="TargetPhysicalDirectory" value="testDir2"/>
<property name="TargetFileName" value="test2"/>
<property name="TargetSchema" value="xsd/destination-fixedLength.xsd"/>
<property name="TargetSchemaRoot value="Root-Element"/>
<property name="TargetType" value="native"/>
<property name="Xsl value="xsl/SourceToDestination.xsl"/>
<property name="Type" value="MOVE"/>
</interaction-spec>
</endpoint-interaction>
</adapter-config>
9) Save the file and deploy the process.
Note:
This will works only if all the records in the data file are of the same type.
Want to create a BPEL process to read, transform and translate a large payload file (e.g. size 1GB)?
Thoughts:
Reading and traversing large payload file was always a problem in SOA 10g. Supposedly Maximum limit for such operations are for less than 7MB files in 10g.
Solution to such problems are answered in 11g where you can read and transform huge payload files.
Solution:
Suppose you have a CSV file incoming from source directory which you want to transform to a Fixed length file. Keep the source and the destination schema (XSD) ready. Also prepare the Transformation file (XSL) and keep it handy.
For accomplishing this, you will only be using one File Adapter which will take care of your read, transform and write - so all the three operations takes place under single I/O interaction.
Follow the steps below for making it possible:
1) Drag and drop a file adapter in the external references swim lane. (For e.g. Give name as FileMove.)
2) Select Synchronous File Read
3) Give any dummy value for File physical path and File Name. It will be changed manually later.
4) Select Native format translation is not required (Opaque Schema)
5) Outbound File Adapter is now configured, click on Finish
6) Open the relevant .jca file (in this case FileMove_file.jca)
7) Ensure that className should be
"oracle.tip.adapter.file.outbound.FileIoInteractionSpec"
8) Add extra parameters as shown in below file:
<adapter-config name="FileMove" adapter="File Adapter"
xmlns="http://platform.integration.oracle/blocks/adapter/fw/metadata">
<connection-factory location="eis/FileAdapter" adapterRef=""/>
<endpoint-interaction portType="FileMove_ptt" operation="FileMove">
<interaction-spec
className="oracle.tip.adapter.file.outbound.FileIoInteractionSpec">
<property name="SourcePhysicalDirectory" value="testDir1"/>
<property name="SourceFileName" value="test1"/>
<property name="SourceSchema" value="xsd/source-csv.xsd"/>
<property name="SourceSchemaRoot value="Root-Element"/>
<property name="SourceType" value="native"/>
<property name="TargetPhysicalDirectory" value="testDir2"/>
<property name="TargetFileName" value="test2"/>
<property name="TargetSchema" value="xsd/destination-fixedLength.xsd"/>
<property name="TargetSchemaRoot value="Root-Element"/>
<property name="TargetType" value="native"/>
<property name="Xsl value="xsl/SourceToDestination.xsl"/>
<property name="Type" value="MOVE"/>
</interaction-spec>
</endpoint-interaction>
</adapter-config>
9) Save the file and deploy the process.
Note:
This will works only if all the records in the data file are of the same type.
Exporting ESB Metadata
Requirement:
Exporting entire ESB metadata
Solution:
Follow the steps below to achieve the result:
1) Login to SOA Server (10g) from backend viz. using Putty, and set your environment.
2) Navigate to <SOA_HOME>/integration/esb/bin
3) Run the export.sh command to export entire ESB metadata like:
./export.sh ESBmetadata.zip
4) You will see a output message as:
Setting Protocol = http
Dec 29, 2010 11:14:16 AM oracle.tip.esb.lifecycle.HttpHandler logInfo
INFO: Exporting metadata from the server [host=oradev,port=7777,user=oc4jadmin] to
file "/u01/app/oracle/product/10g/OracleAS_3/integration/esb/bin/ESBmetadata.zip"
Dec 29, 2010 11:14:16 AM oracle.tip.esb.lifecycle.HttpHandler main
INFO: Export completed successfully!!!
Note:
Steps shown above are done on SOA Server 10g running on Linux environment.
Exporting entire ESB metadata
Solution:
Follow the steps below to achieve the result:
1) Login to SOA Server (10g) from backend viz. using Putty, and set your environment.
2) Navigate to <SOA_HOME>/integration/esb/bin
3) Run the export.sh command to export entire ESB metadata like:
./export.sh ESBmetadata.zip
4) You will see a output message as:
Setting Protocol = http
Dec 29, 2010 11:14:16 AM oracle.tip.esb.lifecycle.HttpHandler logInfo
INFO: Exporting metadata from the server [host=oradev,port=7777,user=oc4jadmin] to
file "/u01/app/oracle/product/10g/OracleAS_3/integration/esb/bin/ESBmetadata.zip"
Dec 29, 2010 11:14:16 AM oracle.tip.esb.lifecycle.HttpHandler main
INFO: Export completed successfully!!!
Note:
Steps shown above are done on SOA Server 10g running on Linux environment.
Monday, January 3, 2011
ORABPEL Spying-Part2
Requirement:
To get all BPEL instances and their duration that run longer than 'n' seconds
Solution:
SELECT
process_id, creation_date,
SUBSTR(modify_date-creation_date,12) Duration,
SUBSTR(REGEXP_SUBSTR(title, '[^ ]+', 1, 2), 2) InstanceId
FROM
cube_instance
WHERE
TO_CHAR(creation_date, 'YYYY-MM-DD HH24') >= '<Date_value>'
AND
TO_CHAR(creation_date, 'YYYY-MM-DD HH24') <= '<Date_value>'
AND
(modify_date-creation_date) > '0 0:0:n.0'
AND
process_id IN ('<Name of Process>')
ORDER BY
modify_date DESC
Example:
Here n=45 seconds
SELECT
process_id, creation_date,
SUBSTR(modify_date-creation_date,12) Duration,
SUBSTR(REGEXP_SUBSTR(title, '[^ ]+', 1, 2), 2) InstanceId
FROM
cube_instance
WHERE
TO_CHAR(creation_date, 'YYYY-MM-DD HH24') >= '2010-10-12 15'
AND
TO_CHAR(creation_date, 'YYYY-MM-DD HH24') <= '2011-01-01 21'
AND
(modify_date-creation_date) > '0 0:0:45.0'
AND
process_id IN ('TestProjectBPEL')
ORDER BY
modify_date DESC
To get all BPEL instances and their duration that run longer than 'n' seconds
Solution:
SELECT
process_id, creation_date,
SUBSTR(modify_date-creation_date,12) Duration,
SUBSTR(REGEXP_SUBSTR(title, '[^ ]+', 1, 2), 2) InstanceId
FROM
cube_instance
WHERE
TO_CHAR(creation_date, 'YYYY-MM-DD HH24') >= '<Date_value>'
AND
TO_CHAR(creation_date, 'YYYY-MM-DD HH24') <= '<Date_value>'
AND
(modify_date-creation_date) > '0 0:0:n.0'
AND
process_id IN ('<Name of Process>')
ORDER BY
modify_date DESC
Example:
Here n=45 seconds
SELECT
process_id, creation_date,
SUBSTR(modify_date-creation_date,12) Duration,
SUBSTR(REGEXP_SUBSTR(title, '[^ ]+', 1, 2), 2) InstanceId
FROM
cube_instance
WHERE
TO_CHAR(creation_date, 'YYYY-MM-DD HH24') >= '2010-10-12 15'
AND
TO_CHAR(creation_date, 'YYYY-MM-DD HH24') <= '2011-01-01 21'
AND
(modify_date-creation_date) > '0 0:0:45.0'
AND
process_id IN ('TestProjectBPEL')
ORDER BY
modify_date DESC
ORABPEL Spying-Part1
Requirement:
Want to get the information on currently running processes - Shortest and Longest running instances on server?
Solution:
Below query will give you the desired result:
SELECT * FROM (
SELECT
bpel_process_name AS "ProcessName",
TO_CHAR(MIN(creation_date),'YYYY-MM-DD HH:MI') AS "EarliestDate",
COUNT(*) AS "TotalRunningProcesses",
TO_NUMBER(SUBSTR(MIN(sysdate-creation_date), 1,
INSTR(MIN(sysdate-creation_date), ' '))) AS "ShortestRunning (Days)",
SUBSTR(MIN(sysdate-creation_date),
INSTR(min(sysdate-creation_date),' ')+1,8) AS "ShortestRunning (Hours)",
TO_NUMBER(SUBSTR(MAX(sysdate-creation_date), 1,
INSTR(MAX(sysdate-creation_date), ' '))) AS "LongestRunning (Days)",
SUBSTR(max(sysdate-creation_date),
INSTR(MAX(sysdate-creation_date),' ')+1,8) AS "LongestRunning (Hours)"
FROM ORABPEL.bpel_process_instances
WHERE state = 1
GROUP BY bpel_process_name
ORDER BY "EarliestDate" DESC
)
NOTE: Query for Oracle BPEL Process Manager 10g (10.1.3.x)
Want to get the information on currently running processes - Shortest and Longest running instances on server?
Solution:
Below query will give you the desired result:
SELECT * FROM (
SELECT
bpel_process_name AS "ProcessName",
TO_CHAR(MIN(creation_date),'YYYY-MM-DD HH:MI') AS "EarliestDate",
COUNT(*) AS "TotalRunningProcesses",
TO_NUMBER(SUBSTR(MIN(sysdate-creation_date), 1,
INSTR(MIN(sysdate-creation_date), ' '))) AS "ShortestRunning (Days)",
SUBSTR(MIN(sysdate-creation_date),
INSTR(min(sysdate-creation_date),' ')+1,8) AS "ShortestRunning (Hours)",
TO_NUMBER(SUBSTR(MAX(sysdate-creation_date), 1,
INSTR(MAX(sysdate-creation_date), ' '))) AS "LongestRunning (Days)",
SUBSTR(max(sysdate-creation_date),
INSTR(MAX(sysdate-creation_date),' ')+1,8) AS "LongestRunning (Hours)"
FROM ORABPEL.bpel_process_instances
WHERE state = 1
GROUP BY bpel_process_name
ORDER BY "EarliestDate" DESC
)
NOTE: Query for Oracle BPEL Process Manager 10g (10.1.3.x)
HAPPY NEW YEAR
I wish all the readers a very and prosperous Happy New Year 2011.
Hope to see new things coming this year from Oracle.
Hope to see new things coming this year from Oracle.
Subscribe to:
Posts (Atom)