Thursday, December 14, 2017

OSB Error : Failed to set the value of context variable "body"

A common error one can get while assigning variables in OSB is  -

Failed to set the value of context variable "body". Value must be an instance of {http://schemas.xmlsoap.org/soap/envelope/}Body.

OSB has some predefined variables like body and it expects that the variable body content should always be enclosed with <soap-env:body>. Inside this element you can assign any value.

If you don't want to be worried about this every time manipulating body variable then you must use "Replace" instead of "Assign". Just select "Replace node contents" and replace will take care of not removing the <soap-env:body> from the variable. 

Monday, December 11, 2017

Difference between Route - Publish and Service Callout actions in OSB


There are mainly 3 types of MEPs (Message Exchange Pattern),to interact between end applications.
1) Synchronous interaction
2) Asynchronous interaction
3) One way interaction

Synchronous interaction is basically to communicate with end point and wait for its reply to process the response.

Service Callout: Service Call out is exactly meant for that. If you use Service Call out in any of Request or response pipeline proxy service will wait for the reply from the end application.
So when ever you are thinking to interact with synchronous process you can opt for Service call out. But this is not the only action which you can use when interacting with synchronous process.

Publish: As mentioned earlier there can be three types of interactions between application. For asynchronous interaction OSB does not provide specifically any action and it also does not suite its architecture. So if you are thinking to implement asynchronous service interaction better to choose Bpel(Oracle SOA)instead of OSB, though you can implement asynchronous interaction in OSB but it is not straight forward like Bpel.

Apart from asynchronous interaction there is another interaction named one way interaction.
OSB provide an action for this type of interaction named Publish. Publish action only sends the request and never ever waits for any response(default behavior if required one can use routing option to change its behavior)  from the back end service even if any exception happens you can not catch it using publish action because it doesn't block the request thread and doesn't wait for reply.

So far we came to know about Service call out and publish action now we will discuss the most important action in a OSB proxy flow for integrating with different application, action is called route.


Route: Route action has some special feature in a proxy flow. if you are using route in your proxy flow then route will be the last action in the flow and you can not add any other action after route action. It is route action where context is switched, what i meant to say about context switching is switching between request context to response context, that means in route action request body will be mapped to response body and similarly for all the context variables (body , fault etc).

Now inside a route you can use a routing option and you can communicate with back end applications using this routing option. you can interact with both Synchronous and oneway process using routing option.
More over it has a feature which makes the processing/interaction happen within the same transnational boundary (unless configured separately) by doing this it ensures that the response comes back from the back end application without having any dependency on the process type (Synchronous/oneway).

There is one action called routing option which is very useful for managing threads/transaction for the interaction.


Few more differentiating parameters:
Route
1.            Last node in request processing.  It can be thought of as a bridge between request pipeline processing and the response pipeline processing.
2.            You can only execute one route in your Proxy Service.
3.            Can only be created in a route node.
4.            OSB will wait for the Route call to finish before continuing to process.
a.            If you are calling a Business service and you specify Best Effort for QoS (Quality of Service), then OSB will release the thread it is holding while the business service executes.
b.            If you are calling a Business service and you specify Exactly Once or At Least Once for QoS, then OSB will hold onto the thread while the business service executes.
c.            If you are calling a local Proxy service, then OSB will hold onto the thread until the Proxy service finishes executing.
Service Callout
1.            Can have multiple Service Callout nodes in a Proxy service.
2.            Pipeline processing will continue after a Service Callout.
3.            Can be invoked from the request and/or response pipelines.
4.            Used to enrich the incoming request or outgoing response. For example, a call to get a country code.
5.            Used for real time request/response calls (Synchronous calls).
6.            OSB will hold a thread and not continue until the Service Callout completes.
7.            Can tie up resources and degrade performance under heavy loads.
Publish
1.            Can be synchronous or asynchronous
a.            If you are calling a business service with a Quality of Service of Best Effort , then it will be an asynchronous call.
b.            If you call a business service with a Quality of Service of Exactly Once or At Least Once, OSB will wait until the processing completes in the business service completes before proceeding and it is effectively a synchronous call.
c.            If you are calling a local proxy service, OSB will wait until the processing in the local proxy service completes and it is effectively a synchronous call.
2.            Can be invoked from the request and/or response pipelines.
3.            Best to use when you do not need to wait for a response from the process you are calling (Fire and Forget.... Asynchronous Calls)

Monday, December 4, 2017

Meta Data Store (MDS)

Meta Data Store:


The MDS is basically an XML store with transaction capabilities, versioning, merging and a persistence framework optimized to work with XML nodes and attributes.  The persistence framework has two adapters that can persist the store to a database or to a folder on disk. The database store has more advanced features is the recommended way of working with MDS. The file store is useful for at development time, because one can change the files manually.
SOA 11g and 12c provides us with MDS repository (both file based & DB based) where common artifacts can be stored and referenced by all projects in the SOA infrastructure.
File and DB Based MDS:
File based MDS is used during design/deploy time whereas DB based MDS is used at run-time by the SOA infra. A configuration file named adf-config.xml that is part of every SOA project manages this partition mapping to be used by JDeveloper during design/deploy time.
File Based MDS:
The idea behind file-based repositories is to allow developers to have a light repository available in their local environment that can be easily adapted for development and tests; a file-based repository relieves developers of having to configure and maintain an external database while providing necessary functionality, such as file referencing and customizations. These kinds of repositories are easily modified and maintained, since they define a directory structure similar to any other directory structure inside an operating system. They can be navigated and altered using common shell commands or any kind of visual file explorer application. The file-based repository is usually located inside the Oracle JDeveloper home (JDEV_HOME/integration) if the default configuration is used.
MDS is created under <JDEV_HOME>/integration/seed directory. Default folder “soa” is used to store common system soa artifacts. All custom artifacts are supposed to be stored under a folder called “apps”, since this folder already exists in server.

Create directory structure under apps folder. In my case, I’ve created folder structure <JDEV_HOME>/integration/seed/apps/<myproject>/common/xsd to store XML schema files. This ideally should match your schema structure.
 
·         First you go to your local 12c Jdeveloper installation folder. In my case it is  C:\1221\Oracle\Middleware\Oracle_Home\jdeveloper\integration\seed folder .You can see seed folder, If not create a seed folder.
·         Now create folder with name apps under seed.
·         Under apps you can create your folder structure to place all your XSD, WSDL files like apps/xsd/<*.xsd> etc.
·         Create MDS connection in Jdeveloper as below
                I. We can find Resource palette pane on left side of the Jdeveloper
                II. Click on New SOA-MDS Connection for creation of new SOA-MDS connection like below


In Create SOA-MDS Connection window, select File Based MDS as Connection Type for file based MDS connection. We will point to the local folder in our system where all the WSDL and XSD files are placed as shown below. So when we use the wsdl and xsd files in the project, they will be referred from the local system. Click on Test Connection, It will show the status as ‘Success’.

Click OK. Now the File Based MDS connection is created. Now we can see the MDS connection tree structure on left pane under SOA-MDS.

Once the artifacts that needs to be shared/referenced are placed under the appropriate namespace (in this case 'apps'), we can let the SOA project in JDeveloper to point to MDS for these artifacts from a common location.

adf-config.xml configuration:

As noted earlier, adf-config.xml is an important configuration file which holds the details regarding the MDS (both file based & DB based). By default, this file declares the 'seed' partition & the metadata store for 'soa' namespace.

This file can be found under <YourApplication>\.adf\META-INF.
Alternatively, in JDeveloper, expand the 'Application Resources' panel; drill through Descriptors -> ADF META-INF to find this file

Step 1: Just add the metadata store namespace for '/apps' in additions to default '/soa/shared' namespace - This is to let the JDeveloper reference the artifacts under the '/apps' during the design time from the file system metadata store.
Adf-config.xml:

Step 2: In the SOA composite.xml, reference the common artifacts from MDS as shown below;

 <import namespace="http://xmlns.oracle.com/bpmn/bpmnProcess/CommenErrorHandlerProcess"
          location="oramds:/apps/Tesco/common/CommenErrorHandlerProcess.wsdl"
          importType="wsdl"/>

There are two methods available for us to deploy the artifacts in SOA File/DB based MDS.

Approach 1: Ant based script which allows to deploy the artifacts into SOA MDS. I am not going to get to the details of this as this has been blogged in detail by Edwin Biemond.


Note: The same process is applicable for 12c too.

Approach 2: For a simple Manual Process SOA MDS deployment follow the steps below.

  • Open EM console. And click on soa-infra (soa_server (In my case Admin Server)) under SOA on left pane of EM console.
  •  Click on Administration---> MDS Configuration


  •   In the Import section choose the ZIP file which is having your MDS files.



  • Click on Import.


  •   It will import the MDS to server.



  • Click on Close button. This will complete our SOA-MDS Deployment.
DB Based MDS:
Database-based repositories are used in production environments where robustness is needed. These repositories are created using the Repository Creation Utility (RCU) application from Oracle. This utility helps with the creation of a new database schema with its corresponding tables and objects. Repositories can later be registered or deregistered via the Oracle Enterprise Manager Fusion Middleware Control console.

Let’s see how to create DB based MDS connection. In Create SOA_MDS Connection window, select  DB Based MDS as Connection Type for file based MDS connection.

For DB Based MDS connection, we should have created the entire schema for the weblogic in the database by running the Repository Creation Utility (RCU) wizard. After RCU execution, the DEV_MDS schema will get created in the database. We need to use the DEV_MDS schema connection and soa-infra as MDS Partition while creating the DB Based MDS connection as shown in the below screen shot.

Click OK. Now the DB Based MDS connection is created.

adf-config.xml configuration:

Modify adf-config.xml with desired information.
 adf-config.xml:


Sunday, September 18, 2016

Difference Between Concreate WSDL And Abstract WSDL

  
·      ●  An abstract WSDL document describes what the web service does, but not how it does it or how to contact it. An abstract WSDL document defines:

the operations provided by the web service.

The input, output and fault messages used by each operation to communicate with the web service, and their format.




● A concrete WSDL document adds the information about how the web service communicates and where you can reach it. A concrete WSDL document contains the abstract WSDL definitions, and also defines:

the communication protocols and data encodings used by the web service.

The port address that must be used to contact the web service.


More Info:


Tuesday, September 13, 2016

Multiple Directories supported in File & FTP Adapter


Multiple Directories supported in File & FTP adapter 

 The Oracle File and FTP Adapters support polling multiple directories within a single activation. You can specify multiple directories in JDeveloper as opposed to a single directory. This is applicable to both physical and logical directories.
SOA Composite:
File Service with read operation:

Wednesday, May 27, 2015

Human workflow migration In Oracle BPM 11g

      Moving Human workflow data from Test environment to Production environment
To move Oracle Business Process Management to the new production environment, you move Oracle Business Process Management user metadata, such as organizations and dashboards, from the test environment to the production environment, using the migration tool. The migration tool is available as an ant target that can be executed in the command line. It calls a configuration file that you create specifying the input parameters for the migration of data.

For Organizations, the following objects are moved to the production environment: Organizational Units, Roles, Calendars, Organization Role, and Extended User Properties.
Follow these steps to Export Human Workflow data to XML file
EXPORT
1.       Ensure that Path environment variables contain JAVA_HOME and ANT_HOME environment variables and that they point to the locations within the Oracle SOA suite installation.
2.       Create a configuration file to export Organizations. The following shows a sample configuration file that exports organizations.
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<testToProductionMigrationConfiguration xmlns="http://xmlns.oracle.com/bpm/t2p/migration/config"
<xmlns:ns2="http://xmlns.oracle.com/bpm/common" override="true" skip="true">
<sourceEndPoint>
    <serverEndPoint>
       <serverURL>t3://hostname:port</serverURL>
       <adminUserLogin>admin_username</adminUserLogin>
       <adminUserPassword>admin_password</adminUserPassword>
       <realm>jazn.com</realm>
    </serverEndPoint>
</sourceEndPoint>
<targetEndPoint>
     <fileEndPoint>
     <migrationFile>/tmp/bpm_organization.xml</migrationFile>
     </fileEndPoint>
</targetEndPoint>
<operation>EXPORT</operation>
<object>ORGANIZATION</object>
   <objectDetails>
     <login>username</login>
     <password>password</password>
    <identityContext>jazn.com</identityContext>
  <organization/>
</objectDetails>
</testToProductionMigrationConfiguration>
                   
Ø  In configuration file you need to specify the values for the test environment in the following elements

                    Migration file: This element specifies the file that was generated by the export operation
                    Server URL: SOA server URL
                    Adminuser Login
                    AdminUser Password
                    Object details: update the login and password elements

3.       Export organizations using the following command

       ant -f ant-t2p-workspace.xml
     -Dbea.home=BEA_HOME
     -Dbpm.home=BPM_HOME
     -Dbpm.t2p.migration.config=ORG_MIGRATION_CONFIG_FILE
     -Dbpm.admin.password=Admin password

After running this command bpm_organization.xml file will be generated in the path mentioned in configuration file.

With this step Exporting was completed, next we need to import it in production environment.
Follow these steps to Import Human Workflow data from  XML file
IMPORT
1.       For importing organization create a configuration file to import organization

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<testToProductionMigrationConfiguration xmlns="http://xmlns.oracle.com/bpm/t2p/migration/config" xmlns:ns2="http://xmlns.oracle.com/bpm/common" override="true" skip="true">
   <sourceEndPoint>
      <fileEndPoint>
            <migrationFile>/tmp/bpm_organization.xml</migrationFile>
      </fileEndPoint>
   </sourceEndPoint>
   <targetEndPoint>
       <serverEndPoint>
           <serverURL>t3://hostname:port</serverURL>
           <adminUserLogin>admin_username</adminUserLogin>
           <adminUserPassword>admin_password</adminUserPassword>
           <realm>jazn.com</realm>
       </serverEndPoint>
   </targetEndPoint>
    <operation>IMPORT</operation>
    <object>ORGANIZATION</object>
       <objectDetails>
         <login>username</login>
         <password>password</password>
         <identityContext>jazn.com</identityContext>
       <organization/>
       </objectDetails>
</testToProductionMigrationConfiguration>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
  
Ø  In the configuration file you must update the following elements with the values for the production environment.
                    Migration file: This element specifies the file that was generated by the export operation.
                    Server URL: SOA server URL.
                    Adminuser Login
                    AdminUser Password
                    Object details: update the login and password elements.

2.       Import organizations using the following command
                      ant -f ant-t2p-workspace.xml
                      -Dbea.home=BEA_HOME
                      -Dbpm.home=BPM_HOME
                      -Dbpm.t2p.migration.config=ORG_MIGRATION_CONFIG_FILE
                                        -Dbpm.admin.password=Admin password  
That’s it. Open the BPM workspace and check whether the data is imported or not.