top button
Flag Notify
    Connect to us
      Site Registration

Site Registration

Extraction of data from Siebel Data base to Dat file and staging table

+1 vote
540 views

I am working on a new requirment and I am new into this. So seeking your help. Requriment - From Siebel base tables (S_ORG_EXT,S_CONTACT,S_PROD_INT) I have to export data and need to put into two staging tables (S1 and S2) and from these staging tables I need to create dat files pipe delimited that include row count also. For staging table S1, we should have Accounts with their associated contacts and for S2, we should have account with its associated contact and Product.

How should I need to go about this. Should I need to use Informatica job directly to pull data from Siebel base tables or need to run EIM export job to get data in EIM table and from there to staging table.

Kindly help me know which way I should go.

posted Apr 17, 2014 by Pooja Bhanout

Share this question
Facebook Share Button Twitter Share Button LinkedIn Share Button

1 Answer

0 votes

Access the base tables directly using Informatica, limiting the extract to only the rows and columns you need.

I'd recommend unloading these to flat files before loading them into the Staging Tables (it gives you a point of recovery if something goes wrong in your Staging Table load, and means you don't have to hit the Siebel DB again).

Then from there you can either unload the staging tables, or just use your flat file extract, to generate your delimited files with row counts.

I tend to favour modular processes, with sensible recovery points, over 'streaming' the data through for (arguably) faster execution time, so here's what I'd do (one mapping for each):

1. Unload from Base Tables to flat files.
2. Join the flat file entities as required and create new flat files in the Staging Table format.
3. Load staging tables.
4. Unload staging tables (optional, if you can get away with using the files created in Step 2)
5. Generate .dat files in pipe-delimited format with the row count.

If the loading of a staging table is only for audit purposes etc, and you can base Step 5 on the files you created in Step 2, then you could perform stage (3) concurrently with stage (5), which may reduce overall runtime.

If this is a one-off process, or you just want to write it in a hurry, you could skip writing out the flat files and just do it all in one or two mappings. I wouldn't do this though, because

a) it's harder to test and

b) there are fewer recovery points.

answer Apr 18, 2014 by Shweta Singh
Similar Questions
+1 vote

I have to load the target table copy of data into the text file at the time of workflow running.(i.e)Whatever data are going into the the target table that should come into the text file also.

+1 vote

I am new to Informatica

I have one staging table where data comes everyday and I need to extract data from this staging table and convert it into Dat file format and place in into a folder. so that these dat files could be a feed for another process.

I dont know how informatica does this (Conversion of data from Staging table to Dat). So please help me to know how Informatica fetch the data from staging table, transform it into Dat file and place it into a folder.

0 votes

I have a problem in INFA where the metadata definitions from source/target tables gets out of sync with the actual definition in the underlying database. I'm working with SFDC connector in INFA, and the SFDC object is changed my integration fails. Is it possible to script to update my source/target table metadata before processing?

0 votes

I am creating an informatica workflow which can extract data from two tables of DB2 database and load into a temporary table. Suppose the two source tables name are Account (Parent) and Activities (Child). They have 1:M relationship. Means an Account can have many Activities (Account.PK = Activities.FK).

Activities table has two columns- first 'Type' whose value could be 'Paid', 'Will-Pay' or 'Not-Paid'.And second column is 'Created_Date' datetime datatype, whenever you create new activity record, date and time would get stamp in this field. Now, condition to load data in temporary table is - "For an Account record, it would 1st check in Activities table for today's Paid activities (Type = Paid). If it finds more than one paid activities, then it would pick the Latest created one (Created_Date column) out of them.

If there is no Paid activity record for the Account, then it would pick latest created 'Will-Pay' activity." Means, it should pick latest Paid activity for today (Sysdate) for an Account, if it is not present then only It will pick latest Will-pay activity for today. Please help me to understand how I can implement this logic in Informatica workflow and which transformations I should use and how? Thanks alot. Kindly help.

+2 votes

I am creating an informatica workflow which can extract data from two tables of DB2 database and load into a temporary table.

Suppose the two source tables name are Account (Parent) and Activities (Child).
They have 1:M relationship. Means an Account can have many Activities (Account.PK = Activities.FK).

Activities table has two columns- first 'Type' whose value could be 'Paid', 'Will-Pay' or 'Not-Paid'.And second column is 'Created_Date' datetime datatype, whenever you create new activity record, date and time would get stamp in this field.

Now, condition to load data in temporary table is - "For an Account record, it would 1st check in Activities table for today's Paid activities (Type = Paid). If it finds more than one paid activities, then it would pick the Latest created one (Created_Date column) out of them. If there is no Paid activity record for the Account, then it would pick latest created 'Will-Pay' activity."

Means, it should pick latest Paid activity for today (Sysdate) for an Account, if it is not present then only It will pick latest Will-pay activity for today.

Please help me to understand how I can implement this logic in Informatica workflow and which transformations I should use and how?

Thanks alot. Kindly help.

...