Validation Transform In BODS

Hi All,
I'm trying to use the validation transform in BODS. The scenario is:
I have declared a variable $Product having value 'Oil','Gas','Fuel'
when I use the validation product IN ($Product) the validation fails even for the valid product entries.
But when I tried with giving the range of values directly instead of through variable like product IN ('Oil','Gas','Fuel') it works fine.
I'm in need of using it through the variable only..Any work around?? Please help....
Thanks and Regards,
Prateek

Prateek, please try this and let me know if this helps u..
Create a custom function (say Get_Match_Status) which accepts two parameters viz. the column value and the list of values (Oil, Gas and Fuel, in ur case). This should return 0 if the column value doesnt match with any of the list values. Otherwise it should return 1. Add a new column (Match_Status) to your query transformation (just before the validation transform) and call this function by passing the column 'Product' as the first argument and the variable '$Product' as the second argument - Get_Match_Status(Product,$Product). In the validation transform you can check the value of the column - Match_Status (this can be either 1 or 0)
Function body:
$Flag = 0;
while(length($In_Compare_Text )>0)
begin
     $Break_Text = word_ext( $In_Compare_Text ,1,',');
     if($Break_Text = $In_Text)
     begin
          $Flag = 1;
     end
     $In_Compare_Text = substr($In_Compare_Text ,index($In_Compare_Text,',',1)+1,length($In_Compare_Text)-index($In_Compare_Text,',',1));
end
if($Flag = 1)
begin
     return 1;     
end
else
begin
     return 0;     
end
Need to create two input parameters:
$In_Compare_Text   varchar   4000
$In_Text        varchar   30
Create two variables also:
$Break_Text    varchar     30
$Flag      int;
The return parameter has to be of type int
Regards,
Shine

Similar Messages

  • Large field even after validation transform

    Hi, I am working on BODS 4.0 with Oracle 11.2 as Repo. I am using validation transform to calculate the length of the string . Validation will fail if record length is greater than 60. I am sending pass records to table temp1 and send failed records table temp2. I have defined the field length to 60 in temp1.
    After this when I run Job then I get error on table temp1 that record length is 65 and field is of 60. I am working with Russian data. Russian is multi byte character CYRILLIC data.
    Now if SAP has field defined as  varchar(60) then how can I make sure that only records with length 60 or less are send.
    Even though I am checking length 60 in validation but inactual it is taking more space and can not be inserted into field of length 60.
    Thanks,

    Hi,
    I've done the same test on BODS 3.1 ver and didn't see any error messages and Job run successfully with correct results.
    I've given one field as 90 chars.
    Please let me know if you need any further help . My emails is mohan.salla at telegraph.co.uk
    cheers

  • How to create User-defined Transform in BODS

    Hi,
    Is there documents explaining how to create a User-defined Transform in BODS Designer?
    Thanks
    Rex

    Hi,
    Please follow the  steps mentioned in below link
    http://help.sap.com/saphelp_nw04/helpdata/en/f8/2857cbc374da48993c8eb7d3c8c87a/frameset.htm
    also refer below links to know more about UDF
    udf
    Thanks
    Swarup
    Edited by: Swarup Sawant on Mar 3, 2008 3:59 PM

  • DI Job Failed with no reason since last week : Invalid schema is specified for PASS output in the Validation Transform

    Post Author: Nono44
    CA Forum: Data Integration
    Hi,
    I have problem with DI job. That job doesn't work since friday.
    The error is :
    (11.7) 04-21-08 09:48:56 (E) (2588:2396) XRN-130125: |SESSION
    OracleApps_GLFacts_Load|STATEMENT
    <GUID::'93342049-fbd9-4942-bab2-5565e8e248bf' TRANSFORM Validation
    OUTPUT(Validation_Pass,
    Validation_Fail)>
                                                         Invalid schema is specified for PASS output in the Validation Transform  Does anybody know what is it ?
    Thanks a lot.
    Arnaud

    Post Author: Nono44
    CA Forum: Data Integration
    Hi,
    I have problem with DI job. That job doesn't work since friday.
    The error is :
    (11.7) 04-21-08 09:48:56 (E) (2588:2396) XRN-130125: |SESSION
    OracleApps_GLFacts_Load|STATEMENT
    <GUID::'93342049-fbd9-4942-bab2-5565e8e248bf' TRANSFORM Validation
    OUTPUT(Validation_Pass,
    Validation_Fail)>
                                                         Invalid schema is specified for PASS output in the Validation Transform  Does anybody know what is it ?
    Thanks a lot.
    Arnaud

  • Ways to retrieve Validation transform information for documentation

    Hi All,
    We have a requirement which needs to document validation rule in all validation transform of data flow in a project.
    Does anyone know if there is a way which we can retrieve validation rules from DI repository? E.g. a SQL statement.
    I searched the DI repo tables but no luck.
    Hope someone can give me some light.
    Thanks,
    Bobby

    In the Auto Documentation in the Management Console, you can browse all the objects in your repository and for the validation transform it will show the rules used in that transform. This is for browsing only though, the print feature does not print the details for any of the transforms, so also for the Validation transform the details (i.e. the validation rules) are not printed.
    Unfortunately, today there is not an easy way to extract the validation rules via a simple query. It is a common request though that we will address in a future release (import/export of validation rules).
    That being said, technically there are some options to get access to the rules. Not straightforward though...
    The validation rules are part of the transform definition, which on its turn is part of the whole dataflow definition. So they are not stored as separate objects in a table somewhere, but embedded into the ATL language we use to describe the DI objects. What you could do is export the repository to an ATL file (or select from AL_LANG_TEXT table in the repo) and scan this file for the validation rules. Below is an example of how such a rule would look like, in this case the rule name is "MyRuleName" and the rule is  : Query.JOB_KEY = 1.
    As a side note, the part that would be easy to get via a SQL query are the run-time statistics of the validation transform. These statistics (number of records passed/failed for each validation rule) are stored in the repo tables al_qd_* and are also used by the Validation Dashboards in the management console. Keep in mind that you need to check the option to 'collect  data validation statistics' in order to collect these details.
    CALL TRANSFORM Validation ()
    INPUT(Query)
    OUTPUT(Validation_Pass ( JOB_NAME varchar(192),
    JOB_KEY int,
    JOB_RUNID varchar(384),
    RUN_SEQ int,
    PATH varchar(765),
    OBJECT_NAME varchar(765),
    OBJECT_TYPE varchar(765),
    ROW_COUNT varchar(765),
    START_TIME varchar(765),
    END_TIME varchar(765),
    EXECUTION_TIME varchar(765),
    DATAFLOW_NAME varchar(765),
    JOB_ID int ) ,
    Validation_Fail ( JOB_NAME varchar(192),
    JOB_KEY int,
    JOB_RUNID varchar(384),
    RUN_SEQ int,
    PATH varchar(765),
    OBJECT_NAME varchar(765),
    OBJECT_TYPE varchar(765),
    ROW_COUNT varchar(765),
    START_TIME varchar(765),
    END_TIME varchar(765),
    EXECUTION_TIME varchar(765),
    DATAFLOW_NAME varchar(765),
    JOB_ID int,
    DI_ERRORACTION varchar(1),
    DI_ERRORCOLUMNS varchar(500) )  )
    SET("validation_rules" = '<?xml version="1.0"; encoding="UTF-8"?>
    <Rules collectStats="true" collectData="false" >
    <Column name= "Query.JOB_KEY"; enableValidation="true" noValidationWhenNull="false" >
    <RuleName> MyRuleName </RuleName>
    <Description></Description>
    <Expression uiSelection="1">
    <UIValue1>=</UIValue1>
    <UIValue2>1</UIValue2>
    <Custom> Query.JOB_KEY = 1 </Custom>
    </Expression>
    <Action sendTo="0" substOnFail="false" substValue="" />
    </Column>
    </Rules>

  • Tuning the validation transform in Data Integrator

    Hi,
    We have to bring 21 million records from a table to a new table using validation transform.
    We are selecting PK and putting a validation condition "Exists in a table " on it. The secondary table selected is the target table and we are taking Failed records and passing them to the target table.
    So here we have a source table, we are taking failed records from validation transform having Existing in a table condition. The table selected here is the target table itself.
    Its taking a lot of time.
    Please pour in some suggestion to improve its performance. We have to track the changes in daily basis in this table and the source table doesn't have last updated/timstamp.

    There are two possible problems here:  one is that you are processing all of this data outside of the database when you should be pushing the work down to the database.   If you must use the validation transform then you can try outer joining to the target table prior to the validation step.   Simply add another query transform prior to the validation transform and add the target table in again as a source. Then outer join to the (source) target table (on PK) and add the target table PK to the output from the new query transform.
    Then you can check the target table PK for null in your validation transform.  If it's null, it doesn't exist in the target.
    The other problem might be related to the database.   Because you are reading and writing from the same table in this dataflow (the target table) you may be experiencing locking and/or some kind of bottleneck with the transaction logs.
    To correct this, try staging the data in a different table after you run in through the validation transform and then, in a new dataflow, move the data to the target table.  (you could also use a Data Transfer Transform to accomplish this).

  • How validation & transformation handled in ETL

    Hi,
    I am new to OWB (We are using Oracle 8i). I have to extract data from source, do the required validation and transformation and then finally load the data into target table.I need help on validation and transformation handled in ETL.
    Illustration:
    source table emp_src & target table is emp_target. I have to validate deptcd (should present in dept master). While loading i have to replace these invalid dept code with some dummy value maintained in parameter table. Insert all the correct records and errorneous records with transformations into target table.
    Also, i want to store all these invalid records in error table or error file.
    My question:
    1. How these can be handled in ETL
    2. Should i have to modify the pl-sql procedure/jobs
    generated by ETL.
    3. Can i have my own procedures to accomplish the above
    task
    4. Can i execute my procedure within the ETL generated
    procedure.
    Thanks,
    sanjay rastogi

    I will try to answer your questions:
    1) You could do this using a "key lookup" on deptcd against the dept master. The key lookup returns NULL for each invalid value. You could then use an expression to replace the null values with a default.
    In order to log the errors you could use a "split", which enables you to insert the erroneous data into a log table.
    2) No. I have yet to come across a situation where I have had to modify the generated PL/SQL. (Doing so will cause serious issues regarding maintenance, upgrades etc.)
    3) You can use your own procedure. Simply write your PL/SQL package/procedure/function and import it into your OWB transformation library.
    4) Yes. Same as before, import your own procedures, and use them in mappings as Transformations or in Expressions
    Hope this helps.
    Roald

  • Validation/Transformation Question ....

    We need to use the  abbreviations in the Name field in the  Vendor master. The incoming data may be with full name but the business requirement is to use the abbreviations for example :-
    INCORPORATED  must be INC
    ORGANIZATION  must be ORG
    MANUFACTURING must be  MFG
    HOSPITAL must be  HOSP
    The above are some of the examples. The requirement is to replace the full name with abbreviations where ever that is present.
    How do we handle this in MDM. I noticed in the data manager there is a tab called transformation where we can specify the Table field name then the Old value and new value but donu2019t know how to execute it. Is Data Manager is the place to handle this?.
    Anybody help me with this?.

    Hi JP,
    Yes,you are right.Datamanager is responsible to do transfomations.As you said we have to
    specify the field name on which you want to create the transformation and also specify
    the value to be replaced in From field and new value in the To field.
    Suppose for example
    Specify INCORPORATED in from field and INC in To field, then INCORPORATED is replaced
    with INC.if you don't specify anything in To field, it just simply deletes the specified value
    available in from field.
    Token option is also availble to replace the values tokenwise.
    To get more details go through below link:
    Re: Transformation in matching mode in data manager
    Cheers
    Narendra

  • Major issue when working on Data_Transfer Transform in BODS.

    Hello friends,
    I had just worked on Data_Transfer transform by going through this document http://scn.sap.com/docs/DOC-39484 everything went good as per the document. The file (first where I had created manually on my desktop); later I had pulled into the Data_Transfer transform. But I just want to see whether this file contains encrypted data or not
    And open the file with open with, all the components and applications which are there on the desktop are changed to encrypted data.
    Could any please help me how to bring back to the normal stage?
    And if Transfer type is a table what to do??
    Thanks & Regards,
    Bheem.

    Hi Arun,
    I used tables from the datastore like sales table.  Please have a look at the log.

  • How to create a validation rule in SAP BODS Job

    Hi Experts
    I have created a BODS Job and in that job i have to create a validation rule as if the cust_id is null the loading must stop.
    I dont have idea where i have to define this validation rule in the Job and how to stop the load job if the validation rule file.
    My Job was defined like below image
    Please guide me where i have to define the validation rule and how to stop the load job.
    Thanks in advance
    PrasannaKumar

    Hi samatha b
    Thanks for your response. I have done as you said and now i can rise the exception.
    I have another requirement as per the validation transformation the data will load into Pass table and Fail table after job execution. If any data entered into fail table i have delete the data loaded into Pass table.
    Here i am facing problem as my target tables are MySQL tables and when i am writing scripts i wrote the script as
    sql('database','delete from <tablename>');
    but as it is an SQL Query execution it is rising exception for the query.
    How can i delete the data loaded into MySQL Target table using scripts.
    Please guide me for this error also
    Thanks in Advance
    PrasannaKumar

  • Problem Validating & Processing Transformation file in NW 7.0 version

    I am trying to Validate & Process Transformation File against my Data file I am getting the error message that I have at the below. When I validate the Conversion files, I see the creation of corresponding .CDM files. I even deleted the .CDM files and recreated the files.
    So my question is why it is giving "Sheet does not exist  (CONVERSION)" warning for which it is rejecting the records inside the Data File?
    DATA FILE:
    C_Category,Time,R_ACCT,R_Entity,InputCurrency,Amount
    ACTUAL,2007.DEC,AVG,GLOBAL,USD,1
    ACTUAL,2007.DEC,END,GLOBAL,USD,1
    ACTUAL,2007.DEC,AVG,GLOBAL,JPY,110
    ACTUAL,2007.DEC,END,GLOBAL,JPY,110
    ACTUAL,2007.DEC,HIST,GLOBAL,JPY,110
    ACTUAL,2007.DEC,HIST,GLOBAL,USD,1
    ACTUAL,2008.MAR,AVG,GLOBAL,USD,1
    ACTUAL,2008.MAR,END,GLOBAL,USD,1
    ACTUAL,2008.MAR,AVG,GLOBAL,JPY,107.5
    ACTUAL,2008.MAR,END,GLOBAL,JPY,105
    ACTUAL,2008.MAR,HIST,GLOBAL,JPY,110
    ACTUAL,2008.MAR,HIST,GLOBAL,USD,1
    TRANSFORMATION FILE:
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = ,
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=YES
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT=
    ROUNDAMOUNT=
    SPECIFICMAPPING=YES
    *MAPPING
    C_Category=*col(1)
    Time=*col(2)
    R_ACCT =*col(3)
    R_Entity=*col(4)
    InputCurrency=*col(5)
    Amount=*col(6)
    *CONVERSION
    C_Category=[COMPANY]C_Category.xls!CONVERSION
    Time=[COMPANY]Time.xls!CONVERSION
    R_ACCT=[COMPANY]R_ACCT.xls!CONVERSION
    R_Entity=[COMPANY]R_Entity.xls!CONVERSION
    InputCurrency=[COMPANY]InputCurrency.xls!CONVERSION
    ERROR
    [Start validating transformation file]
    Validating transformation file format
    Validating optionsu2026
    Validation on options was successful
    Validating mappingsu2026
    Validation on mappings was successful
    Validating conversionsu2026
    Sheet does not exist  (CONVERSION)
    Sheet does not exist  (CONVERSION)
    Sheet does not exist  (CONVERSION)
    Sheet does not exist  (CONVERSION)
    Sheet does not exist  (CONVERSION)
    Validation on conversions was successful
    Creating the transformation xml file; wait a moment
    Transformation xml file saved successfully
    Connecting to server...
    Begin validate transformation file with data fileu2026
    [Start test transformation file]
    Validate has successfully completed
    ValidateRecords = YES
    Task name CONVERT:
    No 1 Round:
    Record count: 12
    Accept count: 0
    Reject count: 12
    Skip count: 0
    Error: All records are rejected

    *CONVERSION
    C_Category=C_Category.xls
    Time=Time.xls
    R_ACCT=R_ACCT.xls
    R_Entity=R_Entity.xls
    InputCurrency=InputCurrency.xls
    On validating with the above format, I still getting the same error
    [Start validating transformation file]
    Validating transformation file format
    Validating optionsu2026
    Validation on options was successful
    Validating mappingsu2026
    Validation on mappings was successful
    Validating conversionsu2026
    Sheet does not exist  (CONVERSION)
    Sheet does not exist  (CONVERSION)
    Sheet does not exist  (CONVERSION)
    Sheet does not exist  (CONVERSION)
    Sheet does not exist  (CONVERSION)
    Validation on conversions was successful
    Creating the transformation xml file; wait a moment
    Transformation xml file saved successfully
    Connecting to server...
    Begin validate transformation file with data fileu2026
    [Start test transformation file]
    Validate has successfully completed
    ValidateRecords = YES
    Task name CONVERT:
    No 1 Round:
    Record count: 12
    Accept count: 0
    Reject count: 12
    Skip count: 0
    Error: All records are rejected

  • BODS ABAP transform question

    Hi! This is a very simple question...
    I've been trying to use ABAP transforms in BODS, and I'm having some difficulty with the documentation. In particular, it's not clear how exactly the extracted information from the ABAP program makes it into the BODS dataflow.
    Does anyone have a clearer picture of how this is done?
    Thanks,
    Scott

    Hi,
    Are you checking for ABAP Dataflows or Custom ABAP transformations?
    Regards.

  • Error validating the transformation file

    Hi All,
    We have a Sales Application for which we are loading the data through flat file. The data file has been created and uploaded using UJFS. The transformation file is giving error while validating saying that conversion files do not  exist. We have maintained the conversion files for each of the dimensions in BPC except the ones that have to loaded with a constant. Any help on the folllowing error log: The conversion files exist in the company folder. Please advise
    [Start validating transformation file]
    Validating transformation file format
    Validating options...
    Validation on options was successful.
    Validating mappings...
    Validation on mappings was successful.
    Validating conversions...
    The conversion file does not exist.  (ZTIME.XLS)
    The conversion file does not exist.  (ZACCOUNT.XLS)
    The conversion file does not exist.  (ZPRODUCT.XLS)
    The conversion file does not exist.  (CHANNEL.XLS)
    The conversion file does not exist.  (ZENTITY.XLS)
    Validation on conversions was successful.
    Creating the transformation xml file. Please wait ...
    Transformation xml file saved successfully.
    Connecting to server ...
    Begin validate transformation file with data file...
    [Start test transformation file]
    Validate has successfully completed
    [The list of conversion file]
    Conversion file: DataManager\ConversionFiles\ZTIME.XLS!CONVERSION
    Conversion file: DataManager\ConversionFiles\ZACCOUNT.XLS!CONVERSION
    Conversion file: DataManager\ConversionFiles\ZPRODUCT.XLS!CONVERSION
    Conversion file: DataManager\ConversionFiles\CHANNEL.XLS!CONVERSION
    Conversion file: DataManager\ConversionFiles\ZENTITY.XLS!CONVERSION
    Task name CONVERT:
    XML file (...BUDGET\SALES\DATAMANAGER\CONVERSIONFILES\ZTIME.CDM) is empty or is not found
    Cannot find document/directory
    Error: Validate with data file failed
    Thanks,
    Santosh

    Hi Santosh,
    I am sure you would done it correctly. However, just to be on the safer side, lets revisit all the steps.
    From BPC excel, we create a new conversion. Check the name of the worksheet (not the workbook). By default, it will be Conversion. You can change it to something else also. Lets say, we change it to Account. So, the sheet's name is Account. We validate and save the conversion as myconversion.xls. This will create another file named myconversion.cdm. Check the location properly while saving the conversion file. The location would be
    HTTP://server_name/appset/application/DataManager/ConversionFiles.
    Save it under company folder.
    Go to the server, where BPC has been installed. Go to the folder
    \Webfolders\appset\application\DataManager\ConversionFiles
    Check whether the conversion file is present or not. We should have both "myconversion.xls" and "myconversion.cdm".
    Create a new transformation file. Under the *CONVERSION section, We define the conversion file to be used. The format would be
    Dimension_Name=conversion_file_name.xls!sheet name
    In our example, it would be myconversion.xls!account. (check the use of the conversion file name and the sheet name).
    Save and validate the transformation file. Mostly, the issue is around the conversion file name and the sheet name. Check all the steps.
    Hope this helps.

  • How to use a SAP table in Validation Look Up

    Hi Experts,
    I my job for every record in the input data set I need to check for the value of the data set column in the SAP table.
    I am using a validation transform and am using the "Exists in Table" option, where I am specifying the column in the SAP table to look up.
    It is giving me an error BODI-1112468 saying that SAP table cannot be used in Validation Look up.
    Please suggest any other way to resolve it.
    Thanks in advance.

    user12088323 wrote:
    How to use a Sybase table in Oracle SQL statement?
    Sybase version : 11.9.2.4
    Oracle version : 10.2.05
    Thanks.Any Oracle client connected to the Oracle database can access Sybase data through the <font style="background-color: #FFFFCC">Database Gateway for Sybase</font> (it requires an additional license) or the <font style="background-color: #FFFFCC">Database gateway for ODBC</font> (it's free).
    The Oracle client and the Oracle database can reside on different machines. The gateway accepts connections only from the Oracle database.
    A connection to the gateway is established through a database link when it is first used in an Oracle session. In this context, a connection refers to the connection between the Oracle database and the gateway. The connection remains established until the Oracle session ends. Another session or user can access the same database link and get a distinct connection to the gateway and Sybase database.
    Database links are active for the duration of a gateway session. If you want to close a database link during a session, you can do so with the ALTER SESSION statement.
    To access the Sybase server, you must create a <font style="background-color: #FFFFCC">database link</font>. A public database link is the most common of database links.
    SQL> CREATE PUBLIC DATABASE LINK dblink CONNECT TO
    2  "user" IDENTIFIED BY "password" USING 'tns_name_entry';
    --dblink is the complete database link name.
    --tns_name_entry specifies the Oracle Net connect descriptor specified in the tnsnames.ora file that identifies the gatewayAfter the database link is created you can verify the connection to the Sybase database, as follows:
    SQL> SELECT * FROM DUAL@dblink;
    Configuring Oracle Database Gateway for Sybase
    <font style="background-color: #FFFFCC">{message:id=10649126}</font>

  • BODS code migration between different versions

    Hi,
    I had a code as .ATL in 12.2.2 version .I want to use this code  in BODS 4.1.
    Is this  code is  compatible with 4.1 ?
    can anyone please explain clearly how can i achieve this situation?

    Hi,
    With respect ATL
    90% compatible and again depends on which transform you are using for example Query and Validation transform 10% information you need to recheck it but not major changes.
    You need to consider management console and other dependency etc.. 
    Regards,
    Manoj.

Maybe you are looking for