FDM event scripts firing twice during data loads

Here's an interesting one. I have added the following to three different event scripts (one at a time, ensuring only one of these exists at any one time), to clear data before loading to Essbase:
Event Script content:
' Declare local variables
Dim objShell
Dim strCMD
' Call MaxL script to run data clear calculation.
Set objShell = CreateObject("WScript.Shell")
strCMD = "D:\Oracle\Middleware\EPMSystem11R1\products\Essbase\EssbaseClient\bin\startMAXL.cmd D:\Test.mxl"
API.DataWindow.Utilities.mShellAndWait strCMD, 0
MaxL Script:
login ******* identified by ******* on *******;
execute calculation 'FIX("Member1","Member2") CLEARDATA "Member3"; ENDFIX' on *******.*******;
exit;
However, it appears that the clear is carried out twice, both before and after the data has been loaded to Essbase. This has been verified at each step by checking the Essbase application log:
No event script:
- No Essbase data clear in application log
Adding above to "BefExportToDat" event script:
- Script is executed once after clicking on Export in FDM Web Client (before the "Target System Load" modal popup is displayed). Entries are visible in Essbase Application log.
- Script is then executed a second time when clicking on the OK button in the "Target System Load" modal popup. Entries are visible in Essbase Application log.
Adding above to "AftExportToDat" event script:
- Script is executed once after clicking on Export in FDM Web Client (before the "Target System Load" modal popup is displayed). Entries are visible in Essbase Application log.
- Script is then executed a second time when clicking on the OK button in the "Target System Load" modal popup. Entries are visible in Essbase Application log.
Adding above to "BefLoad" event script:
- Script is NOT executed after clicking on Export in FDM Web Client (before the "Target System Load" modal popup is displayed).
- Script is executed AFTER the data load to Essbase when clicking on the OK button in the "Target System Load" modal popup. Entries are visible in Essbase Application log.
Some notes on the above:
1. "BefExportToDat" and "AftExportToDat" are both executed twice, before and after the "Target System Load" modal popup. :-(
2. "BefLoad" is executed AFTER the data is loaded to Essbase! :-( :-(
Does anyone please have any idea how we might execute an Essbase database clear before loading data, and not after we have loaded fresh data? And perhaps on why the above event scripts appear to be firing twice?! There does not appear to be any logic to this!
BefExportToDat - Essbase Application Log entries:
+[Wed May 16 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info(1013091)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info(1013162)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info(1012555)+
+Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
+...+
+[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1003037)+
Data Load Updated [98] cells
+[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1003024)+
Data Load Elapsed Time : [0.52] seconds
+...+
+[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1013091)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1013162)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1012555)+
+Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
AftExportToDat - Essbase Application Log entries:
+[Wed May 16 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info(1013091)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info(1013162)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info(1012555)+
+Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
+...+
+[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1003037)+
Data Load Updated [98] cells
+[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info(1003024)+
Data Load Elapsed Time : [0.52] seconds
+...+
+[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info(1013091)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info(1013162)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info(1012555)+
+Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
BefLoad - Essbase Application Log entries:
+[Wed May 16 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1013091)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1013162)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info(1012555)+
+Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+
+...+
+[Wed May 16 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info(1003037)+
Data Load Updated [98] cells
+[Wed May 16 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info(1003024)+
Data Load Elapsed Time : [0.52] seconds
+...+
+[Wed May 16 16:23:45 2012]Local/Monthly/Monthly/admin@Native Directory/140095860504320/Info(1013091)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:23:45 2012]Local/Monthly/Monthly/admin@Native Directory/140095860504320/Info(1013162)+
+Received Command [Calculate] from user [admin@Native Directory]+
+[Wed May 16 16:23:45 2012]Local/Monthly/Monthly/admin@Native Directory/140095860504320/Info(1012555)+
+Clearing data from [Member3] partition with fixed members [Period(Member1); Scenario(Member2)]+

Hi Larry,
As mentioned, our exports do not appear to be generating the "-B.Dat" and "-C.Dat" files at present. However, you are correct with the Export and Load event scripts firing twice (once for the main TB file and again for the journal file). Does this also mean it could continue to fire an additional two times for the "-B.Dat" and "-C.Dat" files?
On the last run, the output was as follows with the modified scripts:
After clicking on Export in Workflow, the Target System Load modal popup is displayed, and the first two files have been generated:
14.24.15.0527_BefExportToDat.txt
14.24.17.0617_AftExportToDat.txt
After clicking on OK in the Target System Load modal popup, the actual load to Essbase takes place. A further six files are generated:
14.24.21.0289_BefLoad.txt
14.24.22.0117_AftLoad.txt
*14.24.22.0152_BefExportToDat-A.txt*
*14.24.22.0414_AftExportToDat-A.txt*
*14.24.22.0433_BefLoad-A.txt*
*14.24.22.0652_AftLoad-A.txt*
This makes a lot more sense, since one can see that the event scripts are being run a second time against the journal files during the data load. Many thanks, this solves my problem as I can now place my script where I want in the process chain. It's just a shame that there are not separate event scripts to distinguish between the various .Dat exports/loads, which are clearly occuring at separate times in the process chain.
Many thanks! :-)
P.S. Updated script below if anyone wishes to use it:
Sub BefExportToDat(strLoc, strCat, strPer, strTCat, strTPer, strFile)
Dim strF, fso, tf, t, temp, m, miliseconds, strSuf
t = Timer
temp = Int(t)
m = Int((t-temp)*1000)
miliseconds = String(4 - Len(m), "0") & m
strF = "D:\TEST\" & Replace(Time, ":", ".") & "." & miliseconds & "_BefExportToDat"
strSuf = UCase(Left(Right(strFile,6),2))
If strSuf = "-A" Or strSuf = "-B" Or strSuf = "-C" Then
strF = strF & UCase(strSuf) & ".txt"
Else
strF = strF & ".txt"
End If
Set fso = CreateObject("Scripting.FileSystemObject")
Set tf = fso.CreateTextFile(strF, True)
tf.WriteLine(strFile)
tf.Close
Set fso = Nothing
End Sub

Similar Messages

  • Executing Essbase Clear Script through FDM event scripts

    Hi,
    Is it possible to access and execute essbase clear script through FDM event scripts? I know one method is to modify Load action, but was wondering if can be done any other way.
    Thanks in advance.

    The Essbase client is installed on the FDM application server so yes. You can create a batch file that calls a calc script via the Essbase Client and execute this from your Event script

  • How to debug a transfer rule during data load?

    I am conducting a flat file (excel sheet saved as a CSV file) data load.  The flat file contains a date field and the value is '12/18/1988'.  In transfer rule for this field, I use a function call to transfer this value to '19881218' which corresponds to BW DATS format, but the monitor of the InfoPackage shows red error:
    "Value '1981218' of characteristic 0DATE is not a number with 000008 spaces".
    Somehow, the last digit or character of the year 1988 was cut and the year grabbed is 198 other than 1988.  The function code is (see below in between two * lines):
    FUNCTION ZDM_CONVERT_DATE.
    ""Local Interface:
    *"  IMPORTING
    *"     REFERENCE(CHARDATE) TYPE  STRING
    *"  EXPORTING
    *"     REFERENCE(DATE) TYPE  D
    DATA:
    c_date(2) TYPE c,
    c_month(2) TYPE c,
    c_year(4) TYPE c,
    c_date_combined(8) TYPE c.
    data: text(10).
    text = chardate.
    search text for '/'.
    if sy-fdpos = 1.
      concatenate '0' text into text.
    endif.
    c_month = text(2).
    c_date = text+3(2).
    c_year = text+6(4).
    CONCATENATE c_year c_month c_date INTO c_date_combined.
    date = c_date_combined.
    ENDFUNCTION.
    Could experts here tell me what's wrong and also tell me on how to debug a transfer rule during data load?
    Thanks

    hey Bhanu/AHP,
    I find the reason.  Originally, I set the character length for the date InfoObject ZCHARDAT1 to 9, then I find the date field value (12/18/1988)length is 10.  Then I modified the InfoObject ZCHARDAT1 length from 9 to 10 and activated it already.  But when defining the transfer rule for this field, before the code screen, click the radio button "Selected Fields" and pick the filed /BIC/ZCHARDAT1, then continue to go to the transfer rule code screen, but find the declaration lines for the infoObject /BIC/ZCHARDAT1 is as following:
      InfoObject ZCHARDAT1: CHAR - 000009
        /BIC/ZCHARDAT1(000009) TYPE C,
    That means even if I've modified the length to 10 for the InfoObject and activated it, but somehow the transfer rule code screen always takes the old length 9.  Any idea to have it fixed to take the length 10 in the transfer rule code screen defination?
    Thanks

  • Number of parallel process definition during data load from R/3 to BI

    Dear Friends,
    We are using Bi7.00. We have a requirement in which i should increase the number of parallel process during data load from R/3 to BI.  I want to modify this for a particular data source and check.Can experts provide helpful answers for the following question.
    1) When load is taking place or have taken place, where can we see how many parallel process that particular load has taken.
    2) Where should i change the setting for the number of parallel process for data load (from R/3 to BI) and not within BI.
    3) How system works and what will be net result of increasing or decreasing the number of parallel process.
    Expecting Experts help.
    Regards,
    M.M

    Dear Des Gallagher,
    Thank you very much for the useful information provided. The following was my observation.
    From the posts in this forum, i was given to understand that the setting for specific data source can be done in the infopackage and DTP level, i carried out the same and found that there is no change in the load, i.e., system by default takes only one parallel process even though i maintained 6.
    Can you kindly explain about the above mentioned point. i.e.,
    1) Even though the value is maintained in the infopackage level , will system consider it or not. -> if not then from which transaction system is able to derive the 1 parallel process.
    Actually we wanted to increase the package size but we failed because i could not understand what values have to be maintained  -> can you explain in detail
    Can you calrify my doubt and provide solution?
    Regards,
    M.M

  • Error trying to display a report file in FDM event script

    Hi
    I am trying to display a file (report) during an event script and receive an error "Object reference not set to an instance of an object" . The process works in version 9.3 but is generating this error during our upgrade testing on 11.1.2.2 (different hardware)
    The file (check report) is getting written correctly the \OutBox\Log directory. The file contains a standard Check Report with a file extension .pdf ( lngFileType = 31)
    We have no issues creating the file
    We try to display the file on the screen for the user using the following code which fails with the "Object reference not set to an instance of an object" error and also a message "Could not load XML file:" with the file name and path
    'Open report file in web client
    RES.PlngActionType = 4
    RES.PstrActionValue = strFileName
    Symptoms:
    We can open the same file as text with RES.PlngActionType = 1
    Everyone has rights to open/read a file in the directory
    File can be opened manually and appears correct
    Any help would be appreciated. I have not found similar issues on the forum. Support has offered no suggestions to date

    Hi and thanks for your response.
    Yes, we have tried all the various type of files ( ScriptActionTypes). The Type = 1 will open the file, but as a text stream without formatting, etc. If we try type = 3 we receive a different error message:
    Error: Invalid Report ID: 0
    Detail: Stacktrace:
    upsAppServerDM._clsAppServerDM.fPublishReport(lngReportID[Int32], lngPubFileType[Int32], strReportSQL[String], strSubReportSQL[String])
    Hyperion.FDM.Pages.ViewReportActionEvent.ExecuteReport(reportID[Int32], sqlStatement[String])
    Hyperion.FDM.Pages.ViewReportActionEvent.PreviewReport()
    Also, the Validation Report is enabled (we are using check with warnings - ID133), I have tried changing the default Web Settings from .PDF to .html and back again and tried the various other default options available under Web Settings.
    If we create a simple test text file, we can open the file as text (Type = 1)
    Here is the complete error message employing Type = 4 (XML)
    Error: Object reference not set to an instance of an object.
    Detail: Stacktrace:
    Hyperion.FDM.Pages.IntersectionSummaryCheck.FillTitleRow()
    Hyperion.FDM.Pages.IntersectionSummaryCheck.Page_Load(sender[Object], e[EventArgs])
    System.Web.UI.Control.OnLoad(e[EventArgs])
    Hyperion.FDM.Pages.BasePage.OnLoad(e[EventArgs])

  • Auto-kick off MaxL script after Oracle GL data load?

    Hi guys, this question will involve 2 different modules: Hyperion and Oracle GL.
    My client has their accounting department updating Oracle GL on a daily basis. My end-user client would like to write a script to automatically kick off the existing MaxL script which is for our daily data load in Hyperion. Currently, the MaxL script is manually executed.
    What's the best approach to build a connection for both modules to communicate with each other? Can we use a timer to trigger the run? If so, how?

    #1 External scheduler.
    I've worked on Appworx and it has build a chain dependent task. There are many other external schedulers like Tivoli,....
    #2 As Daniel pointed out you can use Windows scheduler.
    For every successful GL load add a file to a folder which is accessible for your Essbase task.
    COPY Nul C:\Hyperion\Scripts\Trigger\GL_Load_Finished.txt
    Create another bat file which is scheduled to run on every 5 or 10 mins (this should start just after your GL Load scheduled task)
    This is an example i've for a triggered Essbase job.
    IF EXIST %BASE_DIR%\Trigger\Full_Build_Started.txt (
    Echo "Full Build started"
    ) else (
         IF EXIST %BASE_DIR%\Trigger\Custom_Build_Started.txt (
         Echo "Custom Build started"
         ) else (
              IF EXIST %BASE_DIR%\Trigger\Post_Build_Batch_Started.txt (
              Echo "Post Build started"
              ) else (
              IF EXIST %BASE_DIR%\Trigger\Start_Full_Build.txt (
              Echo "Trigger found starting batch"
              MOVE %BASE_DIR%\Trigger\Start_Batch.txt %BASE_DIR%\Trigger\Full_Build_Started.txt
              call %BASE_DIR%\Scripts\Batch_Files\Monthly_Build_All_Cubes.bat
              ) else (
                   IF EXIST %BASE_DIR%\Trigger\Start_Custom_Build.txt (
                   Echo "Trigger found starting Custom batch"
                   MOVE %BASE_DIR%\Trigger\Start_Custom_Batch.txt %BASE_DIR%\Trigger\Custom_Build_Started.txt
                   call %BASE_DIR%\Scripts\Batch_Files\Monthly_Build_All_Cubes_Custom.bat
                   ) else (
                        IF EXIST %BASE_DIR%\Trigger\Start_Post_Build_Batch.txt (
                        Echo "Trigger found starting Post Build batch"
                        MOVE %BASE_DIR%\Trigger\Start_Post_Build_Batch.txt %BASE_DIR%\Trigger\Post_Build_Batch_Started.txt
                        call %BASE_DIR%\Scripts\Batch_Files\Monthly_Post_Build_All_Cubes.bat
    )So this bat file if it finds Start_Full_Build.txt in the trigger location, it'll rename that to Full_Build_Started.txt and will call the Full Build (likewise for custom and post build)
    Regards
    Celvin
    http://www.orahyplabs.com

  • Query Execution during Data Loads (extarction)

    I think BI 7.0 permitts that but I would still like to confirm from Gurus.
    Can we/users continue to access data or execute queries while extraction is going on?? Can we load data during query execution?
    What are the pros and cons of doing that??
    Always appreciative of your help.
    Suresh

    Hi,
    Query execution will not really hamper data loading or vice versa. But freshly loaded data would not be available for reporting before it gets activated in the infoprovider. Also in case of a cube, if the 'delete overlapping requests' step is to be performed, there could be erroneous looking data in the report till the time this step runs - that is, between the time when the new load has come in and the old request is deleted. That is why loads are best scheduled when users are not working on the system.

  • Segmentation fault error during data load in parallel with multiple rules

    Hi,
    I'm trying to do sql data load in parallel with multiple rules (4 or 5 rules, maybe), i'm getting a "segmentation fault" error. I tested 3 rules file and it worked fine. we're using Essbase system 9.3.2., with UDB (v8) as the sql data source. ODBC driver is DataDirect 5.2 DB2 Wire Protocol Driver (ARdb222). Please let me know if you have any information on this.
    thx.
    Y

    Hi Thad,
    I was wondering, if system is unicode or non unicode that should not matter the amount and currency field . As currencies are defined by SAP and it is in pure English at least a currency code part of 3 Chars. 
    Could this because of some incosistency of data ??
    I would like to know for Which currency  had some special characters it it in that particular record ??
    Hope that helps.
    Regards
    Mr Kapadia

  • Maxl Error during data load - file size limit?

    <p>Does anyone know if there is a file size limit while importingdata into an ASO cube via Maxl. I have tried to execute:</p><p> </p><p>Import Database TST_ASO.J_ASO_DB data</p><p>using server test data file '/XX/xXX/XXX.txt'</p><p>using server rules_file '/XXX/XXX/XXX.rul'</p><p>to load_buffer with buffer_id 1</p><p>on error write to '/XXX.log';</p><p> </p><p>It errors out after about 10 minutes and gives "unexpectedEssbase error 1130610' The file is about 1.5 gigs of data. The filelocation is right. I have tried the same code with a smaller fileand it works. Do I need to increase my cache or anything? I alsogot "DATAERRORLIMIT' reached and I can not find the log filefor this...? Thanks!</p>

    Have you looked in the data error log to see what kind of errors you are getting. The odds are high that you are trying to load data into calculated memebers (or upper level memebers) resulting in errors. It is most likely the former. <BR><BR>you specify the error file with the <BR><BR>on error write to '/XXX.log'; <BR><BR>statement. Have you looked for this file to find why you are getting errors? Do yourself a favor, load the smaller file and look for the error file to see what kind of an error you are getting. It is possible that you error file is larger than your load file, since multiple errors on a single load item may result in a restatement of the entire load line for each error.<BR><BR>This is a starting point for your exploration into the problem. <BR><BR>DATAERRORLIMIT is set at the config file, default at 1000, max at 65000.<BR><BR>NOMSGLOGGINGONDATAERRORLIMIT if set to true, just stops logging and continues the load when the data error limit is reached. I'd advise using this only in atest environement since it doesn't solve the initial problem of data errors.<BR><BR>Probably what you'll have to do is ignore some of the columns in the data load that load into calculated fields. If you have some upper level memebers, you could put them in skip loading condition. <BR><BR>let us know what works for you.

  • Errors during Data loads

    Hi,
    Our end users are preparing their scripts for UAT and I have been asked to provide a list of data errors which they might need to test during UAT.
    Can someone please help me in this. This is very urgent!!!
    Thanks,
    RPK.
    Message was edited by:
            RPK

    HI,
    1.No IDocs could be sent to the SAP BW using RFC.
    2.IDocs were found in the ALE inbox for Source System that are not
    Updated. Processing is overdue
    5.1st – That it is a PC File Source System.
    2nd - That there are Duplicate Data Records leading to error.
    7.1st – That it is a PC File Source System.
    2nd – For uploading data from the PC File it seems that the file was not kept at the application server & the job was scheduled in Background
    8.The background processing was not finished in the source system.
    It is possible that the background processing in the source system was terminated.System response .There are incomplete data packets present
    9.After looking at all the above error messages we find that if we want to load data with the delta update you must first initialize the delta process.
    10.Here we can see that the Activation of ODS has failed as the Request Status in the ODS may not be green.
    13.While Fiscal year is mostly the financial year and varies depending upon the Client using which Fiscal year Variant. Client may consider April is the start month of year (financial) .So this can be achieved by defining the fiscal year.One client can have different fiscal year definations with different Fiscal varients.So Fiscal variants differ the starting and ending of fiscal year.
    For example in India we follow Fiscal year from April to March. You can have a look at the Fiscal year Variant and the periods in transaction OB29
    14.Due to space problem these types of errors occurs

  • Member In Outline, Not Found During Data Load

    I am exporting data from DatabaseA (zero level export in columns to a text file) using MaxL. I then to import this data (using MaxL) into Database B on another server but which has the same exact outline. We set this up about 3 weeks ago and the export import scripts run every day and have been working fine. However today I got an error while loading members into Database B, 3 members were not able to load:
    Member 73997612 Not Found In Database
    Member STCR Not Found In Database
    Member 22-340101 Not Found In Database
    Initially I thought that these 3 members got added to DatabaseA, and were therefore in the export file and probably didn't exist in Database B. However when I went to database B and did a "find members" on the outline, all 3 members are there. Just a note that they had been added manually, but were added.
    Then I thought maybe they had been added in the wrong dimension. I checked the location of the above three members in Database A, and they were added in the same location in the hierarchy in Database B.
    So if the members exist in Database B and are not in the wrong location, why can't they be loaded to Database B ?
    I also wanted to add that there seemed to be several data values which existed for these members. I am guessing they came from the export.
    Version is 11.1.2.1. OS = Oracle Enterprise Linux
    Has anybody encountered an issue like this before ? I don't think it has to do with the member names, they seem to fit syntax requirements PLUS they are exported from the production database.
    Thanks in advance
    Edited by: EssbaseApprentice on Feb 9, 2012 11:12 AM
    Edited by: EssbaseApprentice on Feb 9, 2012 11:54 AM

    Check member names are exactly same( Without any extra spaces, spellings etc...)
    Check storage type

  • Concatnate two fields during data loading - SQL Loader

    How to concatnate two fields(Date and Time ) to a Timestamp field during loading?
    Data is in text format and it contains variable data.
    Example Data
    abc#10.02.2003#16:23:12.12345#dsd#
    Execepted output in a timestamp field
    10.02.2003 16:23:12.12345

    Let me try:
    LOAD DATA
    INFILE 'myfile.dat'
    INTO TABLE mytable
    FIELDS TERMINATED BY '#' OPTIONALLY ENCLOSED BY '"'
    (col1 CHAR(3),
    col2 TIMESTAMP "dd.mm.yyyy hr24:mi:ss.ms" "col2||field1",
    field1 FILLER CHAR,
    col3 CHAR(3))     
    The filler datatype tells sqlldr that this part of the record is not data to be loaded in the table.
    P.S. I am not sure if I got the timestamp format string right.
    P.P.S. This will only work with 9i as 8i and below do not have a timestamp datatype.
    P.P.P.S. Your other option would be to define an external table (again 9i) the do an insert statement into the final table: insert into mytable (select col1, col2||col3, col4 ...

  • Error during data load due to special characters in source data

    Hi Experts,
    We are trying to load Billing data into the BW using the billing item datasource. It seems that there are some special characters in the source data. When the record with these characters is encountered, the request turns red and the package is not loaded even into the PSA. The error we get in the monitor is something like
    'RECORD 5028: Contents from field ****  cannot be converted into type CURR',
    where the field **** is a key figure of type currency. We managed to identify the said record in RSA3 on the source system and found that one of the fields contains some invalid (special) characters that show up as squares in RSA3. The data in the rest of the fields, including the fields mentioned in the error  looks correct.
    Our source system is a non-unicode system wheras the BW system is unicode enabled. I figure that the data in the rest of the fields are getting misaligned due to the presence of the invalid characters in the above field. This was confirmed when we unassigned the field with the special characters from the transfer rules and removed the source field from the transfer structure. After doing this the data was loaded successfully and the request turned green.
    Can anyone suggest a way to either filter out such invalid characters from the source data or make some settings in the BW systems such that the special characters are no longer invalid in the BW system? We cannot write code in the transfer rules because the data package does not even come into the PSA. Is there any other method to solve this problem?
    Regards,
    Ted

    Hi Thad,
    I was wondering, if system is unicode or non unicode that should not matter the amount and currency field . As currencies are defined by SAP and it is in pure English at least a currency code part of 3 Chars. 
    Could this because of some incosistency of data ??
    I would like to know for Which currency  had some special characters it it in that particular record ??
    Hope that helps.
    Regards
    Mr Kapadia

  • How to prevent OLAP from doing unneccessary aggredations during data load?

    Hi,
    I'm trying to create a relatively simple two-dimensional OLAP cube (you might want to call it "OLAP square"). My current environment is 11.2EE with AWM for workspace management.
    One dimension is date, all years->year->month->day, the other one is production unit, implemented as a hierarchy with a certain machine at the bottom level. The fact is defined by a pair of  bottom-level values of these dimensions; for instance, a measure is taken once a day from each machine. I would like to store these detailed facts in a cube together with aggregates, so they could be easily drilled down to without querying the original fact table.
    The aggregation rules are set to "Aggregate from level = default" (which is day and machine respectively) for both of my dimensions, the cube is mapped to fact table with dimension tables, the data is loaded, and the whole thing is working as expected.
    The problem is with the load itself, I noticed it being too slow for my amount of sample data. After some investigation of the issue I found out a query in cube_build_log table, a query the data is actually being loaded with.
    <SQL>
      <![CDATA[
    SELECT /*+  bypass_recursive_check  cursor_sharing_exact  no_expand  no_rewrite */
      T4_ID_DAY ALIAS_37,
      T1_ID_POT ALIAS_38,
      MAX(T7_TEMPERATURE)  ALIAS_39,
      MAX(T7_TEMPERATURE)  ALIAS_40,
      MAX(T7_METAL_HEIGHT)  ALIAS_41
    FROM
      SELECT /*+  no_rewrite */
        T1."DATE_TRUNC" T7_DATE_TRUNC,
        T1."METAL_HEIGHT" T7_METAL_HEIGHT,
        T1."TEMPERATURE" T7_TEMPERATURE,
        T1."POT_GLOBAL_ID" T7_POT_GLOBAL_ID
      FROM
        POTS."POT_BATH" T1   )
      T7,
      SELECT /*+  no_rewrite */
        T1."ID_DIM" T4_ID_DIM,
        T1."ID_DAY" T4_ID_DAY
      FROM
        RI."DIM_DATES" T1   )
      T4,
      SELECT /*+  no_rewrite */
        T1."ID_DIM" T1_ID_DIM,
        T1."ID_POT" T1_ID_POT
      FROM
        RI."DIM_POTS" T1   )
      T1
    WHERE
      ((T4_ID_DIM = T7_DATE_TRUNC)
        AND (T1_ID_DIM = T7_POT_GLOBAL_ID)
        AND ((T7_DATE_TRUNC)  IN  < a long long list of dates for currently processed cube partition is clipped >  ) ) ) 
    GROUP BY
      (T1_ID_POT, T4_ID_DAY) 
    ORDER BY
      T1_ID_POT ASC NULLS LAST ,
      T4_ID_DAY ASC NULLS LAST ]]>>
    </SQL>
    Notice T4_ID_DAY,  T1_ID_POT in the top level column list - these are bottom-level identifiers of my dimensions, which means the query isn't actually doing any aggregation here, as there is only one fact per each pair of (ID_DAY, ID_POT).
    What I want to do is to somehow load the data without doing this (totally useless in my case) intermediate aggregation. Basically, I want it to be something like
    SELECT /*+  bypass_recursive_check  cursor_sharing_exact  no_expand  no_rewrite */
      T4_ID_DAY ALIAS_37,
      T1_ID_POT ALIAS_38,
      T7_TEMPERATURE  ALIAS_39,
      T7_TEMPERATURE  ALIAS_40,
      T7_METAL_HEIGHT  ALIAS_41
    FROM etc...
    without any aggregations. In fact, I can live even with this loading query, as the amounts of data are not that large, but I want things to work the right way (more or less ).
    Any chance to do it?
    Thanks.

    I defined a primary key over all the dimension keys in the fact table but for some reason the build still uses an aggregation. Probably because the aggregation operator for the cube I'm currently playing with is actually set, and I don't see a way to undefine it from the UI toolbar you are referring to. This is a piece of mapping section from the workspace file I exported using AWM.:
    <CubeMap
          Name="MAP1"
          IsSolved="False"
          Query="POT_BATH_T"
          AggregationMethod="SUM">
    Looks like the build aggregates because it is clearly told to so by the AggregationMethod attribute? Any way to override it?

  • PERFORM_CONFLICT_TAB_TYPE  shortdump during data loading into ODS

    HAI
    Im trying to load the data into ODS in Production and QA system . But im getting the shortdump says that <b>PERFORM_CONFLICT_TAB_TYPE</b> .
    But in development system , data is loading into ODS.
    So please tell me wht i have to do
    i will assing the points
    rizwan

    here it is..
    Note 707986 - Writing in trans. InfoCubes: PERFORM_CONFLICT_TAB_TYPE
    Summary
    Symptom
    When data is written to a transactional InfoCube, the termination PERFORM_CONFLICT_TAB_TYPE occurs. The short dump lists the following reasons for the termination:
               ("X") The row types of the two tables are incompatible.
               ("X") The table keys of the two tables do not correspond.
    Other terms
    transactional InfoCube, SEM, BPS, BPS0, APO
    Reason and Prerequisites
    The error is caused by an intensified type check in the ABAP runtime environment.
    Solution
    Workaround for BW 3.0B (SP16-19), BW 3.1 (SP10-13)
               Apply the attached correction instructions.
    BW 3.0B
               Import Support Package 20 for 3.0B (BW3.0B Patch20 or SAPKW30B20) into your BW system. The Support Package is available oncenote 0647752 with the short text "SAPBWNews BW3.0B Support Package 20", which describes this Support Package in more detail, has been released for customers.
    BW 3.10 Content
               Import Support Package 14 for 3.10 (BW3. 10 Patch14 or SAPKW31014) into your BW system. The Support Package is available once note 0601051  with the short text "SAPBWNews BW 3.1 Content Support Package 14" has been released for customers.
    BW3.50
               Import Support Package 03 for 3.5 (BW3.50 Patch03 or SAPKW35003) into your BW system. The Support Package is available once note 0693363 with the short text "SAPBWNews BW 3.5 Support Package 03", which describes this Support Package in more detail, has been released for customers.
    The notes specified may already be available to provide advance information before the Support Package is released. However, in this case, the short text still contains the term "Preliminary version" in this case.
    Header Data
    Release Status: Released for Customer
    Released on: 18.02.2004  08:11:39
    Priority: Correction with medium priority
    Category: Program error
    Primary Component: BW-BEX-OT-DBIF Interface to Database
    Secondary Components: FIN-SEM-BPS Business Planning and Simulation
    Releases
    Software
    Component Release From
    Release To
    Release And
    subsequent
    SAP_BW 30 30B 30B  
    SAP_BW 310 310 310  
    SAP_BW 35 350 350  
    Support Packages
    Support
    Packages Release Package
    Name
    SAP_BW_VIRTUAL_COMP 30B SAPK-30B20INVCBWTECH
    Related Notes
    693363 - SAPBWNews BW SP03 NW'04 Stack 03 RIN
    647752 - SAPBWNews BW 3.0B Support Package 20
    601051 - SAPBWNews BW 3.1 Content Support Package 14
    Corrections Instructions
    Correction
    Instruction Valid
    from Valid
    to Software
    Component Ref.
    Correction Last
    Modifcation
    301776 30B 350 SAP_BW J19K013852 18.02.2004  08:03:33
    Attributes
    Attribute Value
    weitere Komponenten 0000031199
    German
    English
    Vishvesh

Maybe you are looking for

  • SWF Banner works in Live View not in Any Browser? Losing my mind...

    Hello, I'm not a pro but have some web building experience. I'm currently building a website for myself in Dreamweaver CS5. I made a simple flash banner where pictures fade in and out. I placed it into my template and it works fine in Live View but n

  • Connecting my nano to female ipod plug in a land rover

    Hi dears , I just got a nano but do not succeed in connecting it  to the ipod  female plug of my landrover discovery 4 : look like i should get a plug with two male ends ; i am just an old  pc user discovering the Apple world ......

  • Performance issue reported in ADDM output

    When I run addmrpt on PROD, I get this line frequently reported. SQL statements consuming significant database time were found.    RECOMMENDATION 1: SQL Tuning, 22% benefit (2456 seconds)       ACTION: Tune the PL/SQL block with SQL_ID "1tu3twp1maf9j

  • Just some comment

    I think it's about time for Java to start deprecating the AWT and Swing, and create a new package that incoporate these two as one. There are some issue that they need to address when combining these two class.... 1. make them all light weight like s

  • FM error in smartform

    Hi, I am trying to execute my print program but i am getting error that "FM called wrongly" . My requirement is i want to display vendor details using smartforms.For this i took internal table with 5 fields.There is no syntactical errors in the progr