Data import from ASCII file : javasript to determine correct account

Hi,
I am importing data from an ASCII file and need to use a conversion file with javascript to determine the correct account depending on the amount I am importing.
As an example:
If external account B4711 carries an amount of 1000 it needs to be imported into internal account A890, if it carries an amount of -1000 it needs to be imported into internal account B890.
I have the VB script to do this in the MS version but am struggling with the javascript for the NW version.
In the transformation file, the account is the amount from the load file, the string | and the account from the load file. In my example I would get either 1000|B4711 or -1000|B4711. This part works fine.
The conversion file looks like this:
external                  internal                                                                     formula
*|B4711                 *IF(js:%external%.substring(0,1)=- then B890; A890)      
The idea is, that if the first character is the minus sign, the conversion will return account B890, if it is any other character the conversionwill return A890. The actual result is that all data is imported into the account A890 irrespective of the amount (both positive and negative amounts).
Can anyone help me with this?
Thanks,
Arnold

Hi Arnold,
Oh no! I thought this was going in the Formula column. I don't know how I got that idea.
So, I reviewed the documentation ([http://help.sap.com/saphelp_bpc75_nw/helpdata/en/81/94a8a5febd40268d5c59b4fc31be37/frameset.htm]) and you are correct that you should be able to use JS with the *IF statement - I was incorrect about that, at least according to the docs. In your original JS you probably just needed to use == instead of = and then put the minus sign in double-quotes
They also give an example of how to do it without JS, which I guess would look like
*If(%external%=-* then 242000; 128000)
So, that might work as well, though I'm not sure about how it's going to handle that minus sign.
Ethan

Similar Messages

  • Problem in data import from dump file on NAS device

    I am using oracle 10g on solaris 10. I have mounted NAS device on my solaris machine.
    I have created a directory object dir on my NAS storage in which i have placed the dump file.
    Now when i execute the following command to import data
    impdp user/pass DIRECTORY=DIR DUMPFILE=MQA.DMP logfile=import.log
    then i get the following error message
    ORA-39002: invalid operation
    ORA-39070: Unable to open the log file.
    ORA-29283: invalid file operation
    ORA-06512: at "SYS.UTL_FILE", line 475
    ORA-29283: invalid file operation
    Can any body help ?

    Also, Datapump from NAS device would perform slow.
    Take a look at this:
    10gR2 "database backup" to nfs mounted dir results in ORA-27054 NFS error
    Note:356508.1 - NFS Changes In 10gR2 Slow Performance (RMAN, Export, Datapump) From Metalink.

  • Data loading from flat file to cube using bw3.5

    Hi Experts,
                       Kindly give  me the detailed steps with screens  about Data loading from flat file to cube using bw3.5
           ...............Please

    Hi ,
    Procedure
    You are in the Data Warehousing Workbench in the DataSource tree.
           1.      Select the application components in which you want to create the DataSource and choose Create DataSource.
           2.      On the next screen, enter a technical name for the DataSource, select the type of DataSource and choose Copy.
    The DataSource maintenance screen appears.
           3.      Go to the General tab page.
                                a.      Enter descriptions for the DataSource (short, medium, long).
                                b.      As required, specify whether the DataSource builds an initial non-cumulative and can return duplicate data records within a request.
                                c.      Specify whether you want to generate the PSA for the DataSource in the character format. If the PSA is not typed it is not generated in a typed structure but is generated with character-like fields of type CHAR only.
    Use this option if conversion during loading causes problems, for example, because there is no appropriate conversion routine, or if the source cannot guarantee that data is loaded with the correct data type.
    In this case, after you have activated the DataSource you can load data into the PSA and correct it there.
           4.      Go to the Extraction tab page.
                                a.      Define the delta process for the DataSource.
                                b.      Specify whether you want the DataSource to support direct access to data.
                                c.      Real-time data acquisition is not supported for data transfer from files.
                                d.      Select the adapter for the data transfer. You can load text files or binary files from your local work station or from the application server.
    Text-type files only contain characters that can be displayed and read as text. CSV and ASCII files are examples of text files. For CSV files you have to specify a character that separates the individual field values. In BI, you have to specify this separator character and an escape character which specifies this character as a component of the value if required. After specifying these characters, you have to use them in the file. ASCII files contain data in a specified length. The defined field length in the file must be the same as the assigned field in BI.
    Binary files contain data in the form of Bytes. A file of this type can contain any type of Byte value, including Bytes that cannot be displayed or read as text. In this case, the field values in the file have to be the same as the internal format of the assigned field in BI.
    Choose Properties if you want to display the general adapter properties.
                                e.      Select the path to the file that you want to load or enter the name of the file directly, for example C:/Daten/US/Kosten97.csv.
    You can also create a routine that determines the name of your file. If you do not create a routine to determine the name of the file, the system reads the file name directly from the File Name field.
                                  f.      Depending on the adapter and the file to be loaded, make further settings.
    ■       For binary files:
    Specify the character record settings for the data that you want to transfer.
    ■       Text-type files:
    Specify how many rows in your file are header rows and can therefore be ignored when the data is transferred.
    Specify the character record settings for the data that you want to transfer.
    For ASCII files:
    If you are loading data from an ASCII file, the data is requested with a fixed data record length.
    For CSV files:
    If you are loading data from an Excel CSV file, specify the data separator and the escape character.
    Specify the separator that your file uses to divide the fields in the Data Separator field.
    If the data separator character is a part of the value, the file indicates this by enclosing the value in particular start and end characters. Enter these start and end characters in the Escape Charactersfield.
    You chose the; character as the data separator. However, your file contains the value 12;45 for a field. If you set u201C as the escape character, the value in the file must be u201C12;45u201D so that 12;45 is loaded into BI. The complete value that you want to transfer has to be enclosed by the escape characters.
    If the escape characters do not enclose the value but are used within the value, the system interprets the escape characters as a normal part of the value. If you have specified u201C as the escape character, the value 12u201D45 is transferred as 12u201D45 and 12u201D45u201D is transferred as 12u201D45u201D.
    In a text editor (for example, Notepad) check the data separator and the escape character currently being used in the file. These depend on the country version of the file you used.
    Note that if you do not specify an escape character, the space character is interpreted as the escape character. We recommend that you use a different character as the escape character.
    If you select the Hex indicator, you can specify the data separator and the escape character in hexadecimal format. When you enter a character for the data separator and the escape character, these are displayed as hexadecimal code after the entries have been checked. A two character entry for a data separator or an escape sign is always interpreted as a hexadecimal entry.
                                g.      Make the settings for the number format (thousand separator and character used to represent a decimal point), as required.
                                h.      Make the settings for currency conversion, as required.
                                  i.      Make any further settings that are dependent on your selection, as required.
           5.      Go to the Proposal tab page.
    This tab page is only relevant for CSV files. For files in different formats, define the field list on the Fields tab page.
    Here you create a proposal for the field list of the DataSource based on the sample data from your CSV file.
                                a.      Specify the number of data records that you want to load and choose Upload Sample Data.
    The data is displayed in the upper area of the tab page in the format of your file.
    The system displays the proposal for the field list in the lower area of the tab page.
                                b.      In the table of proposed fields, use Copy to Field List to select the fields you want to copy to the field list of the DataSource. All fields are selected by default.
           6.      Go to the Fields tab page.
    Here you edit the fields that you transferred to the field list of the DataSource from the Proposal tab page. If you did not transfer the field list from a proposal, you can define the fields of the DataSource here.
                                a.      To define a field, choose Insert Row and specify a field name.
                                b.      Under Transfer, specify the decision-relevant DataSource fields that you want to be available for extraction and transferred to BI.
                                c.      Instead of generating a proposal for the field list, you can enter InfoObjects to define the fields of the DataSource. Under Template InfoObject, specify InfoObjects for the fields in BI. This allows you to transfer the technical properties of the InfoObjects into the DataSource field.
    Entering InfoObjects here does not equate to assigning them to DataSource fields. Assignments are made in the transformation. When you define the transformation, the system proposes the InfoObjects you entered here as InfoObjects that you might want to assign to a field.
                                d.      Change the data type of the field if required.
                                e.      Specify the key fields of the DataSource.
    These fields are generated as a secondary index in the PSA. This is important in ensuring good performance for data transfer process selections, in particular with semantic grouping.
                                  f.      Specify whether lowercase is supported.
                                g.      Specify whether the source provides the data in the internal or external format.
                                h.      If you choose the external format, ensure that the output length of the field (external length) is correct. Change the entries, as required.
                                  i.      If required, specify a conversion routine that converts data from an external format into an internal format.
                                  j.      Select the fields that you want to be able to set selection criteria for when scheduling a data request using an InfoPackage. Data for this type of field is transferred in accordance with the selection criteria specified in the InfoPackage.
                                k.      Choose the selection options (such as EQ, BT) that you want to be available for selection in the InfoPackage.
                                  l.      Under Field Type, specify whether the data to be selected is language-dependent or time-dependent, as required.
           7.      Check, save and activate the DataSource.
           8.      Go to the Preview tab page.
    If you select Read Preview Data, the number of data records you specified in your field selection is displayed in a preview.
    This function allows you to check whether the data formats and data are correct.
    For More Info:  http://help.sap.com/saphelp_nw70/helpdata/EN/43/01ed2fe3811a77e10000000a422035/content.htm

  • Can I recover my data information from excel file?

    For any reason I already lost (clear all) my data information, have any option where import the data information from excel file to view responses sheet? or , how, can I recover my data information?
    Antonio

    If you had saved your Excel file, then you may revert the Excel file to the last saved version. Follow below steps to do this:
    On the File tab, click Open.
    Double-click the name of the file that you have open in Excel.
    Click Yes to reopen the Excel file.
    If you had not saved Excel file, then follow below steps to recover your file.
    Click the File tab.
    Click Recent.
    Click Recover Unsaved Workbooks.

  • Import From CSV File statement runs forever, no error, does not finish

    Hello,
    I am trying to import a CSV file in a JAVA program, with the following statement:
    IMPORT FROM CSV FILE '/debug/testdatabase/FILE.csv'
    INTO "JOSEPH"."TEST_TABLE"
    WITH COLUMN LIST IN FIRST ROW
    RECORD DELIMITED BY '\n'
    FIELD DELIMITED BY '\t' ERROR LOG '/debug/testdatabase/file.err'
    THREADS 10
    BATCH 10000
    I have two HANA instances on different machines A and B:
    Both machines run HANA version 1.00.74.00.389160 (NewDB100_REL), while the OS is
    SUSE Linux Enterprise Server 11.1 on machine A and
    SUSE Linux Enterprise Server 11.2 on machine B.
    The statement above runs fine on machine A and the rows are imported properly from JAVA as well as when executed from HANA Studio SQL console.
    If I copy the file to machine B and try the exact same statement with the same file, it does not finish (neither from JAVA nor from HANA Studio SQL console). There is no error either. It cannot be cancelled, only a HANA restart stops the statement. Also the sample file I use has only 2 rows, and memory does not seem to be a problem.
    I seem to have a similar problem to the one described here, but the answers there do not help me: http://scn.sap.com/thread/3396582 I specified the record delimiter, and I used a python script to check for any strange characters that are not supposed to be there, but didn't find any.
    If I copy the file to my windows PC and use the "File Menu -> Import -> SAP Hana Content -> Data from Local file" function, it imports the file correctly into B, but I need to be able to do it from JAVA.
    Machine A administration view:
    Machine B administration view:
    If you have any idea what might cause this behavior or where I can find more information on this problem please give me a hint.

    Hi Joseph,
    First from the pics, the revision of your SAP HANA instance is 73 instead of 74. Since I have no identical environment, I cannot test it for you. But can you try the simplest scenario? You can create a table with only one column table try to import a CSV file with only one row.
    Best regards,
    Wenjun

  • Error Message in Data IMport from Excel

    Hi All
    Please can you help - I am trying to import some BP data using the option in the menu Administration - Data Import/Export - Data Import from Excel.
    I have created my import file as a text file and provided the relevant mappings. Having located my text file, I then get the following error message:
    Row Number 1:Internal error (-2007) occurred.
    Has anyone got any ideas as to what this means I've done wrong?
    Thanks

    Dear, 
    Kindly refer to note 1296487 below:
    Symptom
    In attempt to import a Business Partner from an Excel file to SAP
    Business One, the following error messages are displayed and the
    Business Partner is not imported:
        o  When there are no House Banks in the company, the following error
           messages are displayed:
        Row Number 1: Internal error 'House Bank Accounts' occurred,
    and,0 Records imported successfully.
         o  When there are no Payment Methods in the company, the following
            error messages are displayed:
         Internal error 'Payment Methods for Payment Wizard' (OPYM)
    (-2007) occurred, and, 0 Records imported successfully.
         o  When Payment Methods and House Bank are defined in the company,
            the following error messages are displayed:
         Row Number 1: Internal error (-2007) occurred, and,
         0 Records imported successfully.
    Other terms
    DB, export, BP
    Reason and Prerequisites
    Application error
    Solution
    SAP intends to provide patches in order to solve described problem.
    The section Reference to related Notes below will list the specific
    Patches when available.
    The corresponding Info file of Patches on SAP Service Marketplace will
    show the correction / SAP Note number also.
    Be aware that these references can only be set at Patch release date.
    SAP will deliver Patches only for selected Releases at its own
    discretion, based on the business impact and the complexity of the
    implementation.
    If it is your case, We would like to inform you that we plan to fix the problem you raised,
    in patch 05 for SAP Business One 2007 A FP01 / SP01 Release.
    For SAP Business One Patch Delivery Schedule please look at:
    http://service.sap.com/smb/sbo/patches
    Best regards,
    Apple

  • Can you use a subquery when importing from a file?

    I was wondering if the DBMS_DATAPUMP API allows using a subquery (DATA_FILTER) when executing an import from a file?
    I am unable of limiting my data when creating the export file because my subquery requires a database link, which is not an option on the server I'm working on. So I'm wondering if I can limit my data on the import from a file, rather than the export?
    Thanks for any advice...
    ivalum21

    I was wondering if the DBMS_DATAPUMP API allows using a subquery (DATA_FILTER) when executing an import from a file?From the documentation I would say yes:
    SUBQUERY:
    Specifies a subquery that is added to the end of the SELECT statement for the table. If you specify a WHERE clause in the subquery, you can restrict the rows that are selected. Specifying an ORDER BY clause orders the rows dumped in the export which improves performance when migrating from heap-organized tables to index-organized tables.
    «

  • Reference Data Imports from ECC

    All of a sudden our reference data imports from ECC to ESourcing are failing with this error.
                "is either unknown data type or it is missing the required DataType(..) declaration"
    They used to work.  Can someone point me towards the place to check for the issue?  There is a DataType Declaration at the top of the data file (see example) and the data file format has not been changed.
                    #DataType(masterdata.Currency)
                    "DISPLAY_NAME","DOCUMENT_DESCRIPTION","IN_EURO","DISPLAY_PRECISION","STORAGE_PRECISION"
                    "ADP","Andoran peseta" "","0","0"
    Thanks,
    Keith
    Edited by: Keith Wendel on Nov 11, 2009 11:50 AM
    NOTE - I changed brackets to parantheses in the two indented lines above to make them stop appearing as html links.

    Turned out that there were unprintable characters at the front of the files...caused by the encoding used in the source ECC system ABAP program.

  • Data import from EBS failed via FDMEE in fdm . Getting error message as "Error connecting to AIF URL.

    FDM Data import from EBS failed via FDMEE after roll back the 11.1.2.3.500 patch . Getting below error message in ERPI Adapter log.
    *** clsGetFinData.fExecuteDataRule @ 2/18/2015 5:36:17 AM ***
    PeriodKey = 5/31/2013 12:00:00 AM
    PriorPeriodKey = 4/30/2013 12:00:00 AM
    Rule Name = 6001
    Execution Mode = FULLREFRESH
    System.Runtime.InteropServices.COMException (0x80040209): Error connecting to AIF URL.
    at Oracle.Erpi.ErpiFdmCommon.ExecuteRule(String userName, String ssoToken, String ruleName, String executionMode, String priorPeriodKey, String periodKey, String& loadId)
    at fdmERPIfinE1.clsGetFinData.fExecuteDataRule(String strERPIUserID, String strDataRuleName, String strExecutionMode, String strPeriodKey, String strPriorPeriodKey)
    Any help Please?
    Thanks

    Hi
    Getting this error in ErpiIntergrator0.log . ODI session ID were not generated in ODI / FDMEE. If I import from FDMEE its importing data from EBS.
    <[ServletContext@809342788[app:AIF module:aif path:/aif spec-version:2.5 version:11.1.2.0]] Servlet failed with Exception
    java.lang.RuntimeException
    at com.hyperion.aif.servlet.FDMRuleServlet.doPost(FDMRuleServlet.java:76)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:27)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
    at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:324)
    at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:460)
    at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
    at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
    at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:163)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3730)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3696)
    at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2273)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1490)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)

  • Error While Importing from EIF file into AW

    Hi All,
    I am facing the following error while importing from EIF file into AW using AWM as well as from sql prompt.
    java.sql.SQLException: ORA-36786: (IOSEC04) File POCAW/abc.eif does not exist or cannot be accessed.
    ORA-06512: at "SYS.DBMS_AW", line 42
    ORA-06512: at line 1
    at oracle.olap.awm.eif.eifimport.ImportDialogActionThread.doExportOrImport(ImportDialogActionThread.java:172)
    at oracle.olap.awm.eif.EifDialogActionThread.doAction(EifDialogActionThread.java:58)
    at oracle.olap.awm.ui.dialog.ThreadedDialogActionThread.run(ThreadedDialogActionThread.java:49)
    i have already created the directory pocaw.
    i have done the following steps :
    CREATE OR REPLACE DIRECTORY POCAW as 'D:\Olap_POC\export';
    GRANT all ON DIRECTORY POCAW to PUBLIC;
    execute dbms_aw.execute ('aw create ''POC_AW''');
    execute dbms_aw.execute('cda pocaw');
    execute dbms_aw.execute('dtb attach POC_AW rw');
    execute dbms_aw.execute('import all from eif file ''/abc.eif'' dfns update ');
    any idea or thought would be appreciable.
    thanks n advance
    anwar

    execute dbms_aw.execute('import all from eif file''/abc.eif'' dfns update ');
    Maybe one of the following file names could help
    (have never tried it):
    ''abc.eif''
    ''POC_AW\abc.eif''Yes,
    you don't need the slash. CDA is like setting a default directory.

  • "Import from Text File" & "Upload from Clipboard" icon not appearng in F110

    Hi Experts,
    While making the payment to the vendors through TN F110, in "PARAMETER" screen just next to 'STATUS' screen, there is a vendor field for selection in range and just ahead of this there is an icon for multiple vendor selection ( Bold arrow mark ). If I click on that multiple vender selection icon ( on bold arrow mark ), an another screen opens for multiple selection for vendors. In this screen 'Enter, Copy, Delete etc' icons are appearing at the bottom of the screen but "Import from Text File" & "Upload from Clipboard" icon are not appearing there.
    Can you please let me know how to bring that icons there?
    Points would be awarded.
    Rgds,
    Shailesh

    Hi Ravi,
    I have not yet created anything and have just entered the parameter. In the multiple screen for vendors there are 5 fields for individual and 5 for range selection. Even you can put more than five range selections or individual selections by clicking on the page button but there I am not finding these two icons.
    Can you please check at your end and let me know if you are getting or what to do to bring those two icons in F110.
    Rgds,
    Shailesh

  • Importing From Flat File with Dynamic Columns

    HI
    I am using ssis 2008,i have folder in which I have Four(4) “.txt” files each file will have 2 columns(ID, NAME). I loaded 4
    files in one destination, but today I receive one more “.txt” file here we have 3 columns (ID, NAME, JOB) how can I get a message new column will receive in source. And how can I create in extra column in my destination table dynamically …please help me

    Hi Sasidhar,
    You need a Script Task to read the names and number of columns in the first row of the flat file each time and store it in a variable, then create a staging table dynamically based on this variable and modify the destination table definition if one ore more
    new columns need to be added, and then use the staging table to load the destination table. I am afraid there is no available working script for your scenario, and you need some .NET coding experience to achieve your goal. Here is an example you can refer
    to:
    http://www.citagus.com/citagus/blog/importing-from-flat-file-with-dynamic-columns/ 
    Regards,
    Mike Yin
    TechNet Community Support

  • Problem in importing from dmp file

    Hi ,
    I am facing problem while importing from dmp file on unix server
    Error is as follows :
    Export file created by EXPORT:V08.01.07 via conventional path
    IMP-00013: only a DBA can import a file exported by another DBA
    IMP-00000: Import terminated unsuccessfully
    Export taken was with user xxx@yyy
    import command is : imp xxx/ppp file=abc.dmp TABLES=xxx.table1 ignore=y feedback=500
    please help me out.
    Atul Chougule

    I tried with FROM USER / TO USER , but the result is same.
    $ imp uuu/ppp file=aaa.dmp fromuser=uuu touser=uuu ignore=Y
    Import: Release 8.1.7.3.0 - Production on Thu Jun 15 06:44:49 2006
    (c) Copyright 2000 Oracle Corporation. All rights reserved.
    Connected to: Oracle8i Enterprise Edition Release 8.1.7.3.0 - Production
    With the Partitioning option
    JServer Release 8.1.7.3.0 - Production
    Export file created by EXPORT:V08.01.07 via conventional path
    IMP-00013: only a DBA can import a file exported by another DBA
    IMP-00000: Import terminated unsuccessfully
    I am stucked up ,Dont know what to do .

  • Error importing from BIAR file (BO XI 3.0)

    Dear BO community,
    I am using the BIAR tool for backup purposes. For my backup, I use a properties file as input which looks like the one shown below:
    action=exportXML
    exportBiarLocation=C:/Temp/BOBackup.biar
    userName=Administrator
    password=XXX
    CMS=XXX:6400
    authentication=secEnterprise
    includeSecurity=true
    exportDependencies=true
    exportQuery1=select * from ci_infoobjects
    exportQuery2=select * from ci_systemobjects
    exportQuery3=select * from ci_appobjects
    exportQueriesTotal=3
    The file creation seems to work without any problems. A file is created. However, when importing the file into another CMS, I am getting the following error:
    Failed to open the BIAR file. Reason: com.businessobjects.sdk.biar.om.internal.iw.ObjectManagerIWException:Input/Output exception occurred: 'Couldn't find entry 'BusinessObjects.xml'.
    I have opened the BIAR file using a ZIP program and checked that there is no "businessobjects.xml" file in there. When creating BIARs using only a single query, this file exists and the import is no problem.
    Does anyone have an idea what's happening and can help me?
    Thank you and best regards!

    Hello ALL,
    I have the same problem when I am trying import from BIAR file. Please tell me how you resolved this issue.
    I did not understand much from your previous message.
    We are using BO XI R3.1 and below is the error message that Import Wizard is throwing.
    Failed to open the BIAR file. Reason: com.businessobjects.sdk.biar.om.internal.iw.ObjectManagerIWException:required dependencies not found target system : 'AWAS3XkMSEtGn1TXI370kAU'
    In both pc  BO version is boxir3.1 sp3 and currently i m using windows xp professional version 2002 sp3
    Thanks
    Saurabh shivam

  • Va31 shedule line agreement data upload from flat file

    Hi abapers
    I have to upload some data (va31) from flat file to my database (shedule line agreement data) I am using user exit for it...Cant get which user exit will solve the purpose and where to check it from..I tried using SDTRM001 , meeta001 and and the va45A series but its not working. I used break point on these user exits but its not stoping at break point.
    Can any one help me where to find which user exit will work in this case?
    Thanks in Advance
    Annu

    Hi Prash,
    Check these posts:
    Re: Increasing the length of Infoobject from 60 to 240 characters
    Re: InfoObject > 60
    Bye
    Dinesh

Maybe you are looking for

  • Car adaptor to play mp3

    I was just wondering if there is an accessory for the X so I can play my mp3 from my phone in the car.  Thanks!

  • Storing users workspace information in XML

    Hi, I have to store users workspace information in XML file so that user can retreive it later. The information that needed to be stored can be any thing. This XML file will be stores in oracle back end. Well i am not totally well aware of XML advant

  • Is an Adobe Flash Player safe to download on a MacBook Air?

    Is an Adobe Flash Player safe to download on a MacBook Air?

  • Associating To Do entry with a relevant information

    i am currently working on creating To Do entry. To start with , i started with high bill case, where for every stage i am creating a To Do entry. I created a To Do type, creted an algorithm to create the to do entry and i am able to crete a To do ent

  • Can't download purchased movie

    Rented a movie & purchased another: now we can't download the purchased movie: Cloudy With a Chance of Meatballs. Order Number: MGVLYZJDT7 Receipt Date: 08/13/11 Order Total: $18.98 Cloudy With a Chance of Meatballs Write a Review Report a Problem Ch