BPC 7.5 NW -- Data Manager Export:  Fixed-Length Fields?

Hi Experts,
Client has a requirement that we export a batch of transaction data from BPC, which is generally no problem, but in this case they want to do it with Fixed-Length fields in the Export file.
I've done a lot of BPC Data Manager Imports with Fixed-Length fields, and that function really works well.  But, has anyone tried to use the Export package with this option?  It doesn't seem to be working in my case yet.
Any tips?
Thanks so much, as always,
Garrett
=======================
Update -- After going back to review documentation, it looks like the the *PAD() function is actually intended for export not really importing, which makes sense.  ...The SAP online help library says that it's meant for import, but I now believe that is a typo.
Also, I've added the line "OUTPUTFORMAT = NORMAL" in my *OPTIONS section.  Anyone else manage to get export working on BPC 7.5 NW
Edited by: Garrett Tedeman on Mar 3, 2011 1:37 PM

Update -- This problem may now be resolved.
I have been able to conduct test IMPORTs of 48,000, then 96,000 and then 1.7 million records.  All were fine.
It turns out that that difference is that the text files were sorted by amount in the ones that failed.  They were sorted by GLAccount in column A for the ones that succeeded.
Edit:  Yep, all files loaded normally when re-sorted by GLACCOUNT, etc. on the left-hand side.  Apparently, when you're doing a lot of records that might confuse the system or something
Edited by: Garrett Tedeman on Nov 18, 2010 11:41 AM

Similar Messages

  • SSIS Environmental Variables not working in BPC 7.5(MS) Data Manager

    Hi,
    We have an SSIS 2008 package that is retrieving variable values from Environmental Variables.  The package runs successfully when executed from DTExec, but fails when executing from a BPC 7.5(MS) Data Manager package.  Looking through logs we've been able to pinpoint the issue.  The issue is that the SSIS package is unable to retrieve variable values from Environmental Variables when running from Data Manager packages.  We tried using XML Configuration files to store variable values, but this did not work either when running the SSIS package from Data Manager.  Has anyone seen this before?  If so, what was the solution to resolving the issue?
    Thanks,
    Glenn Van Der Werff

    Hi Glenn,
    Can you set package variables in Data Manager script as a workaround?
    For example, here is how we set connection string:
    GLOBAL(DRILL_CN_STRING,Data Source=%SQLSERVER%;Initial Catalog=%DRILL_DB%;Provider=SQLOLEDB.1;Integrated Security=SSPI;Application Name=SSIS-BPC_Incremental_Oracle;Auto Translate=False;)
    DRILL_CN_STRING is a variable in our package and %<Parameter Name>% are parameters for that connection string.
    Regarding Env Variables and XML... there could be security related and/or Package ProtectionLevel issue when BPC system account.
    Hope that helps,
    Akim

  • Uploading data into a fixed length file

    hello experts,
    I got  a task to upload data into a fixed length positional file from internal table.So  please help me.
    regards,
    sriram.

    Hi there.  What you basically need to do is set up the path and name for the export file, move your records to an output table, and then transfer the data to the file that you have specified.  We usually set up our export programs so that the user can choose to download the file locally or onto the application server.  So first we define the following fields on the selection screen:
    PARAMETERS: p_file(128) LOWER CASE OBLIGATORY. "File
    *path name from system standard
    PARAMETERS: p_path LIKE rlgrap-filename MODIF ID fpn.  "Path Name
    PARAMETERS: p_local AS CHECKBOX.  "Local File Flag
    Then we use this to set the path to the application server:
    INITIALIZATION.
      CALL FUNCTION 'FILE_GET_NAME'
        EXPORTING
          logical_filename = 'Z_AO_HR_UP_LOAD'
          parameter_1      = space
        IMPORTING
          emergency_flag   = lw_emergency_flag
          file_name        = p_path
        EXCEPTIONS
          file_not_found   = 1
          OTHERS           = 2.
    AT SELECTION-SCREEN OUTPUT.
    make path name display only
      LOOP AT SCREEN.
        CHECK screen-group1 = 'FPN'.
        screen-input = 0.                  "Output (Display) only
        MODIFY SCREEN.
      ENDLOOP.
    After your START-OF-SELECTION statement, you need to open the file for output:
      IF p_local NE ztc_on.
    Get Path/file Name using logical filename.
          CALL FUNCTION 'FILE_GET_NAME'
        EXPORTING
          logical_filename = 'Z_AO_HR_UP_LOAD'
          parameter_1      = p_file
        IMPORTING
          emergency_flag   = lw_emergency_flag
          file_name        = p_path
        EXCEPTIONS
          file_not_found   = 1
          OTHERS           = 2.
        IF sy-subrc <> 0.
          w_message = 'No Output file - Missing Logical File Z_AO_HR_UP_LOAD'.
          WRITE: / w_message.
          IF sy-batch = 'X'.               "Is this in a job?
            NEW-PAGE.
            PERFORM end_of_selection.
            MESSAGE e000(38) WITH w_message. "This causes the job to cancel
          ELSE.
            MESSAGE i000(38) WITH w_message.
          ENDIF.
          w_stop = 'X'.
          STOP.
        ENDIF.
    *Check if output file is already present.
        OPEN DATASET p_path FOR INPUT IN TEXT MODE.
        IF sy-subrc EQ 0.
          CLOSE DATASET p_path.
          WRITE: 'Output Data file already present ', p_path.
    *Use the following for logging error message
          CONCATENATE 'Ouput Data file already present ' p_path INTO
                       w_message.
          IF sy-batch = ztc_on.            "Is this in a job?
            NEW-PAGE.
            PERFORM end_of_selection.
            MESSAGE e000(38) WITH w_message. "This causes the job to cancel
          ELSE.
            MESSAGE i000(38) WITH w_message.
          ENDIF.
          w_stop = 'X'.
          STOP.
        ENDIF.
        OPEN DATASET p_path FOR OUTPUT IN TEXT MODE.
        IF sy-subrc <> 0.
          WRITE: 'Unable to open output dataset - ', p_path.
    *Use the following for logging error message
          CONCATENATE 'Unable to open output dataset - ' p_path INTO
                       w_message.
          IF sy-batch = ztc_on.            "Is this in a job?
            NEW-PAGE.
            PERFORM end_of_selection.
            MESSAGE e000(38) WITH w_message. "This causes the job to cancel
          ELSE.
            MESSAGE i000(38) WITH w_message.
          ENDIF.
          w_stop = 'X'.
          STOP.
        ENDIF.
      ENDIF.
    Next you perform all of the steps to put your data in the internal table. After the end of selection we then transfer the data from the original internal table (which has many separate fields defined) to a second internal table, t_outfile, in which each record has a single field that is the total length of the original itab record:
    *Internal table for Output data
    DATA : BEGIN OF t_outfile OCCURS 0,
             text(1000).
    DATA : END OF t_outfile.
    Then you can download your file, checking whether it goes to a local drive or the application server:
      IF p_local EQ ztc_on.
          CALL FUNCTION 'WS_DOWNLOAD'
        EXPORTING
          filename                = p_file
        TABLES
          data_tab                = t_outfile
        EXCEPTIONS
          file_open_error         = 1
          file_write_error        = 2
          invalid_filesize        = 3
          invalid_table_width     = 4
          invalid_type            = 5
          no_batch                = 6
          unknown_error           = 7
          gui_refuse_filetransfer = 8
          OTHERS                  = 9.
      ELSE.
        LOOP AT t_outfile.
          TRANSFER t_outfile TO p_path.
        ENDLOOP.
      ENDIF.
    Now close your dataset if it is on the application server:
    IF p_local NE ztc_on.
          CLOSE DATASET p_path.
        ENDIF.
    I hope this helps!
    - April King

  • Import data from a Fixed Length File into Oracle database

    Hi,
    I would like to import data from a Fixed Length text file into a table using HTML DB application.
    As of now, i have a .sql file(that uses external table) to import the data into Oracle tables
    I would like to integrate this in my HTML DB application so that the user can directly import the data into the table.
    Sample data
    XXXYYYYZZZZZ
    Data should be read to table that has
    Col1 Col2 Col3
    XXX YYYY ZZZZZ

    Hi,
    I would like to import data from a Fixed Length text
    file into a table using HTML DB application.AFAIK, fixed length imports are not something you can do directly with the HTML DB tools or available APIs.
    As of now, i have a .sql file(that uses external
    table) to import the data into Oracle tables
    I would like to integrate this in my HTML DB
    application so that the user can directly import the
    data into the table.Any fixed-length data needs to have a specification associated with it that indicates which character position begins a new column and what each column represents. Some also include what datatype should be used for each of those.
    If you really want to do this I would suggest that you create a table that you can store the file spec in. It would probably have columns for field name, start position, length or end position, datatype and any alias/column name you might want to apply. You would then use this table in a PL/SQL procedure (probably a package with the main processing procedure and various supporting procedures/functions) to read the file into memory, apply the file spec to each line and then do inserts into your table.
    Obviously, this is just a concept or strategy. Implementing it will depend on your PL/SQL skills and determination. If you want to pursue this strategy I'm sure you can find some jump-starts by doing a search on the PL/SQL forum.
    This may not sound like an answer - but the answer is you need to code it to fit your requirements. Hope that helps.
    Earl

  • Export to fixed length field format

    Sometimes we need to export records from our student system as fixed length text file to integrate with other software.
    Basically the report will only contains  rows of records, But we want to export it to fixed length field text file.
    And then automate the process.
    We also have Crystal enterprise server.  What is the easy and best way to accomplish this?
    Thanks

    Sorry.  Please ignore my last message.  It went into the wrong thread. 
    There is one huge problem with exporting to Fixed Length in 8.5.  The Fixed Length will truncate any trailing white space.  So if you have a could that's 5 characters long but the current value is only 3, it will only be 3 characters long. 
    To get around this, I've had to create a formula for each column in the report and check the lenght of the field and pad or truncated accordingly. 
    For example:
    If Length ({Table.FIELD1}) > 10 Then
         {Table.FIELD1} [1 to 10]
    Else {Table.FIELD1} & ReplicateString ("*", (10 - Length ({Table.FIELD1}));
    So this column should be 10 characters long.  If the field is longer than 10 characters it will truncate it to 10, otherwise it will pad the extra spaces with *.  You can't use spaces to pad because the export driver will truncate them so I decided * were the best choice because for my purposes I knew that character would never be used. 
    Good luck,
    Brian

  • Data Manager - export/import

    My questions is as follows:
    Export from: Oracle 8.17 (on Unix)
    Import to: Oracle 8.05 (Windows2000)
    How should I set the parameters in the Data Manager to make all tables/procedures/triggers etc.. come across complete and undamaged.?
    I have tried many things - but usually the procedures become invalid and no triggers come across.
    I have used IMP80 as well for importing with same result.
    Regards
    Olav Torvund
    [email protected]

    This is exactly why everyone out there should not get too comfortable using OEM to manage their databases. If you know how to do a task using the command-line, then by all means do it that way. Use OEM or DBA Studio for things you're not so sure of or if you need a visual to help you do it right. Command-line is quicker and more percise because you have total control of what info you're sending to the db.
    Others may disagree with the advice I'm giving - which is to NOT use OEM if you know how to do the task via command-line - and further input is welcome. What do others think about this?????????
    Roy

  • BPC 7.5 NW -- Data Manager Import Fails When Loading 40,000 Records?

    Hi Experts,
    Can't believe I'm posting this because the Data Manager component has always been one of the best parts of BPC.  But, since we got SP04 applied last week, every IMPORT process that I've run that has more than 40,000 records fails out.
    The result logs show that the CONVERT task completes just fine, but it doesn't really show the LOAD task.  ...Not exactly sure what's going on here.  So far I've also taken the following two steps to try for resolution:
    (1.)  Re-added the IMPORT package in Organize Package List from the Library to have a "fresh" version.  Didn't help.
    (2.)  In the "Modify Package" screens, there is a PACKAGESIZE parameter that is 40,0000 by default...  I've been able to locate that in BI using transaction RSA1 and have changed it to 10,000,000.  Saved it.  Tried it.  Didn't help either
    Has anyone seen this kind of behavior before?
    Thanks,
    Garrett

    Update -- This problem may now be resolved.
    I have been able to conduct test IMPORTs of 48,000, then 96,000 and then 1.7 million records.  All were fine.
    It turns out that that difference is that the text files were sorted by amount in the ones that failed.  They were sorted by GLAccount in column A for the ones that succeeded.
    Edit:  Yep, all files loaded normally when re-sorted by GLACCOUNT, etc. on the left-hand side.  Apparently, when you're doing a lot of records that might confuse the system or something
    Edited by: Garrett Tedeman on Nov 18, 2010 11:41 AM

  • BPC NW 10.0 - Data Manager Prompt changing from SELECTINPUT to SELECT cleared values

    Dear BPC Experts,
    We recently went from SP13 patch 4 to SP19 patch 1.  When we made changes in the PROMPT values in the Data Manager Organize>Package>Modify Script>PROMPT, we experienced different behavior switching from SELECTINPUT to SELECT in our development system than we did in our production environment.  In development, when we changed the value from SELECTINPUT to SELECT, the values entered for Variable name such as %SELECTION% in Property1 and "Select the members to CLEAR" in Property2, and %DIMS% remained.  However, when we changed from SELECTINPUT to SELECT in production system, the values for Variable Names and Properties were cleared out.  Does anyone know why in our developmet system values were kept but not in our production system during this type of activity?  I would like to understand the two different behaviors and what controlled it.  We prefer not to have the values for Properties clear.
    Thank you in advance for your assistance.
    Kind regards,
    Lisa

    Hi Vadim,
    Excellent point, I should have included images as that likely would have shown this odd behavior.
    When I made the changes in our development system to a package to switch from SELECTINPUT to SELECT the values outlined in the image below were retained for Varialbe Name, Property 2, and Property 3 after we applied the SP19 patch 1.
    When I made the same change in our system to a package in our production system after we applied the SP19 patch 1, the values for Varialbe Name, Property 2, and Property 3 were cleared per the image below.  The odd thing is that initially it looked like the values stayed.  It was only after you saved and went back in did you see that the values were gone.
    Any help in understanding this behavior change would be greatly appreciated.
    Thank you,
    Lisa

  • BPC NW - Cannot run Data Manager

    Hello,
    I have recently installed the NW version of BPC in my computer, but looks the installation does not come to the right end, as I can not run all the components.
    When I access the Data Manager menu, the corresponding toolbar is not installed and I have encountered the following failure scenarios when trying to access the standard menu:
    1. I get a blank message when clicking on Data Manage option, whichs leads into the Data Manager Menu.
    2. when clicking on the Upload data file option, it allows to select the source of the data, but I do not get any response for the destination of the file
    3. When accessing the Run a data management package, I get a visual basic error:
    "Run-time error 91; Object Variable or WIth Block variable not set"
    4. in other options of the standard menu, I also get several error messages which do not allow to work with this menu, so data cannot be loaded into any application.
    I recently installed (and uninstalled) the SQL version of BPC and was working OK; however, with the NW version it does not seem to be working properly.
    I have unistalled and installed several times following the recommendations (as detailed below), but still have not succeeded.
    Client Machine:
    1. Uninstalled BPC for Office and BPC Administration.
    2. Deleted C:\Documents and Settings\username\My Documents\OutlookSoft folder
    3. Deleted C:\Program Files\BPC folder
    4. Removed the OUTLOOKSOFT entry from the registry. Start Menu \ Run \ cmd \ regedit. Delete the HKEY_LOCAL_MACHINE\SOFTWARE\OUTLOOKSOFT folder.
    5. Reboot PC.
    We have also checked for note 1098742, but once installed and re-installed BPC NW it stills displays the same error message.
    I have also tried note 1108598, but looks to be relevant for MS version only, and all the components are already available in my computer.
    Has anyone encountered a similar problem?
    Thanks in advance.
    Kind regards
    Begonia

    Hi
    We having the same issue of "RUN-TIME ERROR - Invalid Key" and still cannot find solution. And the other visual basic runtime error encountered along with the above mentioned is:
    Automation error: The callee (Server application) not available and disappeared, all connections are invalid.
    Do post any solution discovered.
    Thanks,
    Oscar

  • Data Manager : Export records to excel

    Hi MDM pals,
    Data manager offers an user to select records in a repository and export them to an excel file.Options exist to choose fields and Qualifiers to be exported.Output file will be a single excel file with all records in distinct columns.
                               Having said that, I am in need of exporting records in main table and qualified lookup tables with respective qualifiers into different excel files. Please advise how to split the same.
    Need stated as below :
    1) Excel file 1 : Main table (Customer)
    2) Excel file 2 : Qualified Table 1 (Sales Data) - Should contain Sales records(non qualifiers and qualifiers)  for each customer.
    3) Excel file 3 : Qualified Table 1 (Pricing Data) - Should contain Price records(non qualifiers and Qualifiers) for each customer.
    Regards,
    Vinay M.S

    Hi Vinay,
    Yes this can be achieved but you need to perform the export three times.
    1) Main table (Customer)
    Go to File -> Export To -> Excel, select all the fields, this will simply export all the main table records. If you want to export selected records then either select them manually or perform the search and then go for export.
    2) Excel file 2 : Qualified Table 1 (Sales Data) - Should contain Sales records(non qualifiers and qualifiers) for each customer.
    Go to File -> Export To -> Excel, remove all the main table fields, select Customer Number field and the Lookup Qualified field Sales Data from the list of available fields and move them to Fields to Export, once you move Sales Data field, Qualifiers Check box will get enabled, Check that and you will get the list of Qualifiers in the available fields, either you can select all the qualifiers or the selected ones depending upon what you want in the excel file. Just click on OK and check the output file.
    3) Excel file 3 : Qualified Table 1 (Pricing Data) - Should contain Price records(non qualifiers and Qualifiers) for each customer.
    Similar to 2
    Regards,
    Jitesh Talreja

  • Exporting fixed length records in a .txt format

    Post Author: jnesbitt
    CA Forum: Crystal Reports
    How can I configure Crystal Report detail records so they can be exported in a fixed legth .txt record format?
    Using Crystal Reports 9.2.3.884

    Post Author: SKodidine
    CA Forum: Crystal Reports
    I have created CR detail records fixed length of 400 and then exported them to text.
    This is how I did it, there might be other ways and I even though I created them in CR XI, it should be similar in CR 9.
    Let us assume that your fixed length record needs to be 100 bytes long.  Create a formula with all the fields concatenated, that you would otherwise display in the details section.  Create a second formula that will use the space function to put trailing spaces.
    Here is an example for a fixed length record of 400:
    Here are my two fields in my record that have been concatenated.
    Formula 1: RecordType & totext(BatchNumber,"0000000");
    Formula 2: {@Formula1} & space(400 - length({@Formula1}));

  • How to create an UTF8 data file with max length fields

    We have a legacy application which allows for a transparent text field a maximum length of 30 Bytes.
    The DB-Length of the field is 40 Chars.
    In old NUC days it was so simple:
    TYPES: BEGIN OF t_items,
             hwmc(30) TYPE C,  
    A move, that was it.
    Now we have migrated to unicode, use UTF8 for the external data and are puzzeling over how to create an output field with a fixed length of 30 Bytes.
    Has somebody an elegant idea?

    wyfwong, >> Does the schema which stores a large BLOB column has to create a large tablespace in order to store the column value or the database automatically enlarge the tablespace when a large BLOB column is loaded into the table? <<
    The answer to this question would depend on how you defined the tablespace adn datafiles used to store the tablespace objects to begin with: raw partitions verse files, autoextend on or off, etc....
    It would seem to me to be foolish not to predefine the tablespace at least large enough to hold the tables and indexes based on their expected near_term data loads. That is, if you know the initial load of data will be 60G then the tablespace should be larger than 60G to start with.
    HTH -- Mark D Powell --

  • Output file as a text file with tab delimited and fixed length fields

    Hi all,
    I have developed a custom report which outputs an excel file on the user desktop who executes that report.Now i need to create an additional (second) excel file with almost the same data as the first file.
    Im using the FM GUI_DOWNLOAD to create the file.i need have the 2nd file as txt file(seperated by space/tab delimited) and also i want the fields to have fixed length.For this format of the file,what parameters do I need to pass to the FM ?
    BR,
    SRM Tech.

    Thanks for the prompt reply.
    Also in the sel screen,Im entering the path where  the o/p file needs to be downloadede.g. C:/Output_folder/Output.xls...Now if I need a text file,do I need to give the fielname as C:/Output_folder/Output.txt.?

  • How to output non-fixed length field by DMEE?

    Hi,
    In DMEE, I need to set a fixed length for each field. If the source data length is less than what we set, it will ouput space to fill in rest of the place. Is that a way I could just out put exactly what I need? No extra space to be output.
    Thank you!

    Hi
    The extra spaces can be removed as follows:
    In the transaction DMEE --> Head --> Format attributes --> Field type = 1
    Best regards
    Jean Daniel

  • Data management - Export Data on a text file and work on Excel/Acces

    Hello,
    When i use the Data Export tool in order to retreive all Accounts data, the CRM creates a txt file with more than 255 columns (number of record type by account). Then I can't import them in a new Access database because Access import tool do not accepts too many columns. So, i try to open it in Excel by import text file. However, some Accounts have a comment in a large text field and, when Excel converted the text file, datas are shifted compared with the column headers because of this specific "comments" record type.
    Is someone knows whether it's possible to reduce the number of record type exported by selecting specific ones or knows a process to import on Excel/Access which resolved this problem of position ?
    Regards,
    Guillaume

    Guillaume, you can create a historical account report and specify the fields you want and then download to Excel. Also, if you use the export utility which provides a CSV file format with more than 255 columns, delete the columns you do not want to get it under 255 and then convert to Excel.

Maybe you are looking for