Concatenating the data file(Mbrs) before loading.. due to change in Outline

We have 8 Dimensions(11.1.1.3)
The data file is coming from the source system in this format from 3 yrs as below ..
D1 D2 D3 D4 D5 D6 Product, Curriency
. , . ,. , . , . , ., a , USD
. , . , ., . , . , . , a , EUR
. , . , . , . , . , ., b , GBP
. , . , . , . , . , . , b , INR
Now the product name as been changed in outline as
a_USD
a_EUR
b_GBP
b_INR
So, Is there any way in the hyperion suite(like in rules file or any other), where I can concatenate Product and currency and get the file as
D1 D2 D3 D4 D5 D6 , Product , Curriency
. . . . . . , a_USD , USD
. . . . . . , a_EUR , EUR
. . .. . . . , b_GBP , GBP
. . . . . . , b_INR , INR
Please do let me know
Thanks in Advance.
Edited by: 838300 on Sep 27, 2011 9:00 AM

While what Mehmet wrote is correct, if this is anything more than a quick and dirty fix, may I suggest you go down another ETL path? (Yes, this is Cameron's load rule rant coming right at you, but in abbreviated form.)
ETL in a load rule is great to get the job done, but is a pain to maintain. You would be (unpleasantly) surprised how these have a way of growing. Have you given a thought to fixing it at the source or doing some true ETL in a tool like ODI, or staging it in SQL and doing the mods there? I know, for a simple(ish) change, that seems overkill, but load rules for the purposes of ETL are Evil.
Regards,
Cameron Lackpour
P.S. For anyone who must see me go bonkers over this, see: http://www.network54.com/Forum/58296/thread/1311019508/Data+Load+Optimization+-Headervs+Column

Similar Messages

  • SQL Loader and foreign characters in the data file problem

    Hello,
    I have run into an issue which I can't find an answer for. When I run SQL Loader, one of my control files is used to get file content (LOBFILE) and one of the fields in the data file has a path to that file. The control file looks like:
    LOAD DATA
    INFILE 'PLACE_HOLDER.dat'
    INTO TABLE iceberg.rpt_document_core APPEND
    FIELDS TERMINATED BY ','
    doc_core_id "iceberg.seq_rpt_document_core.nextval",
    -- created_date POSITION(1) date "yyyy-mm-dd:hh24:mi:ss",
    created_date date "yyyy-mm-dd:hh24:mi:ss",
    document_size,
    hash,
    body_format,
    is_generic_doc,
    is_legacy_doc,
    external_filename FILLER char(275) ENCLOSED by '"',
    body LOBFILE(external_filename) terminated by EOF
    A sample data file looks like:
    0,2012-10-22:10:09:35,21,BB51344DD2127002118E286A197ECD4A,text,N,N,"E:\tmp\misc_files\index_testers\foreign\شیمیایی.txt"
    0,2012-10-22:10:09:35,17,CF85BE76B1E20704180534E19D363CF8,text,N,N,"E:\tmp\misc_files\index_testers\foreign\ลอบวางระเบิด.txt"
    0,2012-10-22:10:09:35,23552,47DB382558D69F170227AA18179FD0F0,binary,N,N,"E:\tmp\misc_files\index_testers\foreign\leesburgis_á_ñ_é_í_ó_ú_¿_¡_ü_99.doc"
    0,2012-10-22:10:09:35,17,83FCA0377445B60CE422DE8994900A79,binary,N,N,"E:\tmp\misc_files\index_testers\foreign\làm thế nào bạn làm ngày hôm nay"
    The problem is that whan I run this, SQL Loader throws an error that it can't find the file. It appears that it can't interpret the foreign characters in a way that allows it to find that path. I have tried adding a CHARACTERSET (using AL32UTF8 or UTF8) value in the control file but that only has some success with Western languages, not the ones listed above. Also, there is no set of defined languages that could be found in the data file. It essentaially could be any language.
    Does anyone know if there is a way to somehow get SQL Loader to "understand" the file system paths when a folder and/or file name could be in some other langauge?
    Thanks for any thoughts - Peter

    Thanks for the reply Harry. If I try to open the file in various text editors like Wordpad, Notepad, GVIM, andTextpad, they all display the foreign characters differently. Only Notepad comes close to displaying the characters properly. I have a C# app that will read the file and display the contents and it renders it fine. If you look at the directory of files in Windows Explorer, they all are displayed properly. So it seems things like .Net and Windows have some mechanism to understand the characters in order to render them properly. Other applications, again like Wordpad, do not know how to render them properly. It would seem that whatever SQL Loader is using to "read" the data files also is not rendering the characters properly which prevents it from finding the directory path to the file. If I add "CHARACTERSET AL32UTF8" in the control file, all is fine when dealing with Western langauges (ex, German, Spanish) but not for the Eastern languages (ex. Thai, Chinese). So .... telling SQL Loader to use a characterset seems to work, but not in all cases. The AL32UTF8 is the characterset that the Oracle database was created with. I have not had any luck if I try to set the CHARACTERSET to whatever the Thai character set is, for example. There problem there though is that even if that did work, I can't target specific lagauages because the data could come from anywhere. It's like I need some sort of global "super set" characterset to use. It seems like the CHARACTERSET is the right track to follow but I am not sure, and even if it is, is there a way to handle all languages.
    Thanks - Peter

  • Error: The following translators were not loaded due to errors: ASP.htm has information duplicated b

    Hi,
    Recently my computer had problems, I had to install Windows7 instead of Windows Vista. I made a clean installation of Dreamvers CS4. I was able to Export all the sites from my old computer. But whenever I try to open any file for editing I have this pop-up error message:
    The following translators were not loaded due to errors: ASP.htm has information duplicated by another translator.
    Same error for ASP.Net.htm, ColdFusion.htm, Date.htm, FlashObject.htm, ICERegions.htm, JSP.htm, PHP_MySQL.htm, Server Model SSH.htm, Spry.htm, SpryWidget.htm, XSLT.htm and XSLTransform.htm
    I saw several threads with similar error message, where the translators were invalid, while mine are duplicated by another translator. But I tried the fixes anyway.
    I tried removing Configure folder, and WinFileCache folder. Didn't help. I also tried to create a new site and get all the files from the remote. Same problem.
    Any help will be greatly appreciated.
    Valentina

    Sorry for the bump, but this post helped me solve a simmiliar issue: "The following translators were not loaded due to errors: ColdFusion.htm: has configuration information that is invalid" everytime I try open any document in dreamweaver. As I don't program in cold fusion, I renamed the file to ColdFusion.bck and everything works for me again. I'm not sure what caused the issue, but thanks for pointing out where I can find translator files.
    -Dreamweaver CS5.5, Windows7-64

  • How to get Dynamic data file in SQL*Loader?

    Hi all,
    I have a requirement that needs to import data everyday using background process. I have created a control file with the fixed datafile name and calling it through java. It is working fine.
    The thing is data file name is not constant, it changes everyday (I mean the file name format is YearMonthDate_Clients.cvs). But the the data file location is fixed location.
    How can I get the data file name dynamically?
    Can someone has the answer?
    Thanks in advance,
    Kalyogi

    I'm not a JAVA developer, but I've used this before
    File folder = new File("c:/");
        File[] listOfFiles = folder.listFiles();
        for (int i = 0; i < listOfFiles.length; i++) {
          if (listOfFiles.isFile()) {
    --Here the filename is printed, but I'm sure you can pass this to whatever/however you are invoking SqlLdr
    System.out.println("File " + listOfFiles[i].getName());

  • The .prel files will not load into premiere elements 13 and says that there is a missing codec. This is a school's site llicense that we've used for more thatn a month.

    The .prel files will not load into premiere elements 13 and says that there is a missing codec. This is a school's site llicense that we've used in our lab for more than a month.

    GPPNeil
    Premiere Elements 13 on Windows 8.1 (assumed 64 bit).
    First..
    1. Have you updated from 13 to the 13.1 Update using an opened project's Help Menu/Update. If not, please do so.
    2. Do the problems exist if you disable the antivirus and firewall(s)?
    3. Are you running the program as Administrator? Do you have the latest version of QuickTime installed on your computer?
    4. Have you refreshed the Online Services using Premiere Elements Editor Edit Menu/Preferences/Web Sharing?
    5. Have you updated the Adobe Application Manager?
    http://www.adobe.com/support/downloads/detail.jsp?ftpID=4773
    6.Is your video card/graphics card driver version up to date according to the web site of the manufacturer of the card?
    7. Lots more we could go through such as deleting the Adobe Premiere Elements Prefs file and, if necessary, the whole
    13.0 Folder in which it exists.
    But, I think that you have enough things going on here to warrant an uninstall, free ccleaner run through (regular cleaner and registry cleaner parts) and then reinstall with the antivirus and firewall(s) disabled.
    CCleaner - PC Optimization and Cleaning - Free Download
    If the problems persist after all that then we will adjust to additional troubleshooting maneuvers.
    Any questions or need clarification, please do not hesitate to ask.
    Thank you.
    ATR

  • Clear Data Manager Package Error "The data file is empty."

    Hi,
    When I run the Clear data package in Data Manager, I receive the error "The data file is empty." I selected a very specific set of dimension values (none are calculated) and am on BPC 7.5 SP3. I subsequently turned on debugging to troubleshoot, but do not see any obvious issues leading the the error message. The log file with debugging turned on is below. Any help would be greatly appreciated!
    Thanks.
    Tom
    TOTAL STEPS  3
    1. Export_Zero:        completed  in 1 sec.
    2. Load Cube:          Failed  in 0 sec.
    3. Clear:              completed  in 0 sec.
    [Selection]
    ENABLETASK= Yes
    CHECKLCK= Yes
    (Member Selection)
    Category: ACTUAL
    Time: 2010.C_SEP
    Affiliate: az_swhd
    Account: Donor_DART_ID_1
    Functional: Benchmark_F
    Report: Cons
    Restriction: AnyRestricted
    [Messages]
    The data file is empty. Please check the data file and try again.
    [EvModifyScript Detail]
    12-28-2010  17:30:05 - Debug turned ON
    INFO(%TEMPFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(EXPORT_ZERO, APPSET, ESMetrics)
    TASK(EXPORT_ZERO, APP, CONSOLIDATED)
    TASK(EXPORT_ZERO, USER, NESSGROUP\tbardwil)
    TASK(EXPORT_ZERO, FILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(EXPORT_ZERO, SQL,
    select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable  group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
    TASK(EXPORT_ZERO, DATATRANSFERMODE, 2)
    TASK(LOAD CUBE, APPSET, ESMetrics)
    TASK(LOAD CUBE, APP, CONSOLIDATED)
    TASK(LOAD CUBE, USER, NESSGROUP\tbardwil)
    TASK(LOAD CUBE, FILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(LOAD CUBE, DATATRANSFERMODE, 4)
    TASK(LOAD CUBE, DMMCOPY, 0)
    TASK(LOAD CUBE, PKGTYPE, 0)
    TASK(LOAD CUBE, CHECKLCK, 1)
    TASK(CLEAR COMMENTS, APPSET, ESMetrics)
    TASK(CLEAR COMMENTS, APP, CONSOLIDATED)
    TASK(CLEAR COMMENTS, USER, NESSGROUP\tbardwil)
    TASK(CLEAR COMMENTS, DATATRANSFERMODE, 0)
    TASK(CLEAR COMMENTS, SELECTIONORFILE, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)
    TASK(CLEAR COMMENTS, ENABLETASK, 1)
    TASK(CLEAR COMMENTS, CHECKLCK, 1)
    INFO(%ENABLETASK%, 1)
    INFO(%CHECKLCK%, 1)
    INFO(%SELECTION%, [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'))
    INFO(%TOSELECTION%, [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'))
    INFO(%APPSET%, ESMetrics)
    INFO(%APP%, CONSOLIDATED)
    INFO(%CONVERSION_INSTRUCTIONS%, )
    INFO(%FACTCONVERSION_INSTRUCTIONS%, )
    INFO(%SELECTIONFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\FROM_51_.TMP)
    INFO(%TOSELECTIONFILE%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\TO_51_.TMP)
    INFO(%DEFAULT_MEASURE%, PERIODIC)
    INFO(%MEASURES%, Periodic,QTD,YTD)
    INFO(%OLAPSERVER%, ETSCSAP047940.EASTER-SEALS.ORG)
    INFO(%SQLSERVER%, ETSCSAP047940.EASTER-SEALS.ORG)
    INFO(%APPPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\)
    INFO(%DATAPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\DataFiles\)
    INFO(%DATAROOTPATH%, C:\BPC\Data\WebFolders\)
    INFO(%SELECTIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\SelectionFiles\)
    INFO(%CONVERSIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\ConversionFiles\)
    INFO(%TEMPPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\)
    INFO(%LOGICPATH%, C:\BPC\Data\WebFolders\ESMetrics\Adminapp\CONSOLIDATED\)
    INFO(%TRANSFORMATIONPATH%, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\TransformationFiles\)
    INFO(%DIMS%, [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[Time])
    INFO(%FACTDIMS%, [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID])
    INFO(%CATEGORY_DIM%, [Category])
    INFO(%TIME_DIM%, [Time])
    INFO(%ENTITY_DIM%, [Affiliate])
    INFO(%ACCOUNT_DIM%, [Account])
    INFO(%CURRENCY_DIM%, )
    INFO(%APP_LIST%, Consolidated,ES_INC,GrantMgmt,LegalApp,LRate,Ownership,Rate)
    INFO(%ACCOUNT_SET%, DONOR_DART_ID_1)
    INFO(%AFFILIATE_SET%, AZ_SWHD)
    INFO(%CATEGORY_SET%, ACTUAL)
    INFO(%FUNCTIONAL_SET%, BENCHMARK_F)
    INFO(%REPORT_SET%, CONS)
    INFO(%RESTRICTION_SET%, ANYRESTRICTED)
    INFO(%TIME_SET%, 2010.C_SEP)
    INFO(%ACCOUNT_TO_SET%, DONOR_DART_ID_1)
    INFO(%AFFILIATE_TO_SET%, AZ_SWHD)
    INFO(%CATEGORY_TO_SET%, ACTUAL)
    INFO(%FUNCTIONAL_TO_SET%, BENCHMARK_F)
    INFO(%REPORT_TO_SET%, CONS)
    INFO(%RESTRICTION_TO_SET%, ANYRESTRICTED)
    INFO(%TIME_TO_SET%, 2010.C_SEP)
    INFO(DATAMGRGLOBALBPU, )
    INFO(DATAMGRGLOBALCLIENTMACHINEID, ETSCWLT048794)
    INFO(DATAMGRGLOBALERROR, )
    INFO(DATAMGRGLOBALPACKAGEINFOR, )
    INFO(DATAMGRGLOBALPACKAGENAME, C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\DataManager\PackageFiles\System Files/Clear.dtsx)
    INFO(DATAMGRGLOBALSEQ, 51)
    INFO(DATAMGRGLOBALSITEID, )
    INFO(MODIFYSCRIPT, DEBUG(ON)<BR>PROMPT(SELECTINPUT,[CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED'),,"SELECT THE MEMBERS TO CLEAR",[Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[Time])<BR>PROMPT(RADIOBUTTON,1,"DO YOU WANT TO CLEAR COMMENTS ASSOCIATED WITH DATA REGIONS IN BPC?",1,{"YES","NO"},{"1","0"})<BR>PROMPT(RADIOBUTTON,1,"SELECT WHETHER TO CHECK WORK STATUS SETTINGS WHEN DELETING COMMENTS.",1,{"YES, DELETE COMMENTS WITH WORK STATUS SETTINGS","NO, DO NO DELETE COMMENTS WITH WORK STATUS SETTINGS"},{"1","0"})<BR>INFO(C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempfdla_51_.tmp)<BR>TASK(EXPORT_ZERO,APPSET,ESMetrics)<BR>TASK(EXPORT_ZERO,APP,CONSOLIDATED)<BR>TASK(EXPORT_ZERO,USER,NESSGROUP\tbardwil)<BR>TASK(EXPORT_ZERO,FILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(EXPORT_ZERO,SQL,
    select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable  group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
    )<BR>TASK(EXPORT_ZERO,DATATRANSFERMODE,2)<BR>TASK(LOAD CUBE,APPSET,ESMetrics)<BR>TASK(LOAD CUBE,APP,CONSOLIDATED)<BR>TASK(LOAD CUBE,USER,NESSGROUP\tbardwil)<BR>TASK(LOAD CUBE,FILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(LOAD CUBE,DATATRANSFERMODE,4)<BR>TASK(LOAD CUBE,DMMCOPY,0)<BR>TASK(LOAD CUBE,PKGTYPE,0)<BR>TASK(LOAD CUBE,CHECKLCK,1)<BR>TASK(CLEAR COMMENTS,APPSET,ESMetrics)<BR>TASK(CLEAR COMMENTS,APP,CONSOLIDATED)<BR>TASK(CLEAR COMMENTS,USER,NESSGROUP\tbardwil)<BR>TASK(CLEAR COMMENTS,DATATRANSFERMODE,0)<BR>TASK(CLEAR COMMENTS,SELECTIONORFILE,C:\BPC\Data\WebFolders\ESMetrics\CONSOLIDATED\PrivatePublications\tbardwil\TempFiles\Tempwbh9_51_.tmp)<BR>TASK(CLEAR COMMENTS,ENABLETASK,1)<BR>TASK(CLEAR COMMENTS,CHECKLCK,1)<BR>BEGININFO(
    select [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 as SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) as ZeroTable  group by [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)
    )<BR><BR><BR><BR><BR><BR><BR>SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM ( SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFACTCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED') UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFACTWBCONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')  UNION ALL SELECT [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID],0 AS SIGNEDDATA FROM TBLFAC2CONSOLIDATED WHERE [CATEGORY] in (N'ACTUAL') and [TIMEID] in (N'20100900') and [AFFILIATE] in (N'AZ_SWHD') and [ACCOUNT] in (N'DONOR_DART_ID_1') and [FUNCTIONAL] in (N'BENCHMARK_F') and [REPORT] in (N'CONS') and [RESTRICTION] in (N'ANYRESTRICTED')) AS ZEROTABLE  GROUP BY [Account],[Affiliate],[Category],[Functional],[Report],[Restriction],[TIMEID] OPTION(MAXDOP 1)<BR><BR><BR><BR>ENDINFO<BR><BR><BR>)
    Edited by: Tom Bardwil on Dec 28, 2010 5:20 PM

    You can greatly improve your chance of receiving a helpful answer to your question if you state the version (MS or NW) and the release (5.1, 7.0, 7.5) of BPC which you are using.
    Also notice the sticky [note|Please do not post BPC, SSM or FI/CO questions here!; at the top of this forum whereby we announced new dedicated forums for BPC which are the proper place to post your questions regarding BPC in the future to be able to reach the right audience for your question.
    Thanks and best regards,
    [Jeffrey Holdeman|http://wiki.sdn.sap.com/wiki/display/profile/Jeffrey+Holdeman]
    SAP Labs, LLC
    BusinessObjects Division
    Americas Applications Regional Implementation Group (RIG)

  • Validating the Data in Excel before uploading

    Hi all,
    I have a requirement where i need to uplaod an excel file from the users local system to Server. 
    But before saving i want to do certain validation on the Excel file which the user is trying to upload.
    1. Is the Excel file in the same format as expected . ie the coloumn order , all the required coloumns etc
    2. is there a possibility for me to check if certain coloumn values are mandatory .
    Can you please guide me providing some docs on coloumn validation of the Excel.
    Thanks,
    Swetha

    Hi Swetha,
    First you have create one Value Node that node having 5 attributes. For Example you excel having 5 Columns data So you can create 5 attributes  under value Node.
      Can u see below Code.
    IPrivateASNFileUplaodView.IASNDetails_PopUpNode node = wdContext.nodeASNDetails_PopUp();
    IPrivateASNFileUplaodView.IASNDetails_PopUpElement ele;
    if(node.size() > 0)
    for(int i =0;i <node.size();i++)
    ele = node.getASNDetails_PopUpElementAt(i);
    if(ele.getOpn_Quantity() != null )
    flag = true;  //wdComponentAPI.getMessageManager().reportException(ele.getQuantity()+"val",true);
    else
    flag = false;//wdComponentAPI.getMessageManager().reportException("else-c1",true);
    break;
    if(ele.getPo_Number() != null && ele.getPo_Number().length() != 0 )
    flag = true;     //wdComponentAPI.getMessageManager().reportException(ele.getPlant()+"val",true);
    else
    //wdComponentAPI.getMessageManager().reportException("else-c2",true);
    flag = false;
    break;
    if(ele.getPo_Item() != null && ele.getPo_Item().length() != 0 )
    flag = true;     //wdComponentAPI.getMessageManager().reportException(ele.getMaterialNumber()+"val",true);
    else
    flag = false;//wdComponentAPI.getMessageManager().reportException("else-c3",true);
    break;
    if(flag)
    wdThis.wdGetCO_ASNFileUplaodController().executeYmm_Sc_File_Upload_02_Input();
    wdComponentAPI.getMessageManager().reportSuccess("File Upload Sucessfully");
    else
    wdComponentAPI.getMessageManager().raiseException("Please fill in all the values in Excel sheet and Upload Again.",true);
    By using this code Validating the Data in Excel before uploading .
    Hope this is help full for you Swetha,
    Regards
    Vijay Kalluri

  • Why shrinking the data file is bad?

    Hi All,
    Why shrinking the data file is bad?
    If it is bad, Then why SQL server has this option?

    There are several reasons why shrinking is bad. One of the reasons were recorded already by Louis Jordan around 1950 where he sang What's the Use of Getting Sober, when You Are Gonna Get Drunk Again?" That is, why shrink something that will grow
    again? Growing a database file takes some resources, since it has to be zero-initialised. For a data file this can be avoided by granting the service account for SQL Server the permission "Perform Volume Operations" (if I recall the name correctly).
    For a log file it is unevitable. While SQL Server grows the file, this causes a standstill for the process that triggered the file to grow, and can easily cause a bigger outage.
    Another reason that shirnk is bad is that it adds a lot of fragmentation to the database. In difference to the above, there is no reason it has to be that way. It's simply not an optimal implementation.
    Shrinking can be really poor when you have big LOB columns.
    So why does shrink exist? Because sometimes, shrinking a data file or a log file is a perfectly legit operation. Say that you have a transaction database, and you decide to move transactions that are more than 12 months old to an archive database,
    and you also implement a process to move data to the archive database on quarterly basis. Before you implemented the archive database, the transaction database was 1.5 TB, but after you moved ten years worth of data, you now have a 250 GB database. In this
    case, it would be perfectly reasonable to shrink the data file to, say, 300 GB to make it more managable.
    Or say that you make a change to your reindex job so that it runs with bulk_logged recovery rather than full. This may permit you to reduce the size of the log file with 50% (a number taken out of thin air).
    There are lot of other situations, but the key is: you have reason to believe that the data file will stay within the size you shrink it to. Some people are obesseed with space and shrink whenever they get the occasion, but that is a bad idea.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Where are the data files kept for 6.2.2?

    I have a client who's PC crashed.  She'd been doing full backups to Carbonite.  I've re-installed windows and the palm desktop v6.2.2 (I *think* that's what was running before).  Now I want to restore her address book.
    After doing a full recovery from Carbonite, I've got files all over the place, and copies of copies of copies.  Ugh.
    So...where should I look to restore her most recent address book?  I've searched the entire pc for *.aba, and found about 4 copies of the same file, with a date of 3-9-09.  That's *not* the last time she updated that address book.
    If I know where the files are stored, I can hopefully find them in the carbonite backup, put them in the correct spot on the PC, and be done.
    Help?
    Thanks!
    Bill
    <><
    Post relates to: Palm m105

    If you are trying to access the data in a .dat file, you will want to use Palm Desktop 4.1.4 or Palm Desktop 4.2.  If you have the CD that came with the Palm, you should try installing that version.
    Once installed, you will need to create a new dumby HotSync profile; call it Test or whatever you'd like.  Then find the associated data folder on the PC.  It could be in the c:\program files\palm folder our the user's My Documents folder depending on how old the software is.
    Then, copy (don't move!) the .dat file(s) you have recovered into the correct PIM folder.  So, the address.dat file should be copied into the Address Book or Calendar folder, replacing the emply data file there.
    If Palm Desktop is still running, quit and restart it and see if the data is there.  If it isn't, the data file is either in the wrong spot or is corrupted.
    Hope that helps,
    Alan G

  • Updating iTunes always deletes the data file behind my music in my library

    Hopefully, someone can help me. Everytime I update my iTunes, it somehow deletes the data files for my songs even though the song still appears in my library. When I go to listen or sync to my ipod it says it can't find the original data files and so I have to re-load all my CDs again, so I just basically don't update itunes anymore.
    Can anyone tell me why this is going on and how I can stop this from happening so that I can update itunes without losing all my songs?
    Thanks a bunch in advance!

    This appears to be a common problem lately on some computers ...try the following:
    Disconnect any iPod from the computer
    Temporarily remove any hackies or beta apps
    Make sure iTunes is in the original Apple install location=Applicatons Folder
    Refer to these Apple info docs:
    http://docs.info.apple.com/article.html?artnum=301749e
    http://docs.info.apple.com/article.html?artnum=93313
    Remember .... Seek and Ye shall find!!
    DIXIE

  • Clear Data package - "The data file is empty." Error

    Hi,
    When I run the Clear package in Data Manager, I get the error "The data file is empty". We are running BPC 7.5 SP3. There are no calculated members selected. Any suggestions?
    Thanks.
    Tom

    Hi Tom,
    If i'm not mistaken when we run the clear data package, BPC inserts -ve values for the combination of dimensions selected.
    For ex if you have selected time as 2010.DEC it would reverse out all these values by inserting negative values which would then total up to zero.
    So i still think it could be a problem with the Transformation file.
    P.S. Maybe you can try using the Clear from Fact table package.
    Santosh

  • I have written a binary file with a specific header format in LABVIEW 8.6 and tried to Read the same Data File, Using LABVIEW 7.1.Here i Found some difficulty.Is there any way to Read the Data File(of LABVIEW 8.6), Using LABVIEW 7.1?

    I have written a binary file with a specific header format in LABVIEW 8.6 and tried  to Read the same Data File, Using LABVIEW 7.1.Here i Found some difficulty.Is there any way to Read the Data File(of LABVIEW 8.6), Using LABVIEW 7.1?

    I can think of two possible stumbling blocks:
    What are your 8.6 options for "byte order" and "prepend array or string size"?
    Overall, many file IO functions have changed with LabVIEW 8.0, so there might not be an exact 1:1 code conversion. You might need to make some modifications. For example, in 7.1, you should use "write file", the "binary file VIs" are special purpose (I16 or SGL). What is your data type?
    LabVIEW Champion . Do more with less code and in less time .

  • Data Integrator has no read permissions to get the data file using FTP

    Hi,
    I wonder if you could help.
    We have installed the data integrator and are using FTP to get the data file from the SAP server to the DI server. The files are created by the SAP admin user with permissions 660. The FTP user is not in the sapsys group so cannot read the files.
    Has anyone come accross this problem and how did you solve it?
    Many thanks.

    Hi,
    you might want to put you entry into the EIM forum where also the Data Integrator topics are being handled:
    Data Services and Data Quality
    Ingo

  • How can I put the signature into the data file.

    Hi all,
    I want to put the signature into the data file, mean the data file will be on the top and the signature will be at the buttom.how can i do that?
    Thanks.
    example:-
    dataFile.txt
    Name=...
    Address=...
    Signature=6A07C70E....123FEAB(Hex)

    Thanks for your reply.
    First, i have a keypair (public key and private key).
    I use the private key to initialize the Signature and
    use the Signature to sign a data file ( dataFile.txt) .
    So when i need to comfirm the data file not being modify
    I have to verify the data file with the Signature and the public key.
    I hardcoded the public key in my program so my client (receiver of data)
    need only the Signature file and the data file (2 files).
    So how can i combine the Signature file and the data file into 1 file?

  • Error writing the project file. Error loading type library / dll

    Hi all,
    I am trying to create a new portal project in PDK.NET. But I am getting this error- "Error writing the project file. Error loading type library / dll". Can anyone tell me how to solve this error.
    rgds

    I get this same error.  Was working fine until I installed the SAP .Net Connector 2.0.  Now it does not work even after it was uninstalled.  Please help.

Maybe you are looking for