Substituion Variables in Rules File as data source

Hey all,
This is a continuation of a previous post of mine. I have set some substitution variables to be all apps, all dbs, but when I got into the Rules File-File-Open SQL, it doesn't give me any substitution variables to select (greyed out), I have a few of which the SQL_SOURCE is one.
Other DBs on the same server, same application are OK.
Thanks in advance
David

Hi,
I did a quick test relevant to your post.
What I noticed is- In the drop down, I'm able to see only those Substitution variables(i.e. Data Sources) which are created as DSNs on the Server where EAS is installed.
In other words, I've done the below:
a) Created a DSN, SQL_Source_DSN+
b) Created a substitution variable, sv_DSN whose value is- SQL_Source_DSN+ & assigned to all apps. & dbs.
c) From the Data Prep Editor, when I say- Open SQL, I'm able to see & use- SQL_Source_DSN+
- Natesh

Similar Messages

  • Performance issue with big CSV files as data source

    Hi,
    We are creating crystal reports for a large banking corporation with CSV files as data source. For some reports, we need join 2 csv files. The problem we met now is that when the 2 csv files are very large (both >200M), the performance is very bad and it takes an hour or so to refresh the data in Crystal Reports designer. The same case for either CR 11.5 or CR 2008.
    And my question is, is there any way to improve performance in such situations? For example, can we create index on the csv files? If you have ever created reports connecting to CSV, your suggestions will be highly appreciated.
    Thanks,
    Ray

    Certainly a reasonable concern...
    The question at this point is, How are the reports going to be used and deployed once they are in production?
    I'd look at it from that direction.
    For example... They may be able to dump the data directly to another database on a separate server that would insulate the main enterprise server. This would allow the main server to run the necessary queries during off peak hours and would isolate any reporting activity to a "reporting database".
    This would also keep the data secure and encrypted (it would continue to enjoy the security provided by an RDBMS). Text & csv files can be copied, emailed, altered & deleted by anyone who sees them. Placing them in encrypted .zip folders prevents them from being read by external applications.
    <Hope you liked the sales pitch I wrote for you to give to the client... =^)
    If all else fails and you're stuck using the csv files, at least see if they can get it all on one file. Joining the 2 files is killing you performance wise... More so than using 1 massive file.
    Jason

  • Susbtitution Variables in rules file to replace a member name with another

    Hi everybody,
    Can I use substitution variables to replace a member name with another name in rules file?
    Please let me know if we can use substitution vars.
    Essbase version :11.1.2
    Thanks,
    K.as

    Are you the same guy?
    http://www.network54.com/Forum/58296/thread/1287447433/Susbtitution+Variables+in+rules+file+to+replace+a+member+name+with+another+name
    The answer is still, "No".
    Regards,
    Cameron Lackpour

  • 2.5 GB CSV file as data source for Crystal report

    Hi Experts,
        I  was asked to create a crystal report using crystal report as datasource(CSV file that is pretty huge (2.4Gb)). Could you help with me any doc that expalins the steps mainly with data connectivity.
    Objective is to create Crystal Report using that csv file as data source, save the report as .rpt with the data and send the results to customer to be read with Crystal Reports Viewer or save the results to PDF.
    Please help and suggest me steps as I am new to crystal reports and CSV as source.
    BR, Nanda Kishore

    Nanda,
    The issue of having some records with comma and some with a semi colon will need to be resolved before you can do an import. Assuming that there are no semi colons in any of the text values of the report, you could do a "Find & Replace" to convert the semi colons to commas.
    If find & replace isn't an option, you'll need to get the files separately.
    I've never used the Import Export Wizzard myself. I've always used the BULK INSERT command
    It would look something like this...
    BULK INSERT SQLServerTableName
    FROM 'c:\My_CSV_File.csv'
    WITH (FIELDTERMINATOR = ',')
    This of course implies that your table has the same columns, in the same order as the csv files and that each column is the correct data type to accept the incoming data.
    If you continue to have issues getting your data into SQL Server Express, please post in one of these two forums
    [Transact-SQL|http://social.msdn.microsoft.com/Forums/en-US/transactsql/threads]
    [SQL Server Express|http://social.msdn.microsoft.com/Forums/en-US/sqlexpress/threads]
    The Transact-SQL forum has some VERY knowledgeable people (including MVPs and book authors) posing answers.
    I've never posed to the SQL Server Express but I'm sure they can trouble shoot your issues with the Import Export Wizard.
    If you post in one of them, please copy the post link back to this thread you I can continue to to help.
    Jason

  • Extraction, Flat File, Generic Data Source, Delta Initialization

    Extraction, Flat File, Generic Data Source, Delta Initialization
    I have couple of questions regarding data extraction.
    1.     If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source 
    2.     Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    3.     I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    4.     What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update  or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    5.     If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been  expanded by one field or it may does not need to know at all?
    THANKSSSSSSSSSs

    Hi,
    1. If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source
    Once you create Datasource for A flat file extraction then it is file source system specific hence you cont change to Application table source Data source
    In info package you can change the source as application server instead of desktop no need to change the DS
    2. Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Check the below link for More about generic DS .
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    for Delta capturing you can use
    Timestamp(if the table has time stamp filed  so it can stamp the last changed record in that table hence it is easy to get delta based on the time stamp)
    Calday- (If the table doesn't have the Timestamp filed then search for Calday where you can stamp the delta based on the date when documents are changed )
    Numericpointer : If the table doesn't above both then we go for this option where you can the numeric value change stamp )
    3. I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    Generic datasource nothing but we extracting data directly from the database table without any interface between the application/systems
    4. What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    Correct
    5. If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been expanded by one field or it may does not need to know at all?
    Once you add the new field to structure(DS) you will get the data as on date onwards not historical data hence what is the concept of setup table  ( delta records come from the Delta Que not from the setup table )
    If you want histaric data to new field then you need to setp table deletion ...etc...
    Hope it is clear..
    Regards,
    Satya

  • How to configure crystal report xml file as data source in BOE in Solaris?

    Hi,
    How to configure crystal reports from xml file as data source in Solaris? I didn't find any suitable driver for xml / excel files for sun solaris.
    Which driver i have to use to connect xml file to crystal report to view my crystal report in solaris BOE?
    And the same way excel file as data source for crystal report.
    Thanks

    Hi Don thanks for the reply,
    In windows environment I donot have any problem when creating crystal report from Xml file and Excel file. After creating reports when I publish those into boe server in solaris, getting connection failed error.
    My solaris BOE server doent have any network connection with windows machines. So i have to place the files in solaris server.
    Below the steps what I tried:
    1. Created crystal reports from cr designer in windows using ADO.Net(xml) and in another try with Xml webservices drivers. Reports works well as it should.
    2. Saved in BOE repository in Solaris server from crystal reports and changed database configuration settings as:
        -Used custom database logon information and specified cr_xml as custom driver.
        -Chnaged database path to file directory path according to solaris server file path </app/../../>
        -tried table prefix also
        - Selected radio button use same database logon as when report is run saved.
    My environment :
    SAP BOXI3.1 sp3
    crystal reports 2008 sp3
    SunOS
    Cr developing in windows 7.
    For Excel I tried with ODBC in windows but I can't find any ODBC or JDBC drivers for Excel in solaris.
    Any help to solve my issues
    Thanks
    Nagalla

  • Using environment variable in rule file

    Hi,
    I want to datacopy from one cube (scenario Forecast) to another (scenario FcstJan, FcstFeb, and so on) based on which month is running currently. I want to use environment variable in the rule file. The Forecast data will be extracted in .txt and will be imported to the second cube in FcstJan if current month is Jan and same for Feb, Mar..... The CurrentMth will be my env variable.
    Can anyone please let me know how I can use that variable in the rule file. Or, if there is any other way to load the same.
    Thanks.

    You could just set a substitution variable and then use the substitution variable in the header of the rule.
    Have a read of :- Re: Data error
    to understand about using substitution variables in the header of a rule file.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Simple creation of Update Rule from BW Data Source

    Hi guys,
       Pertaining standard SAP Business Content extractors
       I am referring to <b>InfoCube : 0PA_C01(Headcount and Personnel Actions)</b>
       I am attempting to<u> create </u>an <b>Update Rule</b> from <b>Info Source : 0HR_PA_PA_1(Headcount)</b>
       This <b>Info Source : 0HR_PA_PA_1(Headcount)</b> is connected to <b>BW Data Source</b>(Not R/3!) 0HR_PA_PA_1
       I have created an Info Package for this Info Source and managed to get 15 records{In Contrast to my 68800 Records from Info Source : 0HR_PA_0(Employee)}
       So, when I create an Update Rule to Connect <b>Info Cube: 0PA_C01(Headcount and Personnel Actions)</b> to <b>Info Source to Info Source : 0HR_PA_PA_1(Headcount)</b>, I get the following error
    ERROR : <b>IC=0PA_C01 IS=0HR_PA_PA_1 error when checking the update rules</b>
      Could you please also advice, why do I only get 15 records for Data Source 0HR_PA_PA_1 ?
      P/S : I am on BW 3.5

    Hey Rohini,
       This <b>Data Source: 0HR_PA_PA_1(Headcount)</b> is tricky to me because it`s a BW Data Source.
       Exact Error Message is as follows :-
      "<b>Error Message : RSAU461
        IC=0PA_C01 IS=0HR_PA_PA_1 error when checking the update rules</b> "
       My Exact Problem is that I don`t see any values for the following fields in my Info Cube : 0PA_C01(Headcount and Personnel Actions)
      Country;
      Country Code;
      Gender;
      Nationality;
      Language;
      Postal Code;
      Region;
      Position;
      Job;
      Payroll Area;
      Payroll Group;
      Pay Scale's;
      Pay Grade's
      This is because, this information is supplied by InfoSource : 0HR_PA_PA_1
      But I don`t have an Update Rule for this InfoSource in my InfoCube : 0PA_C01
      So, that's why I am attempting to create this additional Update Rule
    <i>  And also, could someone enlighten me why would SAP not include such a standard Update Rule when they have already idenfied those needed fields in a Cube ? This is suppose to be a STANDARD workable Business Content right ?</i>
      P/S: I have applied Note : 336229

  • SQL Agent Job failing - not using credentials in the config file for Data source

    Hi
    We have an SSIS pkg, that is secheduled as SQL Agent job using proxy account. The pkg contanins data source for connecting different SQL servers and the proxy account do not have access to the external DBs. The data source credentials are stored in the Config
    file.
    Why the job is not using the credentials in the config file and try to use the proxy account and failing.
    Do the proxy account need access to all the external dbs in the pkg, and then what is the purpose of the config file.
    I am sorry, i am not SSIS person trying to understand. If any one can explain tha will be great!!
    Thank you!
    VR

    Please take a look at these URLs:
    Schedule a Package by using SQL Server Agent
    SSIS package does not run when called from a SQL Server Agent job step
    Cheers,
    Saeid Hasani
    Database Consultant
    Please feel free to contact me at [email protected] as well as on Twitter and Facebook.
    [My Writings on TechNet Wiki] [T-SQL Blog] [Curah!]
    [Twitter] [Facebook] [Email]

  • Not able to add an excel file as data source to create new values in the mapped domain

    Hi
    I am trying to use some sample data from an Excel file to improve the quality of a Knowledge Base I have created in the Data Quality Client. I followed the following steps:
    Knowledge Discovery
    Data Source: Excel File
    Browsed to the Excel File on my local drive.
    I'm getting the following error: Failed reading Excel File
    I have checked the security settings, and provided full control to the user. I'm not sure as to what is the issue here. I
    am totally new in this field and with my little knowledge trying to build a Knowledge Base. But the hurdle stopped me.
    Thanks in Advance

    Hi
    You can add a  Excel file as the data source for a universe,  below are the steps.
    Ensure that the following steps are done before inserting a table from Excel sheet into a universe.
    1. Go to Excel and highlight all the cells that you want in the same table.
    2. Go to the Name box from Insert->Name->Define and give a name (in Designer, you see this name as a table of all the values you have selected).
    3. Go to Designer and pop up the table browser. You have to drag and drop the name you gave.
    and check the SP of ur BO.
    you can use .xls file as datasource but you cannot use .xlsx(windows 2007) .
    Regards,
    Rajesh

  • Error (IES 10901) (WIS 10901) when using Excel 2010 file as data source..

    Hello,
    I have a annoying issue with using an excel 2010 file as a data source for a universe. Here is the background information.
    Setup.
    Client: Laptop running Win XP 32bit, Client tools installed with SP2 v9 (latest)
    Server: Win 2008 x64, BOE 4.0 SP2 v9  (latest)
    Task:
    1. On Client, with information design tool, created Project, DB Connection, Data foundation using an excel source on a share
    server\share\data.xlsx
    2.. Published everything to Repository
    3. Try to load univerise in Web Intelligence via creating a new report. I Drag accross the dimensions. When i go to preview data recieve error
    Database error: [Microsoft][ODBC Excel Driver] External table is not in the expected format.. (IES 10901) (WIS 10901)
    4. If i change the excel file to a 2003 format *.xls it all works correctly. Issue being that 2003 cannot hold as much data as a 2010 file so i need to use 2010.
    Extra Notes:
    1. Both the client and server have the latest Access drivers http://www.microsoft.com/download/en/details.aspx?id=13255
    2. Both client and server have a system DSN setup using the 32bit ODBC, same DSN names to the network share.
    3. The data excel source sits on the server on a share.
    4. I am believing that the isssue is within the area of BOE as when you connect to the excel file via ODBC using crystal 2011, you need to right click on the connection and tick "System Tables" to be able to use the excel 2010 file. The BOE Information design tool does not seem to have this option??
    Any advice with this would be great?
    thanks
    david

    Issue Resolved:
    Install
    Access Database Engine 2010 Service Pack 1 on the server
    http://support.microsoft.com/kb/2460011
    Reboot server once completed.

  • Multiple excel files - ODBC Data Source Administrator

    I have about 1750 excel files that I need to convert to text format.
    Is it possible to ... configure all files in the ODBC Data Source Administrator window?

    If you need these DSNs for Java only, you can define them directly inside the connect string.
    Look at these previous topics:
    http://forum.java.sun.com/thread.jsp?forum=48&thread=150956
    http://forum.java.sun.com/thread.jsp?forum=48&thread=218662
    Call again, if you don't succeed.

  • Replacing the Header in rule file without changing source file(header defi)

    hi,
    i am loading the data using header in the Source file (.xls).
    I need to load the same file but for diff header (say instead of A1 i need A2) without any changes in source file.
    can someone please provide information that how to change the header in source file without opening/changing source file.
    Thanks in advance
    Anubhav Bisht

    HI SM,
    thanks for reply.
    the Problem is that there is two members in header defined in the source file that i need to change while loading the data.
    the load rule is picking header from the Source file, so replacing in the field is not working..
    do you have any other option/Suggestion.
    Regards
    Anubhav

  • Changes to source data on rules file

    Hi,,
    Is there any way to replace the data which i got in to the rules file from text source file .( i dont want to change in the source text file).
    Ex:- replace AAAA to BBBB ??
    I know , some changes we can create like
    creating text , split the data ,.Merge the data,move the column

    Hi I need one more help,
    My data soucrce is like below in text file
    field1(----space---) ,filed2 (----space---) ,filed3
    xxxx (----space---) ,yyyy(----space---) ,zzzz
    xxxx (----space---) ,yyyy(----space---) ,zzzz
    In rule file , i have used delimiter as comma and
    I am trying to combine field1 and filed2 to get the result as xxxxyyyy.
    Result is coming in one filed only but value is coming with spaces in between xxxxx (----space---) yyyy.
    Note :- just for understaning purpose ,i mentioned as (-------space---) it nothing blank spaces.

  • Unable to collect Update/Transfer rules Info & Data sources from BI content

    Hi,
    We are tyring to activate Infoprovider 0PY_CO2 from BI content by selecting the option - In data before. BI is unable to collect all corresponding objects such as Update rules, Info source, transfer rules and the data source for the selected Infoprovider. Only Info area and associated info objects are being collected for the installation.
    We are experiencing same problem whilst acitvating most of the info providers. However, in some cases, either of these objects get activated but not all the necessary objects that feed data to the Infoprovider (cube or multi provider).
    Also, Infopackages and Info object catalogs are not being collected for any objects activated/installed so fare from the BI content.
    We are trying to activate BW 3.5 content on the BI 7.0 environment. BI Content release is 703 and support pack level is 008.
    Kindly share any pointers to troubleshoot this probelm please.
    Thanks
    Venkat

    Shiv,
    For some Info providers, all associated info objects(including transfer & update rules and info source and data sources) are getting collected and got successfuly installed. But this is true for only 10% of the Info providers that we have installed so far.
    Remaining all the objects, we are unable to proceed further.
    Thanks
    Venkat.

Maybe you are looking for

  • AVI formatted movies

    How do I view an avi format movie on my MacBook? I was going to download quicktime pro, but quicktime 7 will not install because I have quicktime X on my MacBook. Any suggestions?

  • Oracle 10.2.0.4 to 11.2.0.2 fails with ORA-03113

    This is Oracle upgrade from 10.2.0.2 to Oracle 11.2   (on Windows platform) Since 10202 to 11 is not SAP supported, we upgrade from 10.2.0.4 first before upgrade to 11 10202 to 10204 upgrade finished fine.  All post upgrade sql files are run without

  • Changing the bank recon setup

    Hi, we have to change one of the Account symbol and asociated account for the posting rules for electronic bank statements. what other places do I need to make changes along with this..? do routing to a different account affects the auto reconciliati

  • Multiple EEM consecutive policy of processing a single event

    Hi! Please help me understand. I do not quite understand the algorithm of MULTIPLE CONSECUTIVE policies are processed the SAME event. As happens so that a policy ends and the other is called? As ONE policy is activated understandable.  It ends with s

  • Increase in digits of Input Amount

    Good day everyone! As of this time, our system can only accommodate a maximum input amount of 9,000,000. How will we configure it in such a way that our system can accommodate up to 99,999,999,999? Thank you!