Loading data to a cube from a DSO and a Flat file

Hi All,
I have an Infocube with fields
product
plant
customer
quantity
country
I am trying to load data into it , few fields from a DSO and others from a flat file
e.g plant and country from a dso
and product, customer and quantity from a flat file.
I have created 2 transformations,
one from the DSO -> cube (which works)
another transfromation does not get activated -- it gives an error msg saying (no source fields are assigned to the rule)
Is it possible to make a load of this sort? If so, then why is the transformation giving that error?
Please do help!
Thanks and Regards,
Radhika

Dear Friend,
Not sure what have you done, it should be like this
  datasouce---DSO1--
Cube1
Flat2--DSO(optional)-Cube2
then you should build a multiprovider on top of these cubes (cube1 and cube2 ) and then create Query.
Please check if this is something you have done.
Hope this helps.
Thanks
sukhi

Similar Messages

  • How to load data to a cube from multiple infosources ?

    Hi friends,
    How to load data to a cube from multiple infosources ? could u please answer this question .
    thanks in advance......

    Hi ,
    say for example you need to load data to 1 cube from 3 info sources:
    1) You need to create 3 update rules for the Cube.
    2) Each time you create the update rules. Mention the name the name of the Info source. and create update rules correspondingly.
    Regards
    satish
    Message was edited by:
            satish murthy

  • Data is not uploaded from the dso to the cube

    Dear Experts,
    In one of my process chains the data is not uploaded from the dso to the cube .
    I have tried to upload the data month wise also but still after certain data records the data gets stuck up.
    I have recreated the indexes also.
    When I am checking in DB02 the table space is shown as 18491 MB.The used space is 97%.
    Please suggest.

    Hi.....
    I didn't get your point........what have you mention that you have recreated the index before loading....
    Basically.the process should be........Delete index --> Load --> Create index......
    You have mentioned that 97% of the memory space is in use........please check with the basis team once......
    Also, in SM37 check if any such job is there which is running for a long time but not progressing......or which you think you don't need......then kill that job......
    Regards,
    Debjani.....

  • Loaded data amount into cube and data monitor amount

    Hi,
    when I load data into the cube the inserted data amount in the administrator section shows 650000 data sets. The monitor of that request shows a lot of data packages. When I sum the data packages, the sum is about 700000 data sets.
    Where is the difference coming from?
    Thanks!

    Hi ,
       If it is a full load to the cube , all the records are updated in it since in a cube data can be overwritten.
       If it is a delta load and u want to see why the difference occurs between the records transferred and added in cube ,
       u can go to the manage tab in dso , go to the contents tab ,there click change log button at the below , check the number of entries in that table , the number of entries are the added records in cube since only these records are the new records other records with the same key are already present in the cube.

  • Do we have to load data in BPS cube before entering values ?

    Hello,
    Can someone explain me the flow of data in BPS. Like I have created a copy of a regular cube and will be working on that making it into a transactional cube (as per help.sap). So do i need to load the same data in tran. cube from R/3 as load in regular cube. or do I just have to create the planning folders and layouts and let the user enter their planning values?
    Do we have to send the data that users input back to R/3 ?
    In short if someone can explain me the flow of data it would be great.
    Thanks a lot in advance.

    Hi,
    If you are working on funds management then I think there's an extractor for it and clients usually plan and retract the  data back to SAP. Again, loading of data is not just for comparing plan and actuals the client might be reporting on the data.
    BPS is a part of BI, the main area that is normally implemented first is the BW part which is for reporting and then go for BPS.
    What I am not sure is these are very generic questions can you be more specific as to what's the problem you are facing. All the points mentioned by you will be handled/answered during a blue printing session.
    thanks

  • Input ready query is not showing loaded data in the cube

    Dear Experts,
    In Input ready query we have problem that it is not showing the values which was not entered by throught hat query. Is any settings in input ready query that we can do to populate the loaded data on the cube as well as data entered through Input ready query itself.
    Thanks,
    Gopi R

    Hi,
    input ready queries always should display most recent data (i.e. all green and the yellow request). So you can check the status of the requests in the real-time InfoCube. There should exist only green requests and maybe at most one yellow request.
    In addition you can try to delete the OLAP cache for the plan buffer query: Use RSRCACHE to do this. The technical names of the plan buffer query can be found as follows:
    1. InfoCube\!!1InfoCube, e.g. ZTSC0T003/!!1ZTSC0T003 if ZTSC0T003 is the technical name of the InfoCube
    2. MPRO\!!1MPRO, e.g. ZTSC0M002/!!1ZTSC0M002 if ZTSC0M002 is the technical name of the multiprovider
    If the input ready query is defined on an aggregation level using a real-time InfoCube, the first case is relevant; if the aggregation level is defined on a multiprovider the second case is relevant. If the input-ready query is defined on a multiprovider containing aggregation levels again the first case is relevant (find the real-time InfoCubes used in the aggregation level).
    Regards,
    Gregor

  • How to load data into an ods from multiple info sources.

    hi all...
    i am given a task to load data into an ods from 3 infosources ...
    can someone plz give me the flow .
    thank u in advance..

    Hi Hara Pradhan,
    You have to create 3 update rules by giving the 3 different infosources while creating each update rule. And u have to create the infopackages under each infosource. with this u can load the data to the same data target from multiple info sources.
    Hope it helps!
    Assign points if it helps!

  • Getting error while loading  Data into ASO cube by flat file.

    Hi All,
    i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
    does anyone have solution.
    Regards,
    VM

    Are you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • How to convert from SQL Server table to Flat file (txt file)

    I need To ask question how convert from SQL Server table to Flat file txt file

    Hi
    1. Import/Export wizened
    2. Bcp utility
    3. SSIS 
    1.Import/Export Wizard
    First and very manual technique is the import wizard.  This is great for ad-hoc and just to slam it in tasks.
    In SSMS right click the database you want to import into.  Scroll to Tasks and select Import Data…
    For the data source we want out zips.txt file.  Browse for it and select it.  You should notice the wizard tries to fill in the blanks for you.  One key thing here with this file I picked is there are “ “ qualifiers.  So we need to make
    sure we add “ into the text qualifier field.   The wizard will not do this for you.
    Go through the remaining pages to view everything.  No further changes should be needed though
    Hit next after checking the pages out and select your destination.  This in our case will be DBA.dbo.zips.
    Following the destination step, go into the edit mappings section to ensure we look good on the types and counts.
    Hit next and then finish.  Once completed you will see the count of rows transferred and the success or failure rate
    Import wizard completed and you have the data!
    bcp utility
    Method two is bcp with a format file http://msdn.microsoft.com/en-us/library/ms162802.aspx
    This is probably going to win for speed on most occasions but is limited to the formatting of the file being imported.  For this file it actually works well with a small format file to show the contents and mappings to SQL Server.
    To create a format file all we really need is the type and the count of columns for the most basic files.  In our case the qualifier makes it a bit difficult but there is a trick to ignoring them.  The trick is to basically throw a field into the
    format file that will reference it but basically ignore it in the import process.
    Given that our format file in this case would appear like this
    9.0
    9
    1 SQLCHAR 0 0 """ 0 dummy1 ""
    2 SQLCHAR 0 50 "","" 1 Field1 ""
    3 SQLCHAR 0 50 "","" 2 Field2 ""
    4 SQLCHAR 0 50 "","" 3 Field3 ""
    5 SQLCHAR 0 50 ""," 4 Field4 ""
    6 SQLCHAR 0 50 "," 5 Field5 ""
    7 SQLCHAR 0 50 "," 6 Field6 ""
    8 SQLCHAR 0 50 "," 7 Field7 ""
    9 SQLCHAR 0 50 "n" 8 Field8 ""
    The bcp call would be as follows
    C:Program FilesMicrosoft SQL Server90ToolsBinn>bcp DBA..zips in “C:zips.txt” -f “c:zip_format_file.txt” -S LKFW0133 -T
    Given a successful run you should see this in command prompt after executing the statement
    Starting copy...
    1000 rows sent to SQL Server. Total sent: 1000
    1000 rows sent to SQL Server. Total sent: 2000
    1000 rows sent to SQL Server. Total sent: 3000
    1000 rows sent to SQL Server. Total sent: 4000
    1000 rows sent to SQL Server. Total sent: 5000
    1000 rows sent to SQL Server. Total sent: 6000
    1000 rows sent to SQL Server. Total sent: 7000
    1000 rows sent to SQL Server. Total sent: 8000
    1000 rows sent to SQL Server. Total sent: 9000
    1000 rows sent to SQL Server. Total sent: 10000
    1000 rows sent to SQL Server. Total sent: 11000
    1000 rows sent to SQL Server. Total sent: 12000
    1000 rows sent to SQL Server. Total sent: 13000
    1000 rows sent to SQL Server. Total sent: 14000
    1000 rows sent to SQL Server. Total sent: 15000
    1000 rows sent to SQL Server. Total sent: 16000
    1000 rows sent to SQL Server. Total sent: 17000
    1000 rows sent to SQL Server. Total sent: 18000
    1000 rows sent to SQL Server. Total sent: 19000
    1000 rows sent to SQL Server. Total sent: 20000
    1000 rows sent to SQL Server. Total sent: 21000
    1000 rows sent to SQL Server. Total sent: 22000
    1000 rows sent to SQL Server. Total sent: 23000
    1000 rows sent to SQL Server. Total sent: 24000
    1000 rows sent to SQL Server. Total sent: 25000
    1000 rows sent to SQL Server. Total sent: 26000
    1000 rows sent to SQL Server. Total sent: 27000
    1000 rows sent to SQL Server. Total sent: 28000
    1000 rows sent to SQL Server. Total sent: 29000
    bcp import completed!
    BULK INSERT
    Next, we have BULK INSERT given the same format file from bcp
    CREATE TABLE zips (
    Col1 nvarchar(50),
    Col2 nvarchar(50),
    Col3 nvarchar(50),
    Col4 nvarchar(50),
    Col5 nvarchar(50),
    Col6 nvarchar(50),
    Col7 nvarchar(50),
    Col8 nvarchar(50)
    GO
    INSERT INTO zips
    SELECT *
    FROM OPENROWSET(BULK 'C:Documents and SettingstkruegerMy Documentsblogcenzuszipcodeszips.txt',
    FORMATFILE='C:Documents and SettingstkruegerMy Documentsblogzip_format_file.txt'
    ) as t1 ;
    GO
    That was simple enough given the work on the format file that we already did.  Bulk insert isn’t as fast as bcp but gives you some freedom from within TSQL and SSMS to add functionality to the import.
    SSIS
    Next is my favorite playground in SSIS
    We can do many methods in SSIS to get data from point A, to point B.  I’ll show you data flow task and the SSIS version of BULK INSERT
    First create a new integrated services project.
    Create a new flat file connection by right clicking the connection managers area.  This will be used in both methods
    Bulk insert
    You can use format file here as well which is beneficial to moving methods around.  This essentially is calling the same processes with format file usage.  Drag over a bulk insert task and double click it to go into the editor.
    Fill in the information starting with connection.  This will populate much as the wizard did.
    Example of format file usage
    Or specify your own details
    Execute this and again, we have some data
    Data Flow method
    Bring over a data flow task and double click it to go into the data flow tab.
    Bring over a flat file source and SQL Server destination.  Edit the flat file source to use the connection manager “The file” we already created.  Connect the two once they are there
    Double click the SQL Server Destination task to open the editor.  Enter in the connection manager information and select the table to import into.
    Go into the mappings and connect the dots per say
    Typical issue of type conversions is Unicode to non-unicode.
    We fix this with a Data conversion or explicit conversion in the editor.  Data conversion tasks are usually the route I take.  Drag over a data conversation task and place it between the connection from the flat file source to the SQL Server destination.
    New look in the mappings
    And after execution…
    SqlBulkCopy Method
    Sense we’re in the SSIS package we can use that awesome “script task” to show SlqBulkCopy.  Not only fast but also handy for those really “unique” file formats we receive so often
    Bring over a script task into the control flow
    Double click the task and go to the script page.  Click the Design script to open up the code behind
    Ref.
    Ahsan Kabir Please remember to click Mark as Answer and Vote as Helpful on posts that help you. This can be beneficial to other community members reading the thread. http://www.aktechforum.blogspot.com/

  • I had to redownload cs6 from another drive and all the files are there but it wont open any of it

    i had to redownload cs6 from another drive and all the files are there but it wont open any of it

    Copying/transferring will not work.  You need to install the software using installation files.
    CS6 - http://helpx.adobe.com/x-productkb/policy-pricing/cs6-product-downloads.html

  • Not getting data in cube from lookup DSO

    Hi guys,
    I have a transformation with source DSO and the target Cube. In which I have a look-up from another DSO to the cube in this transformation, end routine. But I am not getting the records I needed in the cube, which are supposed to flow from another DSO. Any ideas would be appreciated.
    regards...

    Hi,
    When we do look-ups in end routine, we don't have active rules for the transformation.
    By default transformation's "Update behaviour of end routine change" is set to
       Only target fields with active rules
       all target rules(independent of active rules)
    So, we have to change to
    All target rules (independent of active rules) in order to get the look-up data independent of active rules.
    In old software, we used to make an infoobject constant,.
    "Update behavior of end routine change " was not available.
    check it once because it happened to me before.
    Hope it works....

  • Error while loading data into the cube

    Hi,
    I loaded data on to PSA and when I am loading the data to the cube through DataTransferProcess, I get an error (red color).
    Through "Manage", I could see teh request in red. How do I get to knoe the exact error? Also what could be the possibel reason for this?
    Also can some one explain the Datatransfer process(not in process chain)?
    Regards,
    Sam

    Hi Sam
        after you load the data  through DTP(after click on execute button..) > just go to monitor screen.. in that press  the refresh button..> in that it self.. you can find the  logs..
       otherwise.. in the request  screen also..  beside of the request number... you can see the logs icon.. you can click on this..
    DTP  means..
    DTP-used for data transfer process from psa to data target..
    check thi link..for DTP:
    http://help.sap.com/saphelp_nw04s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    to load data in to the datatargets or infoproviders like DSO, cube....
    in your  case.. the problem may be.. check the date formats.. special cherectrs... and
    REGARDS
    @JAY

  • Loading the cube from 3 datasources and getting 3 records for each keyfield

    Hi All,
    I am loading an InfoCube from 3 separated datasources. These 3 datasources are UD Datasources and their unique source system is UD Connect.
    Each of the datasource contains a unique key field 'Incident Number' (same as we use have in Datasources for DSO).
    The problem is, when I am loading data with these 3 datasources to the cube, for each 'Incident number' there becomes 3 records.
    We have reports on this Infocube and the report also displays 3 records for each incident number.
    If I remove Incident Number key field from 2 of the Datasources, the data from these datasources do not reach to the Cube.
    For many of you, this may be a minor problem ( or may not be a problem at all !!! ) , but as a New Joinee in SAP field, this has become a showstopper issue for me.
    Please suggest.
    Thanks in Advance.

    Hi Pravender,
    Thanks for your interest.
    The scenario is, I have 3 datasources form the same source system, All the 3 datasources have different fields except 'Incident Number'. So, each and every field has only one value in the report. But due to 3 separate datasources, it creates 3 records displahying values of each datasource in a separate record.
    There is no field in the query output which is having different values for the different source systems. Due to 3 records in the cube, one record will contain the value for a particular field and the other two records will show a Blank for that field.
    Regards.

  • Loading data to a cube

    Hi All,
    We have created one cube and loaded data successfully.
    But we have one dimension named periodType: the members of the period are:Annual, Quarterly and Monthly.
    Client asked to make Monthly as + and remaining ~.
    For that we have created one view in that we have added one column aggr_cons.
    I have defined it as +* for monthly and the rest as *~*. I’m using that column as the consolidation operator.
    But the data load is laoding only a few records.
    Actually client asked me to create an Attribute dimension but it is not possible in EIS.
    We are using EIS 7.1.2 and SQL server 2005.
    For PeriodType dimension we have written the query like
    select distinct PeriodType,aggr_cons
    from ClaimsData_2
    PeriodType is a column in table and it contains Annual,Quarterly,Monthly.
    Please let me know any ideas to do this.
    Only thing is i have to make Monthly as +* and remaining *~*.
    Thanks,
    prathap

    Hi Pratap,
    Ffirst do following changes in datasource (i.e. SQL Server, Oracle whatevere you are using): -
    1- Create a new table say 'Population' and add add two column say id and population like 100,200....etc. Define id as primary key.
    2- Now assuming that you have a SQL table called 'Product' so add column called 'Attribute' create it relationship with column 'id' of table 'Population' through foriegn key.
    Now do follolwing changes in OLAP metadata & metaoutline:-
    1- Suppose you have Product dimension and enable one of its column as attribute.
    2- Ok now open metaoutline and expand Product dimension in left panel. Now it will show attributes that you associated.
    3- Select an attribute and drag to right panel. It will create a attribute dimension automatically.
    Hope it answers you.
    Atul K,

  • Unable to load data into Essbase cube using Essbase Studio

    Hi
    We are creating an essbase cube using Essbase Studio using flat files as data sources.
    We have taken different hierarchies into different flat files and created one fact file having dimension intersection along with data.
    We are able to create the cube and the hierarchy but not able to load any data.
    We are getting the following error
    Failed to deploy Essbase cube.
    Caused by: Unable to perform dataload from more than one flat file.
    Could anyone please help on this?

    Oh this was killing me, so I did this test in 11.1.1.3:
    1) Excel 2007 format -- no go, Essbase didn't see it
    2) Excel 2003 format, three sheets -- only the first sheet was read into an empty rule
    3) Excel 2003 format, one sheet -- the first sheet was read into an empty rule
    4) Excel 95 format, one sheet -- the first sheet was read into an empty rule
    The lesson?
    1) Excel 2007/2010 sheets don't work (no surprise there as the .xlsx format isn't supported).
    2) Excel 2003 and lower (hey, if you have Excel 4, I'll bet that works as well) work, but only the first sheet is recognized.
    Regards,
    Cameron Lackpour

Maybe you are looking for

  • Mass update of Item category in Sale order

    Hi all,    Kindly guide me to do the mass updating of item category in a sale order.    I want to update the item category of all the items in a sale order.    User will be giving large number of sale orders as an input.    How to proceed to achieve

  • Problems with digital signatures (adobe reader)

         I am a Government contractor that develops documents for my customer.  We are in the process of ascertaining the viability of digital signatures. I have developed a signature form with Adobe Pro 9.0.      I have several issues. When someone digi

  • Why does my macbook show  emails as being sent twice

    why does my MacBook Pro  show  emails as being sent twice

  • Adobe Professional 7.0 error

    How do I handle this error message on Adobe Professioal 7.0 - PDFMaker cannot locate the Adobe PDF Printer's printer driver

  • Trying to overwrite OCR text

    I'm having issues in Adobe Acrobat Professional 9.5.5 where I can't overwrite text that was OCR.  I don't know if I'm missing a step.  When I search the document, it does pickup the particular text that I want to update.  Any suggestions. (To be more