Can CLOUD data source be joined to a combined data source?

Hi, experts,
       I have created a cloud data source and I am trying to join the cloud data source with a combined data source.
       I am having a problem.
           This is how I joined the two data sources. I checked the Joined Data Source. It is ok.
          When I open the report,
          The error is just like this.
          The error is just one line . So , I simply dont know how to solve this.
           Any help is greatly appreciated.
Thanks in advance.
Regards,
Fred.

Hi Murali,
I saw the solutions you provide seems to work for the user.
What I want to do is to transfer the Hierarchy from 0GL_ACCOUNT to another InfoObject 0GLACCEXT, however, when I right clicked the Hierarchy and tried to create transfer rules, but the Transfer Rules option prompted only DataSource in Source selections, I cannot select 0GL_ACCOUNT_HIER as my source.
Any advice?
Many thanks,
Vince

Similar Messages

  • Can SharePoint 2010 read data from an external SQL Server 2012 data Source?

    Hi,
     I would like to Know whether SharePoint can read the data from a SQL server 2012 external data source?
    I am not talking about SharePoint internal database. I need to get the data from an sql database which is used by some other application. For My SharePoint server I am using SQL 2008 R2 as an internal database engine. But I need to get some other data from
    another data Source which is configured in SQL server 2012.  Can any one help me on this whether is there any problem on accessing the data from SQL 2012. If there is no problem for that, please provide me the version of SQL data source compatible with
    SharePoint 2010 and SP 2010 sp1. 
    Thanks!
    Regards,
    Henoy 

    Hi Romeo ,
    I have already visited this page. But there is no answer for my question. I just want to know whether we
    can done the same from a SQL 2012 server.
    Please help me to know whether SharePoint 2010 is Compatible to get the data from SQL 2012 external data source.
    Thanks for your instant reply.
    Regards,
    Henoy TM.
    +919035752610

  • HT4847 How i can download my backup data? And how to manage the data on i Cloud?

    How i can download my backup data? And how to manage the data on i Cloud?

    You can't download an iCloud backup, except to restore it to your device should you ever need to.
    iCloud data can be managed within the apps on your iPad.  Any changes to the data within the apps corresponding to the data you are syncing with iCloud will take place in iCloud.  You can also manage some of this data on icloud.com from your computer.
    This article explains ways to manage your iCloud storage space, should you need to reduce you iCloud storage: http://support.apple.com/kb/ht4847.

  • Using SSIS 2012 - merge join component to transfer data to destination provided it does not exist

    HI Folks,
    I have a table - parts_amer and this table exists in source & destination server as well.
    CREATE TABLE [dbo].[Parts_AMER](
     [ServiceTag] [varchar](30) NOT NULL,
     [ComponentID] [decimal](18, 0) NOT NULL,
     [PartNumber] [varchar](20) NULL,
     [Description] [varchar](400) NULL,
     [Qty] [decimal](8, 0) NOT NULL,
     [SrcCommodityCod] [varchar](40) NULL,
     [PartShortDesc] [varchar](100) NULL,
     [SKU] [varchar](30) NULL,
     [SourceInsertUpdateDate] [datetime2](7) NOT NULL,
     CONSTRAINT [PK_Parts_AMER] PRIMARY KEY CLUSTERED
     [ServiceTag] ASC,
     [ComponentID] ASC
    I need to exec the following query using SSIS components so that only that data ,is transfered,which does not exist at destination -
    select source.*
    from parts_amer source left join parts_amer destination
    on source.ServiceTag = destination.ServiceTag
    and source.ComponentID=destination.ComponentID
    where destination.ServiceTag  is null and destination.ComponentID is null
    Question - Can Merge component help with this?
    Pl help out.
    Thanks.

    Hi Rvn_venky2605,
    The Merge Join Transformation is used to join two sorted datasets using a FULL, LEFT, or INNER join, hence, not suitable in your scenario. As James mentioned, you can use Lookup Transformation to redirect the not matched records to the destination table.
    Another option is to write a T-SQL script that makes use of
    Merge statement, and execute the script via Execute SQL Task.
    References:
    http://oakdome.com/programming/SSIS_Lookup.php 
    http://www.mssqltips.com/sqlservertip/1511/lookup-and-cache-transforms-in-sql-server-integration-services/ 
    Regards,
    Mike Yin
    TechNet Community Support

  • The key columns of the country measure group attribute do not match in either number or data types to the key columns of the source attribute

    I have a country dimension which is used in 5 cubes. Now I want to change 'key column' property of the country attribute. Whenever I do this, I get an error as "The key columns of the country measure group attribute do not match in either number or
    data types to the key columns of the source attribute". I dont understand what this error is about.
    Can someone please help? Thanks in advance.
    -Regards,
    Raj Patil

    sounds like you need to verify your dimensional usage tab to verify the relationships between dim and fact on the measure group. 
    Hi Talktorajpatil,
    As Jon said, you can verify the relationships between dim and fact on the measure group on the dimensional usage tab. Use this section to define how you "join" your measure groups to your dimensions.  There may be a Dimension-Measure Group relationship
    that is defined using the wrong attributes and you'll need to select the correct attributes to link the Dimensions to the measure groups. Here are some similar thread for your reference.
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/74203b66-8a71-4681-8e47-8f99cce87b3d/error-on-the-measure-group-which-do-not-match-the-data-type-of-the-key-column?forum=sqlanalysisservices
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/2421058d-fd4a-44b8-8c7c-b0b349bbef2d/measure-group-attribute-key-column-does-not-match-source-attribute?forum=sqlanalysisservices
    Hope this helps.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Validate data in r/3 while upload frm external source to bw

    Hi
    My requirement is while uploading data(2lacs to 5lacs of records) from external source file(xls or notpad) to BW i have to check whether that data is present in R/3 system or not.
    can anyone suggest the best way for validation(any FMs exist) for performance sake
    and also provide the FMs all which gives the best performance.

    u need to explan it in more detail...
    process
    Document number
    matching keys with SAP for validating records
    etc
    etc
    regards
    madan

  • The ADO NET Source was unable to process the data. ORA-64203: Destination buffer too small to hold CLOB data after character set conversion.

     We developed a SSIS Package to pull the data From Oracle source to Sql Server 2012. Here we used ADO.Net source to pull the records from Source but getting the below error after pulling some 40K records.
      [ADO NET Source [2]] Error: The ADO NET Source was unable to process the data. ORA-64203: Destination buffer too small to hold CLOB data after character set conversion.
    [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. 
     The PrimeOutput method on ADO NET Source returned error code 0xC02090F5. 
     The component returned a failure code when the pipeline engine called PrimeOutput(). 
    The meaning of the failure code is defined by the component, 
    but the error is fatal and the pipeline stopped executing. 
     There may be error messages posted before this with more 
    information about the failure.
    Anything that we can do to fix this?

    Hi,
      Tried both....
      * Having schema type as Nvarchar(max). - Getting the same error.
      * Instead of ADO.Net Source used OLEDB Source with driver as " Oracle Provide for OLE DB" Getting error as below.
           [OLE DB Source [478]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
           [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on OLE DB Source returned error code 0xC0202009.  The component returned a failure
    code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the
    failure.
    Additional Info:
       * Here the Source task is getting failed not the conversion or destination task.
    Thanks,
    Loganathan A.

  • DTP does not fetch all records from Source, fetches only records in First Data Package.

    Fellas,
    I have a scenario in my BW system, where I pull data from a source using a Direct Access DTP. (Does not extract from PSA, extracts from Source)
    The Source is a table from the Oracle DB and using a datasource and a Direct Access DTP, I pull data from this table into my BW Infocube.
    The DTP's package size has been set to 100,000 and whenever this load is triggered, a lot of data records from the source table are fetched in various Data packages. This has been working fine and works fine now as well.
    But, very rarely, the DTP fetches 100,000 records in the first data package and fails to pull the remaining data records from source.
    It ends, with this message "No more data records found" even though we have records waiting to be pulled. This DTP in the process chain does not even fail and continues to the next step with a "Green" Status.
    Have you faced a similar situation in any of your systems?  What is the cause?  How can this be fixed?
    Thanks in advance for your help.
    Cheers
    Shiva

    Hello Raman & KV,
    Thanks for your Suggestions.
    Unfortunately, I would not be able to implement any of your suggestions because, I m not allowed to change the DTP Settings.
    So, I m working on finding the root cause of this issue and came across a SAP Note - 1506944 - Only one package is always extracted during direct access , which says this is a Program Error.
    Hence, i m checking more with SAP on this and will share their insights once i hear back from them.
    Cheers
    Shiva

  • Siri can't find music unless I enable use cellular data for Music

    I have an iPhone 4S with the latest IOS.  I ask Siri to play music but "she" can't find it unless I "enable" Use Cellular Data for Music.
    As soon as I enabled this setting in the Settings -> Cellular section Siri could find and play the music.
    I certainly don't understand why this is happening.  Why isn't Siri able to just find what is local to my iPhone and play the music.
    Can someone enlighten me on why this would happen?

    Hi Chris,
    Songs are on the iPhone - there is no cloud symbol (I'm familiar with that as well). 
    I've updated my signature - I am running the latest OSX and IOS (thanks for reminding me).

  • Using report parameter in data set filter expression with an SSAS data source

    I have an SSRS report with an SSAS data source.
    Report parameters:
    Param1 - text, single select
    Param2 - text, multi-select
    Dataset:
    In Query Designer, I want to include Param1 as a filter expression so I can have "Dimension1 Begins with @Param2". I'm not sure the exact syntax to make param2 work in this.
    The point is to filter my data set on param1. If A is selected for param1, I want the data set to have the filter saying "Dimension1 begins with A"
    Anyone know how to use a report parameter in the dataset filter expression for an ssas data source?

    hi,
    try this maybe the dates you are comparing are not in the same format.
    I test data template in EBS but not with dates.
    to_date(date,'dd/mm/YY') between to_date(:p_from_date ,'dd/mm/YY') and to_date(:p_to_date ,'dd/mm/YY')
    if dont works try to put values insted of your parameter like :
    to_date(date,'dd/mm/YY') between to_date(:p_from_date ,'10/01/07') and to_date(:p_to_date ,'01/12/07') ... put dates where u can have some values in report...
    if dont work then try to print the values of the 2 parameters and the date of the select somwhere in ur report to see what you have in it..
    hope it helps..
    Regards Joe.

  • How to merge data in BW from R3 and a legacy source

    Hi
    I would like to combine data from R3 and from a legacy system in BW as follows: I have two similar info objects:
    One info object contains the stuff from R3 (which contains some history but basically the data is just really well maintained starting from Jan 1st 2008.
    The other info object contains data from a legacy HR data warehouse which contains data back to 2001 and up to Dec 31st 2007.
    I would like to combine in BW the two info objects in a multiprovider such that if I make a query before Jan 2008 then the data is taken from the 'legacy' infoobject and when I run a query as of Jan 2008 or later then the data should be taken from the 'R3' info object. (Note that if I specify a range e.g. Dec 2007 - Jan 2008 then data from both info objects are to be taken).
    A problem I see is that the DATEFROM of rows in the time dependent tables of the 'R3' info object is not as of Jan 1st 2008 but can be any date before. Thus I wonder how an overlap between the 2 sources can be avoided.
    Is there a way to acomplish this?
    Regards and thanks for your help, Felix

    The how to at following url shows how to store, retrieve and display characters of different languages from Oracle database onto a JSP page.
    http://otn.oracle.com/sample_code/tech/java/codesnippet/jdbc/nls/Globalization.html
    HTH
    Chandar

  • I backed up files onto an external hard rive form my old PC. Now I want to retrieve the files to put on my mac but it is telling me to format it to Mac. How can I do this without erasing all of the data on my hard drive?

    I backed up files onto an external hard rive form my old PC. Now I want to retrieve the files to put on my mac but it is telling me to format it to Mac. How can I do this without erasing all of the data on my hard drive?

    Your drive was used with a PC and formatted NTFS which is proprietary Microsoft format.
    You need to install a third party program that will read the NTFS format.
    There is various software from PARAGON, Tuxera and NTFS-3G
    When you get the data off and have verified it to be good, reformat the drive either HFS+ for Mac use only, or for Mac and PC use then MSDOS (FAT32) for under 4GB files (best) or exFAT (for larger than 4GB files) is proprietary and Microsoft is appling for a patent, which would likely mean OS X won't be allowed to read it anymore without a licensing fee and you'l have to pay another third party software company to read the format, just like NTFS is.
    The less you have to rely upon third party sources to read your drives the better, this way if you have a issue and need to read the drive on another machine you don't need the software, and a internet connection and a credit card and...and...and...

  • Can we config the SU as an input field in source of putaway sys-guided ?

    Experts,
    If i want to do putaway system guided/by TO for SU managed materials , there is nothing that i need to scan in the source field and i can only see the SU,Storage bin input field in the destination screen(Can only config these two fields as input fileds for scanning).So the user has to manually hit F1 save and hit Next to go the destination screen.How should i have the SU field to be scanned in source? (Can we make the SU field as an input field for scanning in source screen?)
    Thanks in advance

    We have configured both SU and src bin as input fields on RF.  Are you using customised RF program or SAP standard?  WE have this driven through a mix of stnd and customised referring back to Verifcation Data Profiles and movt assignments in Mobile Data entry configuration of LE->WM

  • Difference bw the data staging of generic and appli specefic data sources

    Hi,
       Can anyone tell the difference between data staging of generic and appl specific data sources. Like we know that LO data stage in queued delta, update queue and BW delta queue, i want to know actually where the generic data stages before it is loaded into BW.
    Thanks.

    Generic data sources are based on either a DB table/view, a function module or an ABAP query. So normally the data stages in the corresponding DB tables or you calculate it at extraction time. There is no update queue like in LO.
    Best regards
       Dirk

  • Cross database join Oracle - SQL server date column - nQSError: 46008

    Hi,
    I am using OBIEE 10.1.3.4.2 and
    I am able to make cross database join between Oracle and SQL server using varchar columns, but I am getting this error:
    nQSError:22024-A comparison is being carried out between noncompatible type when I try to make "foreign key" join between two tables (one from Oracle, second from SQL Server) using number columns (INT, DOUBLE...). It is strange, but I when I make "complex join" on physical layer no error is thrown and everything works fine.
    But I am not able to make join between tables using Date column. Column in Oracle table has DATE datatype, column in SQL server tables has datetimeoffset(7) datatype (example: 2011-07-19 13:14:22.2032605 +02:00). So I tried to cast datetimeoffset(7) to date datatype using "convert(DATE,HappenedOn,120)" - this returns me 2011-07-19 . In this format, BI can show converted date column, I can make filter using this date column, but I am not able to make physical join with Oracle table using this column
    Answer using data from both joined tables gives me this error:
    [nQSError: 46008] Internal error: File .\DataType\SUKeyCompare.cpp, line 875. (HY000)
    Do you have some tips, how to solve this "bug"?

    Parse the command column to get the SSIS package file name may be your only option here.

Maybe you are looking for