Data sources for Power BI

Is there some documentation on what kind of data sources are supported in Power BI now, and what is the future road map. We have below data sources, which are all on a on premise server/servers.
Excel files
SharePoint lists
SQL Server OLAP Cube
SQL Server database
Text files
SQL Azure
Wondering if PowerBI will be a good option for us or not.

Hi,
The data sources to the model itself are those supported by PowerPivot & Excel. In terms of scheduled refresh, from the list above, SharePoint lists (through Odata feed), SQL Azure, and SQL server are supported with the coming general availability
Guy

Similar Messages

  • Any expericences about Hive/Hbase as data source for Power BI?

    Any real life experiences about fetching and automatically refreshing data from HDInsight for Power BI?
    Kenny_I

    Hi Kenny,
    I've used HDInsight with Power BI on a handful of prototypes/PoCs where HIVE has been used to condense very high grain data or unstructured data into something a lot more manageable. As you may already know, the HIVE tables are just projected metadata
    on top of folders which contain the data across one or more files, or subfolders of files (i.e. partitions). Initially I had hoped for Power Query's 'From Microsoft Azure HDInsight' data source to be able to pull data from HDInsight by generating HIVE queries
    (HQL), but instead it hits the underlying HDFS (which is implemented as Azure blob storage) and pulls data directly from the files. As a result, it gives pretty much the same experience/functionality as using the 'From Microsoft Azure Blob Storage' data source.
    One thing I found out the hard way is that if data is stored in HIVE tables that use an optimised file format such as ORC, Power Query is unable to pull the actual data out of these files (it doesn't know how to read their contents). You'll need to write some
    HQL that will output the data in a plain text format such as CSV or TSV to be able to pull it in via Power Query. In some scenarios I have gotten around this by using Sqoop or SSIS to load the data into a Microsoft Azure SQL Database and then pulling that
    data into Power Query/Power Pivot, but realise that this may not always be feasible.
    There is the Microsoft Hive ODBC Driver to pull data from HDInsight/Hadoop via HIVE tables on a local machine, but I've not yet found a streamlined way to get workbooks in Power BI to pull data from HDInsight/Hadoop via HIVE.
    I'd be interested to learn about other experiences with HDInsight and Power BI too.
    Edit: Forgot to mention that the Data Management Gateway was needed in all cases due to the use of Power Query.
    Regards,
    Michael Amadi
    Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to vote it as helpful :)
    Website: http://www.nimblelearn.com, Twitter:
    @nimblelearn

  • Data source for this data connection isn't registered for Power BI

    Hi, I am getting this error when I set Schedule Data Refresh to refresh data from db.  How do I register my connection to the data source?  Is this a fix on SQL server or Power BI?
    FAILURE INFORMATION
    Failure
    Correlation ID: c5132b7a-3c54-4f12-a048-3ddcf0b95f26
    Data connection Status
    SqlServer twcperfsql1 OfficePerf OnPremise error: Sorry, the data source for this data connection isn't registered for Power BI. Tell your Power BI admin the admin needs to register this connection to this on-premises data source.
    Thanks for your help.

    I'm running into the same issue, I did configure the Gateway and then the data source in Power Bi.
    When I go to the Power BI Admin Center and click on "Test Connection" it seems to be working. But when I refresh the data from Power BI it doesn't work. Now if I edit the spreadsheet in excel and refresh the data in PowerPivot then it works. So
    not sure why the solution doesn't refresh from the automated service in Power BI without this useless error.
    Thanks
    Fabian
    In case it helps this is the actual error:
    Failure Correlation ID: ecc73a16-7264-45b2-9baf-e3448f007211                                                     
    Power Query - dbo_TableAOnPremise error: Sorry, the data source for this data connection isn't registered for Power BI. Ask your Power BI admin to register the data source in the Power BI admin center.
    Also further information I have found so far is that the Gateway is receiving the requests for the refresh and it shows the message below, so the data source is found and communication seems to be happening from server to client and back, not sure why server
    doesn't seem to like the data the Gateway is sending back to server.
    The feed 'dbo_TableA' was successfully accessed at 7/15/2014 4:23:26 PM.
    The request took 00:00:00.0000105 seconds.
    Activity ID: e8464e5d-3f0a-49c2-b10b-450fec3e5940

  • While creating the Data source for Sql server am getting this error?

    Hi i new to Power BI sites and office 365,
    For my first step 
    1. i have successful creaeted Gateway in Office 365
    2.Now i have creating  the Data source for that gateway using Sql server R2
    3.While Creating the Data source using the Corresponding gateway i have given and next phase set Credentials  phase i have noticed one pop up window and it will shows me like
    Data source Settings and  below in that window it will show's u the things like 
    Datasource Name.
    Loading......
    Credentials type:
    Privacy Level:
    Failed to verify parameter
    I will get this results finally  ,so hw should i achive this probelm ,Please let me know if any one knows
    So kindly give me the proper answer for this.
    Regards

    Any suggestions for Chinnarianu?
    Thanks!
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

  • Can't create sql data source in Power BI

    Hello All,
    I'm trying to use the sql database from my ACT! CRM to connect to Power BI.
    I've succeeded in installing the Microsoft data management gateway and connecting this Power BI. Status is 'running'.
    However when I try to add sql data source, and add credentials to log on the 'act7' instance of the sql, the response is 'failed to test connection, login failed'
    Can anyone help?

    Hi Mark,
    When you specify credential, are you able to connect to the data source directly? Meaning that if you're at home please ensure VPN is established between you and the data source. This is another factor of authentication when exposing data source for workbook
    refresh.
    Samuel

  • Unable to generate "Export data source" for Master data infoobject

    Hi Gurus,
    I'm in BW 3.5
    I need to generate "Export data source" for a master data info object.
    I followed below steps:
        I ticked the checkbox "Characterist. is export data source" in the master data Info object maintenance screen.
        In the Infosource area, for the desired Infoobject --> Right click --> Generate export datasource --> a dialog box displays saying "The generation of DataSource  was successful"
    Problem:
    Even after refreshing, etc.. I couldn't find the generated datasource 8**** for the above master data. I also tried to replicate the datasource from the myself source system, but couldn't find it even there. I don't know where the generated export data source is hiding ???
    Could anyone please help ????
    Thanks,
    Venkat

    Hi Venkat,
    I guess that after creating an export datasource for infoobject this DS should appear in the RSA1-Source systems - MYSELF - datasource overview (with name starting with 8). Since it's a datamart, it should appear also as system generated infosource. And in this case you would be able to assign in URs of data target the appropriate infosource (make sure that in Infoproviders tab of RSA1 you have inserted your infoobject as a data target).
    However, there are some bugs related to export data source. CHeck if you see you export data source in RSA6.
    If you don't see it - consider applying OSS Note #816892 - "30BSP26: Export DS for master data is not generated".
    Note #876845 - "30BSP29: InfoObject change: Export DataSource not adjusted" is also may be useful.
    -Vikram

  • Standard BI Data Source for FMGLFLEXT

    Hello Friends,
    The standard data source for the Financials New Legder is 0FI_GL_10 General Ledger: Leading Ledger Balances, which fetches the data from the FAGLFLEXT Table using the Extract Structure FAGL_EXTSTRUCT_LEAD. However there has been a migration of the FAGL table to FMGL table. And now the FAGLFLEXT is no longer populated with any FI postings data. And the new table which is getting the update is the FMGLFLEXT Table. GL Migration from FAGL to FMGL is a standard procedure. However I am not finding any Standard Data Source to fetch the data from the FMGLFLEXT Table.
    Can anybody let me know if there is any standard datasource for this or should we create a custom data source to fetch the data from FMGLFLEXT using the FMGL_EXTSTRUCT_LEAD Extract Structure.
    Appreciate help in this regard.
    VB

    Hello,
    The requirement is resolved by the following:
    If you want to use your own totals table (in this scenario FMGLFLEXT) as your basis, you first have to create the corresponding extraction structure using transaction FAGLBW01. In this way, all fields of the totals table are transferred into the extraction structure. Further In the OLTP system, call up transaction FAGLBW03 and Configure the DataSource and save your settings.
    http://help.sap.com/saphelp_nw70/helpdata/en/be/928f40f5767d17e10000000a1550b0/frameset.htm
    Regards,
    VB

  • Standard Data Source for FMGLFLEXT Table

    Hello Friends,
    The standard data source for the Financials New Legder is 0FI_GL_10 General Ledger: Leading Ledger Balances, which fetches the data from the FAGLFLEXT Table using the Extract Structure FAGL_EXTSTRUCT_LEAD. However there has been a migration of the FAGL table to FMGL table. And now the FAGLFLEXT is no longer populated with any FI postings data. And the new table which is getting the update is the FMGLFLEXT Table. GL Migration from FAGL to FMGL is a standard procedure. However I am not finding any Standard Data Source to fetch the data from the FMGLFLEXT Table.
    Can anybody let me know if there is any standard datasource for this or should we create a custom data source to fetch the data from FMGLFLEXT using the FMGL_EXTSTRUCT_LEAD Extract Structure.
    Appreciate help in this regard.
    VB

    Hello Sasi and Kirun,
    Thanks for your reply,
    The requirement is resolved by the following:
    If you want to use your own totals table (in this scenario FMGLFLEXT) as your basis, you first have to create the corresponding extraction structure using transaction FAGLBW01. In this way, all fields of the totals table are transferred into the extraction structure. Further In the OLTP system, call up transaction FAGLBW03 and Configure the DataSource and save your settings.
    http://help.sap.com/saphelp_nw70/helpdata/en/be/928f40f5767d17e10000000a1550b0/frameset.htm
    Regards,
    VB

  • Can you use SQL as a data source for a project in the same way you can in Excel?

    Excel allows you to create a data source that executes a SQL stored procedure, display that data as a table in a spreadsheet and have that data automatically refresh each time you open the spreadsheet. Is it possible to do the same thing in MS Project, displaying
    the data from the stored procedure as a series of tasks?
    Here's what I'm trying to do - I have a stored procedure that pulls task data meeting a specific criteria from all projects in Project Server. We're currently displaying this data as an Excel report. However, the data includes start dates and durations so
    it would be nice to be able to display it as a Gantt Chart. I've played around with creating a Gantt chart in Excel and have been able to do a very basic one, but it doesn’t quite fit our needs.

    No, You can not use sql as a data source for a project.
    You have 3 options to achieve it:
    1. You can create a Sharepoint list with desired column ,fill desired data in that list then you can create a MS project from Sharepoint List.
    2. You can create a SSRS report in which you can display grantt chart Joe has given you that link.
    3. You can write a macro in MPP which will take data from your excel. In excel you will fetch data from your stored procedure. write a schedule which will run every day to update your data or
    create an excel report in which will update automatically and write macro in mpp which will fetch the data then publish it so that it would be available to team members.
    kirtesh

  • SharePoint PPS 2013 Dashboard Designer error "Please check the data source for any unsaved changes and click on Test Data Source button"

    Hi,
    I am getting below error in SharePoint PPS 2013 Dashboard Designer. While create the Analysis Service by using "PROVIDER="MSOLAP";DATA SOURCE="http://testpivot2013:9090/Source%20Documents/TestSSource.xlsx"
    "An error occurred connecting to this data source. Please check the data source for any unsaved changes and click on Test Data Source button to confirm connection to the data source. "
    I have checked all the Sites and done all the steps also. But still getting the error.Its frustrating like anything. Everything is configured correctly but still getting this error.
    Thanks in advance.
    Poomani Sankaran

    Hi Poomani,
    Thanks for posting your issue,
    you must have to Install SQL Server 2012 ADOMD.Net  on your machine and find the browse the below mentioned URL to create SharePoint Dashboard with Analysis service step by step.
    http://www.c-sharpcorner.com/UploadFile/a9d961/create-an-analysis-service-data-source-connection-using-shar/
    I hope this is helpful to you, mark it as Helpful.
    If this works, Please mark it as Answered.
    Regards,
    Dharmendra Singh (MCPD-EA | MCTS)
    Blog : http://sharepoint-community.net/profile/DharmendraSingh

  • Trying to change the data source for a Crystal Report.

    <p>The method below represents my best attempt to programatically change the data source of a Crystal Report. The goal is to have a routine that will update the data source for reports after they have been distributed to production servers. So far I have not been successful in saving the report back to the CMS. No exceptions are thrown, but when I view the Database Configuration of the report in the CMC nothing has changed.
    </p>
    <p>
    Am I missing a step, or is there another way to accomplish this?
    </p>
    <p>
    Thank you.
    </p>
    <hr />
    <pre>
    private void test(String reportName)
       throws SDKException, ReportSDKException, java.io.IOException
       IInfoObjects newInfoObjects;
       IInfoObject reportObj;
       ReportClientDocument clientDoc = new ReportClientDocument();
       DatabaseController dc;
       PropertyBag pBag;
       PropertyBag logonProps;
       ConnectionInfo newConInfo;
       ConnectionInfo oldConInfo;
       ConnectionInfos conInfos;
       int connOptions = DBOptions._ignoreCurrentTableQualifiers + DBOptions._doNotVerifyDB; //0;
       Fields connFields = null;
       String queryStr = "Select * From CI_INFOOBJECTS " +
          "Where SI_NAME='wfr.rpt' AND SI_KIND='CrystalReport' AND SI_INSTANCE=0";
       newInfoObjects = getCms().executeQuery(queryStr);
       if(newInfoObjects.size() > 0)
          reportObj = (IInfoObject)newInfoObjects.get(0);
          try
             clientDoc = getCms().getReportAppFactory().openDocument(
                reportObj
                , OpenReportOptions._refreshRepositoryObjects
                , java.util.Locale.US);
             dc = clientDoc.getDatabaseController();
             conInfos = dc.getConnectionInfos(null);
             for(int i = 0; i < conInfos.size(); ++i)
                oldConInfo = (ConnectionInfo)conInfos.getConnectionInfo(i);
                newConInfo = (ConnectionInfo)oldConInfo.clone(true);
                pBag = newConInfo.getAttributes();
                pBag.putStringValue("QE_ServerDescription", "alio");
                logonProps = new PropertyBag();
                logonProps.putStringValue("Trusted_Connection", "false");
                logonProps.putStringValue("Server", "alio");
                pBag.put("QE_LogonProperties", logonProps);
                newConInfo.setUserName("admin");
                newConInfo.setPassword("password");
                dc.replaceConnection(
                   oldConInfo
                   , newConInfo
                   , connFields
                   , connOptions);
          catch(ReportSDKServerException Ex)
             String msg = "A server error occured while processing the " + reportObj.getKind()
                + " object, " + reportObj.getTitle() + " (" + reportObj.getCUID() + "), from the CMS.";
             Utility.errorOut(msg, Ex);
          catch(Exception Ex)
             String msg = "An error occured while processing the " + reportObj.getKind()
                + " object, " + reportObj.getTitle() + " (" + reportObj.getCUID() + "), from the CMS.";
             Utility.errorOut(msg, Ex);
          finally
             clientDoc.save();
             getCms().commitToInfoStore(newInfoObjects);
             clientDoc.close();
    </pre>
    Edited by: Mark Young on Sep 10, 2009 2:16 PM

    <style type="text/css">
    /<![CDATA[/
        body
            font-size: 1.125em;
              font-family: helvetica,arial,"sans-serif";
          .code{font-family: "courier new",courier,mono,monospace}
          .bi{font-style: italic; font-weight: bold;}
    /]]>/
    </style>
    <p>Justin,</p>
    <p>
    Thank you for the reply. Time constraints have not allowed me to post back to this tread
    till now. I will try your suggestion. My assumption is that <i>Save the report back to the
    info store</i> refers to <span class="code">IInfoStore.commit(IInfoObjects)</span>.
    </p>
    <p>
    I'm afraid that I do not understand why I don't want to change the report client document,
    or why <i>successfully exporting the report with the new login/password</i> is not what I
    want to do. Any explanation on that statement would be appreciated.
    </p>
    <p>
    I did find a way to accomplish my goal. It involved adding the SSOKEY property to the
    logon property bag. Below you'll see my revised code which modifies the report logon and
    server. I have no idea what
    this does, and SAP support has not been able to tell me why it works. However, what I
    discovered is that if I changed the report option, <b>Database Configuration -> When
    viewing report:</b>, in the CMS to <span class="bi">Use same database logon as when report
    is run</span> from <span class="bi">Prompt the user for database logon</span>, then the
    SSOKEY property had been added to the logon property bag having an empty string as its
    value. This allowed me to successfullyupdate and save the modified logon back to the CMS.
    </p>
    <p>
    So I took a chance and added code to always add the SSOKEY property with an empty string
    as its value, and I could then successfully modify and save the report's logon info
    and server. Again, I don't know what this means, but it has worked so far. If anyone has
    some insight or comments, either are welcome. Thank you in advance.
    </p>
    <br />
    <hr />
    <pre>
    private void changeDataSourceOfAWFCrystalReports()
       throws SDKException, ReportSDKException, java.io.IOException
       IInfoObjects newInfoObjects = null;
       IInfoObject reportObj = null;
       IReport curReport = null;
       ReportClientDocument clientDoc = new ReportClientDocument();
       DatabaseController dbController;
       PropertyBag pBag;
       PropertyBag logonProps;
       ConnectionInfo newConInfo;
       ConnectionInfo oldConInfo;
       ConnectionInfos conInfos;
       int connOptions = DBOptions._ignoreCurrentTableQualifiers + DBOptions._doNotVerifyDB;
       Fields connFields = null;
       String outputStr;
       int numOfReports;
       int numOfQueryPages;
       double progressIncrementPerPage = 30;
       int progressIncrementPerReport = 0;
       // Path query to reports is in a .properties file.
       String queryStr = getAppSettingsFile().getWscAwfCrystalReportPathQuery();
       try
          // Executes IInfoStore.getPageingQuery() and generates a list of queries.
          getCms().setPathQueryQueries(queryStr, 100);
          numOfQueryPages = 0;
          // Gets a List&lt;String&gt; of the IPageResult returned from IInfoStore.getPageingQuery().
          if(getCms().getPathQueryQueries() != null)
             numOfQueryPages = getCms().getPathQueryQueries().size();
          if(numOfQueryPages &gt; 0)
             // Use 30% of progress bar for the following loop.
             progressIncrementPerPage = Math.floor(30.0/(double)numOfQueryPages);
          for(int queryPageIndex = 0; queryPageIndex &lt; numOfQueryPages; ++queryPageIndex)
             // Gets the IInfoObjects returned from the current page query
             newInfoObjects = getCms().getPathQueryResultSetPage(queryPageIndex);
             numOfReports = newInfoObjects.size();
             if(newInfoObjects != null && numOfReports &gt; 0)
                progressIncrementPerReport =
                   Math.round((float)Math.floor(progressIncrementPerPage/(double)numOfReports));
                for(int reportIndex = 0; reportIndex &lt; numOfReports; ++reportIndex)
                   reportObj = (IInfoObject)newInfoObjects.get(reportIndex);
                   curReport = (IReport)reportObj;
                   clientDoc = getCms().getReportAppFactory().openDocument(
                      reportObj
                      , OpenReportOptions._refreshRepositoryObjects
                      , java.util.Locale.US);
                   dbController = clientDoc.getDatabaseController();
                   conInfos = dbController.getConnectionInfos(null);
                   for(int conInfosIndex = 0; conInfosIndex &lt; conInfos.size(); ++conInfosIndex)
                      oldConInfo = (ConnectionInfo)conInfos.getConnectionInfo(conInfosIndex);
                      newConInfo = (ConnectionInfo)oldConInfo.clone(true);
                      pBag = newConInfo.getAttributes();
                      pBag.putStringValue(
                         "QE_ServerDescription"
                         ,getConfigFile().getDBDataSourceConnections());
                      logonProps = new PropertyBag();
                      logonProps.putStringValue("Trusted_Connection", "false");
                      <b>logonProps.putStringValue("SSOKEY", "");</b>
                      logonProps.putStringValue(
                         "Server"
                         ,getConfigFile().getDBDataSourceConnections());
                      pBag.put("QE_LogonProperties", logonProps);
                      newConInfo.setUserName(getConfigFile().getUNVConnectionUserName());
                      newConInfo.setPassword(getConfigFile().getUNVConnectionPasswordDecrypted());
                      dbController.replaceConnection(
                         oldConInfo
                         , newConInfo
                         , connFields
                         , connOptions);
                      newConInfo = (ConnectionInfo)conInfos.getConnectionInfo(conInfosIndex);
                   } // end for on conInfosIndex
                   clientDoc.save();
                } // end for on reportIndex
             } // end if on newInfoObjects
          } // end for on queryPageIndex
       } // end try
       catch(ReportSDKServerException Ex)
          // handle...
       catch(Exception Ex)
          // handle...
       finally
          getCms().commitToInfoStore(newInfoObjects);
          if(clientDoc != null)
             clientDoc.close();
    </pre>

  • Error while creating Data Source for master data attributes

    Hi BI Experts,
    Well its been some time for me that I have been part of Extraction in BI.I primarily handled reporting in my last assignments.
    I was trying extraction with flat files in SAP BI 7(new to sap bi 7 but very much familiar with BW3.5) but failed in the activity during master data attributes and text upload in infoobject (say IOSP_Mat).
    Here is the procedure that I did after creation of characteristic IOSP_Mat.I created a source system for flat file followed by data source for Master data attributes, i selected all the parameters correctly.i.e. csv file format, data seperator as   ,
    and other settings, now when i am trying to look at the proposed data in the next tab using Load example data.its not showing the desired result.The columns that I have maintained in Flat File is  MAT_NUMBER and MAT_NAME (with say 100 data in the file)
    same is the result when I am trying to load the text data too columns maintained are
    (LANGUAGE MAT_NUMBER Short Description)(same 100 data).
    now i used to rsa1old transaction  to upload the file using 3.5 version.i created info source for master data/text/hierarchies for  IOSP_Mat
    now when trying to upload it using info package for master and text data,I observe its(the data) not maintained in the characteristic IOSP_Mat.
    When I monitored ,I figured the data has not been even uploaded to the PSA level.
    Can you BI experts tell me the answer for this.
    Thanks,
    Srijith

    apologies to all of you for late response,
    was busy with some other activities.
    I don't remember the exact message,but I remember it was not loaded to even the PSA level.I will try it again and post the exact message.
    Thanks again for your quick response.
    Once again sorry to all of you for my late response
    Thanks,
    Sri

  • OIM 9.1.0.2 - Weblogic JDBC Multi Data Sources for Oracle RAC

    Does OIM OIM 9.1.0.2 BP07 support Weblogic JDBC Multi Data Sources (Services>JDBC>Multi Data Sources) for Oracle RAC instead of inserting the "Oracle RAC JDBC URL" on JDBC Data Sources for xlDS and xlXADS (Services>JDBC>Data Sources> xlDS|xlXADS > Connection Poll> URL) ?
    If yes, is there are any other modifications that need to be made on OIM, or just changing the data sources?

    Yes, it's supported. You install against one instance directly of the Rac Server. Then you update the config.xml file and the jdbc resource in your weblogic server with the full rac address. It is documented for installation against RAC. http://docs.oracle.com/cd/E14049_01/doc.9101/e14047/database.htm#insertedID2
    -Kevin

  • Using an SAS2008 Cube as a data source for BI Publisher errors

    Hi All,
    I am trying to use a SAS 2008 cube as a data source for a report in BI Publisher. When I defined the data source under the admin section I gave it the following details:
    Data Source=ntsasqa01;Provider=msolap.3;Initial Catalog=WWPR_SMG
    With a username / password that has access to the cube.
    When I try and test the connection the OC4J instance stops running and the test does not finish or error. If I use the data source and then setup a report and then try and view data I get an error like the one below:
    /app/oracle/OracleBI/xmlp/XMLP/Admin/DataSource/msmdacc.dll (     0509-022 Cannot load module /app/oracle/OracleBI/xmlp/XMLP/Admin/DataSource/msmdacc.dll.
         0509-103 The module has an invalid magic number.)
    I am not sure if the error is related to the initial issue of not being able to complete the test.
    I need to find a solution though as I have to make use of MDX to access the cubes.
    Please could someone give me some advice.
    Thanks,
    Kim.

    Hi Kim
    1. SAS cubes are not currently supported - http://download.oracle.com/docs/cd/E10415_01/doc/bi.1013/e10417.pdf. Unless SAS cubes can use the MS drivers?
    2. If you try and pursue it you might check the permissions on the ddll in the error to ensure you have rights over it.
    Either way you are not supported with a SAS cube so you are kinda on your own. Sorry I dont use SAS so Im not going to be much help other than stating the obvious :0)
    Regards
    Tim

  • Creation of a generic extractor and data source for the FAGLFLEXA table

    Hi All,
    Need to create a generic extractor and data source for the FAGLFLEXA table to support AR reporting. This table contains the necessary profit center information to perform LOB reporting against the AR data.
    Please advice on how to do this.
    Regards, Vishal

    Hi Vishal,
    Its seems a simple a work out.
    1.Go to RSO2 & choose the relevant option ie. whether you want to create Transactional DS, Master Data DS or Text DS.
    2. Name it accordingly & then create.
    3. Give description to it & then give table name FAGLFLEXA.
    4. Save it & activate. If you need it to be delta enabled then click over Delta & you can choose accordingly.
    If you still face some problem then do mail me at [email protected]
    Assign points if helpful
    Regards,
    Himanshu

Maybe you are looking for

  • Setting up Apple TV with direct tv and surround sound from stereo

    I am just setting up.  I don't have a audio cord that leaves my tv.  It is in seperate room.  Audio leaves dish box to audio.  Only getting sound on tv speakers, not surround speakers

  • Can't install iTunes on PC running Windows 7

    Can someone please give me step by step instructions to download iTunes to my PC? It keeps stalling on me with a few minutes left in the download. Many Thanks to Any Help!

  • Persist large field using stream in JPA 2.0

    Is it possible persist a blob field using a stream? I want persist a long file in a database and i wan not load it in a byte[] in memory. I wan use a stream, as a Blob in jdbc. Thanks a lot.

  • Where is located a query????

    I'm new in this forum and new in java world; i'm testing a crm java based (in tomcat server): my problem is related to know what files contains query text because i discovered a bug in a query (when i try to execute this query tomcat server make a lo

  • Apps not there

    I cannot get the "apps" to appear on the desktop version of my creative cloud and when I try to download a program from the site it takes me back to "download error please retry or contact customer support." I have installed a few programs in the pas