Validation data extraction

Hi,
I have enabled 'Collect Validation Statistics' in the validation transform but i do not have media/flash player so Data Validation is not displayed on the MC.
Is it possible to retrieve the data statistics from some where else?
Appreciate your help.
Arun

Thanks Michael, i will check the reference guide.
In my case, both the input and output files are flat files. Though repository is in SQL, i am not writing any output there.
In that case does querying database helps? It has to write in some tables or files outside database right?
regards,
Arun

Similar Messages

  • Master data extraction from SAP ECC

         Hi All,
    I am a newbie here and teaching myself SAP BI. I have looked through the forums regarding master data extraction from SAP ECC in all forums but could not answers for my question. Please help me out.
    I want to extract customer attributes from SAP ECC. i have identified the standard data source 0customer_attr and replicated in SAP BI. I have created infopackage for full update. I validated the extractor checker(RSA3) and found 2 records for 0customer_attr.
    When I run the info package, the info package remains in yellow state until it times out. Please let me know in case i am missing anything, Please let me know if there is any other process for master data extraction similar to transaction data extraction.

    Hi All,
    i did the below and afte clicking execute in the Simple job, it takes me back to the Monitor Infopackage screen.
    From your info pack monitor --> menu environment-->job overview--> job in the source system--> prompts for source system(ECC) user id and password, enter them, you will get your job at SM37(ECC), select job and click on log details icon. there you can see details about your error. please share that error screen shot.
    Please find the screenshots.
    I did an Environment -->Check connection and found the below,

  • Data extraction from Oracle database

    Hello all,
    I have to extract data from legacy database tables. I need to apply a lot of conditions on data extraction using SQL statements for getting only valid master data, transaction data, SAP date format etc. Unfortunately I don;t have a luxary of accessing legacy system data base table to create table views and applying select statements.
    Is there anyother way round by which I can filter data on source system side w/o getting into legacy system. I mean being in BW data source side.
    I am suppose to use both UD connect and DB connect to test which will workout better way of data extraction. But my question above should be same in either interface.
    This is very urgent as we are in design phase.
    Points will be rewarded immediately.
    Thanks
    Message was edited by:
            Shail
    Message was edited by:
            Shail

    Well I and eveyone know that it can be done in BI.
    I apologize that I did not mention it in my question.
    I am looking for very specific answer, if there is any trick we can do on source system side from BI. Or where we can insert SQL statements in infopackage or data source.
    Thanks

  • Ho to automate data extraction from KSB1 and GR55 transaction code

    Hi All,
    Can you please let me know if their is a way to automate data extraction from transaction code KSB1 and GR55. I have to extract data from 5 different servers .i.e different server for each region and again i have different controlling area codes in each region. Following are the details which i use to extract the data. It takes too long for me to extract data from all this regions and controlling area codes using my parameters. It's very time consuming so i want to automate this process. I am end user so i don't have any admin rights. Please let me know any workable solution asap.
    Production areas : PNA for Americas, PSI for Asia Pacific and Japan, PGY for Germany, PIT for Italy and PEU for Europe
    Controlling area codes in PNA : CAR for Argentina, CBR for Brazil, CMX for Mexico and CUS for USA. Same way there so may other controlling area codes for all other production areas
    Period From 1 to 12
    Fiscal Year : 2009
    Cost Centre Group : G_6284
    Cost Element Group : 1742000000
    Please let me know in case you need more details.

    Hi,
    Here follows a translation from German:
    SAP GUI (client) for Windows enable
    Start SAP Logon and log on to the SAP server.
    Click the button on the toolbar to adjust for Local Layout.
    Click Options and then click the tab for the scripting.
    Select the Enable checkbox for scripting.
    Disable the checkbox for Notify when a script is assigned to an active GUI and the checkbox for Notify when a script opens a connection.
    Save the settings and restart the SAP GUI again.
    SAP-server enable
    With the following procedure, you can enable scripting by the SAP client temporarily. The specified value in this way is lost when you restart the server.
    Start SAP Logon and log on to the SAP server.
    Start a transaction RZ11.
    Enter sapgui / user_scripting in the window to manage the profile parameters.
    Click on ads.
    Click in the window to display the profile parameter attributes to change value.
    Enter TRUE in the field for a new value.
    Save the settings and log out from the SAP GUI.
    Quit the SAP Logon.
    Note:
    If the server administrator edited the application server profile of the SAP system to sapgui / user_scripting = TRUE to include the scripting is enabled when you restart the server by default.
    SAP provides an option to change the network connection mode at any server. The following two connection modes are available: high-speed connection (LAN) and connecting with a slow speed. Although Functional Tester works in both modes, the high-speed connection with a recorded script is played only in this mode. This also applies to other modes. They must reflect your SAP script in the same network connection mode, with which the script was recorded. It is recommended that the mode of "high-speed connection, as it offers a greater number of valid recognition properties.
    Regards,
    ScriptMan
    Edited by: ScriptMan on Apr 13, 2010 12:32 PM

  • MDMGX - After Data Extraction

    Hi,
    I am new for MDM Generic Extraction concept and understand the process up to generation of XSD and XML file generation. Below are the doubts in my mind. Please help me to get clear.
    1. What is the use of XSD file generation. Is it used for source template for MDM import manager?
    2. What is the use of Time out option in 'Define Repositories and FTP Server Details'.
    3. How will import generated multiple XML files into MDM server via Import Manager. For Ex
    1. Repository Fields :
    Product ID -- UNIQUE Key FIELD
    Product Desc
    Country
    Counry Description
    ISO Code
    Field1
    Field2
    2. Extracted XML files from MDMGX will be
    File1 --> Product ID & Desc
    File2 ---> Country & Country Desc
    Please take the above ex or any valid examples and explain.
    Thanks in advance.

    Hi Rakesh,
    SAP has delivered standard extraction for Reference Data and for Master Data
    The T-Code MDMGX is for Reference Data and it is used to load the Sub Table of MDM
    The MDM business content contains the standard ports and maps which is required for reference data
    The below thread explains the procedure to configure the MDMGX
    Extract Data usnig MDMGX
    There are sequences in extarcting the data.You can load only the revelant sub tables which is used in your repository using the selection creteria.
    There is no XI/PI required as you can configure FTP or you can download it your desktop and manually load
    Master Data Extraction
    The T-Code MDM_CLNT_EXTR is used for the Master Data Extraction.
    The distribution model is required for this and the configuration at PI is required
    Follow the below link for more details
    MDM_CLNT_EXTR
    Regards,
    Antony

  • Regular expression for date extraction

    hi all,
    please show me how can i extract date from statment the date has four forms such as
    8-8-2008 or 8/8/2008 or 8-agu-2008 or 8.8.2008
    the follwing query can extarct only in this form 8/8/2008.
    SELECT REGEXP_SUBSTR(' the conference will be on 8/8/2008', '[0-9]{1,}/[0-9]{1,}/[0-9]{2,}') FROM dual;
    regards
    Ayham
    Edited by: Ayham on Jul 29, 2012 10:17 AM

    Solomon Yakobson wrote:
    Slow_Moe wrote:
    so why do you keep nagging about 10-11-12 ??Because OP asked help on validating date. In order to do that we need to validate year, month and day. When we see 8/8/2008 we can't figure out is first 8 representing month or day. we know 2008 is year and month is august and day is 8 only because 8 is used in both positions. But OP will have other dates, right? So if 15/8/2008 comes along how can be validate it without knowing if 15 represents day on month. We could say that since 15 > 12 it represents day and therefore 8 represents month. And 8/15/2008 could be validated using same logic. In all such cases we can deduce the actual date. But what about 9/8/2008. We can validate it and tell it is a valid date but we can't tell what that date is. So OP has to clarify what is actually needed - validate date regardless of format but not being able to tell the actual date or validate date based on some format. If latter, OP needs to provide format mask.
    SY.Exactly why is it many of you "guru status"-wanna-be-aces always presume everyone else is a complete idiot??
    +"But what about 9/8/2008. We can validate it and tell it is a valid date but we can't tell what that date is."+
    Wow, really? You must be some kind of genious! Come on now, don't you find it embarrasing to believe
    this is something you have to explain people? No? Ok, I'll grant you 100 reward points for that incredible insight.. no, wait, 1000. It was that good.

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • Open data extraction orders -  Applying Support Packs

    Dear All,
    I have done the IDES 4.6C SR2 installation.
    While updating the support packs, i get the message saying
    CHECK_REQUIREMENTS phase.
    Open data extraction orders
    There are still open data extraction orders in the system
    process these before the start of the object import because changes to the ABAP Dictionary structures could lead to data extraction orders not being able to be read after the import and their processing terminating
    For more details about this problem, see Note 328181.
    Go to the Customizing cockpit for data extraction and start the processing of all open extraction orders.
    I have checked the Note.
    But this is something m facing for the first time.
    Any suggestion!!!
    Rgds,
    NK

    The exact message is :
    Phase CHECK_REQUIREMENTS: Explanation of the Errors
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.

  • Bulk API V2.0 Data extract support for additional objects (Campaign,Email,Form,FormData,LandingPage)?

    allison.moore
    Any plans for adding following objects under Bulk API V2.0 for data extraction from Eloqua. Extracting the data using the REST API for these objects makes it complicated.

    Thanks for quick response. Extracting these objects using REST API in depth=Complete poses lots of complication from the code perspective since these object(s) contains multiple nested or embedded objects within it. is there any guideline on how to extract these objects using REST so that we can get all the data which is required for analysis/reporting.

  • Data Extraction and ODS/Cube loading: New date key field added

    Good morning.
    Your expert advise is required with the following:
    1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
    2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
    3) Data must be appended in future.
    4) the current InfoPackage in the ODS is a full upload.
    5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
    I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
    My questions are:
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    Your assistance will be highly appreciated. Thanks
    Cornelius Faurie

    Hi Cornelius,
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    -->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
    If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    --> Yes, It is correct. Otherwise you will loose historic data.
    Hope it Helps
    Srini

  • Error messege- Please enter a valid date in Background Job

    Hi all,
    background job which is scheduled in sm37 is giving error for dated 01.01.2009 to 30.01.2009.
    But before 31.12.2008 it was working fine
    error messege-
    This is happening in Production and developement client.
    While running the report it showing the error messege- Please enter a valid date
    One function module is used in the report for geting fiscal year from date.
    it is showing the following error messege for perticular date range- 01.01.2009 to 30.01.2009.
    GM_GET_FISCAL_YEAR
    Error
    Fiscal year variant V3 not defined or maintained for date 01.01.2009
    Thanks
    Susanta

    Hi,
    Did you put the date fields to be dynamic in your variant? i.e if in your variant you specified date range of last year, this will not work for this year and eventually give you an error message that the date is not valid...
    Dev.

  • Error messege in Background Job(SM37) -  Please enter a valid date

    Hi all,
    background job which is scheduled in sm37 is giving error for dated 01.01.2009 to 30.01.2009.
    But before 31.12.2008 it was working fine
    error messege-
    This is happening in Production and developement  client.
    While running the report it showing the error messege- Please enter a valid date
    One function module is used in the report for geting fiscal year from date.
    it is showing the following error messege for perticular date range- 01.01.2009 to 30.01.2009.
    GM_GET_FISCAL_YEAR
    Error
    Fiscal year variant V3 not defined or maintained for date 01.01.2009
    Thanks
    Susanta

    Hi,Because this datein the past

  • BOM Valid Date

    Hi,
    While creating BOM due to some human error we took wrong valid date.Now i want to change it.
    Please suggest me the T-code or any customization to change the same.
    "I am online"

    Dear Pawar
    This are the basic IMG settings in order to used the engineering change management
    features.
    x refers to a tick.
    First setup the control data in OS54.
    Revision sections
    Fields :-
    Revision level active            x
    Ext. revision level              x
    Higher revision level            x
    Object Management Record sections
    Fields :-
    Object maintenance               x
    Assign alternative date          x
    Overide value/assignment date sections
    Fields :-
    Only with leading change mst.    x
    Setting the Revision level active is to activate the engineering change management for
    material master.  If you do not want the engineering change management for materials,
    remove the tick.
    With a tick in "Higher revision level", the new revision level has to be always higher
    than the old revision.
    Secondly, setup the external number range in OS53.
    If you use the internal number range, then remove the external number range.
    Third, define the status for the change master records in transaction SM30 - V_T419S.
    Change No.     Chg        Date Chg     Dist. Lock      Description
    Status         Poss
      1             x            x             ' '         Active
      2            ' '          ' '            ' '         Inactive
      3            ' '          ' '             x          Locked
    Fourth, define modification parameters for the BOM in OS27.
    Fields :-
    BOM Validty Maint.               x
    EC Management Active             x
    History Requirement             ' '
    Unit Piece                      PC
    The rest of the fields are blank.
    If the History Requirement is tick, every time during creation of the BOM, SAP will
    prompt your for an ECN number as is compulsory.
    Lastly, define fields selection for routing in OP5A.
    PP task lists: initial screen -> Change number -> Transaction code -> Tick Req.
    Setting the Transaction code fields Required indicate that you have make the ECN number ........
    Hope this will help you one of our friends gave me this
    Pavan

  • Regd : How to find Validity date for a user in central user system

    Hi Experts;
    I want to get the list of users with profile SAP_ALL  with following details like validity ,user type ,user name ,user id..
    I can get through SUIM for each individual systems.Its very difficult to login to each system ,generate the report.So I prefered to go for Central system
    But if I use central user system I have no option to find validity and user type for the system ( SUIM - > Cross system application )
    I have also tried to the table USRO2 ( which gives only the list of users in the central system )
    So is there any possible ways to find the Users with profile SAP _ALL with validity date in the central user system. So that I can easily generate it as one report instead of logging to each and every system
    Regards
    Sanjeev.S

    Hi Ruchit
    Thanks for your reply. I want to find the validity date of all users having SAP_ALL
    profile of all child system connected through central user system .So it is possible
    to do that in Centrals System by executing the report?
    If I execute that report in Central user system will it give the details of all child
    system connected to central system
    I think it will give only the result of Central system and not the child system connected to Central system.Please clarify me.
    I can execute the report by logging to each child system ,but it takes very long hours for me since there are many system in my landscape.
    Awaiting for your reply.'
    Thanks
    Sanjeev.S

  • IPhoto says it cannot import my photos b/c they may be an unrecognized file or the file may not contain valid data. The operation could not be completed. Error code is: ImageCaptureCoreError-9905.

    Unable to download photos from my camera. iPhoto returns an error message that states: The following files could not be imported (they may be an unrecognized file or the files may not contain valid data). FILE NAME: 100_0490.JPG      REASON: com.apple.ImageCaptureCoreError- 9905.  WHAT'S UP WITH THAT?  Did not have a problem importing photos under the LEOPARD OS, however, now that I have upgraded to Yosemite 10.10.3 the iPHOTO app is not performing the imports.

    Are you using Photos, or iPhoto? Maybe you've got a corrupt file on your camera. I would try using the image capture application to import the bad picture to my desktop. If it works and the picture looks ok,  then delete that pic from the camera, then try to import into iphoto again. You can then import the pic it was saying was bad from your desktop.
    Or, try downloading all the pics using image capture and then import from the download folder.

Maybe you are looking for

  • Two iPhone users, one account for apps & media

    My wife and I just bought iPhones. We are trying to share some stuff and keep other stuff separate, here's what we're looking for. 1. We use two different Macs with different emails. 2. We are using a single Mobile Me account to sync our address book

  • CPU Upgrade on Satellite L30?

    Tried searching but didn't get an answer. Bought a laptop from ebuyer the other week, and so far so good, but good do with a bit more juice. Currently running a CoreDuo 1.73 processor, wondering if Core2 or something a bit better will suit. Heard som

  • Print to PDF (Firefox addon) won't print Combo Box Dropdown lists

    Hi. I have this webpage that a user can type information into, there are also drop down lists where they can choose a value. My problem is that when "Print to PDF" converts the page the typed info shows but none of the changed values in the drop down

  • Can't print PDF's from new printer Brother HL-L2340DW

    Installed new wireless Brother printer today. Can print word, excel, as well as webpages just fine, but cannot get any PDF's to print. I open these docs in Adobe Reader. Doesn't matter whether I have them downloaded from the web, or saved on my pc. D

  • Matching a string of text with another

    Hello Gurus I have what is hopefully a fun problem for someone to help me solve. I was unable to find the relevant commands in the ABAP dictionary and am not the best at ABAP anyway! We have field with properties: /BIC/OZBARCODE Data Type  NUMC Lengt