Not All RFx Data Extracted to "0BBP_BID" ODS (SRM - Bidding Engine)

Hi Expert,
I'm now working on an SRM Reporting in SAP BW especially Bidding Engine Reporting.  I have installed the Business Content of Bidding Engine and do extraction to the InfoProviders.
The problem is at ODS "0BBP_BID" not all RFx or Bid Invitations data extracted.  After I investigate the issue I found out that the data that not extracted is the data with Process Type (RFx Type) Z* (User Defined).
Is there any setting that I have to do so the Process Type "Z*" can be extracted to the ODS?
Please help and give us some solution.
Thanks,
Gilang

John
First of all with out having user ID and password it is not possible for bidders to enter SRM system to submit bids. So Business partners have to be created in SRM with contact person(Bidder) details.
You can try one thing. It is just my thought and I dont know how far it will be viable. I never tried.
Create a dummy business partner(vendor) in SRM and create many contact persons ie one for each vendor with their ID/password and email address. Then select all vendors(like receiving multiple bids from same vendor) and send the bid invitation.
Bidders will login and submit the bids. Then buyer will see the bids from all vendors (actually from one vendor but with different contact persons).
Then buyer can decide on the winning bid and then create a business partner for the winning bid and continue with the purchase activities.
With this you can avoid creating many business partners at the time of creating bid invitations.
Other alternative is, if you dont want to create ID and passwords for Bidders then send tender details to all vendors off line via e-mail and ask them to send the price to Buyer offline.
Regards
Jagdish

Similar Messages

  • Fire Fox sent not all the data in text form. Reached only piece of information. Where can I find the cache of sent text?

    Fire Fox sent not all the data in text form. Reached only piece of information. Where can I find the cache sent the text?
    For example, I posted an article on a news site, but found that only a small part of my article was posted.

    I'm not sure whether Firefox saved that information. There is a chance that it is in the session history file, especially if you haven't closed that tab yet. To check that:
    Open your currently active Firefox settings folder (AKA your Firefox profile folder) using
    Help > Troubleshooting Information > "Show Folder" button
    Find all files starting with the name sessionstore and copy them to a safe working location (e.g., your documents folder). In order to read a .js file, it is useful to rename it to a .txt file. Or you could drag it to Wordpad or your favorite word processing software.
    The script will look somewhat like gibberish, but if you search for a word in your article that doesn't normally appear in web pages, you may be able to find your data.
    Any luck?
    For the future, you might want to consider this add-on, although I don't know how long it saves data after you submit a form: [https://addons.mozilla.org/en-US/firefox/addon/lazarus-form-recovery/].

  • Firefox opens web pages but not all the data or pictures.

    Firefox 22.0 opens web pages slowly, it takes up to 30 seconds to load and then not all the pictures and data is loaded or it times out. If I hit the reload button, everything loads OK. This happens on mant sites, not just a few.
    I use Google Chrome as my primary browser, and it is quite fast. My internet connection is commercial and it is OMG fast. I would like to replace Google Chrome( I hate being tracked), but at this point, the Firefox is just too slow.
    Please advise how to correct these issues.
    Thanks

    Which security software (firewall, anti-virus) do you have?
    Make sure that Firefox is fully trusted in your security software.
    You can check for problems with preferences and rename or delete the prefs.js file and possible numbered prefs-##.js files and a possible user.js file to reset all prefs to the default value.
    *http://kb.mozillazine.org/Preferences_not_saved
    *http://kb.mozillazine.org/Resetting_preferences
    Start Firefox in <u>[[Safe Mode|Safe Mode]]</u> to check if one of the extensions (Firefox/Firefox/Tools > Add-ons > Extensions) or if hardware acceleration is causing the problem (switch to the DEFAULT theme: Firefox/Firefox/Tools > Add-ons > Appearance).
    *Do NOT click the Reset button on the Safe Mode start window or otherwise make changes.
    *https://support.mozilla.org/kb/Safe+Mode
    *https://support.mozilla.org/kb/Troubleshooting+extensions+and+themes

  • I had to replace my iPhone so when I got my new o e and retorted it not all my data transferred, I am still missing my audio books. How do I get them back?

    I have tried restoring purchases and looking to download it from the iCloud and iTunes but I can't find my audiobooks.

    Hi there,
    If you backed up to iCloud, you may want to review the article below.
    iCloud: Backup and restore overview
    http://support.apple.com/kb/ht4859
    The following items are not backed up to iCloud. You can sync these items with a computer using iTunes:
    Music, movies and TV shows not purchased from the iTunes Store
    Podcasts and audio books
    Photos that were originally synced from your computer
    Hope that helps,
    Griff W.

  • Data Extraction and ODS/Cube loading: New date key field added

    Good morning.
    Your expert advise is required with the following:
    1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
    2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
    3) Data must be appended in future.
    4) the current InfoPackage in the ODS is a full upload.
    5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
    I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
    My questions are:
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    Your assistance will be highly appreciated. Thanks
    Cornelius Faurie

    Hi Cornelius,
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    -->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
    If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    --> Yes, It is correct. Otherwise you will loose historic data.
    Hope it Helps
    Srini

  • How to find all the Master data extract structures and Extractors

    I intend to create Master data dimensions closely similar to SAP BI in a 3rd party system. I would like to use SAP's standard extractors for populating the master data structures in SAP BI and then use a proprietary technology to create similar structures in a 3rd party database.
    Question: How to get a complete list of all Master data extract structures and corresponding extractors?
    Example: In ECC if I do SE80 and give 'Package' and 'MDX' and then press the 'display' spectacles, I get a list of structures and views under "dictionary objects" covering Material, Customer, Vendor, Plant Texts and attributes.
    How do I get the remainder of the Master data extract structures viz. Purchase Info Records, Address, Org Unit etc?
    Regards
    Sasanka

    Hi,
    try the table ROOSOURCE and search the data source with string in astrick attr  for master data and for text with text
    This will give you the list of all the master data source in the system and it contains a column which tells about the extract structure and function module used by each.
    Thanks
    Ajeet

  • Data Extract Design

    Hi All,
    I have a requirement where I have to extract data from various different tables-->Only particular columns.
    The requirements are same for different databases, hence I thought to have a single generic approach and reuse the same code to perform the extract and create an ascii file.
    Below is the typical scenarion i want to achieve, hence need your expertise inputs to start off..
    a) Define the required columns -- This should be configurable, i.e., add or remove columns in future.
    b) Extract the column names from the database for those that are defined in the step a) above.
    c) Extract the data from relevent tables/columns for various conditions based on step a and b above.
    d) Create an ascii file for all the data extracted.
    I'm unsure if there is anything wrong or please suggest the best approach.
    Regs,
    R

    user10177353 wrote:
    I'm unsure if there is anything wrong or please suggest the best approach.
    The first thing to bear in mind is that developing a generic, dynamic solution is considerably more more complicated than writing a set of extract statements. So you need to be sure that the effort you're about to expend willl save you more time than writing a script and copying/editing it for subsequent re-use.
    You'll probably need three tables:
    1. Extracts - one record per extract definition (perhaps including info such as target file name)
    2. Extract tables - tableges for each extract
    3. Extract columns - columns for each extracted column.
    I'm writing this as though you'll be extracting more than one table per run.
    The writing to file is the trickiest bit. Choose a good target. Remember that although we called them CSV files, commas actually make a remarkably poor choice of separator, as way too much data contains them. Go for soemthing really unlikely, ideally a multi-character separator like ||¬.
    Also remember text files only take strings, so you need to convert your data to text. Use the data dictionary ALL_TAB_COLUMNS view to get the metatdata for the extracted columns, and apply explicit masks to date and numeric columns. You may want to allow date columns to have masks which include or exclude the time element.
    Consider what you want to do with complex data types (LOBs, UDTs, etc).
    Finally, you need to address the problem of the extract file's location. PL/SQL has a lot of utilities to wrangle files but they only work on the server side. So if you want to write to a local drive you'll need to use SPOOL.
    One last thought: how will you import the data? It would probably be a good idea to use this mechanism to generate DDL for a matching external table.
    Cheers, APC
    Edited by: APC on May 4, 2012 1:08 PM

  • Urgent: Not all records extracted to ODS

    Hi,
    Not all of the records I am trying to load get updated to ODS. Ex:
    Doc#     Item#     Created On     Debit/Credit
    1000     1     02012007     540
    1000     2     02012007     540
    1000     1     02022007     -540
    1000     2     02022007     -540
    These 4 records appear as 2 records in ODS as:
    1000     1     02022007     -540
    1000     2     02022007     -540
    When I do simulate update it shows me 4 records till communication structure. But from communication structure to ODS it they come in only as 2 records. There is no routine in update rules it is just direct one to one mapping. I am having this problem for records with same line item # within a document. Please advice how to solve this issue…
    Thank you,
    sam

    I personally don't think there would be a requirement to load two line items the way you mention; if that is the case there should generally be some record number also coming in with the data. The data example you gave doesn't look right/complete to me somehow.
    Anyway, you can get the record no in many ways. You can create a number range object in the system (check Txn SNUM), and you can then use standard functions in your code to get the next no (I think you can check functions NUMBER_RANGE* in se37 to get an appropriate one). Or you can always read the MAX() no stored in the ODS in this field and start from there in your routine (likely not as robust as the first option).

  • Data extraction from Siebel to BW - use BC or not

    Hello
    I am doing data extraction from Siebel to BW. I can map just 15% of Siebel fields to SAP BC data sources. Do you still recommend me to use BC or forget it and create whole buch of new infoobjects and custom ods/cubes?

    Hi,
    If data is comming from out of SAP, then carefully select the infoObects, because SAP provided InfoObjects are more meaningful and better integration with other infoobjects(Compound Objects, Navigational and so on).If You are created any custom objects, You have to take care all of them in overall design and architecture.
    I am prefering first to choose existing BC Infoobjects, if not available and then create new objects. But here You need lot of functional knowledge also

  • 'Get All New Data Request by Request' option not working Between DSO n Cube

    Hi BI's..
             Could anyone please tell me why the option ' Get one Request only' and  'Get All New Data Request by Request' is not working in DTP between Standard DSO and InfoCube.
    Scenario:
    I have done the data load by Yearwise say FY 2000 to FY 2009 in Infopackage and load it to Write-optimised DSO (10 requests) and again load Request by request to Standard DSO and activate each request. I have selected the option in DTP's to  'Get All New Data Request by Request' and its working fine between WDSO and SDSO. But not working between Cube and SDSO. While Execute DTP its taking as a single request from SDSO to Cube.( 10 request to single request).
    Regards,
    Sari.

    Hi,
    How does your DTP setting looks like from below options ? It should be change log, assuming you are not deleting change log data.
    Delta Init. Extraction from...
    - Active Table (with archive)
    - Active Table (without archive)
    - Archive ( full extraction only)
    - Change Log
    Also if you want to enable deltas, please do not delete change log. That could create issue while further update from DSO.
    Hope that helps.
    Regards
    Mr Kapadia
    *Assigning points is the way to say thanks*

  • Delta Upload request from ODS is not including the Data target of planning

    Hi,
    I have a process chain in which there is a delta upload from ODS to 3 different data target.  One of the data target is the planning cube. last 2days back i have changed the process chain schedule after was the Delta request is not hapenning for Planning cube.  The process chain contains the changing the planning to load mode and load to planning mode.  Every thing is working fine but the Delta is not happening for the Planning cube.  Delta request are going correctly to the Other 2 data targets.  Now it was like Delta has excluded the Planning cube.  Another this there is nto planning package is appearing in the Infousources.  and when I am trying to upload manually from ODS to Data targets it say that No delta is there.  Please help me in this.  This is a production problem.   I have checked all the data monitors every thing is from but no delta is hapenning for the planning cube.   
    Thanks
    Naveen.

    Resolved

  • Data extraction is not taking place from BW to R/3

    Hi Experts,
    We recently upgraded our BW 3.1 system to NW2004s/BI (BW 700). Now we are
    not able to extract data from R/3 4.6C source system.
    We checked and data extraction job is not getting triggered in R/3. It is
    not receving Idoc.All the settings are correct in WE20 and WE21. I will appreciate your quick help.I am not sure if the question has been posted in correct forum or not as I am a newbee in SDN forum. If its not there in the correct forum,please let me know.
    Many Thanks

    hi Arun,
    Thanks for your quick reply. and also I have made the mistakein the post subject . It is from R/3 to BW. But I checked all the ALE setings and RFC connection. Everyhthing is alright.but dont know why its not getting extracted. I think some problem might be lying with the function module. If you have any other idea apart from RFC/Delta please let me know. I am damn sure this isnot an issue with ALE config.
    Just to make this issue little bit clear, the exact issue now I am facing is related to Idocs. Idocs are going to the state 56 in R/3 and all the control records are blank in R/3. This has created a nightmare. All the efforts are going in vain...
    Many Thanks,
    Chinmaya
    Message was edited by:
            Chinmaya Dash

  • If Generic R3 Data Extraction does it all, why other extraction techniques?

    Hi,
    I am reviewing some documents about data extraction and I have a quick question.
    With
    1. standard Business Content extraction and
    2. Generic R/3 Data Extraction (which by definition, can be used to extract any data from R/3)
    why are there other techniques for extraction?
    For example, I do not recall but I remember that there was a different technique to extract FI data and another to extract CO data.
    Why is the need for these other techniques if Generic R/3 Data Extraction can do it all?
    (By the way, if you recall the FI and CO extraction techniques you may give me some hints)
    Thanks

    Hi Amanda,
    Would like to put something from my side as well.
    Generic extractor are a strong a tool but with lot of limitations like...if you have seen it there...only three kinds of delta menthod is supported and that too changes to only one field is captired and cannnot be used with cluster and pool tables which form a mojor portion of FI domain.Even if you can write FM for that but then it has its limitations with delta enabeling and capturing delta and performance which again leaves lot of work to be done by us.
    What SAP has coded for standard FI-CO data sources internally is very complex and if tried to be achieved through standard generic data sources and coding all the time......will make SAP very tough to be used by masses and to learn for the beginners like me and you.
    A standard transaction in SAP may hit several tables and to capture exact changes may require to write quite a complex logic and lot of thorough testing.
    We do build out own data sources for FICO as well but most of the time the need is satisfied by the existing data sources.
    I think SAP knows what kind of reports are required by the users in R/3 as they did thorough study about this and then came up with these extractors to make possible same kind of reports in BW.
    This is the advantage of SAP over other tools that it has inbuilt logic to extract the data from standard tables and capture all kind of changes in delta and manage delta easily without being bothred too much about internal complications...all you have to do intially is start the work by having a knowledge of FI and BW data sources...slowly you can start playing with them and come up with customized one and explore SAP.....as your requirement grows.
    Rest others have already explained.
    Thanks
    Ajeet

  • Extract all the data grids in one click

    Hi All,
    I need to download all the data forms that i have in the system.
    We have a fat hierarchy and i'm looking for a short way to extract all the data grids without extracting from every folder in the hierarchy separately.
    I don't care about the type of the file/files. it can be one long file...
    Do you know what action i can perform here?
    Thanks,
    Orit

    Hi,
    What operating system are you using?  What version of Excel are you using?  It's possible you are using a version of Excel we do not support.  Also, can you send me an email with your exported Excel file, so I can try opening it myself?  My email address is [email protected]
    Thanks,
    Todd

  • Unfortunately, I lost by an update all the apps on my iPad 2. Now I can not pull the data from the cloud to my iPad. How is this possible?

    Unfortunately, I lost by an update all the apps on my iPad 2. Now I can not pull the data from the cloud to my iPad. How is this possible?

    If you just installed iCloud does that mean you updated the iOS that's running on your iPad?  If so, you'll want to restore all the programs you have from the backup you hopefully made.
    Refer to these articles for help.
    iTunes: Backing up, updating, and restoring iOS software.
    If you don't want to use iCloud, simply don't activate it.
    You can also download the programs again.
    If you live in a country that supports re-downloading apps then you can re-download them.  You can refer to this article for more help.
    Downloading past purchases from the App Store and iTunes Store
    What to know if your country supports downloading past purchases?
    iTunes in the Cloud Availability

Maybe you are looking for

  • 3.1EA2 Old bug with German language settings still exists

    In the past I wrote several times about a bug in SQL Developer when running in German language mode. This bug is still not fixed in 3.1 EA2: In new folders for user definded reports only the options "Kopieren" and "Speichern unter ..." ('Copy' and 'C

  • How to set the variable of "path" and  classpath" in winxp

    I down a sdk1.4.2_01 and use it in windows xp system,I can use "javac"; but when I run "java *****"(eg: java abc),it do not work. The infomation is "Exception in thread "main" java.lang.NoclassfoundError:" I set my source code "C:\j21". PATH: C:\j2sd

  • Issue with pages

    Hi Friends, I m facing problem with form printing in smartform when i m going to print next page which have some other than first page data which i mentioned as backpage for this i used command in footer. Now my requirement is suppose there is data o

  • How to bluetooth from PC

    Hi. I'm struglling to Bluetooth from my PC to my Iphone 4. what Hardrive do I need to complete this action. Any tips on how to get it or what to do would be Appreciated. Thanks

  • Dummy cost center, Dummy profit center

    Hi everybody, anyone tell me, why is necessary to use dummy cost center and profit center ? what does dummy cost center, profit center mean? I don't clear about dummy cost center and profit center Thank in advance Minhtb Edited by: Tran Binh Minh on