Issue with Delta upload in Extractor
Hi all,
We are using the Extractor 2LIS_11_VAITM to upload Sales Overview cube through Delta Process. It so happens that when ever a new record is entered (To be specific, whenever a new sales order is created) data is extracted and uploaded properly in to the cube.
But when an existing record is changed(existing sales order), the record is not being extracted to R/3 extractor (2LIS_11_VAITM) which means any CHANGED RECORDS are not being extracted as per the schedule.
Any hints to solve the issue would be helpful.
Regards,
Ram.
Hello,
Load is using Direct Delta or Queued Delta or Unserialized V3 Update ?
Is Data from InfoSource 2LIS_11_VAITM directly feeding the InfoCube or is into a ODS ? If it is directly feeding into InfoCube, do you have document number InfoObject in the Cube ? How are you checking whether existing data is been updated or not or new record arrived into your Cube ? Take any document # which you think got changed in R3 and check in the PSA too.
problem of not updating existing documents was always there or just happened today ?
GSM.
Similar Messages
-
Issue with Delta in Function Module
Hi Team,
I have an issue with delta in Genric extraction using function module.Full load is working fine and i have taken post_date as delta field.plz chk the code if any delta related statements are missing.
FUNCTION ZRSAX_BIW_MANGEMENT_RAT .
""Local interface:
*" IMPORTING
*" VALUE(I_REQUNR) TYPE SRSC_S_IF_SIMPLE-REQUNR
*" VALUE(I_DSOURCE) TYPE SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
*" VALUE(I_MAXSIZE) TYPE SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
*" VALUE(I_INITFLAG) TYPE SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
*" VALUE(I_READ_ONLY) TYPE SRSC_S_IF_SIMPLE-READONLY OPTIONAL
*" TABLES
*" I_T_SELECT TYPE SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
*" I_T_FIELDS TYPE SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
*" E_T_DATA STRUCTURE ZQMBW_FUJ_MANAGEMENT OPTIONAL
*" EXCEPTIONS
*" NO_MORE_DATA
*" ERROR_PASSED_TO_MESS_HANDLER
Example: DataSource for table MANAGEMENT RATING
TABLES: ZQMBW_MANAGEMENT.
Auxiliary Selection criteria structure
DATA: L_S_SELECT TYPE SRSC_S_SELECT.
Maximum number of lines for DB table
STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
counter
S_COUNTER_DATAPAKID LIKE SY-TABIX,
cursor
S_CURSOR TYPE CURSOR.
RANGES: POST_DATE FOR ZMMTVEND_RATING-POST_DATE,
VENDOR FOR ZMMTVEND_RATING-VENDOR.
Initialization mode (first call by SAPI) or data transfer mode
(following calls) ?
IF I_INITFLAG = SBIWA_C_FLAG_ON.
Initialization: check input parameters
buffer input parameters
prepare data selection
Check DataSource validity
CASE I_DSOURCE.
WHEN 'ZQMMANAGEMENT_DS'.
WHEN OTHERS.
IF 1 = 2. MESSAGE E009(R3). ENDIF.
this is a typical log call. Please write every error message like this
LOG_WRITE 'E' "message type
'R3' "message class
'009' "message number
I_DSOURCE "message variable 1
' '. "message variable 2
RAISE ERROR_PASSED_TO_MESS_HANDLER.
ENDCASE.
APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
Fill parameter buffer for data extraction calls
S_S_IF-REQUNR = I_REQUNR.
S_S_IF-DSOURCE = I_DSOURCE.
S_S_IF-MAXSIZE = I_MAXSIZE.
Fill field list table for an optimized select statement
(in case that there is no 1:1 relation between InfoSource fields
and database table fields this may be far from beeing trivial)
APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
ELSE. "Initialization mode or data extraction ?
IF S_COUNTER_DATAPAKID = 0.
Fill range tables BW will only pass down simple selection criteria
of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'VENDOR'.
MOVE-CORRESPONDING L_S_SELECT TO VENDOR.
CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
EXPORTING
INPUT = VENDOR-LOW
IMPORTING
OUTPUT = VENDOR-LOW.
CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
EXPORTING
INPUT = VENDOR-HIGH
IMPORTING
OUTPUT = VENDOR-HIGH.
APPEND VENDOR.
ENDLOOP.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'POST_DATE'.
MOVE-CORRESPONDING L_S_SELECT TO POST_DATE.
CONCATENATE L_S_SELECT-LOW6(4) L_S_SELECT-LOW3(2) L_S_SELECT-LOW+0(2) INTO POST_DATE-LOW.
CONCATENATE L_S_SELECT-HIGH6(4) L_S_SELECT-HIGH3(2) L_S_SELECT-HIGH+0(2) INTO POST_DATE-HIGH.
APPEND POST_DATE.
ENDLOOP.
**Get Management rating details
OPEN CURSOR WITH HOLD S_CURSOR FOR
SELECT VENDOR POST_DATE OVERALL_MNGT_RAT OVERALL_DEV_RAT FROM ZMMTVEND_RATING WHERE VENDOR IN VENDOR AND POST_DATE IN POST_DATE .
ENDIF.
Fetch records into interface table.
FETCH NEXT CURSOR S_CURSOR
APPENDING CORRESPONDING FIELDS
OF TABLE E_T_DATA
PACKAGE SIZE S_S_IF-MAXSIZE.
S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
IF SY-SUBRC <> 0.
CLOSE CURSOR S_CURSOR.
RAISE NO_MORE_DATA.
ENDIF.
ENDIF. "Initialization mode or data extraction ?
ENDFUNCTION.Hi
Check URLs:
How to populate the ranges using FM for the SELECTs
Re: Generic Delta Function Module -
Recurring issue with mobile upload
Hi all,
Using Lightroom 5.6, I am experiencing ongoing issues with synchronising (uploading) LR to mobile devices. In brief, the upload will typically fail in mid-session with some collections partially or fully uploaded and some not started. (This may initially have been caused by going off-line during an upload), It will not recover. Deleting all uploaded data to prompt a restart does not help - it does not restart either.
Previously I have had to contact support and send the diagnostic report. Apparently there is a flag on the server end that needs resetting. Not the most satisfactory of solutions as it requires an action outwith the users control, but it does / did reset the file transfer interface. Apparently this bug was scheduled for fixing in the next up date. I understand I am on the latest (5.6) version , so does anyone know if this is still a bug?
Meantime, can support please reset my connection?
Regards .... AlastairHi Alastair,
I've contacted you privately.
Thanks,
Ignacio -
Issue with Delta Load in BI 7.0... Need resolution
Hi
I am having difficulty in Delta load which uses a Generic Extractor. The generic extractor is based on a view of two Tables. I use the system date to perform the delta load. If the system date increaes by a day, the load is expected to pick up the extra records. One of the tables used in the view for master data does not have the system date in it.
the data does not even come up to PSA. It keeps saying there are no records.... Is it because I loaded the data for yesterday and manually adding today's data...?
Not sure what is the cuase of delta failing....
Appreciate any suggestions to take care of the issue.
Thanks.... SMaaHi
The Generic DataSource supports following delta types:
1. Calender day
2. Numeric Pointer
3. Time stamp
Calday u2013 it is based on a calday, we can run delta only once per day that to at the end of the clock to minimize the missing of delta records.
Numeric pointer u2013 This type of delta is suitable only when we are extracting data from a table which supports only creation of new records / change of existing records.
It supports
Additive Delta: With delta type additive delta, the record to be loaded only returns the respective changes to key figures for key figures that can be aggregated. The extracted data is added in BI (Targets: DSO and InfoCube)
New status for changed records: With delta type new status for changed records, each of the records to be
loaded returns the new status for all key figures and characteristics. The values in BI are overwritten (Targets: DSO and Master Data)
Time stamp u2013 Using timestamp we can run delta multiple times per day but we need to use the safety lower limit and safety upper limit with minimum of 5 minutes.
As you specified, the DS is based on VIEW (of two tables, with one containing date and other does not).
Kindly check the above lines and verify, if the view (primary key) could be used to determine the Delta for your DS.
Also let us the if any standard SAP tables used in creating VIEW and if so, what is the size of the DS load every day?
Thanks.
Nazeer -
Hi Friends,
I have an issue with loading.
1. From source 2lis_13_VDITM, data loads to 2 targets.
ZIC_SEG, 0SD_C03.and from 0sd_C03 it again loads to ZSD_C03W and ZSD_C03M through DTP.
I have done a repair full load on 08.08.2011 to PSA and loaded this request manually to cube ZIC_seg.
I forgoted to delete this request in PSA as i dont want to load this to other targets.
Before i notice ,already delta requests got loaded and has pulled my repair full request also to other targets.
As i have not done any selective deltions on other cubes there may be double entries.
I am planning to do the below steps inorder to rectify the issue.
1. Do a selective deletion of the delta request in all 3 targets which got loaded on 8th along with repair full.
2. Delete the repair full request from PSA.
So now delta request which got loaded after repair full request, is left in PSA.
My question is if my PC runs today will this delta request in PSA also pulls again to cube through DTP?
Kindly share any other ideas please urgent...
Regards,
BanuHi Banu,
If the data in the CUBE's is not compressed, then follow the below steps
1)Delete the latest request in CUBE ZIC_SEG
2)Delete the latest request from cubes ZSD_C03W and ZSD_C03M and then from 0SD_C03.
3)load the delta request manually using DTP from your cubes(in DTP you can load by giving request number). to cubes ZIC_SEG and 0SD_C03.
3)now run the delta DTP to load the delta request from 0SD_C03 to ZSD_C03W and ZSD_C03M.
Next when your PC runs it will load only that particular day delta request.
It will work
Regards,
Venkatesh -
Issues with file upload in flex mobile application (sharepoint as backend)
Hello,
I am working on flex mobile application for android platform for which we are having sharepoint as a backend.
(Flex SDK 4.6 and AIR 3.9)
Issue which we are facing is as follows:
We are communicating with the backend server using webservices: example:
<s:WebService id="kWebService" wsdl="http://www.kservice.net/kdatabaseservice.asmx?WSDL" >
<s:operation name="AddPost"
resultFormat="object"
result="addPostResult(event)"
fault="postsfaulterr(event)" />
</s:WebService>
Above services are working fine but we are facing issue with one service which is related to file upload.
File upload for <10 MB is working fine but when we try to upload larger file on server it fails to process.
We are sending bytearray to the backend and backend code is writing those bytearray into file.
We have tried many ways to overcome from this situation. like we have checked configuration for file upload size on server , we have tried wcf services as well. Please help us on this criticle point as soon as possible
Thanks
DhwaniPrashant8809 wrote:
Hi
>
> I have already gone through the video by Thomas Jung for multiple file upload but it saves the contents in server and not in >transparent table. So please suggest me alternative solutions.
>
>
> Regards
> Prashant Chauhan
What do you mean that my video saves the contents int he server and not in transparent table? I save the data into a temporary database table so it can be accessed by the parent WDA session. From there the WDA session can do whatever it wants with it. What do you mean by transparent table - that would be a database table. Do you actually mean internal table? if so, just read the data from the temporary database table into memory. -
Issue with Image upload in DAM (changes the size)
Hi,
We are seeing issues with some of the .jpeg files dimensions getting chnaged when uploaded into CQ5.4. For e.g. I have an W-Training-Shoes-hotspot-1b-140x140.2.jpeg of140x140 dimension. When I upload it in CQ it saves with width=1600 & height=140. Did anybody saw this issue before?
If I copy the original file & resave it with 140x140 size & reload in CQ, it will load fine with width=140 & height=140.
It looks like the assets has wrong metadata. As a workaround add a process step right after "Metadata extraction" to the update asset workflow and have a script updates with correct dimensions.
-
Issue with delta queue initialization
Hello
I am having a problem with an initialization of the delta queue for a certain extractor. The initialization was successfull, and I can see the extractor in the delta queue maintenance (RSA7) in ECC, but when users are posting transactions in ECC, they are not coming into RSA7.
The extractor is 0CO_OM_WBS_6, so its not standard LO cockpit where we have V3 (background) jobs which have to be triggered.
One more thing, the record which was posted was backdated, ie the date is sometime in the past. But I think the delta queue still picks the records. Can i know how the delta queue funtions based on the time stamp?
Please advise.Here is the OOS note fro FI Datasources.
OSS 485958:
Symptom
The FI line item delta DataSources can identify new and changed data only to the day because the source tables only contain the CPU date and not the time as time characteristic for the change.
This results in a safety interval of at least one day.
This can cause problems if the batch processing of the period-end closing is still running on the day following the period-end closing and all data of the closed period is to be extracted into BW on this day.
Example:
The fiscal year of the company ends on December 31, 2002.
On January 01, 2003, all data of the closed fiscal year is to be extracted into BW.In this case, the FI line item delta DataSources only extract the data with a safety interval of at least one day.
This is the data that was posted with a CPU date up to December 31, 2002.
However, the batch programs for closing the 2002 fiscal year are still running on January 01, 2003.
The postings created by these programs with a January 01 date for the 2002 fiscal year can therefore only be extracted with delta extraction on January 02, 2003. However, extraction should already be possible on January 01, 2003.
Other terms
FI line item delta DataSources
0FI_AP_4; 0FI_AR_4; 0FI_GL_4
Reason and Prerequisites
This problem is caused by the program design.
Solution
The functions are enhanced as of PI 2002.1, standard version.
Due to a customized setting, it is now possible for the FI line item extractors to read the data up to the current date during the delta runs.
During the next delta run, the data is then read again from the day when the previous extraction was called, up to the current day of the call for the new extraction.This means that the delta extraction of the data always occurs with time intervals overlapping by one day.
The delta calculation by the ODS objects ensures that the repeatedly extracted data does not cause duplicate values in the InfoCube.
Activate the new functions as follows:
1. PI 2001.1 or PI 2001.2:
Implement the new functions with the correction instructions in this note.As of PI 2002.1, these changes are already implemented in the standard version.
1. Start a test run for an FI line item DataSource via transaction RSA3.
2. Two new parameters are now available in the BWOM_SETTINGS table.
3. Parameter BWFIOVERLA (Default _)
Set this parameter to "X" using transaction SE16, to activate the new function of overlapping time intervals.
Caution
The new function also uses a safety interval of one day to determine the To value of the selection, if the time limit set by BWFITIMBOR is not reached.This means that only data up to the day before the extraction was called is selected, if the extraction starts before 02:00:00 (default for BWFITIMBOR).
Correction: Parameter BWFITIMBOR (Format HH:MM:SS, default 020000)
The default value for this parameter is 02:00:00.Do not change this value.
If this time limit is not reached, only data up to the previous day (with BWFIOVERLA = 'X') or the day before this day (with BWFIOVERLA = ' ') is selected.
This logic prevents records with a CPU date of the previous day being contained in the posting when the extraction is executed.
Check any OSS notes for your data source. -
Issue with delta load from R/3 to BW
Hi frnds,
There is a standarded D.S 2LIS_05_ITEM in R/3. Evereday We extraceted data from R/3 to BW based on this standard Data Source with delta load.There are two fields ZX and ZY in BW ,some times the data haven't extracted for this 2 fields in BW with the delta update.But some times this 2 fields properly extracted in to BW with the same delta update.
Why is it happen like this ? Please give some inputs on this.
Regards,
Satya.Hello,
if its a standard field then its getting populated in correct way.
if its custom field then you need to analyze the records for which its getting populated and for one which its not.
Quite possible that some cutomization in CMOD results in this behaviour.
Also,check underlying tables to see the correct values.
Regards
Ajeet -
Issue with the standard SAP Extractor 2LIS_04_P_COMP
Hello gurus,
I am working on the PP and PM module. the standard extractor 2LIS_04_P_COMP is already in use in the development system .
I have made the changes to the Production Order (PO) and check the delta for the BW load the changes in the field release, deliver and PO confirm are not populated with the values, But when i check the same in COR3 tcode this PO has the above all status.
Could you please let me know why its behaving in this way. did any one has faced this scenario..
RegardsHello Gurus,
In this extractor i am getting the negative values for the ENMNG (Withdrawn Quantity) field for the after image records, for the good which are consume for the prodution. so i am getting the values in the report level as negative. could any one kindly let me know what the logic working behind this.
Regards -
hello, i am having problems with rsa7 queue. i have to delete the init everyday and recreate every day. all day long it works fine but then it stops working the next day. in my monitor all i see is a yellow request. cna some one tell me how to fix this issue? thanks.
Why are you deleting INIT everyday and recreating ? What is the error msg you have seen in Monitor ?
Is the present load is Init ? or Delta ? what did you see in monitor screen ? Is it still running ?
can check the jobs in SM37 in BW and R/3 , also check in SM58 / WE02 if any Idocs are stucked -
Issue with file upload to Microsoft One Drive
Hi,
My uploaded file(25.6MB, 44 seconds long) from Adobe Premier Elements 9 will not play on Microsoft One Drive(MOD),
Is anyone else have problems using MOD?
I created the file I uploaded on my new HP Envy Computer(x64 proccessor with 8 gig of RAM).
I created this file on the Adobe Premier Elements 9 software, saved to my computer, and then uploaded to MOD.
I created it from my project by choosing the MPEG category Then within the MPEG category I chose the HD 720p 30 footprint as the smallest and fastest option to upload.
. I saved as a file in a folder on my computer. Don't know if this makes a difference but when I right clicked I chose "Windows Media Player" as the "Opens With" option.Don't think "Opens With" option really matter? Doesn't MOD automatically convert the file to whatever they want to use to open the file anyway?
So that is the file I uploaded to MOD using the MOD "Upload" button.
When I clicked to play, after waiting for what I assume was a "buffering process" , I got a "Sorry, This video can't be played." message.
Also, FYI , i uploaded to Google Drive . It plays but pauses to buffer about 30 seconds in.
You Tube is the another drive I uploaded to. It plays perfectly there. But I would preferr to play on MOD. You feedback appreciated.
Thanks,
Ed'sAPE9Hi,
Are you facing this error with specific browser. Test it in another browser.
If yes, check below SAP Note
1900896 - Browser: IE Restrictions for Portal and NWBC content - IE Quirks and Standards Document Modes -
Slideshow issues with my uploaded site
I have recently built a website in muse and exported to html. I uploaded my website to my web host everything works...except all my slideshows are now up in my banner space in multiple web browsers such as: chrome, mozilla, safari and internet explorer. I did not have this issue when previewing my site, only after going live. How can I fix this?
I looked in Chrome, Safari and Firefox and it looks fine to me? Have you got it sorted out?
-
Hi
I am facing a problem in uploading data...problem is data goes upto psa but after that failes to go in DSO as there is some # character problem..
can anyone suggest something..
thanks in advanceHi,
The issue is regarding special character,
If your load is through PSA,then you can edit the value in PSA to do this you have to delete the request from the target without making the QM status red otherwise it will not allow you to edit in PSA Then do reconstruction...ie push the request from the PSA to the target
Or
Go to SE38 and execute the program RSKC_ALLOWED_CHAR_MAINTAIN and give ALL_CAPITAL or the char you want to add.
Check the table RSALLOWEDCHAR. It should contain ALL_CAPITAL or the char you have entered.
Or,
Goto RSKC T-Code and permitt that Char
Hope it helps u,
Thanks & Regards,
SD -
Hi,
I made a very involved 75 page photobook, and I want to purchase now, so, I cick buy now, all my info is verified and 1-click is set up, but as soon as the upload (assembling book) begins, it gets to page three and hangs (beachball).
Page three was blank, and it happened; I made it an image and it happened; I check the fonts, but there is no text on that page, still happens; repaired permissions, rebuilt the tn cache, made a copy of the library...
Any ideas? Need to get this ordered!
Thank you,
JoeOthers have reported similar problems with ordering books from a MacPro. It's though to be associated with the graphics card, NVIDIA GT7300.
Happy Holidays
Maybe you are looking for
-
my screen is frozen and it will. It allow me to turn off. I reset it but that didn't fix the problem. Then I tried resetting on my computer through iTunes but it will not allow me to reset until I shut off 'Find My iPhone' which I cannot do cause the
-
My busted ipod's serial number is scratched out and I need repairs?
I bought this ipod mini off of a guy who bought it off someone else. The thing is busted and won't even turn on or charge and the case is banged up and the top and bottom covers are missing. I wanted to get repairs on it and I was willing to pay but
-
Does frequently sleeping/turning off the 30" display lower display life?
I typically sleep or turn off my 30" Cinema display about 6 times each day while my Mac Pro is sleeping. I also turn the display off when listening to music/podcasts. Will this frequent power on/power off lower the life of the monitor?
-
Is there a way to sync address books?
I have the address book for our company on a server. I would like to synchronise the address books in both directions. Is this possible? Any tips as to how to do this? Any ideas or suggestions? Thanks
-
which serial number ( box has two ) do I quote as my Box shows photoshop elements 11 & premiere elements11?