New scenario- alternative materials - Back ground Batch deterimination..
Dear friends.
Please help in mapping the requirement:
particular finish material eg materila A , is having alternative raw materials R1,R2,R3 . After creating process order,
In Batch Deterimination , System has to check the Stock of R1, if it meet the demand of order qty , it has to take the stock from R1, on top proirity,
If partial stock is there for R1, It has to take that partial stock first then go for R2 for remaing qty. this has to take place in the back ground automatically.
Eg : let us say order qty is 25,kg
stock of R1 = 10 kg
stock of r2 = 5kg
stock of r3 = 50 kg
while batch deteimation in the back groung , system has to check the qty on top priority for R1 and it has to take availabe 10 kgs.
from r2 it has to take 5kgs , and for remaing 10kgs from R3.
Had the stock for R1 is suffient , then it will take stock from R1, and the line items in bom the qty's needs to made 0, so as not get the planned costs for materila for R2 and R3.
I checked it with alternative material scenro... But no use ...I can't use the probality of usage here..
Please guide.
Hi,
For determining planned cost and considering in MRP run you have to enter usage probability for bom item.
Thanks
Kedar
Similar Messages
-
How to use WS_DOWNLOAD funciton module in back ground (Batch Job)
Hi all,
Can any one tell,
How to use WS_DOWNLOAD funciton module in back ground (Batch Job)
Thanks,
Ravi Kumar.As disussed above you can use DATASET techniques to write the file to the application server. Then you could do a bdc call to Transaction CG3Y to transfer the file to the presentation server.
Just F1 help on keyword DATASET in SE80 for more information.
BDC Call Info found here: <a href="http://www.sap-img.com/bdc.htm">http://www.sap-img.com/bdc.htm</a>
Regards,
Philip Johannesen -
Back ground batch input processing error log handling
Dear Friends,
Currently im am scheduling job for batch processing of goods receipt through bapi, now after the succesfull run of this program im appending all the success and error logs in one internal table and sending this table to spool for future reference.
Is there any other way to handle the error logs??
bcoz in the method im using every time user has to run transaction sp01 to access that spool request to check the error logs.
Kindly suggest ....
Regards,
Sunny VBest way will be creating Application Logs of the reports log.This will be forever in the system unless you delete it.
Few one time settings are required. After that you can use tcodes
SLG1 (Analyze application log) and
SLG2 (Application Log: Delete expired logs).
For more info take the help from below link:
[Create Application Log|http://help.sap.com/saphelp_nw04/Helpdata/EN/2a/fa0216493111d182b70000e829fbfe/frameset.htm]
Let me know if you find any difficulty in doing this. -
Truly dumb question. How do I get PS to actually complete a command, eg straighten an horizon, and finalise the change. Even doing a "save as" with a new file name and reloading brings the image back, straightened, but with the skewed back ground frame still there. Is there a "apply change" or similiar command that I simply cannot see.
brings the image back, straightened, but with the skewed back ground frame
PSE did what you told it to do. Choose the Straighten and crop edges or fill edges options when you use the straighten tool if you want it to do more than just straighten the contents of the image. Many people prefer the plain straighten because they'd rather crop themselves than let PSE do it. -
Problem in scheduling job in back ground
hi all,
My project scenario is ,i have written a program to create WBS element.
For 1 WBS creation ,In production it takes 20 to 30 min to create as there are many transactional bdcs.
so i want to run that perticular code (all bdcs) in back ground.
so i wrote another program with name 'zback_job_for_wbs_creation' and added all the required code.
in old program,i have written some code to execute this new program in back ground . this code is below..
FORM submit_for_job.
TABLES:btcevtjob.
DATA:l_jobname LIKE tbtco-jobname,
l_jobnumber LIKE rsjobinfo-jobnumb.
DATA: count LIKE btcevtjob-jobcount,
jobname LIKE btcevtjob-jobname.
DATA: job_was_released LIKE btch0000-char1.
jobname = 'WBS_CREATION'.
EXPORT s_scrnum TO MEMORY ID 'W_SCRNUM'.
EXPORT iscrh TO MEMORY ID 'W_ISCRH'.
EXPORT iscrl1 TO MEMORY ID 'W_ISCRL1'.
EXPORT iscrl2 TO MEMORY ID 'W_ISCRL2'.
EXPORT iscrl2b TO MEMORY ID 'W_ISCRL2B'.
EXPORT iscrl2a TO MEMORY ID 'W_ISCRL2A'.
EXPORT iscrl1a TO MEMORY ID 'W_ISCRL1A'.
CALL FUNCTION 'JOB_OPEN'
EXPORTING
jobname = jobname
IMPORTING
jobcount = count
SUBMIT zback_job_for_wbs_creation
with S_SCRNNUM IN S_SCRNUM
and return via JOB JOBNAME NUMBER COUNT TO SAP-SPOOL
WITHOUT SPOOL DYNPRO.
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = count
jobname = jobname
strtimmed = 'X'
IMPORTING
job_was_released = job_was_released
EXCEPTIONS
cant_start_immediate = 1
invalid_startdate = 2
jobname_missing = 3
job_close_failed = 4
job_nosteps = 5
job_notex = 6
lock_failed = 7
OTHERS = 8
here i have put break-point in my submitted program..but its not going to the program and job gets closed successfully without executing my code( Submitted Program).
in above submit statement , if i just write submit progname and return ,then it goes to program.
but job_close function module throws exception no 5.
so pls help me out for this..
i want my submitted program to be run in back ground..
thanks.Hi,
Start Job Monitor SM37 and search for your job. I think you will find it.
Jobs are not linked to GUI frontend. So you can't debug your background job in that way.
If you want to debug you must be tricky.
Code a never ending loop with an exit condition which you can control by debugging.
DATA:
stop_for_capture.
DO.
IF sy-uname NE 'HABICH'. "change to your account
EXIT.
ENDIF.
IF NOT stop_for_capture IS INITIAL.
EXIT.
ENDIF.
ENDDO.
When your job is running, start TA SM50 and mark relevant batch work process and
choose by menu program -> program -> debugging.
regards
Walter Habich -
Back ground job - report, is scheduled 49 hrs ago, still ACTIVE??
Hi Experts,
Pls. clarify one of my simple doubt that, Weather a report(which has only one SELECT statement, with key in WHERE clause) wuld take 49 hours to execute? bcoz, I scheduled a back ground job for my_report, 49 hours ago, its still ACTIVE, OR does it went to infinitive loop?
thanqHi
no one select query won't take that mucj of time
may be that program went in infinite loop
please check with that
reward if useful
<b>below is the best way to write a select query</b>
Ways of Performance Tuning
1. Selection Criteria
2. Select Statements
Select Queries
SQL Interface
Aggregate Functions
For all Entries
Select Over more than one Internal table
Selection Criteria
1. Restrict the data to the selection criteria itself, rather than filtering it out using the ABAP code using CHECK statement.
2. Select with selection list.
Points # 1/2
SELECT * FROM SBOOK INTO SBOOK_WA.
CHECK: SBOOK_WA-CARRID = 'LH' AND
SBOOK_WA-CONNID = '0400'.
ENDSELECT.
The above code can be much more optimized by the code written below which avoids CHECK, selects with selection list
SELECT CARRID CONNID FLDATE BOOKID FROM SBOOK INTO TABLE T_SBOOK
WHERE SBOOK_WA-CARRID = 'LH' AND
SBOOK_WA-CONNID = '0400'.
Select Statements Select Queries
1. Avoid nested selects
2. Select all the records in a single shot using into table clause of select statement rather than to use Append statements.
3. When a base table has multiple indices, the where clause should be in the order of the index, either a primary or a secondary index.
4. For testing existence , use Select.. Up to 1 rows statement instead of a Select-Endselect-loop with an Exit.
5. Use Select Single if all primary key fields are supplied in the Where condition .
Point # 1
SELECT * FROM EKKO INTO EKKO_WA.
SELECT * FROM EKAN INTO EKAN_WA
WHERE EBELN = EKKO_WA-EBELN.
ENDSELECT.
ENDSELECT.
The above code can be much more optimized by the code written below.
SELECT PF1 PF2 FF3 FF4 INTO TABLE ITAB
FROM EKKO AS P INNER JOIN EKAN AS F
ON PEBELN = FEBELN.
Note: A simple SELECT loop is a single database access whose result is passed to the ABAP program line by line. Nested SELECT loops mean that the number of accesses in the inner loop is multiplied by the number of accesses in the outer loop. One should therefore use nested SELECT loops only if the selection in the outer loop contains very few lines or the outer loop is a SELECT SINGLE statement.
Point # 2
SELECT * FROM SBOOK INTO SBOOK_WA.
CHECK: SBOOK_WA-CARRID = 'LH' AND
SBOOK_WA-CONNID = '0400'.
ENDSELECT.
The above code can be much more optimized by the code written below which avoids CHECK, selects with selection list and puts the data in one shot using into table
SELECT CARRID CONNID FLDATE BOOKID FROM SBOOK INTO TABLE T_SBOOK
WHERE SBOOK_WA-CARRID = 'LH' AND
SBOOK_WA-CONNID = '0400'.
Point # 3
To choose an index, the optimizer checks the field names specified in the where clause and then uses an index that has the same order of the fields . In certain scenarios, it is advisable to check whether a new index can speed up the performance of a program. This will come handy in programs that access data from the finance tables.
Point # 4
SELECT * FROM SBOOK INTO SBOOK_WA
UP TO 1 ROWS
WHERE CARRID = 'LH'.
ENDSELECT.
The above code is more optimized as compared to the code mentioned below for testing existence of a record.
SELECT * FROM SBOOK INTO SBOOK_WA
WHERE CARRID = 'LH'.
EXIT.
ENDSELECT.
Point # 5
If all primary key fields are supplied in the Where condition you can even use Select Single.
Select Single requires one communication with the database system, whereas Select-Endselect needs two.
Select Statements contd.. SQL Interface
1. Use column updates instead of single-row updates
to update your database tables.
2. For all frequently used Select statements, try to use an index.
3. Using buffered tables improves the performance considerably.
Point # 1
SELECT * FROM SFLIGHT INTO SFLIGHT_WA.
SFLIGHT_WA-SEATSOCC =
SFLIGHT_WA-SEATSOCC - 1.
UPDATE SFLIGHT FROM SFLIGHT_WA.
ENDSELECT.
The above mentioned code can be more optimized by using the following code
UPDATE SFLIGHT
SET SEATSOCC = SEATSOCC - 1.
Point # 2
SELECT * FROM SBOOK CLIENT SPECIFIED INTO SBOOK_WA
WHERE CARRID = 'LH'
AND CONNID = '0400'.
ENDSELECT.
The above mentioned code can be more optimized by using the following code
SELECT * FROM SBOOK CLIENT SPECIFIED INTO SBOOK_WA
WHERE MANDT IN ( SELECT MANDT FROM T000 )
AND CARRID = 'LH'
AND CONNID = '0400'.
ENDSELECT.
Point # 3
Bypassing the buffer increases the network considerably
SELECT SINGLE * FROM T100 INTO T100_WA
BYPASSING BUFFER
WHERE SPRSL = 'D'
AND ARBGB = '00'
AND MSGNR = '999'.
The above mentioned code can be more optimized by using the following code
SELECT SINGLE * FROM T100 INTO T100_WA
WHERE SPRSL = 'D'
AND ARBGB = '00'
AND MSGNR = '999'.
Select Statements contd Aggregate Functions
If you want to find the maximum, minimum, sum and average value or the count of a database column, use a select list with aggregate functions instead of computing the aggregates yourself.
Some of the Aggregate functions allowed in SAP are MAX, MIN, AVG, SUM, COUNT, COUNT( * )
Consider the following extract.
Maxno = 0.
Select * from zflight where airln = LF and cntry = IN.
Check zflight-fligh > maxno.
Maxno = zflight-fligh.
Endselect.
The above mentioned code can be much more optimized by using the following code.
Select max( fligh ) from zflight into maxno where airln = LF and cntry = IN.
Select Statements contd For All Entries
The for all entries creates a where clause, where all the entries in the driver table are combined with OR. If the number of entries in the driver table is larger than rsdb/max_blocking_factor, several similar SQL statements are executed to limit the length of the WHERE clause.
The plus
Large amount of data
Mixing processing and reading of data
Fast internal reprocessing of data
Fast
The Minus
Difficult to program/understand
Memory could be critical (use FREE or PACKAGE size)
Points to be must considered FOR ALL ENTRIES
Check that data is present in the driver table
Sorting the driver table
Removing duplicates from the driver table
Consider the following piece of extract
Loop at int_cntry.
Select single * from zfligh into int_fligh
where cntry = int_cntry-cntry.
Append int_fligh.
Endloop.
The above mentioned can be more optimized by using the following code.
Sort int_cntry by cntry.
Delete adjacent duplicates from int_cntry.
If NOT int_cntry[] is INITIAL.
Select * from zfligh appending table int_fligh
For all entries in int_cntry
Where cntry = int_cntry-cntry.
Endif.
Select Statements contd Select Over more than one Internal table
1. Its better to use a views instead of nested Select statements.
2. To read data from several logically connected tables use a join instead of nested Select statements. Joins are preferred only if all the primary key are available in WHERE clause for the tables that are joined. If the primary keys are not provided in join the Joining of tables itself takes time.
3. Instead of using nested Select loops it is often better to use subqueries.
Point # 1
SELECT * FROM DD01L INTO DD01L_WA
WHERE DOMNAME LIKE 'CHAR%'
AND AS4LOCAL = 'A'.
SELECT SINGLE * FROM DD01T INTO DD01T_WA
WHERE DOMNAME = DD01L_WA-DOMNAME
AND AS4LOCAL = 'A'
AND AS4VERS = DD01L_WA-AS4VERS
AND DDLANGUAGE = SY-LANGU.
ENDSELECT.
The above code can be more optimized by extracting all the data from view DD01V_WA
SELECT * FROM DD01V INTO DD01V_WA
WHERE DOMNAME LIKE 'CHAR%'
AND DDLANGUAGE = SY-LANGU.
ENDSELECT
Point # 2
SELECT * FROM EKKO INTO EKKO_WA.
SELECT * FROM EKAN INTO EKAN_WA
WHERE EBELN = EKKO_WA-EBELN.
ENDSELECT.
ENDSELECT.
The above code can be much more optimized by the code written below.
SELECT PF1 PF2 FF3 FF4 INTO TABLE ITAB
FROM EKKO AS P INNER JOIN EKAN AS F
ON PEBELN = FEBELN.
Point # 3
SELECT * FROM SPFLI
INTO TABLE T_SPFLI
WHERE CITYFROM = 'FRANKFURT'
AND CITYTO = 'NEW YORK'.
SELECT * FROM SFLIGHT AS F
INTO SFLIGHT_WA
FOR ALL ENTRIES IN T_SPFLI
WHERE SEATSOCC < F~SEATSMAX
AND CARRID = T_SPFLI-CARRID
AND CONNID = T_SPFLI-CONNID
AND FLDATE BETWEEN '19990101' AND '19990331'.
ENDSELECT.
The above mentioned code can be even more optimized by using subqueries instead of for all entries.
SELECT * FROM SFLIGHT AS F INTO SFLIGHT_WA
WHERE SEATSOCC < F~SEATSMAX
AND EXISTS ( SELECT * FROM SPFLI
WHERE CARRID = F~CARRID
AND CONNID = F~CONNID
AND CITYFROM = 'FRANKFURT'
AND CITYTO = 'NEW YORK' )
AND FLDATE BETWEEN '19990101' AND '19990331'.
ENDSELECT.
1. Table operations should be done using explicit work areas rather than via header lines.
2. Always try to use binary search instead of linear search. But dont forget to sort your internal table before that.
3. A dynamic key access is slower than a static one, since the key specification must be evaluated at runtime.
4. A binary search using secondary index takes considerably less time.
5. LOOP ... WHERE is faster than LOOP/CHECK because LOOP ... WHERE evaluates the specified condition internally.
6. Modifying selected components using MODIFY itab TRANSPORTING f1 f2.. accelerates the task of updating a line of an internal table.
Point # 2
READ TABLE ITAB INTO WA WITH KEY K = 'X BINARY SEARCH.
IS MUCH FASTER THAN USING
READ TABLE ITAB INTO WA WITH KEY K = 'X'.
If TAB has n entries, linear search runs in O( n ) time, whereas binary search takes only O( log2( n ) ).
Point # 3
READ TABLE ITAB INTO WA WITH KEY K = 'X'. IS FASTER THAN USING
READ TABLE ITAB INTO WA WITH KEY (NAME) = 'X'.
Point # 5
LOOP AT ITAB INTO WA WHERE K = 'X'.
ENDLOOP.
The above code is much faster than using
LOOP AT ITAB INTO WA.
CHECK WA-K = 'X'.
ENDLOOP.
Point # 6
WA-DATE = SY-DATUM.
MODIFY ITAB FROM WA INDEX 1 TRANSPORTING DATE.
The above code is more optimized as compared to
WA-DATE = SY-DATUM.
MODIFY ITAB FROM WA INDEX 1.
7. Accessing the table entries directly in a "LOOP ... ASSIGNING ..." accelerates the task of updating a set of lines of an internal table considerably
8. If collect semantics is required, it is always better to use to COLLECT rather than READ BINARY and then ADD.
9. "APPEND LINES OF itab1 TO itab2" accelerates the task of appending a table to another table considerably as compared to LOOP-APPEND-ENDLOOP.
10. DELETE ADJACENT DUPLICATES accelerates the task of deleting duplicate entries considerably as compared to READ-LOOP-DELETE-ENDLOOP.
11. "DELETE itab FROM ... TO ..." accelerates the task of deleting a sequence of lines considerably as compared to DO -DELETE-ENDDO.
Point # 7
Modifying selected components only makes the program faster as compared to Modifying all lines completely.
e.g,
LOOP AT ITAB ASSIGNING <WA>.
I = SY-TABIX MOD 2.
IF I = 0.
<WA>-FLAG = 'X'.
ENDIF.
ENDLOOP.
The above code works faster as compared to
LOOP AT ITAB INTO WA.
I = SY-TABIX MOD 2.
IF I = 0.
WA-FLAG = 'X'.
MODIFY ITAB FROM WA.
ENDIF.
ENDLOOP.
Point # 8
LOOP AT ITAB1 INTO WA1.
READ TABLE ITAB2 INTO WA2 WITH KEY K = WA1-K BINARY SEARCH.
IF SY-SUBRC = 0.
ADD: WA1-VAL1 TO WA2-VAL1,
WA1-VAL2 TO WA2-VAL2.
MODIFY ITAB2 FROM WA2 INDEX SY-TABIX TRANSPORTING VAL1 VAL2.
ELSE.
INSERT WA1 INTO ITAB2 INDEX SY-TABIX.
ENDIF.
ENDLOOP.
The above code uses BINARY SEARCH for collect semantics. READ BINARY runs in O( log2(n) ) time. The above piece of code can be more optimized by
LOOP AT ITAB1 INTO WA.
COLLECT WA INTO ITAB2.
ENDLOOP.
SORT ITAB2 BY K.
COLLECT, however, uses a hash algorithm and is therefore independent
of the number of entries (i.e. O(1)) .
Point # 9
APPEND LINES OF ITAB1 TO ITAB2.
This is more optimized as compared to
LOOP AT ITAB1 INTO WA.
APPEND WA TO ITAB2.
ENDLOOP.
Point # 10
DELETE ADJACENT DUPLICATES FROM ITAB COMPARING K.
This is much more optimized as compared to
READ TABLE ITAB INDEX 1 INTO PREV_LINE.
LOOP AT ITAB FROM 2 INTO WA.
IF WA = PREV_LINE.
DELETE ITAB.
ELSE.
PREV_LINE = WA.
ENDIF.
ENDLOOP.
Point # 11
DELETE ITAB FROM 450 TO 550.
This is much more optimized as compared to
DO 101 TIMES.
DELETE ITAB INDEX 450.
ENDDO.
12. Copying internal tables by using ITAB2[ ] = ITAB1[ ] as compared to LOOP-APPEND-ENDLOOP.
13. Specify the sort key as restrictively as possible to run the program faster.
Point # 12
ITAB2[] = ITAB1[].
This is much more optimized as compared to
REFRESH ITAB2.
LOOP AT ITAB1 INTO WA.
APPEND WA TO ITAB2.
ENDLOOP.
Point # 13
SORT ITAB BY K. makes the program runs faster as compared to SORT ITAB.
Internal Tables contd
Hashed and Sorted tables
1. For single read access hashed tables are more optimized as compared to sorted tables.
2. For partial sequential access sorted tables are more optimized as compared to hashed tables
Hashed And Sorted Tables
Point # 1
Consider the following example where HTAB is a hashed table and STAB is a sorted table
DO 250 TIMES.
N = 4 * SY-INDEX.
READ TABLE HTAB INTO WA WITH TABLE KEY K = N.
IF SY-SUBRC = 0.
ENDIF.
ENDDO.
This runs faster for single read access as compared to the following same code for sorted table
DO 250 TIMES.
N = 4 * SY-INDEX.
READ TABLE STAB INTO WA WITH TABLE KEY K = N.
IF SY-SUBRC = 0.
ENDIF.
ENDDO.
Point # 2
Similarly for Partial Sequential access the STAB runs faster as compared to HTAB
LOOP AT STAB INTO WA WHERE K = SUBKEY.
ENDLOOP.
This runs faster as compared to
LOOP AT HTAB INTO WA WHERE K = SUBKEY.
ENDLOOP. -
Hi Experts,
I have one scenario .
I have scheduled on program in back ground.
now that in back . when i excute that back job program.
I need to tranfer that data to application server.
please suggest me how to do this.
Ramesh.
thanks in advanice.there is a system variable sy-batch which recognises if the job is running in background or not..
OPEN DATASET p_file FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc NE 0.
MESSAGE e001(zsd_mes).
EXIT.
ELSE.
*---Data is downloaded to the application server file path
LOOP AT it_tab2 INTO wa_tab2.
TRANSFER wa_tab2 TO p_file.
ENDLOOP.
ENDIF.
*--Close the Application server file (Mandatory).
CLOSE DATASET p_file. -
How to debug the back ground job
Hi All,
I want to debug the program, but it is taking more time so iam running it in back ground, but i need to debug the program from some particular point....is there any other way to debug the program...
Means...to run the program in background upto particular point and later i need to run it in debug mode.
Please suggest.
Thanks & Regards,
HariHi,
You can do this only after the job has finished execution. This will simulate the exact background scenario with the same selection screen values as used in the job and sy-batch set to 'X'.
Use SM37 to get list of jobs , type 'JDBG' in the command line ( no '/' ), put the cursor on the job and press ENTER
You are in debug mode now. Step through SAP program (press F7 couple of times) until you get to code you need.
Hope this helps u.
Thanks. -
Restriction of back ground from end user
Hi all, i am ABAPer, but i need to work on authorization issue,
I have to restrict our FI End users (8 users) from the back ground job,
At present scenario is they can run others job also in SM35 including thier job, recently our company aduit found this given the below solution
Excessive batch job authorisation was given to end users (End users should be restricted from the object S_BTCH_ADM and S_BTCH_NAM which allows them to release and schedule jobs using other usersu2019 ID)
when i go thorugh i chenged the value S_BCTH_ADM as N and S_BCTH_NAM as *
and S_BTCH_JOB as LIST,RELE,PROT.
Kindly give me the solution <removed_by_moderator>
Regards
Chandra
Edited by: Julius Bussche on Jun 16, 2008 6:02 AM> Is there any possiblity to restrict the naming convention of background job ie in the sm36 screen Jobname coloumn if i want to restrect with a particular naming convention is int possible example Z:XYZ is the first part of the job name.
AFAIK you can only restrict this for JOBACTION 'SHOW' and this would only work if your JOBGROUP field was already populated with a value.
The name of the job would not work. So I would say that you need to arrange that organizationally.
I think way back a few years ago there was a thread here about this and the possibility of using the user's ID as the JOBGROUP value - but if I remember correctly it was disbanded as being impractical.
So the answer is most likely: No.
Cheers,
Julius -
What happens in the back ground when you schedule a Sales Order?
Gurus,
What happens in the back ground when you schedule a Sales Order?
Assemble to Order scenario.
Edited by: 792050 on May 17, 2011 3:24 AM
Edited by: 792050 on May 17, 2011 3:24 AMIf I merge libraries and then back the merged libraries up onto an external drive or DVD, do I just back up the libraries themselves, or do I also back up the original files?
If you're running a Managed Library, then backing up the Library also backs up the original files.
I ask because my HDD is getting pretty full. I want to reduce the amount of stuff on it, and the iPhoto libraries are the biggest single consumer of space at the moment (Nearly 100GB).
1. Quit iPhoto
2. Copy the iPhoto Library from your Pictures Folder to the External Disk.
3. Hold down the option (or alt) key while launching iPhoto. From the resulting menu select 'Choose Library' and navigate to the new location. From that point on this will be the default location of your library.
4. Test the library and when you're sure all is well, trash the one on your internal HD to free up space.
Regards
TD
P.s. If you're running a Managed library, then it's the default setting, and iPhoto copies files into the iPhoto Library when Importing
If you're running a Referenced Library, then you made a change at iPhoto -> Preferences -> Advanced and iPhoto is NOT copying the files into the iPhoto Library when importing. -
SD New Scenario- customer free supplied material
Hi Experts,
my client requirement is : Customer raise the sales order for one product, for that product customer is going to supply one specific spare part to co. with free of cost.
co. will use that part in making of finished product. So in this case what are the possibilities to deliver the finished product.
how to map from sales order to Invoice.
please help me to map the scenario.
Regards
MadhuHi,
I faced a smilar scenario some while back but the solution was not based in SD. It was based in MM and PP.
My solution is different from the previous one because if I understood correctly, the customer sends products to you. You don't send anything to the customer. If I am right, then you cannot use subcontrating process.
This was how we solved it:
-> Material Type - you need a material type non valuated to map this product. You can use a standard one but I would use a Z one so that I am free to adapt it in the future. Please use a separate number range so that you can distinguish easiliy the materials from the customer from your own materials.
-> You will map the deliver of your customer materials through a purchase order where your vendor is your customer. You need a new Account Assignment Category that is not relevant for billing. That is, one where pricing is not relevant and where the system does not expect to receive an invoice for it.
-> you need to set up the BOM's so that your customer material is used in production of your finished product. As the material type is non valuated there will not be any costs regarding this product when you produce it.
-> you sell your finished product to the customer with a normal sales order (OR). Probably because one of the components belong to your customer, the price will be lower.
Hope it helps. Regards, -
ERROR IN BDC SESSION BACK GROUND ( USING PP03 T.CODE)
Hi Experts,
When i am executing the session ( pp03 T.Code) ,it is working fine in fore ground as well as in display errors mode. But it is not working in back ground mode.
it is raising the exception ( cntl-system-error).
I used flat file which is in the presentation server . I accessed it by gui_upload Function Module.
<b> I am also getting the error using data sets . After upload the presentation server data into one internal table .I sent it to application server using open data set ( transfer ) , after that i read the application server data using open dataset ( read )
into another internal table ( i declared with same structure of flat file ) .</b>
But this time also it is giving same error ( runtime exception).
Please tell me how can i handle using datasets . Its Very urgentttttttt
Please any body help meee ( If any one worked on this (pp03 T.code) ) .
Send the code or Inform the full detailsss as soon as possibleeee
regards ,
dattu malge.Hi,
Go to the transaction SM35 and select your session and then click the Process Button.
Here you select the Processing Mode as "Background".
It is not possible to execute the same session by more than one user at a same time.
RSBDCSUB is used to automate the processing of Batch input session.
Cheers,
Hakim -
How do I turn off programs running in the back ground on my iPad
How do I turn off apps running in the back ground with the new update
How to Close Apps
Double Tap the Home Button... Then swipe the App (not the icon) Upwards... Tap the Home Button when finished.
From Here > http://support.apple.com/kb/HT4211 -
Hi Experts,
See the following code.
When i run this program in fore ground it is giving time out error.Current running time is 600 Sec.When i run in Back ground with Immediate.Excute option after some time the job is cancled.When i see the job log It is mentioned as "More memory space requested".
Can any body Plz go thru the code and tell me what is the problem?
*& Report ZTEST_XML
REPORT ZTEST_XML.
*data: ifile like salfldir occurs 0 with header line.
data:p_path TYPE rsmrgstr-path value 'F:\usr\sap\CD5\serena\OCOS\'.
data: p_file type string.
TYPES: BEGIN OF t_xml_line, "Structure for holding XML data
data(256) TYPE x,
END OF t_xml_line.
data: BEGIN OF t_xml occurs 0,
rec TYPE x,
END OF t_xml.
data: begin of jtab occurs 0,
text type string,
end of jtab.
DATA: l_xml_table TYPE TABLE OF t_xml_line. " XML Table of the structure
data: wa(256) type x.
data: wa_xml like l_xml_table occurs 0.
data: l_str type string.
data: start_line type i,
end_line type i,
v_str type string.
data: cnt type i,
idx type i,
count1 type i,
len type i,
diff type i.
data: catalogid type string.
DATA : BEGIN OF DAT OCCURS 0,
LIN type string,
END OF DAT.
DATA: BASIC_TEXT LIKE THEAD OCCURS 0 WITH HEADER LINE.
DATA: YGUID LIKE COMM_PCAT_CTY-GUID,
ZGUID LIKE COMM_PCAT_CTV-PARENT_CATEGORY.
DATA: BASICTEXT like TLINE OCCURS 0 WITH HEADER LINE.
DATA: IT_THEAD LIKE THEAD OCCURS 0 WITH HEADER LINE.
data: begin of IT_SERENA occurs 0,
ID LIKE COMM_PCAT_CTY-ID,
TEXT type string,
end of IT_SERENA.
DATA:WA_SERENA like IT_SERENA.
DATA: wa_files TYPE rsfillst,
ifile LIKE TABLE OF wa_files.
data: begin of it_error occurs 0,
err type string,
end of it_error.
data: errorfile type string value 'error.txt'.
CALL FUNCTION 'SUBST_GET_FILE_LIST'
EXPORTING
DIRNAME = p_path
FILENM = '*'
PATTERN =
TABLES
FILE_LIST = ifile
EXCEPTIONS
ACCESS_ERROR = 1
OTHERS = 2
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
loop at ifile into wa_files.
if not wa_files-name cs '.xml'.
delete ifile.
endif.
endloop.
loop at ifile into wa_files.
concatenate p_path wa_files-name into p_file.
open dataset p_file for input in text mode encoding default.
if sy-subrc = 0.
do.
read dataset p_file into l_str.
if l_str = space.
exit.
endif.
if sy-subrc <> 0.
concatenate text-002 p_file into it_error-err.
append it_error.
clear it_error.
exit.
else.
jtab-text = l_str.
append jtab.
clear jtab.
endif.
enddo.
else.
concatenate text-001 p_file into it_error-err.
append it_error.
clear it_error.
continue.
endif.
loop at jtab.
if jtab-text cs 'Catalog ID'.
idx = sy-tabix .
endif.
endloop.
read table jtab index idx.
if sy-subrc = 0.
catalogid = jtab-text.
endif.
if catalogid cs '<![CDATA['.
shift catalogid left by 56 places.
endif.
if catalogid cs ']]></field>'.
replace ']]></field>' with space into catalogid.
endif.
write:/ 'Catalogid is:', catalogid.
IT_SERENA-ID = catalogid.
LOOP AT jTAB.
if jtab-text cs 'longDescription'.
start_line = sy-tabix + 1.
exit.
endif.
endloop.
cnt = start_line.
do.
read table jtab index cnt.
if jtab-text cs '</field>'.
end_line = sy-tabix.
exit.
else.
cnt = cnt + 1.
endif.
enddo.
loop at jtab from start_line to end_line.
concatenate jtab-text v_str into v_str separated by space.
endloop.
shift v_str left by 18 places.
shift v_str left by 10 places.
len = strlen( v_str ).
DO.
IF COUNT1 >= LEN.
EXIT.
ENDIF.
DIFF = LEN - COUNT1.
IF DIFF < 125.
DAT-LIN = v_str+COUNT1(DIFF).
ELSE.
DAT-LIN = v_str+COUNT1(125).
ENDIF.
DAT-LIN = v_str+COUNT1(125).
APPEND DAT.
WRITE :/ DAT-LIN.
CLEAR DAT.
COUNT1 = COUNT1 + 125.
ENDDO.
REFRESH BASIC_TEXT.
REFRESH BASICTEXT.
CLEAR YGUID.
CLEAR ZGUID.
SELECT SINGLE GUID FROM COMM_PCAT_CTY INTO YGUID
WHERE ID = IT_SERENA-ID.
IF NOT YGUID IS initial.
if sy-subrc <> 0.
concatenate text-003 p_file into it_error-err.
append it_error.
clear it_error.
else.
SELECT SINGLE GUID FROM COMM_PCAT_CTV INTO ZGUID
WHERE PARENT_CATEGORY = YGUID.
ENDIF.
BASIC_TEXT-TDOBJECT = 'PCAT_CTY'.
BASIC_TEXT-TDNAME = ZGUID.
BASIC_TEXT-TDID = '0001'.
BASIC_TEXT-TDSPRAS = 'EN'.
APPEND BASIC_TEXT.
loop at DAT.
BASICTEXT-TDFORMAT = '*'.
BASICTEXT-TDLINE = DAT-LIN.
APPEND BASICTEXT.
endloop.
refresh DAT.
loop at basictext.
if basictext-tdline cs '![CDATA['.
replace '![CDATA[' with space into basictext-tdline.
ENDIF.
modify basictext.
ENDLOOP.
loop at basictext.
if basictext-tdline cs ']]>'.
replace ']]>' with space into basictext-tdline.
modify basictext.
endif.
endloop.
CALL FUNCTION 'SAVE_TEXT'
EXPORTING
CLIENT = SY-MANDT
HEADER = BASIC_TEXT
INSERT = 'X'
SAVEMODE_DIRECT = 'X'
OWNER_SPECIFIED = ' '
LOCAL_CAT = ' '
IMPORTING
FUNCTION =
NEWHEADER = IT_THEAD
TABLES
LINES = BASICTEXT
EXCEPTIONS
ID = 1
LANGUAGE = 2
NAME = 3
OBJECT = 4
OTHERS = 5
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
CLEAR IT_THEAD.
IF SY-SUBRC = 0.
WRITE:/ 'DATA UPLOADED SUCCESSFULLY'.
ELSE.
WRITE:/ 'DATA NOT UPLOADED'.
ENDIF.
refresh jtab.
refresh DAT.
COUNT1 = 0.
clear v_str.
REFRESH BASICTEXT.
refresh dat.
close dataset p_file.
endloop.
open dataset errorfile for appending in text mode encoding default.
if sy-subrc = 0.
loop at it_error.
transfer it_error-err to errorfile.
endloop.
close dataset errorfile.
endif.
Is there any other FM which is similar to SAVE_TEXT?That depends. Can you please provide some more info how the programs are linked:
several steps in batch planning
cascade of submits
and what kind of failure might occur:
shortdump
Kind Regards
Klaus -
hi every one,
pls help me out in this.
i have written a program to create a wbs element.as it takes 20 to 30 min to create one element in production.in my program, there is part of code which creates wbs elements.
so i want to run tht part of code in background.
only part of code.
even i did the coding for it. i wrote a new program and i pasted that part of code in new program.
and this new program i m submitting in old one by creating a back ground job.
i m exporting few required internal tables for new report.i hav imported all internal tables in new report also..
but still this importing exporting is not happening..pls
tell me wat can be the problem..
thanks..Your code is not optimized according to performance,can you sed me your code ...
see my business card for my id.
Thanks
Maybe you are looking for
-
Hi, I need to use a button in BI 7.0 BEx Analyzer in a workbook with the function similar to the context menu "Back" in BEx. What are the command and parameter settings for this button? Thanks.
-
Jsf and applete-which is better to use to design recruitment intranet
Hi, I'm interested in enhancing my skills on developing standard interface for my company. Pls, Is it possible to design standard interface like the ones in accounting software with jsf. Or is better to use appletes Emeka
-
How can I get photos from my computer on to Ipod touch?
So, I have an IPod Touch. i got photos off facebook to have them on my IPod. I saved thoes photos on to my Dell Laditude D830. Now my IPod has a screen malfunction. i cant click on my bottom fore apps. So, I got a nw one this is the 4th Gennaration I
-
A general question about how can Java object creation with an average amount of data retrieval is considered as a reasonable. For example if I am clicking on a link in a web application...to paint that page, I think, about 4-5 java object with an ave
-
HT5137 what does the circle with a lock in it mean?
What does the circle with a lock in it mean? And how do I make it go away?