Forecast -Bacground Job issue

Hi,
Working on SCM 4.1
We are getting one issue regarding the Forecast Run in the Background...
After the Background Job run --- for Forecast --- we are getting Error message for some of the SKU-Locations...
Error --->  "Forecast Values will be negative"
Becasue of the above error the Forecast isnot updateed in the Forecast Key fig !
If we run the same SKU-Location -- under the same Forecast profiles and settings --->> It is giving the Forecast values
And in the message Tab --> it is givin the same message ""Forecast Values will be negative"
But in foregroubd it is giving the forecast
So in Background job why system isnot populating the Forecast in the planning Book?
Is there any specific reson? How can we solve this?
Thx in advance.
Regards,
Rajesh patil

Rajesh,
It is normal to see the error message "Forecast Values will be negative" due to decreasing trend. In this case, the negative values just dont get stores in the stat forecast KF, instead it will be null.
Now to the question of why positive values are not written in background. Please make sure you run forecast at the same level both in Interactive Planning and background as well. Lets say if you use Product - customer in the planning book, select only these levels in the forecast batch job as well.
Let me know if this helped.
Hari Vomkarey

Similar Messages

  • Bacground job cancelled in Solman system.

    Hi experts,
    Two bacground job cancelled in Solman system.
    1. SM:SELFDIAGNOSIS
    Job started
    Step 001 started (program RDSWP_SELF_DIAGNOSIS, variant &0000000001018, user ID SAP*)
    ABAP/4 processor: MOVE_TO_LIT_NOTALLOWED_NODATA
    Job cancelled
    2.SM:SYNC SAP SESSIONS
    Job started
    Step 001 started (program RDSWPCISERVICEPLAN, variant &0000000001018, user ID SAP*)
    Your user has no SAP Support Portal user assigned
    Job cancelled after system exception ERROR_MESSAGE
    Please give me the soltuin to fix it.
    Regards,

    Hi Vivek,
    1). Adobe forms (TCode SFP) offers a feature where you can pass the value of the table instead of table iteself. For this yuo have to go to SFP > INTERFACE > Interface (Tab) >(Click on) Import
    Here, you should see all the parameters listed. On the parameter that you want to meddle with, tick the "Pass Value" checkbox. This solved the issue and prevented the dump.
    Exception MOVE_TO_LIT_NOTALLOWED_NODATA
    Refer to the  Note 834318 ...
    2). please maintain the your S - user id in AISUSER t-code. Refer to below link for screenshots.
    http://wiki.sdn.sap.com/wiki/display/SM/HowtomaintainthetransactionAISUSERforstandardandVARscenarios
    Regards,
    Arjun
    Edited by: Arjun Venkateswarlu. on Jan 12, 2012 9:35 PM

  • Bacground job for automatic creation of billing document

    Dear All
    Could you advise me on how to set up a bacground job for automatic creation of billing document?
    Thanks and regards,
    Sylwia
    Edited by: S. KOWALSKA on Jan 12, 2009 10:12 AM

    Dear Sylwia
    Please dont post the same question in multiple threads.  You are receiving feedback in the other forum.  Check this link
    [Backgorund job for automatic creation of billing document   |Backgorund job for automatic creation of billing document;
    thanks
    G. Lakshmipathi

  • Forecasting Background job

    Hi Gurus,
    We run the S&OP forecasting background job once a month using RMCA0766 but everytime it runs the next month one fails. I wasnt able to find it why it fails. But if I run job using MC8G transaction manually it tells to reset the status to run it next time. I would like to know what is the correct procedure to run this program in the background whethring using RMCA0766 or MC8G. Also do we need to reset the status every month to run again. Can someone please guide me on that.
    I would really appreciate if someone could throw some insights on it.
    thanks
    Babu

    You need to run 3 program in the running for any mass processing job via SM37.
    1.  RMCAxxx6  ->  this will rebuild the planning status for any object according your variant setting in the job.  xxx is the name of the structure.  In your case, for S076, it will be RMCA0766.
    2.  RMCP6RES  ->  This will reset the status of the job.  You need to create the variant and specify the job name in the variant.  Job name is defined in MC8D.
    3.  RMCP6BIN  -> Run the mass processing itself.
    There is an OSS note describe this, but I can't remember the number.

  • "can not start a job" issue in AWM

    Hi ALL,
    I am maintaining my cube from PLSQL with following options
    1. buildtype = "BACKGROUND"
    2. trackstatus = "true"
    3. maxjobqueues = 3
    i get following error when i see the "olapsys.xml_load_log" table
    ***Error Occured: Failed to Build(Refresh) DB_OWNER.MY_ANALYTICAL_WORKSPACE Analytic Workspace. Can not start a job.
    Can anybody explain when and why this error occurs? I have wasted a lot of time searching for this issue, but have found no person facing such issue.
    Hi Keith, it will be great if you can answer this one.
    My database version is 10.2.0.4.0 and AWM version is also 10.2.0.3.0
    Kind Regards,
    QQ
    Message was edited by:
    dwh_10g

    Applies to:
    Oracle OLAP - Version: 10.1 to 11.1
    This problem can occur on any platform.
    Symptoms
    - We have an AW maintenance / refresh script or procedure that contains BuildType="BACKGROUND", so that the AW maintenance task will be sent to the Oracle Job queue.
    - When we execute the AW maintenance / refresh script or procedure, we do not get any errors in the foreground, the script/procedure has been executed successfully.
    - However when we look into the build/refresh log (see <Note 351688.1> for details) we see that the maintenance/refresh task failed with:
    13:29:39 Failed to Submit a Job to Build(Refresh) Analytic Workspace <schema>.<AW Name>.
    13:29:39 ***Error Occured in BUILD_DRIVER
    - In the generated SQL trace for the session of the user who launches the AW build/refresh script or procedure, we see that ORA-27486 insufficient privileges error occurred at creation of the job.
    We see from the relevant bit of the SQL trace that err=27486 occured while executing the #20 statement which is 'begin DBMS_SCHEDULER.CREATE_JOB ...', and the statement is parsed and tried to be executed as user having uid=91:
    PARSING IN CURSOR #20 len=118 dep=2 uid=91 oct=47 lid=91 tim=1176987702199571
    hv=1976722458 ad='76dd8bcc'
    begin
    DBMS_SCHEDULER.CREATE_JOB(:1, 'plsql_block', :2, 0, null, null, null,
    'DEFAULT_JOB_CLASS', true, true, :3); end;
    END OF STMT
    PARSE
    #20:c=1000,e=1100,p=0,cr=0,cu=0,mis=1,r=0,dep=2,og=1,tim=1176987702199561
    EXEC #20:c=65990,e=125465,p=10,cr=1090,cu=3,mis=1,r=0,dep=2,og=1,tim=
    1176987702325167
    ERROR #20:err=27486 tim=465202984
    Cause
    User who tries to create a job (executes DBMS_SCHEDULER.CREATE_JOB() procedure) does not have the sufficient privileges.
    Solution
    1. Identify the user under which the job is supposed to be created. This user is not necessarily the same as the user who launched AW build/refresh script or procedure. Get the corresponding username from one of the %_USERS views e.g. from ALL_USERS.
    e.g.
    SELECT user_id,username FROM all_users WHERE user_id=91;
    2. Identify the system privileges currently assigned to the user by connecting as the user whom the privileges need to be determined, and execute:
    SELECT * FROM SESSION_PRIVS ORDER BY PRIVILEGE;
    3. Ensure that the CREATE JOB privilege is listed.
    The CREATE JOB privilege can be granted in various ways to a user:
    - via role OLAP_USER or via role OLAP_DBA (see view ROLE_SYS_PRIVS for what privs are assigned to a role):
    GRANT OLAP_USER TO <username>;
    - explicitly
    GRANT CREATE JOB TO <username>;

  • Creating Job Issue

    Hi Experts,
    I have created a page in Apex where I m using File browse item say :p13_filebrowse(where i m using CSV format files) and other items too. On click of Submit button I m calling package which dynamically creates the table successfully of browsed CSV FIle.
    Now My requirement is to create the dbms_scheduler Job for the existing package(AS Above) so that If I upload the very huge file it should run in backend using the DBMS Job.If I call package directly from the APex . it is happening successfully. But If i create the Job for that package then I m getting error for not identify the Apex file browsed items.
    In Package I m using wwv_flow_files from where i m extracting the file and creating the table.Through Job it is unable to recognise the Apex file browse item.For more information please go through the follwing code:-
    I m calling package from Apex which is succesfully executing in creaitng the table:-
    htmldb_sid_50.parse_file(:P13_FILEBROWSE,'P13_COLLECTION' ,'P13_HEADINGS','P13_COLUMNS','P13_DDL');
    htmldb_sid_50.parse_file(:P13_FILEBROWSE,'P13_COLLECTION' ,'P13_HEADINGS','P13_COLUMNS','P13_DDL','gdqdata.gdq_table1');
    Foollwing code of Package:-
    create or replace PACKAGE BODY htmldb_sid_50
    AS
    TYPE varchar2_t IS TABLE OF VARCHAR2(32767) INDEX BY binary_integer;
    -- Private functions --{{{
    PROCEDURE delete_collection ( --{{{
    -- Delete the collection if it exists
    p_collection_name IN VARCHAR2
    IS
    BEGIN
    IF (htmldb_collection.collection_exists(p_collection_name))
    THEN
    htmldb_collection.delete_collection(p_collection_name);
    END IF;
    END delete_collection; --}}}
    PROCEDURE csv_to_array ( --{{{
    -- Utility to take a CSV string, parse it into a PL/SQL table
    -- Note that it takes care of some elements optionally enclosed
    -- by double-quotes.
    p_csv_string IN VARCHAR2,
    p_array OUT wwv_flow_global.vc_arr2,
    p_num IN VARCHAR2 DEFAULT null,
    p_separator IN VARCHAR2 := ','
    IS
    l_start_separator PLS_INTEGER := 0;
    l_stop_separator PLS_INTEGER := 0;
    l_length PLS_INTEGER := 0;
    l_idx BINARY_INTEGER := 0;
    l_new_idx BINARY_INTEGER := 0;
    l_quote_enclosed BOOLEAN := FALSE;
    l_offset PLS_INTEGER := 1;
    p_csv_string_test varchar2(4000);
    cell_value varchar2(400);
    BEGIN
    l_length := NVL(LENGTH(p_csv_string),0);
    IF (l_length <= 0)
    THEN
    RETURN;
    END IF;
    LOOP
    l_idx := l_idx + 1;
    l_quote_enclosed := FALSE;
    -- p_csv_string
    p_csv_string_test := replace(p_csv_string,'""','@@');
    --p_csv_string := p_csv_string_test;
    IF SUBSTR(p_csv_string_test, l_start_separator + 1, 1) = '"'
    THEN
    l_quote_enclosed := TRUE;
    l_offset := 2;
    l_stop_separator := INSTR(p_csv_string_test, '"', l_start_separator + l_offset, 1);
    ELSE
    l_offset := 1;
    l_stop_separator := INSTR(p_csv_string_test, p_separator, l_start_separator + l_offset, 1);
    END IF;
    IF l_stop_separator = 0
    THEN
    l_stop_separator := l_length + 1;
    END IF;
    if l_idx = 1 then
    p_array(l_idx) := p_num;
    l_idx := l_idx + 1;
    end if;
    if(l_idx < 50) then
    cell_value := (SUBSTR(p_csv_string_test, l_start_separator + l_offset,(l_stop_separator - l_start_separator - l_offset)));
    cell_value := replace(cell_value, '@@', '"');
    p_array(l_idx) := cell_value;
    end if;
    EXIT WHEN l_stop_separator >= l_length;
    IF l_quote_enclosed
    THEN
    l_stop_separator := l_stop_separator + 1;
    END IF;
    l_start_separator := l_stop_separator;
    END LOOP;
    END csv_to_array; --}}}
    PROCEDURE get_records(p_blob IN blob,p_records OUT varchar2_t) --{{{
    IS
    l_record_separator VARCHAR2(2) := chr(13)||chr(10);
    l_last INTEGER;
    l_current INTEGER;
    BEGIN
    -- Sigh, stupid DOS/Unix newline stuff. If HTMLDB has generated the file,
    -- it will be a Unix text file. If user has manually created the file, it
    -- will have DOS newlines.
    -- If the file has a DOS newline (cr+lf), use that
    -- If the file does not have a DOS newline, use a Unix newline (lf)
    IF (NVL(dbms_lob.instr(p_blob,utl_raw.cast_to_raw(l_record_separator),1,1),0)=0)
    THEN
    l_record_separator := chr(10);
    END IF;
    l_last := 1;
    LOOP
    l_current := dbms_lob.instr( p_blob, utl_raw.cast_to_raw(l_record_separator), l_last, 1 );
    EXIT WHEN (nvl(l_current,0) = 0);
    p_records(p_records.count+1) := utl_raw.cast_to_varchar2(dbms_lob.substr(p_blob,l_current-l_last,l_last));
    l_last := l_current+length(l_record_separator);
    END LOOP;
    END get_records; --}}}
    -- Utility functions --{{{
    PROCEDURE parse_textarea ( --{{{
    p_textarea IN VARCHAR2,
    p_collection_name IN VARCHAR2
    IS
    l_index INTEGER;
    l_string VARCHAR2(32767) := TRANSLATE(p_textarea,chr(10)||chr(13)||' ,','@@@@');
    l_element VARCHAR2(100);
    BEGIN
    l_string := l_string||'@';
    htmldb_collection.create_or_truncate_collection(p_collection_name);
    LOOP
    l_index := instr(l_string,'@');
    EXIT WHEN NVL(l_index,0)=0;
    l_element := substr(l_string,1,l_index-1);
    IF (trim(l_element) IS NOT NULL)
    THEN
    htmldb_collection.add_member(p_collection_name,l_element);
    END IF;
    l_string := substr(l_string,l_index+1);
    END LOOP;
    END parse_textarea; --}}}
    PROCEDURE parse_file( --{{{
    p_file_name IN VARCHAR2,
    p_collection_name IN VARCHAR2,
    p_headings_item IN VARCHAR2,
    p_columns_item IN VARCHAR2,
    p_ddl_item IN VARCHAR2,
    p_table_name IN VARCHAR2 DEFAULT NULL
    IS
    l_blob blob;
    l_records varchar2_t;
    l_record wwv_flow_global.vc_arr2;
    l_datatypes wwv_flow_global.vc_arr2;
    l_headings VARCHAR2(4000);
    l_columns VARCHAR2(4000);
    l_seq_id NUMBER;
    l_num_columns INTEGER;
    l_ddl VARCHAR2(4000);
    l_keyword varchar2(300);
    l_records_count number;
    l_count number;
    l_filename varchar2(400);
    BEGIN
    --delete from test_mylog;
    -- insert into t_log(slog) values('p_file_name '||p_file_name||'p_collection_name '||p_collection_name||'p_headings_item '||p_headings_item||'p_columns_item '||p_columns_item||'p_ddl_item '||p_ddl_item||'p_table_name '||p_table_name);
    -- commit;
    insert into testing_job values(p_file_name,p_collection_name,p_headings_item,p_columns_item,p_ddl_item,p_table_name);commit;
    IF (p_table_name is not null)
    THEN
    BEGIN
    execute immediate 'drop table '||p_table_name;
    EXCEPTION
    WHEN OTHERS THEN null;
    END;
    l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
    htmldb_util.set_session_state('P13_DEBUG',l_ddl);
    execute immediate l_ddl;
    l_ddl := 'insert into '||p_table_name||' '||
    'select '||v(p_columns_item)||' '||
    'from htmldb_collections '||
    'where seq_id > 1 and collection_name='''||p_collection_name||'''';
    htmldb_util.set_session_state('P13_DEBUG',v('P13_DEBUG')||'/'||l_ddl);
    execute immediate l_ddl;
    RETURN;
    END IF;
    BEGIN
    /*select blob_content into l_blob from wwv_flow_files
    where name=p_file_name;*/
    select the_blob into l_blob from t_batch_attachments
    where blob_filename=p_file_name;
    -- insert into testing_job values (l_blob);commit;
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    raise_application_error(-20000,'File not found, id='||p_file_name);
    END;
    get_records(l_blob,l_records);
    IF (l_records.count < 2)
    THEN
    raise_application_error(-20000,'File must have at least 2 ROWS, id='||p_file_name);
    END IF;
    -- Initialize collection
    htmldb_collection.create_or_truncate_collection(p_collection_name);
    -- Get column headings and datatypes
    csv_to_array(l_records(1),l_record,'sid');
         --l_records_count:=l_record.count+1;
    --csv_to_array(l_records(l_records_count),l_datatypes);
         l_num_columns := l_record.count;
         if (l_num_columns > 100) then
    raise_application_error(-20000,'Max. of 100 columns allowed, id='||p_file_name);
    end if;
    -- Get column headings and names
    FOR i IN 1..l_record.count
    LOOP
    l_record(i) := trim(l_record(i));
    l_record(i) := replace(l_record(i),' ','_');
    l_record(i) := replace(l_record(i),'-','_');
    l_record(i) := replace(l_record(i),'(','_');
    l_record(i) := replace(l_record(i),')','_');
    l_record(i) := replace(l_record(i),':','_');
    l_record(i) := replace(l_record(i),';','_');
    l_record(i) := replace(l_record(i),'.','_');
    l_record(i) := replace(l_record(i),'"','_');
    l_headings := l_headings||':'||l_record(i);
    l_columns := l_columns||',c'||lpad(i,3,'0');
    END LOOP;
    l_headings := ltrim(l_headings,':');
    l_columns := ltrim(l_columns,',');
    htmldb_util.set_session_state(p_headings_item,l_headings);
    htmldb_util.set_session_state(p_columns_item,l_columns);
         -- Get datatypes
    FOR i IN 1..l_record.count
    LOOP
    l_datatypes(i):='varchar2(400)';
    l_ddl := l_ddl||','||l_record(i)||' '||l_datatypes(i);
    END LOOP;
    l_ddl := '('||ltrim(l_ddl,',')||')';
    htmldb_util.set_session_state(p_ddl_item,l_ddl);
    -- Save data into specified collection
    FOR i IN 1..l_records.count
    LOOP
    csv_to_array(l_records(i),l_record,i-1);
    l_seq_id := htmldb_collection.add_member(p_collection_name,'dummy');
    FOR i IN 1..l_record.count
    LOOP
    htmldb_collection.update_member_attribute(
    p_collection_name=> p_collection_name,
    p_seq => l_seq_id,
    p_attr_number => i,
    p_attr_value => l_record(i)
    END LOOP;
    END LOOP;
    -- DELETE FROM wwv_flow_files WHERE name=p_file_name;
    END;
    BEGIN
    NULL;
    END;
    Now I m stuck in If I call the Package through following Job:-
    DBMS_SCHEDULER.create_job (
    job_name => 'New_Testing1',
    job_type => 'PLSQL_BLOCK',
    job_action => 'begin
    GDQ.htmldb_sid_50.parse_file ('''||:P13_FILEBROWSE||''',''P13_COLLECTION'',''P13_HEADINGS'',''P13_COLUMNS'',''P13_DDL'');
    htmldb_sid_50.parse_file('''||:P13_FILEBROWSE||''',''P13_COLLECTION'' ,''P13_HEADINGS'',''P13_COLUMNS'',''P13_DDL'',''gdqdata.gdq_table1'');
    end;
    start_date => NULL,
    repeat_interval => NULL,
    end_date => NULL,
    enabled => TRUE,
    comments => 'UPLOAD FILE INTO TABLE IN GDQ.'
    I believe there is security issue in APex related to wwv_flow_files which is creating the problem . Please can anybody assist me in this issue.
    Thanks in Advance
    Danalaxmi

    Please post the details of the application release, along with the complete error message and the steps you followed to reproduce the issue.
    Thanks,
    Hussein

  • RPCS0000 job issue

    Hello,
    I have an issue when I run the program "RPCS0000" for RPTIME00 in the background.
    In SM37, the job appears but after one minute, it changes from "active" status to "interrupt"
    Could someone help on this please?
    Kind regards
    Karim

    Here is the error message :
       When calling the SAP spool system, an (unspecified) internal error occurred.                                                                               
    More detailed description of error: "spool overflow "                                                                               
    Further information:                                                                               
    See below under "Spool error information".
                                       RSPO - Error                                                   
          code:    1  RSPOEINTERN  Some other internal error                                          
          spool overflow                                                                               
    module: rspospno no:   29 line:   351                    T100: PO001                        
          TSL01: FBN

  • Oracle 11g OEM job issue

    Hi all,
    After Oracle 11g migration, i have been facing issues in OEM scheduling. Twice the job did not run even after 11.2.0.7 upgrade. Hence i would like to explore the possibility of scheduling the same in DBMS jobs (we have not faced even a single failure since migration). However there is no mail alert available for successful completion of the job. In this regard i would like to have your views and help to create alerts to proceed further.
    Thankx

    Hi Ravi,
    Really the link helps,eventhough i have been looking for the rootcause to overcome from the job failure(scheduled jobs)
    Actually after completion/succeded of my scheduled job i may also get "INITIALIZATION ERROR" status message,so i need to be Re-schedule the particular job again
    for next day.
    Look up screenshot of the error
    Job Name=OEM_FLEET_ICL_SEND_6_30_AM_DAILY
    Job Owner=SYSMAN
    Job Type=SQL Script
    Target Type=Database Instance
    Timestamp=Apr 19, 2011 6:30:44 AM
    Status=Initialization Error
    Step Output=
    Command:Output Log
    SQLPlus: Release 11.2.0.2.0 Production on Tue Apr 19 06:30:15 2011*
    Copyright (c) 1982, 2010, Oracle.  All rights reserved.
    SQL> SQL> SQL> SQL> Connected.
    SQL> SQL> SQL> SQL>
    PL/SQL procedure successfully completed.
    SQL> SP2-0103: Nothing in SQL buffer to run.
    SQL>
    Commit complete.
    SQL> SQL> SQL> Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    *~~~End Step Output/Error Log~~~*
    kindly advice me on this
    Thanks

  • RFC Job Issue

    Dear experts,
    scenario:
    Many parallel jobs are being triggered in our R/3 server. These are triggered by RFC from BI and they are posting idocs in R/3. During this process,each job is reading NRIV table and posting idoc in R/3. some jobs read this table for more than 7000 secs...during this time many parallel jobs are running from the same user and occupying all background processes in CI and app servers as well....
    Can i restrict the number of background jobs triggered from this RFC user ? any rfc profile parameters needs change?
    we are running on solaris and Db2
    Regards
    Ravi

    Hi,
    Are you running only with CI Instance or you also have other instances ? as you have mentioned CI is loaded with all of BG process kindly check below points:
    1) Schedule the jobs by keeping 10-15 mins Interval so that table will not get updated for the same entry with same job everytime
    2) Check RZ12 / SARFC for resource utilization
    3) You also need one or more additional instance if you are facing frequently wp resource issue or high utilization of resources
    Thanks,
    Saleem

  • Forecast BG Job and messages to be Ignored

    Dear Experts,
    The background activity definition for forecast has some 50 odd message numbers that can be checked to set the batch job (TS_BATCH_RUN) to success but the message class /SAPAPO/MA has several scores of messages related to the context of stat forecast that are not in included in this background activity definition (to check). This leads to a situation where the error management in resp. process chains fail and/or the batch job itself ending with errors but still showing as finished.
    Is there a cleaner solution to this nuisance apart from the random bunch of message specific notes released by SAP on ad-hoc basis ? An example message number is 502, 503, 504.. esp. if I KNOW what messages can be safely excluded from raising the errors.
    Thanks
    BS

    Hello BS
    I agree. Its an area which requires cleaner solution.
    Please check this SAP Note# 1688648 - DP Job log in process chain: show when process status is set
    In addition,  the successor step from process chain should be allowed to start on the event of "ALWAYS" after Statistical Forecasting step is finished regardless with Error or Success. This should help you trigger the next step without manual intervention.
    Also please check if these notes are relevant as well - OSS notes 0001238917, 0001250385 & 0001399861.  These notes are interrelated to each other.
    Hope this will help to some extent.
    Let us hope we get some solid inputs on this topic.
    Thank you
    Satish Waghmare

  • GC 10.1 on WIndows Job issue

    Hi,
    I'm running into this issue where the job e.g. backup job gets stuck in "Waiting for Output". The resolution i've found is I restart the agent on the target database after I stop the job. This stops the job and then I resubmit it manually to get it to work.
    Furthermore, if it's a daily job, I need to resubmit another daily job manually. Of course, this is working for me except for the hassle.
    Just wondering if anyone else is having the same issue. Again, i'm running 10.1 GC on Windows 2003.
    Cheers,
    Rodney

    Yes - I have had the same problem with release 1 on
    Windowz. I am beginning to suspect that the June 1
    release date is for 2007!Thanks for confirming ecoombs that you are seeing the same problem.
    Cheers.

  • Sharepoint 2010 Content Deployemnt Job issue with duplicate fields in User information List

    Hi friends,
    I am facing below issue with the content deployment job.
    It was working earlier. But now since from couple of days all the content deployment jobs in production environment are failing with below error.
    Field name already exists. The name used for this field is already used by another field in the list. Select another name and try again.
    ObjectName="User Information List".
    When I check the fields in User information list in targeted site, I found couple of columns are dupicate like "ask me about",first name","Last name" etc.
    Do i Need to drop target site collection or recreate with fresh content deployment job.
    Please suggest.
    Please help .
    Regards
    Subrat

    Hi,
    According to your post, my understanding is that you got duplicate field error.
    Based on the error message, you can try to use the following code sample to remove duplicate records, and check whether it works:
    http://social.msdn.microsoft.com/Forums/en-US/sharepointgeneralprevious/thread/41ee04bd-91fb-4bf9-932a-bac42c56c357
    Here is a similar issue, you can also use the ‘RemoveDuplicateColumn64’ provided:
    http://sharepointsurfer.wordpress.com/2012/04/27/how-to-fix-publishing-site-content-deployment-error-duplicate-first-name-column/
    What’s more, as you had said, you can recreate a site with a fresh deployment job.
    Thanks & Regards,
    Jason
    Jason Guo
    TechNet Community Support

  • Background job issue

    The background job - SAP_COLLECTOR_FOR_NONE_R3_STAT is taking too long to complete.
    It is stuck with -
    Reason - RFC  
    Status - On-hold
    Double clicking on the work process I see this -
    Waiting f.
    Gateway A92SV093XI034
    Report / Spool action
    CL_SLD_ACCESSOR===============CP
    On the SAP system of server A92SV093XI034, rdisp/max_gateways = 500 and the SLD is running fine.
    So, could you please tell me the cause of the work process being stuck?

    Jobs been running for last 2 days, so no errors.
    I cancelled the jobs now.
    Also, I checked the SAPSLDAPI RFC and it is not connecting to the SLD.
    So, mostly a problem with SLD connection.
    I checked everything as per the Link -
    http://help.sap.com/saphelp_nw04/helpdata/en/78/20244134a56532e10000000a1550b0/frameset.htm
    Just can't figure out what's wrong

  • Back Ground Job Issue

    Hi,
    My report program was executing successfully in the Foreground but not in the back ground.. Can any one help me on this?
    My report program has a selection screen..I entered the data for it at and through menu option Program->execute in background..I executed it..As it asked for output device i gave LP01 and No of copies to 1. and checked Radio button Print all. and clicked ok.
    It asked me for start time and i selected Immediate and clicked on save.
    It gave a status message "Back ground Job Scheduled".
    Now when i look through SM37 under my name the job is always active and Never finished.
    Can any one help me on this?
    Thank in advance..

    Just check whether the job is running in the background or not
    Double Click on the job name and take the Server & Process Id.
    Go back to the SM37 Screen.
    Click on Application Servers or directly Open the transaction SM51.
    Select the Server and find the process Id with the help of the things you found already
    Now, you can see the database level.
    Check if the values or moving or not. This way you could track the details
    If not, Ask BASIS guys to check whether the job is reading from rollback or not.
    If it is, cancel and rerun the job...or else..leave the job..it will finish off in some time.
    Hope everything is clear

  • Schedule jobs issue after the upgrade

    We have Data Services jobs scheduled via management console in 4.0 Environment. I upgraded the repository to 4.2 and attach it in CMC and also in Server manager. Both the environments are in different servers.
    Login to Data Services management console. Click on the upgraded repository schedules, scheduled jobs listed there. But there is no Job server atatched to it. When i try to activate or edit the schedule, i am getting below error.
    Schedule:billing_rep Error:Socket is closed] When contacting the server above exception was encountered, so it will be assumed the schedule is not active anymore.
    Am i missing any steps?

    Hi Jawahar,
    Below is my suggestion regarding your query:
    Edit existing Job server and update the details regarding Local reporsitory and save it.
    Now try to map the same job server in scheduled jobs and check it was working or not
    Or Else
    Create new Job server and assign it to local repository.
    In this case you have to update your all real time & batch Job configuration.
    Thanks,
    Daya

Maybe you are looking for

  • In Capital P.O - Multiple asset number in a single line item

    Hi friends, My Client is using SAP 4.7. In capital P.O selecting account assignment as 'A'  asset and entering in description as Laptop 5 Qtys in a single line item. in item details account assignment tab its asking to enter asset. I'm able to select

  • My HP Pavilion g4 windows 8 laptop keeps freezing up can i fix it

    Ok my HP Pavilion g4 Windows 8 locks up n freezing up can i fix it .

  • Can a .rar file be used to zip a Mac file?

    I've got a bunch of .rar files that are supposed to make up one big Mac file, but I've never heard of that being possible..always thought that's only a PC zip format....anyone know otherwise? If so, how do you unzip 'em? thanks

  • Raising Alerts from BI

    Hi All, I want to raise two alerts from BI7, will you please let me know the step by step process of how to do this. Thanks in Advance, Radhika

  • Installing Netgear Adapter

    Trying to confirm this, if my PB G4 recognizes a PCMCIA network card, then the device will show up in the Network Device list in System Preferences? I have plugged a card in, installed the software for it, and in Network the only things I see are the