Relay through process channel

Hi,
since 3 days our mail server is relaying mail:
30-Jul-2007 06:01:42.68 tcp_local ims-ms E 2 [email protected] rfc822;[email protected] malas+SPAM@ims-ms
-daemon
30-Jul-2007 06:01:42.69 tcp_local process E 2 rfc822;[email protected] [email protected]
30-Jul-2007 06:01:42.74 process tcp_local E 2 rfc822;[email protected] [email protected]
30-Jul-2007 06:01:42.74 process D 2 rfc822;[email protected] [email protected]
30-Jul-2007 06:01:42.77 ims-ms D 2 [email protected] rfc822;[email protected] malas+SPAM@ims-ms
-daemon
30-Jul-2007 06:01:43.91 tcp_local R 2 rfc822;[email protected] [email protected] dns;inbo
und.adcommoutdoors.com.netsolmail.net (ns-mr15.netsolmail.com ESMTP) smtp;550 5.2.1 <[email protected]>... Mailbox disabled for
this recipient
how the hell does the spammer come into the process channel?
in my send access I need to allow sending mail for the process channel for the automatic vacation message to work:
SEND_ACCESS
[..] (standard "$NBad$ destination$ system" stuff)
! allow relaying from intranet:
tcp_intranet|*|*|* $YIntranet
tcp_auth|*|*|* $YAuthenticated
tcp_local|*|ims-ms|* $Y
process|*|*|* $YAutoreply
*|*|*|* $NNot$ allowed!$ Are$ authentication$ and$ SSL$ activated?
how can I stop the spammer coming into the process channel, which relays him back? Or can I set up sending authenticated mail for the autoreply function?
regards and thanks
David

Hi,
since 3 days our mail server is relaying mail:
30-Jul-2007 06:01:42.68 tcp_local ims-ms E 2
[email protected]
rfc822;[email protected] malas+SPAM@ims-ms
-daemon
30-Jul-2007 06:01:42.69 tcp_local process E 2
rfc822;[email protected]
[email protected]
0-Jul-2007 06:01:42.74 process tcp_local E 2
rfc822;[email protected]
[email protected]
0-Jul-2007 06:01:42.74 process D 2
rfc822;[email protected]
[email protected]
0-Jul-2007 06:01:42.77 ims-ms D 2
[email protected]
rfc822;[email protected] malas+SPAM@ims-ms
-daemon
30-Jul-2007 06:01:43.91 tcp_local R 2
rfc822;[email protected]
[email protected] dns;inbo
nd.adcommoutdoors.com.netsolmail.net
(ns-mr15.netsolmail.com ESMTP) smtp;550 5.2.1
<[email protected]>... Mailbox
disabled for
this recipient
how the hell does the spammer come into the process
channel?The spammer didn't come into the process channel, this was the destination. From what I can tell the recipient got a copy of the email in their SPAM folder (I assume you have some kind of SPAM filtering software) and a bounce message was created (which I assume was a vacation reply) and sent back to the sender.
how can I stop the spammer coming into the process
channel, which relays him back? Or can I set up
sending authenticated mail for the autoreply
function?You can use a system sieve and the 'novacation' sieve command to stop emails detected as SPAM from having a message sent back.
Please refer to the email thread "how to get vacation messages not to respond to spam?" at:
http://lists.balius.com/pipermail/info-ims-archive/2005-November/023598.html
Regards,
Shane.

Similar Messages

  • Broadcasting through process chains

    How to broadcast reports to indivdual mailboxs in the company through process chains. say current data should be broadcasted into individual email account every time a new sales order or purchase order is created or by the end of the day , it should automatically trigger the mail with the report. can anyone help on this.

    Hi,
    Here see the blog details.
    The goal of information broadcasting is to distribute the right information, in the appropriate format, to the right people, through different channels, at the right time.
    With the BEx Broadcaster, you can precalculate queries, query views, Web templates, reports and workbooks and broadcast them by e-mail or to the portal. In addition to the precalculated documents in various formats (HTML, MHTML, ZIP and so on) that contain historical data, you can also generate online links.
    Accessing the Broadcaster
    The broadcaster can also be accessed via the Portal through the delivered BI Role.
    You can determine the Scheduling for Information Broadcaster
    Based on a data change event triggered by a process chain.
    Based on a pre-defined time point.
    Freely definable scheduling.
    Steps to Schedule Information Broadcaster based on a data change event triggered by a process chain
    Create the Query in the Query Designer.
    Execute the report in the BEx Web Analyzer.
    Click on u201CSendu201D to create the settings for broadcasting.
    The Broadcast Wizard takes you through a series of prompts where you supply the key information required to develop a broadcast. At any time you can leave the Wizard and use the standard settings dialogs which offer more settings.
    Then schedule the broadcast. If you start the Broadcaster for a query (or template or workbook) that gets data from an InfoProvider that will be selected in the process chain, you can select the InfoProvider for Scheduling.
    Create the process chain and include the event data change, include process type "Trigger Event Data Change (for Broadcaster), itu2019s available under "Load Process and Post -Processing".
    The Process Chain is created including the process types: (1) Start (2) Execute InfoPackage (3) Delta Data Transfer Process (4) Activate DSO (5) Trigger Event data Change.
    When you create the Variant for the Event Data Change, using checkbox we can indicate when the Broadcast should trigger.
    As soon as that InfoProvider is affected by a Process Chain, the Broadcasting is triggered.
    After successful activation you can now schedule your chain. Press button u201CScheduleu201D or menu u201CExecution -> scheduleu201D. The chain will be scheduled as background job. You can see it in SM37. You will find a job named u201CBI_PROCESS_TRIGGERu201D. Unfortunately every process chain is scheduled with a job with this name. In the job variant you will find which process chain will be executed. During execution the steps defined in RSPCPROCESSCHAIN will be executed one after each other. The execution of the next event is triggered by events defined in the table.  You can watch SM37 for new executed jobs starting with u201CBI_u201D or look at the protocol view of the chain.
    You can monitor the Broadcaster from the SOST transaction.
    Note:
    Depending on authorizations, end-users can schedule their Broadcasting Settings.
    Only for those queries which are checked "Execution with Data Change in the Infoprovider" while you schedule, will be triggered by process chain event.
    You may wish to refer the Note on Settings for Information Broadcasting- 760775
    Hope this help you
    Regards,
    Rakesh

  • How to print of a crystal output on paper through process Scheduler in PeopleSoft.

    Post Author: rajaumareddy
    CA Forum: Charts and Graphs
    while printing crystal through process scheduler in PeopleSoft it giving Error as Runstatus. and i given in Printer destination.
    Could u help me to find solution.
    1. what are process scheduler setup.
    2. Is there any Sever side setup (Default server)

    Srinivas, Thanks for your quick reply.
    This is smartform, and sending the output as pdf, except the the first page on which I have the text to be printed on the email body. In this email body text I have to display an email.
    when user clicks on the email should open an outlook with the email id in the TO. Hope this helps.
    Thank you,
    Surya

  • Loading through Process Chains 2 Delta Loads and 1 Full Load (ODS to Cube).

    Dear All,
    I am loading through Process chains with 2 Delta Loads and 1 Full load from ODS to Cube in 3.5. Am in the development process.
    My loading process is:
    Start - 2 Delta Loads - 1 Full Load - ODS Activation - Delete Index - Further Update - Delete overlapping requests from infocube - Creating Index.
    My question is:
    When am loading for the first am getting some data and for the next load i should get as Zero as there is no data for the next load but am getting same no of records for the next load. May be it is taking data from full upload, i guess. Please, guide me.
    Krishna.

    Hi,
    The reason you are getting the same no. of records is as you said (Full load), after running the delta you got all the changed records but after those two delta's again you have a full load step which will pick whole of the data all over again.
    The reason you are getting same no. of records is:
    1> You are running the chain for the first time.
    2> You ran this delta ip's for the first time, as such while initializing these deltas you might have choosen "Initialization without data transfer", as such now when you ran these deltas for the first time they picked whole of the data.Running a full load after that will also pick the same no. of records too.
    If the two delats you are talking are one after another then is say u got the data because of some changes, since you are loading for a single ods to a cube both your delta and full will pick same "For the first time " during data marting, for they have the same data source(ODS).
    Hope fully this will serve your purpose and will be expedite.
    Thax & Regards
    Vaibhave Sharma
    Edited by: Vaibhave Sharma on Sep 3, 2008 10:28 PM

  • Incorrect data after activating the request through Process chain.

    Dear SDN chaps.
    Today morning. I encountered a strange issue in DSO..
    I have DSO which is updating from the AL11(application server) flat file.
    While i am loading it to PSA there were no issues and after loading it to the DSO there is no issue and its passing through the routine and the data is populating properly in NEW data Table .But after successful activation of  the request through process i am getting the wrong records in active data table.
    Then i deleted the request and reran it manually i mean triggered the DTP and ran the manual activation surprisingly accurate records are coming through manual process..
    I am just wondering why it is not working through process chain and why it is showing incorrect records through process chain execution and how it is showing accurate records through manual uploading process..'
    Could some one please help to come out from this..By the way mine is SAP BI 7 SP20 &SP05 for BW 7.01
    Thanks
    K M R
      

    Hi Pra
    Thanks for your response..
    We are doing PSA deletion and then we are uploading the data to PSA as well as DSO.
    Now the issue is not in the part of loading we are facing the issue in Actiation of request if i am executing the activation through process chain it is sucess but the values are incorrect. If i am doing the manual activation it sucess with correct data.
    Even i tried with a new chain but still i am facing the issue.
    Surprise thing is in new data table the data is perfect in both the ways like manual upate and Process chain update only during activation i am getting incorrect record in the active data table..
    Appreciate your help on this....
    Thanks
    K M R
    Edited by: K M R on Jul 9, 2010 11:09 AM

  • How to call program through process chain

    Hi Gurus,
    I am in the position to execute the abap program through process chain, I have used  abap program as process type in process chain(First time I am using this process type).when I am executing the process chain, the abap program is not executing.Eagerly anticipating your reply.
    Regards
    Shiva

    Hi
    I managed the execution of rscrm jobs in PC as follows:
    1. Execute query through RSCRM_BAPI transaction
    2. goto sm37 and copy the Jobname (the active one)
    3. Create following progromm
    *& Report /WST/RSCRM_START *
    REPORT /WST/RSCRM_START .
    parameter: l_bid TYPE sysuuid_c.
    CALL METHOD cl_rscrmbw_bapi=>exec_rep_in_batch
    EXPORTING
    i_barepid = l_bid
    4. Execute Programm and fill the Parameter with the Jobname
    5. Save as a new program variant and use in PC as a normal program
    I hope that helps.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3507aa90-0201-0010-6891-d7df8c4722f7
    regards
    ashwin

  • Problem in Activating Hierarchy Attribute change run through process chain.

    Hi All,
    I have the problem in Hierarchy change run processw running through process chain..and also process chain completes successfully without any error..
    Even hierarchy change run  completes successfully the hierarchy objects are not activating,
    so daily when i have to activate hierarchy objects manually..
    and what i did was in one attributechange run process i have included 10 hierarchy object is it correct or else i have to separatly create 10 attribute change hierachy proceess in the process chain.. pls through some lights on this issue
    With regards,
    Hari

    Hi venkat,
    Yes.. i already include the save hierarchies process in process chain...
    my main problem is hierarchy infobject are not activating  using process chain and also even process chain is not giving error message.. and in which log table the hierarchy objects are maintained (such as run time, date , status etc..).
    IS there is any other settings to be made in process chain..
    with regards,
    hari

  • Error in the Source system while data loding through Process Chain

    Hi,
    I am facing issue while data loading for certain extractors through Process chain. Load of 0BPM_WIHEAD, 0BP_LOGHIST, OBPM_OBJREL, 0BPM_DEADLINES(there are 2-3 more extractors) is getting failed daily with message Error occurred in the source system and if we repeat the same it is getting executed successfully.
    Regards,
    Javed

    Hi..
    It means that the extraction job is failing in the source system..
    Please check the Job log in the source system  --> Copy the request nimber --> In the source system go to SM37 --> Give * then the request number......execute --> Check the Job log..
    Look there may be multiple reasons behind this....
    1) Connection problem : Go to SM59 and test the connection, If you don't have the access in SM59...then go to RSA1 --> Source Systems --> Right click --> Connection Parameters --> Connection Test
    2) Or may be work processes are not available and due to which jobs are failing due to time out...
    In both the cases you can check with the Basis Team if they can suggest something....or you can change the process chain scheduling time if possible, and schedule it in such a time when less no of jobs are running..
    Regards,
    Debjani..

  • Archiving Infocube through Process Chain...

    Hi All,
    I need help in creating process chain for archiving infocube.I am able to archive infocube manually but not getting through process chain.
    Is it possible to archive infocube through Process Chain?If yes then please give steps to create process chain for archiving.
    Thanks in advance.
    Bandana.

    Hi,
    It is possible to Archive data from an infocube via process chain.
    Have a start process followed by Archive data from an infoprovider. Here the trick lies in the variants used for the archive steps used to make the chain.
    Create a process by dragging the "Archive data..." process and give name for the variant and use this variant for writing the archive file, choose your archive process (the same archive process u have created to archive data from the infocube) , as this is write phase do not check "Continue Open Archiving  requests" chk box, and choose "40 Wirte phase completed succesfully option" from the options in Continue Process Until target status.Now enter the required selection conditions. In case u want to reuse this chain give a relative value under 'Primary Time Restriction tab' and save this variant. This is your variant 1.
    Now drag the same 'Archiving process' and create a variant and in this process u need to select "Continue Open Archiving  request" and choose the option from the dropdown list '70 Deletion phase confirmed and request completed' this variant for deleting the data from the infocube. This is your variant 2.
    So now have a Process chain with Start >Archive Process with Variant 1 ( to write the archive file) > Archive Process with Variant 2 ( to delete the data from the cube).
    Thats it !!
    Regards
    Edited by: Ellora Dobbala on Apr 8, 2009 5:28 PM

  • 50 Parallel Processing through Process Chain

    Hello Experts,
    Here is what I am trying to do.
    I want to create Process Chain which run 50(for ex.) Abap Program in parallel. I can create one by one through Process Types and select 'ABAP Program' and drag in the chain which is really very time consuming and tidious.
    Does anybody have better approach to do that ?? Any table where I can create all this parallel steps and put it in a chain ?? or copy each step and put it in parallel in chain ??
    Quick reply will be appreciated.
    Points are guaranteed for Right Resolution.
    Thanks in Advance.

    Hi "Believe in Jainism",
    Do you need to execute other process chain steps after executing the 50 abap programs? If not, why not just do the following:
    1) Create an ABAP program
    2) Inside the ABAP program, use the SUBMIT statement to execute the 50 ABAP programs in a background task.
    If you need to execute steps after the 50 abap programs, you can try to create a custom process type that executes the 50 ABAP programs in a background task and then monitors the statuses and then terminate only once all of these programs have executed successfully or if 1 has failed.
    Hope this helps.
    P.S. If you don't mind, may I ask why do you need to execute 50 ABAP programs in parallel?

  • PSA Deletion through Process Chain

    Hi Experts,
    Currently I am working on the BW3.5 version. I would like to delete the old PSA req through Process Chain. I need some clarification. Please provide me your suggestions.  Thanks in advance.
    1)In SAP-BW 3.5 version, there is only one process type u2018Deleting Requests from the PSAu2019 is available to do this activity. This process type can delete the requests from the PSA only or Will it delete PSA and Change Log tables!!!!! In the SDN threads, some people said it delete the PSA requests only and some said  it delete the PSA requests as well as Change Log.
    2)Currently we are having six process chain and each has master & transaction data load(Data load through PSA).
    Similarly I am planning to create six PSA deletion process chain which includes master & transaction data deletion. Here I am got little complication on finding Object Name(PSA Table Name).Please refer the screen shot.Is there any short cut to find full list of Object Name(PSA Table) which is currently available in the Process chain every day load.
    3)On the request selection, i prefer "Only sucessfully booked/updated req.". I did not select"Only those req with errors,that are not booked in a data target". Please share your view on this selection preference.
    http://img818.imageshack.us/img818/3963/psa1.jpg
    Thanks,
    RR

    Hi Murali,
    Thanks for the response. I do understand about the identifying the data source & PSA retention period(days). Let me elaborate my question little more. Thanks.
    1)In SAP-BW 3.5 version, there is only one process type u2018Deleting Requests from the PSAu2019 is available to do this activity. This process type can delete the requests from the PSA only or Will it delete PSA and Change Log tables!!!!! In the SDN threads, some people said it delete the PSA requests only and some said it delete the PSA requests as well as Change Log.If i am not mistaken, In BW 3.5 PSA deletion will leads through Change Log also.Am i right?
    2)Currently we are having six process chain and each has master & transaction data load(Data load through PSA).
    Similarly I am planning to create six PSA deletion process chain which includes master & transaction data deletion. Here I am got little complication on finding Object Name(PSA Table Name).Please refer the screen shot.Is there any short cut to find full list of Object Name(PSA Table) which is currently available in the Process chain every day load.In the very day process chain, we are having many stages in master & transaction load. So I can go through info pkg level and find the data source,then i can identify the PSA table name. But is there any other simplest way to find the PSA table name.
    http://img818.imageshack.us/img818/3963/psa1.jpg
    3)On the request selection, i prefer "Only sucessfully booked/updated req.". I did not select"Only those req with errors,that are not booked in a data target". Please share your view on this selection preference. May i know that which option is optimal on normal business process. If i did not select both options, then will that delete all the request in PSA!!!
    Thanks.
    RR.

  • B2B  Custom document through Http Channel.

    Hi
    Normally in order to exchange Edifact documetns we use Edi identifiers to receive documents from multiple Trading Partners.
    But we have a scenario, where we need to use single Custom Purchase Order document definition and need to recieve Documents from multiple Trading Partners through Http Channel, we are passing xml documents. How to get Document Identifier for Custom documents and how to get the Identifier tag from the Document given as input to differentiate from different trading partners.
    Thanks in advance.
    Regards
    Chaitanya.

    Hi Anuj,
    Thanks for reply
    In My scenario i have to receive custom documents(cXML file) from two remote trading partners through http channel. I just created one custom document definition...and each agreement for each trading partner using that document definition.
    If the document (i/p) is coming from soa, then easily i can give fromTP and toTP values in SOA mediator or bpel and i can route the document to particular agreement.
    But in my case,the i/p document doesnt comes from SOA (it may come from http servlet or in some way) so i cant give toTP and fromTP values.
    so if it is a EDI document ,then we can route it according to the identifiers. but in my case it is a Custom document i.e, cXML.xml
    so in my xml i had one <Identity>xxx</Identity> tag ,this identity tag value tells us from which partner the document is coming. so by using this identity tag can i route it to particular agreement.
    so based on the above identity tag, how can i find the trading partner in b2b.
    Thanks in advance,
    Regards,
    Chaitanya.

  • Population o batch manufacturing date automatically through process mesages

    Hi gurus,
    We sre using control recipe to download data into MES system and uploading data through process message to do the confirmation and goods receipt. We are using batch management and calculating shelf life expiration date at the time of confirmation using manufaturing date + shelf life. Now the issue is, through process message we are not able to populate batch manufacturing date automatically and getting confirmation error after sending the process message to SAP. In process messgae category we are using characterics PPPI_END_DATE,PPPI_EVENT_DATE, PPPI_START_DATE as a date field.
    Can anybody suggest how can we populate batch manufacturing date automatically through process mesages.
    With thanks
    Rajib Pathak

    thanks

  • Delete specific attachment file through Conversion Channel ?

    version : iMS5.2 sp1
    O/S : Solaris 2.6 Generic_105181-29
    I wanted to delete specific attachment file(ex: ALTDESK.ZIP) through Conversion Channel.
    So I set up like below..
    1) In mappings file
    =================
    CONVERSIONS
    IN-CHAN=tcp_intranet;OUT-CHAN=tcp_local;CONVERT Yes
    ==============
    I only want to delete attached file from tcp_intranet to tcp_local.
    2) msg-INSTANCE/imta/config/conversions
    example mail header ::
    --- omit ----
    MIME-version: 1.0
    X-Mailer: iPlanet Messenger Express 5.2 Patch 1 (built Aug 19 2002)
    Content-type: multipart/mixed; boundary=--6b2385053506b85
    Content-language: ko
    X-Accept-Language: ko
    Priority: normal
    This is a multi-part message in MIME format.
    ----6b2385053506b85
    Content-Type: text/plain; charset=EUC-KR
    Content-Disposition: inline
    Content-Transfer-Encoding: quoted-printable
    ----6b2385053506b85
    Content-Type: application/x-zip-compressed
    Content-Transfer-Encoding: base64
    Content-Disposition: attachment; filename=ALTDESK.ZIP
    - conversions file setting
    ==================================================
    in-channel=tcp_intranet; out-channel=tcp_local;
    in-type=application; in-subtype=x-zip-compressed;
    parameter-symbol-0=ALTDESK.ZIP; parameter-copy-0=*;
    dparameter-symbol-0=ALTDESK.ZIP; dparameter-copy-0=*;
    message-header-file=2; original-header-file=1;
    override-header-file=1; override-option-file=1;
    command="/product/leeky/convert.sh"
    ============================================
    3) /product/leeky/convert.sh file
    ========================
    #!/bin/sh
    if [ $? -eq 1 ]; then
    echo "STATUS=178030178" >> $OUTPUT_OPTIONS
    else
    cp $INPUT_FILE $OUTPUT_FILE
    fi
    =========================
    4) The problems I face are
    - All of zip files which are filtered is deleted. As you can see 2) I only want to delete ALTDESK.ZIP file
    But, all of the zip-compressed files are deleted at the moment.
    - This converison channel is work(even if all zip-compressed files are deleted). But sometimes this is not
    working(2 or 3 times out of 10). I don not know why.
    - Above all, I am not sure 2) and 3) settings are good. English is a second language to me. So it was
    not easy to understand conversion channel setting in Admin Guide.
    - How can I see the out put of "$OUTPUT_OPTIONS" ? I do not know where I can see that.
    Is there anybody to help me ?

    The section of the admin guide which can help is:
    http://docs.sun.com/source/816-6009-10/channel2.htm#42283
    Here there is an explanation of how the mime headers of the message part would align with the entries one would put into the conversions file entry. (The document has an error where it talks about APPARENT_NAME and APPARENT_FILENAME. It should really say the words "NAME" and "FILENAME" respectively).
    Based on that document, the MIME headers of your message part :
    Content-Type: application/x-zip-compressed
    Content-Transfer-Encoding: base64
    Content-Disposition: attachment; filename=ALTDESK.ZIP
    will align with a conversions file settiong of:
    in-channel=tcp_intranet; out-channel=tcp_local;
    in-type=application; in-subtype=x-zip-compressed;
    parameter-symbol-0=NAME; parameter-copy-0=*;
    dparameter-symbol-0=FILENAME; dparameter-copy-0=*;
    message-header-file=2; original-header-file=1;
    override-header-file=1; override-option-file=1;
    command="/product/leeky/convert.sh"
    and a /product/leeky/convert.sh script which reads something like:
    #!/bin/sh
    grep "$FILENAME" /product/leek/badfiles.list
    if [ $? -eq 1 ]; then
    echo "STATUS=178030178" >> $OUTPUT_OPTIONS
    else
    grep "$NAME" /product/leek/badfiles.list
    if [ $? -eq 1 ]; then
    echo "STATUS=178030178" >> $OUTPUT_OPTIONS
    else
    cp $INPUT_FILE $OUTPUT_FILE
    fi
    fi
    The lines:
    parameter-symbol-0=NAME; parameter-copy-0=*;
    dparameter-symbol-0=FILENAME; dparameter-copy-0=*;
    tells the conversion channel to make the environment variable $NAME avaliable to your program with a value corresponding to the "name=" clause on the Content-Type line of the MIME headers. The environment variable $FILENAME is made avalable to your program and takes on the value extracted from the filename=" clause on the content-disposition line of the MIME headers.
    The document at :
    http://docs.sun.com/source/816-6092-10/conversion.html
    may help provide other examples.

  • Broadcasting Workbooks through Process Chain

    Dear friends,
    I want to broadcast workbooks through process chain. I have tried two programs
    1) RSRD_BROADCAST_STARTER
    2) RSRD_BROADCAST_BATCH
    My requirement is that, I want a log to be mailed to me stating the success or failure of the broadcast of the workbook.
    Through the first program, I am able to send the log to my e-mail ID however I am not able to send the workbook.
    Through the second program, when I execute the process chain, it immediately says process chain executed. After some time the log mail comes stating succss. After some more time the workbook gets broadcasted.
    I want the workbook to be broadcasted first and then the log should come specifying the success or failure of the precalculation of the workbook in the precal server.
    Please let me know if somebody knows how to do it.
    Regards
    Himanshu

    Hi Sapna,
    Yes You have to install pre-calculation server for broadcasting and the process for that is
    http://service.sap.com/swdc > Download>Support Packages & Patches>Entry by Application Group>SAP NETWEAVER -->SAP NETWEAVER 04 -->BI Precalculation
    and can check its status in transaction : RSPRECADMIN   or
    Transaction SPRO>SAP Referance IMG> SAP Netweaver> Business Interlligence>
      Settings for Reporting and Analysis> Settings for Information Broadcasting> Administrate Pre-calculation Server.
    and the precalculation is done : Open Query>Publish>BEx Broadcaster , then you can also make new settings --> three or foue tabs  and one of the tab is pre-calculate.
    Regards,
    Nisha Jagtap.

Maybe you are looking for

  • Query running on sql commands prompt not running on report region

    Hi All, Facing a weird issue now. I have written a report query which is running absolutely fine in sql command prompt but when i trying to run this as a report it is just processing and the report is not loading. What could be the reason behing this

  • Date and time formatting not responding

    I'm trying to total a column based on a date or date window.  My formula looks like this. SUMIF(Date,B1,'$') I have 2 tables.  The table with the formula references the other table, and compares Date in table 1 to B1 in table 2.  B1 is a date I enter

  • Pricing condition in idoc PORDCR05

    Hi all, Our current business process is- we will get POs from our legacy system in flat files, SAP will convert these into idoc basic type PORDCR05 / message type PORDCR. We are using BAPI_PO_CREATE to create PO from idoc. New requirement is to send

  • Same synced drives, different remaining space - aaarrggghh!

    This may not pertain to Mountain Lion exclusively, in fact i know it doesn't, but nobody (not even the manufaturer can explain this: I have two 2TB Western Digital Caviar Green drives both formatted on my mac pro. I typically use Decimus Synk to sync

  • Computer feels slow and laggy

    The description is a bit vague, but it's hard to describe the exact problem. I hope I'm posting the correct sub forum. I'll start with giving all relevant system information I can think of: root@linux ~ % uname -a Linux linux 2.6.35-ARCH #1 SMP PREEM