Question about CRM extractions

Hi all,
The CRM team have defined several business transaction types in CRM customizing. These transactions are transferred to BW by using DataSource 0CRM_SALES_ACT_1. Now there are new requirements based on transaction types (BW: 0CRM_PRCTYP) which are not included in the extraction yet. No selections take place on InfoPackage level.
Two questions:
- Is it necessary to indicate whether or not transaction types should be included in the extraction or are there other possible reasons why these transactions are not transferred?
- I may need another DataSource for these transaction types. If so, how do I know which DataSource belongs to a particular business transaction.
Thnx in advace,
Henk.

Hi Eric,
   I think the license is a internet user and this license is more cheap than user sap gui. Ask for you comercial SAP...to check it.
Regards.
Manuel

Similar Messages

  • Question about data extraction from web forms

    I am developing a simple web form in DreamWeaver MX for
    increased accesibility for users who utilize screen reader
    software, as navigating PDF forms is still currently very difficult
    for most of the screen readers to navigate. I was wondering if
    there is a way to take an .asp webform that a user fills out and
    when they hit a print button it would extract the data from the
    form and open an Adobe PDF form and populate the data they entered
    into specific fields within the PDF form which would allow them to
    print an official copy of the form they used the web version to
    complete. Any insight into this possibility is greatly appreciated!
    Thanks,
    AU PSD

    Regex? Lots of indexOf? Parsing...

  • Question About CRM Best Practices Configuration Guide...

    In the CRM Connectivity (C71) Configuration Guide, Sections 3.1.2.2.2 and 3.1.2.2.3, it mentions two clients, Cient 000 and the Application Client.  What are these two client?  I assumed Client 000 was my CRM client, but that sounds the same as what the application client should be.
    http://help.sap.com/bp_crmv340/CRM_DE/BBLibrary/Documentation/C71_BB_ConfigGuide_EN_US.doc

    Keith,
    Client 000 is not the application client.
    The client which is used in the middleware(e.g.CRM quality client - R/3 Quality client or CRM Production client-R/3 Production client)is the application client.
    You have to do it once in client 000 and in your own created client which is used in the middleware connectivity.
    regards,
    Bapujee

  • Question about IPC for CRM 5.0

    Hi ,
    I have a question about CRM 5.0 with relation to the IPC .
    As we know the IPC  will be available internally from  CRM 5.0 , but we would like to know if the internal IPC  will be distributed with different applications or will it be one IPC  for all the different applications .
    Thanks in advance
    BR//
    Ankur

    Hi Ankur,
    See my reply in your another thread Question about IPC for CRM 5.0
    <b>Reward points if it helps!!</b>
    Best regards,
    Vikash.

  • Regarding CRM extraction.....

    Hi Gurus,
    I have few questions regarding CRM extraction.
    Q1: If we enhance a CRM transactional Datasource, do we have to write the ABAP logic in SE19 or CMOD or both?
    Q2: If we enhance a CRM MasterData Datasource, do we have to write the ABAP logic in SE19 or CMOD or both?
    Q3: What is the significance of the T-CODES BWA1, BWA5, BWA7 ?
    Q4: When do you implement the BADI CRM_BWA_MFLOW and when do we implement the BADI CRM_BWA_SFLOW and when do we implement both of them?
    Q5: What is the significance of the function module CRM_BADI_GET_BDOC? Is there any other function module which is used in CRM extraction that has to be implemented in coding?
    Please throw some light on my questions.
    Thanks
    Nice day
    Regards
    Sam

    HI,
    For Delta Extraction of CRM to BI pelase follwo the below steps
    Go to transaction GNRWB
    Select BUS_TRANS_MSG
    Select (on the right, the services): BWA_DELTA3, BWA_FILL, BWA_queue
    Press Generate.
    Also check for the following:
    1. The delta should have been initialized successfully.
    2. Confirm that all Bdocs of type BUS_TRANS_MSG
    are processed with success in SMW01.
    3. If there are queues in SMQ1 with erroneous status then activate
    these queues.
    In Transaction SMQ1 if there are Queues existing with
    names beginning with CRM_BWAn (n is number) then
    activate these queues in the same transaction.
    4.a)If required activate the datasource
    Go to transaction BWA5 > select the required datasource and
    activate.
    4 b) The Delta may not be active ,activate the delta in BWA7 by
    selecting the name of the datsource and pressing the candle icon for
    'activate delta'.
    5. In BW system
    Go to transaction RSA1 > modeling > infosources > select the
    info source > right mouse click on the selected
    infosource > choose option replicate datasource
    Activate the infosource.
    6. Go to the scheduler for the infosource > select delta in the
    update >choose the option PSA only (in the Processing tab)
    start immediately
    Check the entry in the RSA7 in the OLTP(CRM system)
    Hope It helps You.
    Regards,
    Nagaraju.V

  • I have a question about extracting pages.  When I do the function, adobe saves the individual files as " file name space page number ", so the files look like this "filename 1.pdf", "filename 2.pdf", "filename 3.pdf".  Without too many gory details, I a

    I have a question about extracting pages.  When I do the function, adobe saves the individual files as "<file name><space><page number>", so the files look like this "filename 1.pdf", "filename 2.pdf", "filename 3.pdf".  Without too many gory details, I am using excel to concatenate some data to dynamically build a hyperlink to these extraced files.  It casues me problems, however, for the space to be the the file name.  Is there any way to change the default behavoir of this function to perhaps use a dash or underscore instead of a space?

    No, you can't change the default naming scheme. You can do it yourself if you extract the pages using a script instead of using the built-in command.

  • A question about users assigned roles extraction

    Dear all,
    I have a question about users assigned roles list extraction. I need the list of the users who have already been created along with their assigned roles. According to what I found on Google, there is a table named AGR_USERS which provides the roles assigned to each user. Yet, this table provides only the SAP ID of each user along with the assigned roles. What I need more is to have also the first name and second name of each user.
    So, do you know any table providing at least the following information:
    1) First name of each user
    2) Second name of each user
    3) SAP ID of each user
    4) All assigned roles to each user.
    NOTE: I really need to have first name and second name in separate columns
    Thanks in advance,
    Dariyoosh

    >
    Shekar.J wrote:
    > Agr_users for the user ID and role assignments
    > USR02 to check the validity of the User ID
    > and USER_ADDR for the first name and last name
    >
    > You can create a Table join of the above 3 tables to retrieve the data you require
    Thanks to you and others for your attention to my problem
    I don't know anything about ABAP programming, is there any transaction allowing to create this join? As it seems to me the column "UNAME" in the table "AGR_USERS" and the column "BNAME" in the table "USER_ADDR", both refer to the SAP ID of the user. As a result the condition of the join would be "WHERE (UNAME = BNAME)", is there  any transaction/programme allowing to create this join?
    Thanks in advance,
    Dariyoosh

  • Some question about SAP CRM consultant

    Dear all,
          I'm a center-based consultant in china.Our company is among the first batch to run SAP CRM in china in 2006.Our version is CRM 5.0.From my two years' working experience,the status of using CRM in our company is not very good.Complex UI,sychrorize problem makes our partner not willing to use this system.
         I used to do some ABAP developing then i began to transfer to MM&SD.CRM work is the recent thing.During some months' work in CRM,i found i am more interesting in CRM.
        I want to find a new job in outside consulting company in future.I want to focus on CRM.One aspect is that i'm interested in CRM.The other is that CRM is a new product and there is not so many people doing this.However,i seems getting some worries.First,SAP CRM seems not quite mature just from my understanding in 5.0 version.We  need to develope a lot not just using User-exit.I don't know whether 7.0 has changed a lot.I've seen the UI of 7.0.It seems much better.But is it means more company will choose SAP CRM 7.0 other than Siebel?In china,there seems not many CRM projects now and i don't know the status in other countries.Second,CRM may have much differences from ERP implementation because there are many process which can't be solidified.
       In one word,i'm worried about CRM consultant's future.And i hope gurus to give me the answers.THx.

    Hi Feng
    It would be very intersting to clients as well as consultants to work on SAP CRM. Career wise you have ample opportunities in the same. With the advent of various features in CRM 7.0 version, SAP is now more user friendly and is easy to implement with lot many processes covered as a part of vanila offer.
    I wish you all the best and hope this helps.
    Regds,
    Raghu

  • Regarding bw crm extraction

    Hello Guys,
    I need to know what are the steps for BW CRM extraction. I know this has been asked for a lot of times, and I have read some threads before I am posting this question.
    I know that we have to activate the BW meta data for the CRM to for the extractors to work, but somehow the steps mentioned on the various thread do not seem to be adequate.
    Can we have one thread where we talk all about BW CRM extraction ? Right from step 1. Can someone please post it please ? Also, if we can talk the implications (of what goes in side when we execute each step) it would be great. Its also good to get a technical understanding of the thing works.

    HI Mukund,
    Steps for Extracting data from CRM:
    Configuration Steps
    1.Click on ->Assign Dialog RFC destination
    If your default RFC destination is not a dialog RFC destination, you need to create an additional dialog RFC destination in addition and then assign it to your default RFC destination
    2.Execute Transaction SBIW in CRM
    3.Open BC DataSources.
    4.Click on Transfer Application Component Hierarchy
    Application Component hierarchy is transferred.
    5.SPRO in CRM .Go to CRM->CRM Analytics
    6.Go to transaction SBIW-> Settings for Application specific Data Source ->Settings for BW adapter
    7.Click on Activate BW Adapter Metadata
    Select the relevant data sources for CRM sales
    8.Click on Copy data sources
    Say yes and proceed
    9.Logon to BW system and execute transaction RSA1.
    Create a source system to establish connectivity with CRM Server
    A source system is created. (LSYSCRM200)(Prerequisites: Both BW and CRM should have defined Back ground, RFC users and logical systems)
    10.Business content activation for CRM sales area is done
    11.Click on source system and choose replicate datasources.
    Which Datasource using....? 
    Hope this Help
    Regards,
    Raj

  • Some questions about the integration between BIEE and EBS

    Hi, dear,
    I'm a new bie of BIEE. In these days, have a look about BIEE architecture and the BIEE components. In the next project, there are some work about BIEE development based on EBS application. I have some questions about the integration :
    1) generally, is the BIEE database and application server decentralized with EBS database and application? Both BIEE 10g and 11g version can be integrated with EBS R12?
    2) In BIEE administrator tool, the first step is to create physical tables. if the source appliation is EBS, is it still needed to create the physical tables?
    3) if the physical tables creation is needed, how to complete the data transfer from the EBS source tables to BIEE physical tables? which ETL tool is prefer for most developers? warehouse builder or Oracle Data Integration?
    4) During data transfer phase, if there are many many large volume data needs to transfer, how to keep the completeness? for example, it needs to transfer 1 million rows from source database to BIEE physical tables, when 50%is completed, the users try to open the BIEE report, can they see the new 50% data on the reports? is there some transaction control in ETL phase?
    could anyone give some guide for me? I'm very appreciated if you can also give any other information.
    Thanks in advance.

    1) generally, is the BIEE database and application server decentralized with EBS database and application? Both BIEE 10g and 11g version can be integrated with EBS R12?You, shud consider OBI Application here which uses OBIEE as a reporting tool with different pre-built modules. Both 10g & 11g comes with different versions of BI apps which supports sources like Siebel CRM, EBS, Peoplesoft, JD Edwards etc..
    2) In BIEE administrator tool, the first step is to create physical tables. if the source appliation is EBS, is it still needed to create the physical tables?Its independent of any soure. This is OBIEE modeling to create RPD with all the layers. If you build it from scratch then you will require to create all the layers else if BI Apps is used then you will get pre-built RPD along with other pre-built components.
    3) if the physical tables creation is needed, how to complete the data transfer from the EBS source tables to BIEE physical tables? which ETL tool is prefer for most developers? warehouse builder or Oracle Data Integration?BI apps comes with pre-built ETL mapping to use with the tools majorly with Informatica. Only BI Apps 7.9.5.2 comes with ODI but oracle has plans to have only ODI for any further releases.
    4) During data transfer phase, if there are many many large volume data needs to transfer, how to keep the completeness? for example, it needs to transfer 1 million rows from source database to BIEE physical tables, when 50%is completed, the users try to open the BIEE report, can they see the new 50% data on the reports? is there some transaction control in ETL phase?User will still see old data because its good to turn on Cache and purge it after every load.
    Refer..http://www.oracle.com/us/solutions/ent-performance-bi/bi-applications-066544.html
    and many more docs on google
    Hope this helps

  • Basic questions about Infocube

    Hi, everyone.
    I got very basic questions about infocube data handling.
    With infopackage, I extracted all the data about employees from R/3.
    But there were some mistakes during inputting employee data, like positions,
    so I just extracted those employees data once more.
    Now I have two requests in infocube, where the first one has some wrong data.
    Is there any solutions about this?
    I might got it all wrong, so as beginner, any suggestions and explanations will
    be grateful.

    Hi,
    You can manually delete the earlier request by going in the Manage option of the cube. select the request and click on delete icon at the bottom.
    Other option is to make setting in the Infopackage to delete similar or overlapping request.
    Data target tab --> 6th column Automatic loading / deletion of similar request. --> click on the blank icon --> you will get a pop-up --> select the radio button - "delete existing request" --> Select Conditions --> Infosource are same, datasource are same and source system are same, --> selections are "Same or  More Comprehensive "
    Assign points if useful
    Regards
    Venkata Devaraj !!!

  • Questions about 1003051 - TDMS 3.0 corrections - Composite SAP Note

    Experts:
    We, from a big SAP shop, want to set up TDMS for ERP,  BI, HCM and CRM.
    We have some questions about note 1003051:
    1) Is this note about ERP (and HCM)  only?  If so, what are the notes for TDMS on BI and CRM?
    2) Should we apply this note to all systems involved: sender, controller (SM7.0 in our case) and receiver?
    3) We have enhp1...enhp4 in our ERP (and HCM), enhp1 on BI and perhaps enhp1 on CRM (not sure yet).
        a)  What is the impact of enhp on TDMS?
        b) should we install TDMS before applying enhp OR the sequence does not matter?
    Thanks for your help!

    1) Is this note about ERP (and HCM) only? If so, what are the notes for TDMS on BI and CRM?
    >This is a composite note, and contains information about various other notes related to TDMS. Refer to each note mentioned in this note individually to assess the same.
    2) Should we apply this note to all systems involved: sender, controller (SM7.0 in our case) and receiver?
    >Depends from note to note, Each note contains the necessary information about the system it has to be installed on.
    3) We have enhp1...enhp4 in our ERP (and HCM), enhp1 on BI and perhaps enhp1 on CRM (not sure yet).
    a) What is the impact of enhp on TDMS?
    >TDMS in standard supports Basis 7.0 systems. However it generally works fine with 7.01 too.
    b) should we install TDMS before applying enhp OR the sequence does not matter?
    >As TDMS 3.0 is only available for systems with basis 4.6C to 7.0. So install TDMS first and then the EHP.
    I hope these answers help.
    Regards
    Pankaj.

  • Question about LRU in a replicated cache

    Hi Tangosol,
    I have a question about how the LRU eviction policy works in a replicated cache that uses a local cache for its backing map. My cache config looks like this:
    <replicated-scheme>
    <scheme-name>local-repl-scheme</scheme-name>
    <backing-map-scheme>
    <local-scheme>
    <scheme-ref>base-local-scheme</scheme-ref>
    </local-scheme>
    </backing-map-scheme>
    </replicated-scheme>
    <local-scheme>
    <scheme-name>base-local-scheme</scheme-name>
    <eviction-policy>LRU</eviction-policy>
    <high-units>50</high-units>
    <low-units>20</low-units>
    <expiry-delay/>
    <flush-delay/>
    </local-scheme>
    My test code does the following:
    1. Inserts 50 entries into the cache
    2. Checks to see that the cache size is 50
    3. Inserts 1 additional entry (as I understand it, this should cause the eviction logic to kick-in)
    4. Checks the cache size again, expecting it to now be 20
    With HYBRID and LFU eviction policies, the above logic works exactly as expected. When I switch to LRU however, the code at step 2 always returns a value significantly less than 50. All 50 inserts appear to complete successfully, so I can only assume that some of the entries have already been evicted by the time I get to step 2.
    Any thoughts?
    Thanks.
    Pete L.
    Addendum:
    As usual, in attempting to boil this issue down to its essential elements, I left out some details that turned out to be important. The logic that causes the condition to occur looks more like:
    1. Loop 2 times:
    2. Create named cache instance "TestReplCache"
    3. Insert 50 cache entries
    4. Verify that cache size == 50
    5. Insert 1 additional entry
    6. Verify that cache size == 20
    7. call cache.release()
    8. End Loop
    With this logic, the problem occurs on the second pass of the loop. Step 4 reports a cache size of < 50. This happens with LRU, LFU, and HYBRID-- so my initial characterization of this problem is incorrect. The salient details appear to be that I am using the same cache name each pass of the loop and that I am calling release() at the end of the loop. (If I call destroy() instead, all works as expected.)
    So... my revised question(s) would be: is this behavior expected? Is calling destroy() my only recourse?
    Message was edited by: planeski

    Robert,
    Attached are my sample code and cache config files. The code is a bit contrived-- it's extracted from a JUnit test case. Typically, we wouldn't re-use the same cache name in this way. What caught my eye however, was the fact that this same test case does not exhibit this behavior when running against a local cache directly (as opposed to a repl cache backed by a local cache.)
    Why call release? Well, again, when running this same test case against a local cache, you have to call release or it won't work. I figured the same applied to a repl cache backed by a local cache.
    Now that I understand this is more a byproduct of how my unit tests are written and not an issue with LRU eviction (as I originally thought), it's not a big deal-- more of a curiosity than a problem.
    Pete L.<br><br> <b> Attachment: </b><br>coherence-cache-config.xml <br> (*To use this attachment you will need to rename 545.bin to coherence-cache-config.xml after the download is complete.)<br><br> <b> Attachment: </b><br>LruTest.java <br> (*To use this attachment you will need to rename 546.bin to LruTest.java after the download is complete.)

  • Question about ERMS push

    Hi Guru,
    I am prototyping the ERMS push solution in CRM7 and have some questions about the solution SAP help provided.
    Below is the detail about ERMS push:
    Here the e-mail is first handled by the e-mail pull mechanism: it is converted into a
    SAPoffice mail and analyzed by ERMS to find out more details about the mail (like language and certain keywords).
    Then the mail (including the additional information from the ERMS analysis) is transferred to the
    CMS (Communication Management Software). The CMS determines the appropriate agent team and
    dispatches the mail via the push process to an available agent of that team.
    Below is the sap help link:
    http://help.sap.com/saphelp_crm70/helpdata/EN/0e/6a22b86821468691bd5abb51dfd81e/content.htm
    I have below questions about the solution in the help link:
    1. It mentioned about the email profile (set the agent inbox as email provider) and I changed the u201Cdefaultu201D profile delivered by sap. I setup the rule policy according to the help and assigned it in the service manager profile. The purpose of ERMS push is to push email to CMS instead of sending to agent inbox using ERMS. Which business role should this email profile be assigned to? Is it IC_agent?
    2. The help also mentioned about setup u201CERMS_ACTIONu201D as communication system ID in CRMM_BCB_ADM. Does this ID need to be added in the CMS profile? If so, which business role should this CMS profile be assigned to? Is it IC_Agent?
    3. The ERMS uses workflow WS00200001. After the email is pushed to CMS, what status should the workflow be, in progress or complete? Also does it suppose to have agent assigned in the workflow task?
    4. After the CMS pushes the email back to CRM, it will be a pop up for agent to accept or reject. Will it create an interaction record once the agent clicks the accept?
    It would be great if you could shed some light on this.
    Thanks in advance!
    Zhi Jie Kong
    Edited by: Zhijie Kong on Apr 28, 2011 4:32 PM
    Edited by: Zhijie Kong on Apr 28, 2011 4:47 PM

    Hello Zhijie,
    Let me see if I can help address some of your questions.
    1) It doesn't matter which business role you use. You can copy IC_AGENT for example. The important thing is, as Mariusz mentions in this thread, [ERMS email push: problem with CAD and transfer;,your E-Mail profile must be set for E-Mail Provider = 2 (Agent Inbox).
    2) No, this ID itself does not need to be added to any business role (as I assume it is hardcoded in the SAP workflow as Mariusz mentions).
    3) From what I remember, the ERMS Push emails are not set to complete by the system, and therefore can still get inadvertantly routed to agents! I recommend to have a second rule in your Rule Modeler policy to route the ERMS Push emails to a special, separate queue where you can close them out easily without worrying about them getting assigned to any agents!
    4) Yes, the email will arrive like a phone call with the accept/reject buttons flashing (though it will show as an email, not a phone call). And yes, when the agent accepts an Interaction Record will be created by the system automatically.
    I hope this helps you!
    Regards,
    John

  • Questions about RH Server features

    Hi Adobe users,
    we're considering to recommend RoboHelp Server as a
    publishing solution to a client. They are already using RoboHelp
    and want to make the contents available online. Another issue is
    centralized management of their help content documents. I'd be very
    glad if someone could answer the following questions:
    Is it possible to integrate RH Server
    with a CRM system or another user management system for
    authentication? Can it be integrated in a single sign-on
    infrastructure?
    How does the access control mechanism
    work? Can I set permissions who is allowed to publish certain
    documents/projects?
    Does it support workflows
    (review/accept/reject)? Or is this done using the staging
    server?
    Does it support content versioning?
    Or do we need RH SourceControl for that additionally?
    Does it support locking (check-in,
    check-out) of documents to allow concurrent editing?
    In short - does the RoboHelp server act as a document
    management system, or do the authors have to store the contents on
    their local computers and publish only the final version? Another
    issue is translation management - can the authors publish one
    language version, which is translated by other people (in other
    countries) and published to the same server?
    Thanks a lot in advance for any help!
    Andreas Hartmann

    Wow, Andreas. Welcome to the Forums.
    That's quite a laundry list!
    The answers are pretty much, "yes", if I understand
    correctly. That said, I don't know that much about CRMs, so your
    mileage may vary.
    To start, you might want to take a look at my article on the
    Adobe Developer Network for an overall perspective.
    Adobe
    RoboHelp Server 6 improves the feedback loop
    I view RoboHelp and RoboHelp Server as a "content" management
    rather than a "document" management system though this is a bit a
    semantics. Using variables, conditional build tags and Single
    Source Layouts, you can manage the content of your output in a
    fairly granular way (including to some level, your translation
    issues).
    In other words a team of multiple authors can manage the
    content of a website in the way you describe with a combination of:
    1. The authoring client (RoboHelp's main application) to
    develop content and manage it.
    2. RoboSource Control (for the check in, check out and
    versioning you mentioned) where authoring content source material
    is stored on a central server for backup and access by the team.
    3. RoboHelp Server for managing the authentication of who has
    access to the web server for publishing, etc. RoboHelp Server is an
    Active Server Pages application running on IIS, typically with a MS
    SQL Server or Oracle DB as the back end. As such, it uses Windows
    Server straightforward authentication methods. When you set up the
    author's project to publish to the server it will ask you for the
    Username and Password. Once this is configured, it's seamless from
    there. That login name/password can be the same as whatever you
    call your single sign-on pair as long as it conforms to the Windows
    Server scheme mentioned earlier.
    The RoboHelp author who may be designated as the "admin(s)"
    of the team can create and assign permissions from the authoring
    client and otherwise manage the Server site remotely without having
    to pester the Web Administrator for maintenance of the site.
    The staging server scenario you mention is an option that is
    completely up to your team and the IT folks. RoboHelp Server
    doesn't care one way or another. It's just an application sitting
    on a IIS web server. That server could be the "development or
    staging" server or the so-called "production" or live server.
    The different languages you mention could be published to the
    same server and comingled into a single site (kind of messy) or you
    could have different language sites (as long as they are different
    domains/IP addresses) sitting on the same RoboHelp Server enabled
    machine.
    Regarding language, one important item to research is the
    RoboHelp version you are using and it's language support. The
    much-anticipated Adobe RoboHelp 7 (now in beta and expected "before
    the end of the year") will have full Unicode/double byte character
    support for 35 languages and a wonderful way of handling
    translation workflows.
    Well, that should get you started. Let me know what I've
    missed!
    Thanx,
    john

Maybe you are looking for

  • Acrobat 6 "Save As" settings weirdness

    Hi - I'm trying to save/export a PDF as either a Word or, basically, any sort of text document. I'm getting an error message saying: "This document is not a tagged PDF. Please set your Save As preferences to generate tags for untagged documents." Whe

  • Web page displays much too small

    I'm trying to develop a web page to display on the Pre, but no matter what I set the size of the page to it shows up very small on the Pre. Here's the raw html. Changing the font size to large didn't seem to change anything. No matter what it shows u

  • Need to chg text of a field in SM30: Compare Flag

    Hi, I created a table & created a table maintance Generator. I want to change the text of a field in SM30. Currently it is taking from Dictonary Field text. I went to SE11->Table Maintance Generator-> Modification->Maintance Screen->Flow Logic There

  • Java script and scriplets

    Hi, I have got a .jsp page wherein there is a 2 dimentional string array defined within a scriplet. I have defined an other function in java script within the same page. Now I need to access the string array within the java script. Can someone please

  • Not a problem per say....

    Okay well let me first start out by saying I am not a Mac newbie just been away for awhile. I got a great deal on a used iMac which I love by the way. Well my problem isn't with my Mac it is with my wireless router/cable modem (yes I have one of thos